Why those particular rights? It seems rather convenient that they mostly arrive at beneficial consequences and jive with human intuitions. Kind of like how biblical apologists have explanations that just happen to coincide with our current understanding of history and physics.
If you lived in a world where your system of rights didn’t typically lead to beneficial consequences, would you still believe them to be correct?
What do you mean, “these particular rights”? I haven’t presented a list. I mentioned one right that I think we probably have.
It seems rather convenient that they mostly arrive at beneficial consequences and jive with human intuitions. Kind of like how biblical apologists have explanations that just happen to coincide with our current understanding of history and physics.
Oh, now, that was low.
If you lived in a world where your system of rights didn’t typically lead to beneficial consequences, would you still believe them to be correct?
Do you mean: does Alicorn’s nearest counterpart who grew up in such a world share her opinions? Or do you mean: if the Alicorn from this world were transported to a world like this, would she modify her ethics to suit the new context? They’re different questions.
Yeah, but most people don’t come up with a moral system that arrives at undesirable consequences in typical circumstances. Ditto for going against human intuitions/culture.
They’re different questions.
Now I’m curious. Is your answer to them different? Could you please answer both of those hypotheticals?
ETA: If your answer is different, then isn’t your morality in fact based solely on the consequences and not some innate thing that comes along with personhood?
does Alicorn’s nearest counterpart who grew up in such a world share her opinions?
Almost certainly, she does not. Otherworldly-Alicorn-Counterpart (OAC) has a very different causal history from me. I would not be surprised to find any two opinions differ between me and OAC, including ethical opinions. She probably doesn’t even like chocolate chip cookie dough ice cream.
if the Alicorn from this world were transported to a world like this, would she modify her ethics to suit the new context?
No. However: after an adjustment period in which I became accustomed to the new world, my epistemic state about the likely consequences of various actions would change, and that epistemic state has moral force in my system as it stands. The system doesn’t have to change at all for a change in circumstance and accompanying new consequential regularities to motivate changes in my behavior, as long as I have my eyes open. This doesn’t make my morality “based on consequences”; it just means that my intentions are informed by my expectations which are influenced by inductive reasoning from the past.
I guess the question I meant to ask was: In a world where your deontology would lead to horrible consequences, do you think it is likely for someone to come up with a totally different deontology that just happens to have good consequences most of the time in that world?
A ridiculous example: If an orphanage exploded every time someone did nothing in a moral dilemma, wouldn’t OAC be likely to invent a moral system saying inaction is more bad than action? Wouldn’t OAC also likely believe that inaction is inherently bad? I doubt OAC would say, “I privilege the null action, but since orphanages explode every time we do nothing, we have to weigh those consequences against that (lack of) action.”
Your right not to be killed has a list of exceptions. To me this indicates a layer of simpler rules underneath. Your preference for inaction has exceptions for suitably bad consequences. To me this seems like you’re peeking at consequentialism whenever the consequences of your deontology are bad enough to go against your intuitions.
I guess the question I meant to ask was: In a world where your deontology would lead to horrible consequences, do you think it is likely for someone to come up with a totally different deontology that just happens to have good consequences most of the time in that world?
It seems likely indeed that someone would do that.
If an orphanage exploded every time someone did nothing in a moral dilemma
I think that in this case, one ought to go about getting the orphans into foster homes as quickly as possible.
One thing that’s very complicated and not fully fleshed out that I didn’t mention is that, in certain cases, one might be obliged to waive one’s own rights, such that failing to do so is a contextually relevant wrong act and forfeits the rights anyway. It seems plausible that this could apply to cases where failing to waive some right will lead to an orphanage exploding.
It seems rather convenient that they mostly arrive at beneficial consequences and jive with human intuitions.
Agreed. It is also rather convenient that maximizing preference satisfaction rarely involves violating anyone’s rights and mostly jives with human intuitions.
And thats because normative ethics is just about trying to come up with nice sounding theories to explain our ethical intuitions.
Umm… torture vs dust specks is both counterintuitive and violates rights. Utilitarian consequentialists also flip the switch in the trolley problem, again violating rights.
It doesn’t sound nice or explain our intuitions. Instead, the goal is the most good for the most people.
maximizing preference satisfaction rarely involves violating anyone’s rights and mostly jives with human intuitions.
Those two examples are contrived to demonstrate the differences between utilitarianism and other theories. They hardly represent typical moral judgments.
Why those particular rights? It seems rather convenient that they mostly arrive at beneficial consequences and jive with human intuitions. Kind of like how biblical apologists have explanations that just happen to coincide with our current understanding of history and physics.
If you lived in a world where your system of rights didn’t typically lead to beneficial consequences, would you still believe them to be correct?
What do you mean, “these particular rights”? I haven’t presented a list. I mentioned one right that I think we probably have.
Oh, now, that was low.
Do you mean: does Alicorn’s nearest counterpart who grew up in such a world share her opinions? Or do you mean: if the Alicorn from this world were transported to a world like this, would she modify her ethics to suit the new context? They’re different questions.
Yeah, but most people don’t come up with a moral system that arrives at undesirable consequences in typical circumstances. Ditto for going against human intuitions/culture.
Now I’m curious. Is your answer to them different? Could you please answer both of those hypotheticals?
ETA: If your answer is different, then isn’t your morality in fact based solely on the consequences and not some innate thing that comes along with personhood?
Almost certainly, she does not. Otherworldly-Alicorn-Counterpart (OAC) has a very different causal history from me. I would not be surprised to find any two opinions differ between me and OAC, including ethical opinions. She probably doesn’t even like chocolate chip cookie dough ice cream.
No. However: after an adjustment period in which I became accustomed to the new world, my epistemic state about the likely consequences of various actions would change, and that epistemic state has moral force in my system as it stands. The system doesn’t have to change at all for a change in circumstance and accompanying new consequential regularities to motivate changes in my behavior, as long as I have my eyes open. This doesn’t make my morality “based on consequences”; it just means that my intentions are informed by my expectations which are influenced by inductive reasoning from the past.
I guess the question I meant to ask was: In a world where your deontology would lead to horrible consequences, do you think it is likely for someone to come up with a totally different deontology that just happens to have good consequences most of the time in that world?
A ridiculous example: If an orphanage exploded every time someone did nothing in a moral dilemma, wouldn’t OAC be likely to invent a moral system saying inaction is more bad than action? Wouldn’t OAC also likely believe that inaction is inherently bad? I doubt OAC would say, “I privilege the null action, but since orphanages explode every time we do nothing, we have to weigh those consequences against that (lack of) action.”
Your right not to be killed has a list of exceptions. To me this indicates a layer of simpler rules underneath. Your preference for inaction has exceptions for suitably bad consequences. To me this seems like you’re peeking at consequentialism whenever the consequences of your deontology are bad enough to go against your intuitions.
It seems likely indeed that someone would do that.
I think that in this case, one ought to go about getting the orphans into foster homes as quickly as possible.
One thing that’s very complicated and not fully fleshed out that I didn’t mention is that, in certain cases, one might be obliged to waive one’s own rights, such that failing to do so is a contextually relevant wrong act and forfeits the rights anyway. It seems plausible that this could apply to cases where failing to waive some right will lead to an orphanage exploding.
Agreed. It is also rather convenient that maximizing preference satisfaction rarely involves violating anyone’s rights and mostly jives with human intuitions.
And thats because normative ethics is just about trying to come up with nice sounding theories to explain our ethical intuitions.
Umm… torture vs dust specks is both counterintuitive and violates rights. Utilitarian consequentialists also flip the switch in the trolley problem, again violating rights.
It doesn’t sound nice or explain our intuitions. Instead, the goal is the most good for the most people.
I said:
Those two examples are contrived to demonstrate the differences between utilitarianism and other theories. They hardly represent typical moral judgments.
Because she says so. Which is a good reason. Much as I have preferences for possible worlds because I say so.