I think it’s a category error to see ethics as only being about what one likes (even if that involves some work getting rid of obvious contradictions). In such a case, doing ethics would just be descriptive, it would tell us nothing new, and the outcome would be whatever evolution arbitrarily equipped us with. Surely that’s not satisfying! If evolution had equipped us with a strong preference to generate paperclips, should our ethicists then be debating how to best fill the universe with paperclips? Much rather, we should be trying to come up with better reasons than mere intuitions and arbitrarily (by blind evolution) shaped preferences.
If there was no suffering and no happiness, I might agree with ethics just being about whatever you like, and I’d add that one might as well change what one likes and do whatever, since nothing then truly mattered. But it’s a fact that suffering is intrinsically awful, in the only way something can be, for some first person point of view. Of pain, one can only want one thing: That it stops. I know this about my pain as certainly as I know anything. And just because some other being’s pain is at another spatio-temporal location doesn’t change that. If I have to find good reasons for the things I want to do in life, there’s nothing that makes even remotely as much sense as trying to minimize suffering. Especially if you add that caring about my future suffering might not be more rational than caring about all future suffering, as some views on personal identity imply.
In such a case, doing ethics would just be descriptive, it would tell us nothing new, and the outcome would be whatever evolution arbitrarily equipped us with
I used to worry about that a lot, and then AndrewCritch explained at minicamp that the statement “I should do X” can mean “I want to want to do X.” In other words, I currently prefer to eat industrially raised chicken sometimes. It is a cold hard fact that I will frequently go to a restaurant that primarily serves torture-products, give them some money so that they can torture some more chickens, and then put the dead tortured chicken in my mouth. I wish I didn’t prefer to do that. I want to eat Subway footlongs, but I shouldn’t eat Subway footlongs. I aspire not to want to eat them in the future.
Also check out the Sequences article “Thou Art Godshatter.” Basically, we want any number of things that have only the most tenuous ties to evolutionary drives. Evolution may have equipped me with an interest in breasts, but it surely is indifferent to whether the lace on a girlfriend’s bra is dyed aquamarine and woven into a series of cardioids or dyed magenta and woven into a series of sinusoidal spirals—whereas I have a distinct preference. Eliezer explains it better than I do.
I’m not sure “intriniscally awful” means anything interesting. I mean, if you define suffering as an experience E had by person P such that P finds E awful, then, sure, suffering is intrinsically awful. But if you don’t define suffering that way, then there are at least some beings that won’t find a given E awful.
But suffering is bad no matter your basic preference architecture. It takes the arbitrariness of out ethics when it’s applicable to all that. Suffering is bad (for the first person point of view experiencing it) in all hypothetical universes. Well, by definition. Culture isn’t. Biological complexity isn’t. Biodiversity isn’t.
Even if it’s not all that matters, it’s a good place to start. And a good way to see whether something else really matters too is to look whether you’d be willing to trade a huge amount of suffering for whatever else you consider to matter, all else being equal (as I did in the example about the planet full of artifacts).
Yes, basically everyone agrees that suffering is bad, and reducing suffering is valuable. Agreed.
And as you say, for most people there are things that they’d accept an increase in suffering for, which suggests that there are also other valuable things in the world.
The idea of using suffering-reduction as a commensurable common currency for all other values is an intriguing one, though.
I think it’s a category error to see ethics as only being about what one likes (even if that involves some work getting rid of obvious contradictions). In such a case, doing ethics would just be descriptive, it would tell us nothing new, and the outcome would be whatever evolution arbitrarily equipped us with. Surely that’s not satisfying! If evolution had equipped us with a strong preference to generate paperclips, should our ethicists then be debating how to best fill the universe with paperclips? Much rather, we should be trying to come up with better reasons than mere intuitions and arbitrarily (by blind evolution) shaped preferences.
If there was no suffering and no happiness, I might agree with ethics just being about whatever you like, and I’d add that one might as well change what one likes and do whatever, since nothing then truly mattered. But it’s a fact that suffering is intrinsically awful, in the only way something can be, for some first person point of view. Of pain, one can only want one thing: That it stops. I know this about my pain as certainly as I know anything. And just because some other being’s pain is at another spatio-temporal location doesn’t change that. If I have to find good reasons for the things I want to do in life, there’s nothing that makes even remotely as much sense as trying to minimize suffering. Especially if you add that caring about my future suffering might not be more rational than caring about all future suffering, as some views on personal identity imply.
I used to worry about that a lot, and then AndrewCritch explained at minicamp that the statement “I should do X” can mean “I want to want to do X.” In other words, I currently prefer to eat industrially raised chicken sometimes. It is a cold hard fact that I will frequently go to a restaurant that primarily serves torture-products, give them some money so that they can torture some more chickens, and then put the dead tortured chicken in my mouth. I wish I didn’t prefer to do that. I want to eat Subway footlongs, but I shouldn’t eat Subway footlongs. I aspire not to want to eat them in the future.
Also check out the Sequences article “Thou Art Godshatter.” Basically, we want any number of things that have only the most tenuous ties to evolutionary drives. Evolution may have equipped me with an interest in breasts, but it surely is indifferent to whether the lace on a girlfriend’s bra is dyed aquamarine and woven into a series of cardioids or dyed magenta and woven into a series of sinusoidal spirals—whereas I have a distinct preference. Eliezer explains it better than I do.
I’m not sure “intriniscally awful” means anything interesting. I mean, if you define suffering as an experience E had by person P such that P finds E awful, then, sure, suffering is intrinsically awful. But if you don’t define suffering that way, then there are at least some beings that won’t find a given E awful.
(shrug) I agree that suffering is bad.
It doesn’t follow that the only thing that matters is reducing suffering.
But suffering is bad no matter your basic preference architecture. It takes the arbitrariness of out ethics when it’s applicable to all that. Suffering is bad (for the first person point of view experiencing it) in all hypothetical universes. Well, by definition. Culture isn’t. Biological complexity isn’t. Biodiversity isn’t.
Even if it’s not all that matters, it’s a good place to start. And a good way to see whether something else really matters too is to look whether you’d be willing to trade a huge amount of suffering for whatever else you consider to matter, all else being equal (as I did in the example about the planet full of artifacts).
Yes, basically everyone agrees that suffering is bad, and reducing suffering is valuable. Agreed.
And as you say, for most people there are things that they’d accept an increase in suffering for, which suggests that there are also other valuable things in the world.
The idea of using suffering-reduction as a commensurable common currency for all other values is an intriguing one, though.