I’m not saying we have a settled ethic here, and still less, that its rational structure is sufficiently natural and privileged that tons of agents will converge on it. Rather, my claim is that we have some ethic here – an ethic that behaves towards “agents with different values” in a manner importantly different from (and “nicer” than) paperclipping, utilitarianism, and a whole class of related forms of consequentialism; and in particular, an ethic that doesn’t view the mere presence of (law-abiding, cooperative) people-who-like-paperclips as a major problem.
For a rather more specific (though still in fact Utilitarian) formulation of this ethic, see: A Moral Case for Evolved-Sapience-Chauvinism. Briefly, if your civilization includes some humans-who-like-paperclips (or indeed some allied sapient aliens who we can get along with, who happen to like alien-paperclips), then the appropriate definition of “utility” for that civilization includes putting some value on letting them have the paperclips they want (as long as this isn’t massively decreasing anyone else’s utility). And indeed for Alicia, Jim, Felipe, Maria, and Jason letting them enjoy their various favorite passtimes (again, as long as these don’t heavily intrude on other people living their preferred lives). “Fun” or “utility” is in the eye of the beholder, just like beauty. I actually can’t tell you what you find fun: you’re the world expert on that, and I defer to your expertise (even if I have some suggestions of things you might want to try).
It’s also extremely likely that, not only does everyone’s view on how fun paperclips are in their own back yard get summed over or cohered, but also that “letting me do what I darn well want in my own house, as long as I’m not significantly harming anyone else” (a concept often called ‘personal freedom’) is in fact part of the complex and fragile “human values” that Yudkowski talks about, and would thus become part of the “Coherent Extrapolated Volition” that Utilitarians such as him would like to see optimized. The definition of “utility” sums over or coheres the opinions of all citizens/moral patients in the society, not just Eliezer, and in your own back yard, your opinion is particularly relevant, since you spend more time there than anyone else. So Utilitarianism includes Liberalism, pro tanto: the opposition of them you came up with in a previous essay was the result of taking the Atheism viewpoint to a ludicrous solipsistic extreme: it’s not what most actual Utilitarians are proposing, including Yudkowski (or me).
For a rather more specific (though still in fact Utilitarian) formulation of this ethic, see: A Moral Case for Evolved-Sapience-Chauvinism. Briefly, if your civilization includes some humans-who-like-paperclips (or indeed some allied sapient aliens who we can get along with, who happen to like alien-paperclips), then the appropriate definition of “utility” for that civilization includes putting some value on letting them have the paperclips they want (as long as this isn’t massively decreasing anyone else’s utility). And indeed for Alicia, Jim, Felipe, Maria, and Jason letting them enjoy their various favorite passtimes (again, as long as these don’t heavily intrude on other people living their preferred lives). “Fun” or “utility” is in the eye of the beholder, just like beauty. I actually can’t tell you what you find fun: you’re the world expert on that, and I defer to your expertise (even if I have some suggestions of things you might want to try).
It’s also extremely likely that, not only does everyone’s view on how fun paperclips are in their own back yard get summed over or cohered, but also that “letting me do what I darn well want in my own house, as long as I’m not significantly harming anyone else” (a concept often called ‘personal freedom’) is in fact part of the complex and fragile “human values” that Yudkowski talks about, and would thus become part of the “Coherent Extrapolated Volition” that Utilitarians such as him would like to see optimized. The definition of “utility” sums over or coheres the opinions of all citizens/moral patients in the society, not just Eliezer, and in your own back yard, your opinion is particularly relevant, since you spend more time there than anyone else. So Utilitarianism includes Liberalism, pro tanto: the opposition of them you came up with in a previous essay was the result of taking the Atheism viewpoint to a ludicrous solipsistic extreme: it’s not what most actual Utilitarians are proposing, including Yudkowski (or me).