Sure—egoists, assign some value to avoiding the end of the world.
For them, it isn’t billions of times worse than all their friends and relatives dying, though.
Smaller utilities mean that the “tiny chance times huge utility” sums don’t have the same results as for utilitarians.
This results in disagreements over policy issues. For instance, an egoist might regard a utilitarian organisation—like the Singularity Institute—gaining power as being a bad thing—since they plainly have such a different set of values. They would be willing to gamble small chances of a huge utility—while the egoist might regard the huge utility as being illusory.
This is a problem because (I claim) the actions of most people more closely approximate those of egoists than utilitarians—since they were built by natural selection to value their own inclusive fitness.
The Singularity Institute is a kind of utilitarian club—where utilitarians club together in an attempt to steal the future, against practically everyone else’s wishes.
Smaller utilities mean that the “tiny chance times huge utility” sums don’t have the same results as for utilitarians.
Beware Pascal’s wager. Also worthy of note is that Eliezer himself doesn’t gamble on a small probability. But maybe you talked about the difference the egoist could make? Then I agree it amounts to a much smaller probability.
On the other hand, I think the prospect of living a few aeons represents by itself a huge utility, even for an egoist. It might still be worth a long shot.
I would call myself more of an egoist, and I would say the first possibility looks really good and the second possibility looks pretty bad. I of course assume that I am part of the 1%.
Sure—egoists, assign some value to avoiding the end of the world.
For them, it isn’t billions of times worse than all their friends and relatives dying, though.
Smaller utilities mean that the “tiny chance times huge utility” sums don’t have the same results as for utilitarians.
This results in disagreements over policy issues. For instance, an egoist might regard a utilitarian organisation—like the Singularity Institute—gaining power as being a bad thing—since they plainly have such a different set of values. They would be willing to gamble small chances of a huge utility—while the egoist might regard the huge utility as being illusory.
This is a problem because (I claim) the actions of most people more closely approximate those of egoists than utilitarians—since they were built by natural selection to value their own inclusive fitness.
The Singularity Institute is a kind of utilitarian club—where utilitarians club together in an attempt to steal the future, against practically everyone else’s wishes.
Beware Pascal’s wager. Also worthy of note is that Eliezer himself doesn’t gamble on a small probability. But maybe you talked about the difference the egoist could make? Then I agree it amounts to a much smaller probability.
On the other hand, I think the prospect of living a few aeons represents by itself a huge utility, even for an egoist. It might still be worth a long shot.
If an example of where there is a difference would help, consider these two possibilities:
1% of the population takes over the universe;
everyone is obliterated (99% chance) - or “everyone” takes over the universe (1% chance);
To an egoist those two possibilities look about equally bad.
To those whose main concern is existential risk, the second option looks a lot worse.
I would call myself more of an egoist, and I would say the first possibility looks really good and the second possibility looks pretty bad. I of course assume that I am part of the 1%.