EY: The desire not to be optimized too hard by an outside agent is one of the structurally nontrivial aspects of human morality.
The vast majority of optimization-capable agents encountered by humans during their evolutionary history were selfish entities, squeezing their futures into their preferred regions. Given enough evolutionary time, any mutant humans who didn’t resist outside manipulation would end up ‘optimized’ to serve as slave labor in favor of the ‘optimizers’.
EY: would be the gift of a world that works on improved rules
Yes, just plug the most important holes (accidental death, unwanted suffering, illness, justice, asteroids, etc.), and leave people have fun.
EY: The desire not to be optimized too hard by an outside agent is one of the structurally nontrivial aspects of human morality.
The vast majority of optimization-capable agents encountered by humans during their evolutionary history were selfish entities, squeezing their futures into their preferred regions. Given enough evolutionary time, any mutant humans who didn’t resist outside manipulation would end up ‘optimized’ to serve as slave labor in favor of the ‘optimizers’.
EY: would be the gift of a world that works on improved rules
Yes, just plug the most important holes (accidental death, unwanted suffering, illness, justice, asteroids, etc.), and leave people have fun.