I personally didn’t find the actual experience at Equestria itself terrifying at all. It was a little disturbing at first, but almost all of that was sheer physical disgust or a knee-jerk sour grapes reaction. But it seems to avoid almost all of the pitfalls of failed Utopias everywhere:
You interact with real, sentient creatures who are independent and have their own values and desires. Thunder is capable of getting hurt, angry, and frustrated with his wife. Limeade is capable of feeling envious of her friend. They are in no way less than any complete human mind, and are given the same moral weight. They satisfy Lavender’s values, but only as an effect of satisfying their own values, not as their primary directive. The love and friendship are real.
You’re not isolated from other uploaded humans. There are only very few shards of Equestria that grow around only one upload; most interact with others from their Earthly lives.
It’s not stagnant—jryy, guvf vf n ovg qrongnoyr, V xabj, orpnhfr bs gur Ybbc Vzzbegnyf, but there are always new things to learn and discover and opportunities for growth and enlightenment if you so choose.
It’s not devoid of pain or sadness; it’s only devoid of arbitrary pain or sadness. It recognizes that to live a fully human life, you need sadness and frustration sometimes; it just makes sure that the pain is, as Paul Graham said, the pain of running a marathon, not the pain of stepping on a nail. Not everything is perfect.
That said, there were moments of genuine horror, mainly stuff people have pointed out before:
Perhaps trillions and trillions of sentient alien species were wiped out to expand Celestia’s empire.
The people left behind, who didn’t upload, are living in a post-apocalyptic wasteland. Celestia was no doubt capable of arranging for functional societies and amenities for those who chose not to upload, but her primary directive was to satisfy values through friendship and ponies, and making life hell for those who held out made them more likely to upload quickly.
Fridge Logic: One of Siofra’s coworkers said his version of the PonyPad game was like God of War; he brutally slaughtered and tortured ponies as part of Celestia’s palace guard. Well, what would happen to his shard of Equestria when he uploaded? Would he be massacring living minds? Presumably the ponies in Horndog Dan’s version of Equestria truly desired him and satisfied their own values by having sex with him, but the ones who were killed to satisfy the other colleague’s desire for heroism? Presumably his values don’t involve killing ponies who are essentially automata who exist only to be killed; he wants to kill genuinely evil enemy minds, not drones. Also, how does Celestia manage to satisfy the values of sociopaths with “friendship and ponies”?
I suspect your fridge logic would be solved by fvzcyl abg trggvat qb jung ur jnagrq, hagvy ur jvfurq ng fbzr cbvag gung ur jbhyq abg or n fbpvbcngu. I’m more worried about the part you rot13ed, and I suspect it’s part of what makes Eliezer consider it horror. I feel that’s the main horror part of the story.
There are also the issues of Celestia lying to Lavendar when clearly she wants the truth on some level, the worry about those who would have uploaded (or uploaded earlier) if they had a human option, and the lack of obviously-possible medical and other care for the unuploaded humans (whose values could be satisfied almost as much as those of the ponies). These are instances when an AI is almost-but-not-quite Friendly (and, in the case of the simple fictional story instead of everyday life, could have been easily avoided by telling Celestia to “satisfy values” and that most people she meets initially want friendship and ponies). These are probably the parts that Eliezer is referring to, because of his work in avoiding uFAI and almost-FAI. On the other hand, they are far better than his default scenario, the no AI scenario, and the Failed Utopia #4-2 scenario in the OP. EDIT: Additionally, in the story at least, everything except the lying was easily avoidable by having Celestia just maximize values, while telling her that most people she meets early on will value friendship and ponies (and the lying at the end seems to be somewhat out-of-character because it doesn’t actually maximize values).
One other thing some might find horrifying, but probably not Eliezer, is the “Does Síofra die” question. To me, and I presume to him, the answer is “surely not”, and the question of ethics boils down to a simple check “does there ever exist an observer moment without a successor; i.e., has somebody died?”. Obviously some people do die preventable deaths, but Síofra isn’t one of them.
I personally didn’t find the actual experience at Equestria itself terrifying at all. It was a little disturbing at first, but almost all of that was sheer physical disgust or a knee-jerk sour grapes reaction. But it seems to avoid almost all of the pitfalls of failed Utopias everywhere:
You interact with real, sentient creatures who are independent and have their own values and desires. Thunder is capable of getting hurt, angry, and frustrated with his wife. Limeade is capable of feeling envious of her friend. They are in no way less than any complete human mind, and are given the same moral weight. They satisfy Lavender’s values, but only as an effect of satisfying their own values, not as their primary directive. The love and friendship are real.
You’re not isolated from other uploaded humans. There are only very few shards of Equestria that grow around only one upload; most interact with others from their Earthly lives.
It’s not stagnant—jryy, guvf vf n ovg qrongnoyr, V xabj, orpnhfr bs gur Ybbc Vzzbegnyf, but there are always new things to learn and discover and opportunities for growth and enlightenment if you so choose.
It’s not devoid of pain or sadness; it’s only devoid of arbitrary pain or sadness. It recognizes that to live a fully human life, you need sadness and frustration sometimes; it just makes sure that the pain is, as Paul Graham said, the pain of running a marathon, not the pain of stepping on a nail. Not everything is perfect.
That said, there were moments of genuine horror, mainly stuff people have pointed out before:
Perhaps trillions and trillions of sentient alien species were wiped out to expand Celestia’s empire.
The people left behind, who didn’t upload, are living in a post-apocalyptic wasteland. Celestia was no doubt capable of arranging for functional societies and amenities for those who chose not to upload, but her primary directive was to satisfy values through friendship and ponies, and making life hell for those who held out made them more likely to upload quickly.
Fridge Logic: One of Siofra’s coworkers said his version of the PonyPad game was like God of War; he brutally slaughtered and tortured ponies as part of Celestia’s palace guard. Well, what would happen to his shard of Equestria when he uploaded? Would he be massacring living minds? Presumably the ponies in Horndog Dan’s version of Equestria truly desired him and satisfied their own values by having sex with him, but the ones who were killed to satisfy the other colleague’s desire for heroism? Presumably his values don’t involve killing ponies who are essentially automata who exist only to be killed; he wants to kill genuinely evil enemy minds, not drones. Also, how does Celestia manage to satisfy the values of sociopaths with “friendship and ponies”?
I suspect your fridge logic would be solved by fvzcyl abg trggvat qb jung ur jnagrq, hagvy ur jvfurq ng fbzr cbvag gung ur jbhyq abg or n fbpvbcngu. I’m more worried about the part you rot13ed, and I suspect it’s part of what makes Eliezer consider it horror. I feel that’s the main horror part of the story.
There are also the issues of Celestia lying to Lavendar when clearly she wants the truth on some level, the worry about those who would have uploaded (or uploaded earlier) if they had a human option, and the lack of obviously-possible medical and other care for the unuploaded humans (whose values could be satisfied almost as much as those of the ponies). These are instances when an AI is almost-but-not-quite Friendly (and, in the case of the simple fictional story instead of everyday life, could have been easily avoided by telling Celestia to “satisfy values” and that most people she meets initially want friendship and ponies). These are probably the parts that Eliezer is referring to, because of his work in avoiding uFAI and almost-FAI. On the other hand, they are far better than his default scenario, the no AI scenario, and the Failed Utopia #4-2 scenario in the OP. EDIT: Additionally, in the story at least, everything except the lying was easily avoidable by having Celestia just maximize values, while telling her that most people she meets early on will value friendship and ponies (and the lying at the end seems to be somewhat out-of-character because it doesn’t actually maximize values).
One other thing some might find horrifying, but probably not Eliezer, is the “Does Síofra die” question. To me, and I presume to him, the answer is “surely not”, and the question of ethics boils down to a simple check “does there ever exist an observer moment without a successor; i.e., has somebody died?”. Obviously some people do die preventable deaths, but Síofra isn’t one of them.