I started reading the first page, and it looks like it’s a fic about an engineered utopia with constructed simulated minds.
That’s enough right there. Large scale manufacture of pony minds for a video game is exactly the kind of thing that Eliezer would call horrifying, even if it didn’t wrench his guts with terror. Think of it as endorsing that you should feel horror in response to the Optimalverse, even though you probably won’t because it’s nicer than present reality in many respects and because human emotions don’t properly reflect reality (scope neglect, hedonic treadmill, not crying when you walk through a cemetery until you reach the row of infant graves). Or maybe his guts do wrench. Different people get upset about all sorts of stimuli, from squirting blood to scraping nails to clicking computers to a microcosm inhabited by adorable inhuman sentients to whom no one gives proper moral consideration .
It also happens to serve Eliezer’s interests to make it seem that he is an expert designer of utopias, against whose work everything else falls disastrously short. But that’s not such a high standard.
That’s what I thought until the last chapter. All the time, I was waiting for something horrible to happen. I thought that in the last chapter, at the latest, things would have to take some very dark turn indeed to make up for all the full-on utopianism that came before.
Instead, the protagonist becomes a godlike intelligence herself. She not only achieves things outside her virtual world, but gains a far deeper understanding of the physical universe than she could ever have as a human. And she’s one of trillions to do so. I can’t fathom any rational reason anyone wouldn’t want to be that transpony, or one of the Superhappies from Three Worlds Collide.
Different people get upset about all sorts of stimuli, from squirting blood to scraping nails to clicking computers to a microcosm inhabited by adorable inhuman sentients to whom no one gives proper moral consideration .
Actually, if I recall correctly, in the original Friendship is Optimal, once they were constructed, the non-uploaded people received the same moral consideration as those originally human. They were designed to fit into a preconceived world but they weren’t slaves. I’m not quite sure whether that feels bad because it’s actually bad or because it’s so very different from our current methods of manufacturing minds (roll genetic dice, expose to local memes, hope for the best).
I don’t see it as bad at all and suspect most who do see it as bad do so because it’s different from the current method. These minds are designed to have lives that humans would consider valuable, and that they enjoy for all its complexity. It is like making new humans in the usual method, but without the problems of abusive upbringing (the one pony with abusive upbringing wasn’t a person at the time) or other bad things that can happen to a human.
I started reading the first page, and it looks like it’s a fic about an engineered utopia with constructed simulated minds.
That’s enough right there. Large scale manufacture of pony minds for a video game is exactly the kind of thing that Eliezer would call horrifying, even if it didn’t wrench his guts with terror. Think of it as endorsing that you should feel horror in response to the Optimalverse, even though you probably won’t because it’s nicer than present reality in many respects and because human emotions don’t properly reflect reality (scope neglect, hedonic treadmill, not crying when you walk through a cemetery until you reach the row of infant graves). Or maybe his guts do wrench. Different people get upset about all sorts of stimuli, from squirting blood to scraping nails to clicking computers to a microcosm inhabited by adorable inhuman sentients to whom no one gives proper moral consideration .
It also happens to serve Eliezer’s interests to make it seem that he is an expert designer of utopias, against whose work everything else falls disastrously short. But that’s not such a high standard.
That’s what I thought until the last chapter. All the time, I was waiting for something horrible to happen. I thought that in the last chapter, at the latest, things would have to take some very dark turn indeed to make up for all the full-on utopianism that came before.
Instead, the protagonist becomes a godlike intelligence herself. She not only achieves things outside her virtual world, but gains a far deeper understanding of the physical universe than she could ever have as a human. And she’s one of trillions to do so. I can’t fathom any rational reason anyone wouldn’t want to be that transpony, or one of the Superhappies from Three Worlds Collide.
Actually, if I recall correctly, in the original Friendship is Optimal, once they were constructed, the non-uploaded people received the same moral consideration as those originally human. They were designed to fit into a preconceived world but they weren’t slaves. I’m not quite sure whether that feels bad because it’s actually bad or because it’s so very different from our current methods of manufacturing minds (roll genetic dice, expose to local memes, hope for the best).
some people find house elves horrifying, even the cheerful ones.
Making house elves is horrible, but once they exist it’s ceteris paribus better to satisfy their desire to serve than not.
(Making house elves is horrible because they are capable of suffering but not resistance. It’s the “I Have No Mouth And I Must Scream” situation.)
I don’t see it as bad at all and suspect most who do see it as bad do so because it’s different from the current method. These minds are designed to have lives that humans would consider valuable, and that they enjoy for all its complexity. It is like making new humans in the usual method, but without the problems of abusive upbringing (the one pony with abusive upbringing wasn’t a person at the time) or other bad things that can happen to a human.
I’m curious: why was this comment retracted? Have you changed your opinion?