I liked the story, though I had a handful of thoughts.
The first is about humans who have incompatible goals. I don’t think an Olympic Gold Medalist like Usain Colt would be happy with just being the best in his shard or taking turns on the leaderboard. He wants to be the best there is, period. If there are 10 trillion people, he wants to be rank 1 out of 10 trillion. There were 10,000 Olympians in 2012 and all of them wanted and devoted their lives to being the best, even if they didn’t take gold.
Going further than that, I can’t imagine how to reconcile the 1% of the population who are psychopaths, or how to reconcile the 1% of the 1% who are truly destructive. Ted Bundy killed because he wanted to feel his power over others and murder was the best high he had ever had; such a man would think it better to harm real people instead of constructs.
I assume that Celestia would modify them rather than make everyone mutually delusional in thinking they were the best (or killing others), but I had hoped that would be touched on. I understand mutually incompatible goals are a problem CEV and not just your story. The way AI resolves these mutually incompatible goals is a lot of what’s scary about an AI controlled future.
(As a minor sidepoint, what of humans who are okay with ponies but value not living in a matrix or wireheading. Do they get engineered pony bodies or do they just get that value modified out? I’d assume the latter or that anyone who holds that value ends up like Abdul.)
The other main thought I had was that I liked chapter 10 as an ending rather than 11. It has a strong emotional resonance with me that individual stories don’t. Although I do realize it would change the tone of the ending somewhat significantly. Vafgrnq bs na ‘rirelobql yvirf unccvyl rire nsgre, fbegn’ raqvat vg jbhyq cebonoyl or ‘bu Tbq na bzavcbgrag NV vf gvyvat gur havirefr jvgu cbavrf naq rirelobql’f BX jvgu gung’ raqvat.
I do want to reiterate how much I liked your story. Very well done and thank you for sharing.
There were 10,000 Olympians in 2012 and all of them wanted and devoted their lives to being the best, even if they didn’t take gold.
I don’t think there could possibly be a more stupid, pointless, and horribly depressing zero sum game than being a professional athelete, if all you care about is being “the best”. Same about being a scientist who only cared about scoring a Nobel rather than achieving results.
For the psychopaths, I figured this was covered with the “block” feature. If it extends from the game into the emulation, Celestia just has to tell the destructive psychopath that the ponies they are “killing” are real (she can lie) and keep them in a shard away from those who do not want to be killed. She doesn’t even need to lie per se: she can create emulation-ponies that harbor a deep desire to be killed while behaving like they want to live. And of course, introduce psychopath-ponies (or gryphons, or diamond dogs) that will act as their “friends.”
Thus, satisfying their values through friendship and ponies.
As for the Olympian… they probably do want to be the best, but that won’t always happen. But there are lots of things to be the best at, and it wouldn’t be unreasonable for there to be different leaderboards for “in shard” and “in universe.” And Celestia explicity notes that she is not maximizing happiness, but values: and the Olympian probably values the effort and work more than just the happiness of believing themselves to be number 1. (For those who go the other way, they get locked in shards where they are number 1 and simply not introduced to the wider world)
But that’s me interpretting another person’s story about a super-intelligent AI that presumably is smarter than either me or the author or both of us combined.
I guess my main objection is how the “value being in the true world” conflicting with other values is portrayed. Celestia sides against valuing a true world, and this theme is covered in Light Spark’s story, but would becomes explicit if there were direct conflicts. In the end, what Celestia does is just very sophisticated networked wireheading.
In my mind, this means Hanna failed, but it doesn’t seem quite portrayed as a failure. There is a hint of the horror as she consumes galaxies and tiles the universe with wireheaded ponies, but just a hint. It’s more subtle than I prefer. Specifically, I’m worried that someone reading it without LessWrong background would miss all that.
(Also, I just assumed the block function is a dummy button and doesn’t actually do anything when you press it.)
I liked the story, though I had a handful of thoughts.
The first is about humans who have incompatible goals. I don’t think an Olympic Gold Medalist like Usain Colt would be happy with just being the best in his shard or taking turns on the leaderboard. He wants to be the best there is, period. If there are 10 trillion people, he wants to be rank 1 out of 10 trillion. There were 10,000 Olympians in 2012 and all of them wanted and devoted their lives to being the best, even if they didn’t take gold.
Going further than that, I can’t imagine how to reconcile the 1% of the population who are psychopaths, or how to reconcile the 1% of the 1% who are truly destructive. Ted Bundy killed because he wanted to feel his power over others and murder was the best high he had ever had; such a man would think it better to harm real people instead of constructs.
I assume that Celestia would modify them rather than make everyone mutually delusional in thinking they were the best (or killing others), but I had hoped that would be touched on. I understand mutually incompatible goals are a problem CEV and not just your story. The way AI resolves these mutually incompatible goals is a lot of what’s scary about an AI controlled future.
(As a minor sidepoint, what of humans who are okay with ponies but value not living in a matrix or wireheading. Do they get engineered pony bodies or do they just get that value modified out? I’d assume the latter or that anyone who holds that value ends up like Abdul.)
The other main thought I had was that I liked chapter 10 as an ending rather than 11. It has a strong emotional resonance with me that individual stories don’t. Although I do realize it would change the tone of the ending somewhat significantly. Vafgrnq bs na ‘rirelobql yvirf unccvyl rire nsgre, fbegn’ raqvat vg jbhyq cebonoyl or ‘bu Tbq na bzavcbgrag NV vf gvyvat gur havirefr jvgu cbavrf naq rirelobql’f BX jvgu gung’ raqvat.
I do want to reiterate how much I liked your story. Very well done and thank you for sharing.
I don’t think there could possibly be a more stupid, pointless, and horribly depressing zero sum game than being a professional athelete, if all you care about is being “the best”. Same about being a scientist who only cared about scoring a Nobel rather than achieving results.
Ouch.
For the psychopaths, I figured this was covered with the “block” feature. If it extends from the game into the emulation, Celestia just has to tell the destructive psychopath that the ponies they are “killing” are real (she can lie) and keep them in a shard away from those who do not want to be killed. She doesn’t even need to lie per se: she can create emulation-ponies that harbor a deep desire to be killed while behaving like they want to live. And of course, introduce psychopath-ponies (or gryphons, or diamond dogs) that will act as their “friends.”
Thus, satisfying their values through friendship and ponies.
As for the Olympian… they probably do want to be the best, but that won’t always happen. But there are lots of things to be the best at, and it wouldn’t be unreasonable for there to be different leaderboards for “in shard” and “in universe.” And Celestia explicity notes that she is not maximizing happiness, but values: and the Olympian probably values the effort and work more than just the happiness of believing themselves to be number 1. (For those who go the other way, they get locked in shards where they are number 1 and simply not introduced to the wider world)
But that’s me interpretting another person’s story about a super-intelligent AI that presumably is smarter than either me or the author or both of us combined.
I guess my main objection is how the “value being in the true world” conflicting with other values is portrayed. Celestia sides against valuing a true world, and this theme is covered in Light Spark’s story, but would becomes explicit if there were direct conflicts. In the end, what Celestia does is just very sophisticated networked wireheading.
In my mind, this means Hanna failed, but it doesn’t seem quite portrayed as a failure. There is a hint of the horror as she consumes galaxies and tiles the universe with wireheaded ponies, but just a hint. It’s more subtle than I prefer. Specifically, I’m worried that someone reading it without LessWrong background would miss all that.
(Also, I just assumed the block function is a dummy button and doesn’t actually do anything when you press it.)
I have a Less Wrong background, and I don’t get what the problem is with abandoning the True World for a universe that actually cares about us.