Nobody has ever come up with the correct solution to how Eliezer Yudkowsky won the AI-Box experiment in less than 15 minutes of effort. (This includes Eliezer himself). (75%)
Salivanth
Army1987: Not sure what the rules are for comments replying to the original, but hell. Voted down for agreement.
Agree the chance is >50%, but upvoted for overconfidence.
Okay, given the confusion, I’ve retracted my downvote. I’ve also seen a comment get about 27 karma on this thread replying to another post, and that comment was certainly not massively irrational, so I assume we vote normally if it’s not a first-order comment.
Similar to this is Quirrel’s mannerisms at the very start of his first class, meant to get you to think he’ll be the typical Quirrel before he bursts into a confident diatribe, but people have speculated on that one, too.
What makes you think a self-improving super-intelligence gone wrong will be restricted to a single planet?
Damn, I hadn’t thought of that...to spend MONTHS discovering something REALLY COOL like gravity, and then realising: You can never tell anyone. Ever.
So tempting...if only I had the ability and knowledge to write such a thing. I’ll certainly look back at this idea at a later date.
I think my ideal would be a combination of both: You’re allowed to know the level above your own, maybe even two levels up, but three levels up is a mystery. So you get to look forward to what comes next, but still not know EVERYTHING. And that way, you want to level up not only to get Level + 1, but to FIND OUT what you get in Level + 3...
Ben Jones didn’t recognise the dust speck as “trivial” on his torture scale, he identified it as “zero”. There is a difference: If dust speck disutility is equal to zero, you shouldn’t pay one cent to save 3^^^3 people from it. 0 3^^^3 = 0, and the disutility of losing one cent is non-zero. If you assign an epsilon of disutility to a dust speck, then 3^^^3 epsilon is way more than 1 person suffering 50 years of torture. For all intents and purposes, 3^^^3 = infinity. The only way that Infinity(X) can be worse than a finite number is if X is equal to 0. If X = 0.00000001, then torture is preferable to dust specks.
Actually, I ended up resolving this at some point. I would in fact pick the dust specks in this case, because the situations aren’t identical. I’d spend a lot of time in my 3^^^3 lives worrying if I’m going to start being tortured for 50 years, but I wouldn’t worry about the dust specks. Technically, the disutility of the dust specks is worse, but my brain can’t comprehend the number “3^^^3”, so it would worry more about the torture happening to me. Adding in the disutility of worrying about the torture, even a small amount, across 3^^^3 / 2 lives, and it’s clear that I should pick the dust specks for myself in this situation, regardless of whether or not I choose torture in the original problem.
You might be right. I’ll have to think about this, and reconsider my stance. One billion is obviously far less than 3^^^3, but you are right in that the 10 million dollars stolen by you would be preferable to me than the 100,000 dollars stolen by Eliezer. I also consider losing 100,000 dollars less than or equal to 100,000 times as bad as losing one dollar. This indicates one of two things:
A) My utility system is deeply flawed. B) My utility system includes some sort of ‘diffiusion factor’ wherein a disutility of X becomes <X when divided among several people, and the disutility becomes lower the more people it’s divided among. In essence, there is some disutility for one person suffering a lot of disutility, that isn’t there when it’s divided among a lot of people.
Of this, B seems more likely, and I didn’t take it into account when considering torture vs. dust specks. In any case, some introspection on this should help me further define my utility function, so thanks for giving me something to think about.
Then I choose the torture. I’ve grown a bit more comfortable with overriding intuition in regards to extremely large numbers since my original reply 3 months ago.
Need help with an MLP fanfiction with a transhumanist theme.
That’s...an interesting point. I never actually thought that Celestia and Luna could move the celestial bodies because they were alicorns. I always just thought they could move them because it was their special talents, and it was unique magic they could do because of their knowledge or talent, not because only they had the brute force to do it. After all, the only fight Celestia was ever in canonically, she lost, and it wasn’t even all that climactic either.
In the event that all alicorns have royal-sister levels of power (Again, my assumptions have blinded me here, I always thought that being an alicorn would basically be like being a very powerful unicorn, and Celestia and Luna were so powerful because, in addition to being alicorns, they were also thousands of years old and knew far more about magic than any mortal) that’s an extremely, EXTREMELY good reason not to do it.
I see no reason why earth ponies would lose their talents, though they might decide they want to do something else upon gaining MORE talents. That said, that could easily apply both ways: A pegasus who can now not only fly, but also have a deeper relationship with the earth might decide they’d rather farm than be a weather pony.
And you’re right about the diversity. This isn’t a very good Celestia argument imo, but if I expanded the story, it could certainly be a justification for certain groups to oppose the concept, such as groups that believe that the unique culture of their race would be destroyed by this move.
Even if I make my assumptions canon within the fic, there’s no reason to expect the average pony on the street would know it, and wouldn’t assume that your assumptions were correct. And given the irrationality of humans, and that ponies seem to act mostly like humans, a mere statement that “This is the way alicorn power is” isn’t going to be enough to assuage the populace...
Lastly, should I fail to get any real discourse going here, I can delete it now. The LessWrong group on a popular MLP fanfiction site is much, much more active than it looked. It hadn’t gotten any activity in it’s forum in weeks, but when I posted this question there, I started getting replies very quickly. If I’d known that I’d get that level of activity, I wouldn’t have bothered to post this here at all.
Good point. It also makes Celestia look like a much more credible character. One of my biggest problems was “Why the hell hasn’t Celestia come up with this solution a thousand years ago?” and by making it genuinely really difficult to make the mass alicornification work properly, I can come up with a plausible answer for this that isn’t “Celestia isn’t rational.”.
For what it’s worth, I think I’m going to keep the particular thing you quoted, because I think it makes significantly more logical sense for alicorns, which are supposed to emulate the strengths of all three races, to be able to do everything than to lose the ability to do certain things. But I’ll probably change the power level of alicorns to be more dangerous, as to me that makes just as much sense as my own interpretation of their power, and ought to be an equally challenging obstacle to the protagonists as the loss of talents would be.
A large amount of the things you mention become less dangerous in the event of greater alicorn presence in Equestria, not more. Nightmare Moon, Discord and Chrysalis ALL almost won, and if even just a few dozen alicorns had existed, they wouldn’t have stood a chance in hell.
Now, the whole existential risk angle...is a very interesting point, since based on what I’ve just argued, the logical meeting-ground between the two would be to have a task force of alicorns, say, at least a dozen, but no more than a hundred, all comprised of ponies Celestia trusted sufficiently. The chance of alicorn-related existential risk increases, but the chance of the next season’s villain killing everyone plummets to nearly zero. So, given your predictions on the power of alicorns, you’re right. If we take the prior that alicorns automatically gain Celestia-level powers, it’s far, far too dangerous to give everyone that kind of power, and immortalising everyone is a very, very bad idea.
In fact, this very argument leads me to believe that, in order to provide the optimum amount of conflict in the story, alicorns need to be a lot more powerful than unicorns, but not automatically god-tier. Alicorns should have the potential to reach the power of Celestia and Luna, but imagine if Celestia and Luna were immortal unicorns: Based on their great amount of knowledge, they would still likely be more powerful mages than any other unicorn alive. So this could easily extend to alicorns as well. This still brings about the existential risk angle. Powerful mortals can cast spells like Want-It-Need-It and the altering of Parasprites already, but a lot more ponies would be capable of such things if they were alicornified. My own personal belief about alicorn power levels subscribes to this idea, but as another LWer pointed out, I shouldn’t make the world convenient for me. I should make it as inconvenient as possible while still allowing the protagonists to win, because that makes for a much, much better story than “Deathists are always wrong about everything forever.” But I don’t think this is a problem that allows rational protagonists to win. They’d have to back down.
As for your FAI question: The answer is, no, I don’t want to convince a Friendly AI of this, but Celestia is not a friendly AI. She’s immortal, she’s the ruler of Equestria, and she’s definitely much wiser than just about any mortal, but she’s not a superintelligence. She’s not so far beyond ponies in mental ability that the concept of challenging her judgement is ludicrous. She has pony-level intelligence, just a lot more years to learn things. But, as we can extrapolate from elderly humans, sometimes age has it’s deficits as well, making people more inflexible in their opinions. Your argument for existential risk is what would convince me if I were Twilight, not Celestia saying it’s too dangerous and me blindly trusting said judgement. Celestia knows more than Twilight, but not so much more that in an argument between the two, Celestia can never be wrong.
Wait, this is a thing? I’ve only ever seen one small one-shot that had a transhumanist vibe to it. (Mortality Report) All the other “Reactions to immortality” ones I’ve seen have been all about how terrible it was. If there’s already a few well-written explorations of this exact concept, is there even a good reason to write this one?
Also, does anyone have some links to these, or at least names/authors? Whether my writing this fanfiction is still worth doing or not, getting more ideas is unlikely to be a bad thing.
I was referring to the concept as transequinism in my head, but I think “transponyism” is a lot more memorable, so I think I’m gonna go with that. Would you be okay with my using that as a title if I can’t think of something better?
Thanks for the inspiration for this idea, by the way :) I might not have thought of it if not for Luminosity and Radiance.
And, speaking of which, something I was wondering about: Is your name actually inspired by the alicorns from MLP? Believe it or not, I only thought of the association a few weeks ago, but I wasn’t curious enough to PM you about it.
Man that’s a good one. It’s certainly interesting to know that my ability to override intuition when it comes to large numbers is far less effective when the question is applied to me personally. I’m assuming that this question assumes no other ill effects from the specks. And I know I should pick the torture. I know that if the torture is the best outcome for other people, it’s the best outcome for myself. But if I was given that choice in real life, I don’t think I would as of writing this comment.
I have some correcting to do.