If you ask me, the prevalence of torture scenarios on this site has very little to do with clarity and a great deal to do with a certain kind of autism-y obsession with things that might happen but probably won’t.
It’s the same mental machinery that makes people avoid sidewalk cracks or worry their parents have poisoned their food.
A lot of times it seems the “rationality” around here simply consists of an environment that enables certain neuroses and personality problems while suppressing more typical ones.
I don’t think that being fascinated by extremely low-probability but dramatic possibilities has anything to do with autism. As you imply, people in general tend to do it, though being terrified about airplane crashes might be a better example.
I’d come up with an evolutionary explanation, but a meteor would probably fall on my head if I did that.
I really don’t see how you could have drawn that conclusion. It’s not like anyone here is actually worried about being forced to choose between torture and dust specks, or being accosted by Omega and required to choose one box or two, or being counterfactually mugged. (And, if you were wondering, we don’t actually think Clippy is a real paperclip maximizer, either.) “Torture” is a convenient, agreeable stand-in for “something very strongly negatively valued” or “something you want to avoid more than almost anything else that could happen to you” in decision problems. I think it works pretty well for that purpose.
Yes, a recent now-deleted post proposed a torture scenario as something that might actually happen, but it was not a typical case and not well-received. You need to provide more evidence that more than a few people here actually worry about that sort of thing, that it’s more than just an Omega-like abstraction used to simplify decision problems by removing loopholes around thinking about the real question.
Actually I’m not sure he’s even serious, but I’ve certainly seen that argument advanced before, and the parent post’s “1% chance” thing is I’m pretty sure a parody of the idea that you have to give anything at least a 1% chance, because it’s all so messy and how can you ever be sure?! which has certainly shown up on this site on several occasions recently, particularly in relation to the very extreme fringe scenarios you say help people think more clearly.
Torture scenarios have LONG being advanced in this community as more than a trolly problem with added poor-taste hyperbole. Even if you go back to the SL4 mailing list, it’s full of discussions where someone says something about AI, and someone else replies “what, so an AI is like god in this respect? What if it goes wrong? What if religious people make one? What if my mean neighbor gets uploaded? What if what if what if? WE’LL ALL BE TORTURED!”
I was serious that the probability of an explicitly described situation is orders of magnitude greater than the probability of a not-yet-chosen random scenario. In the same way the probability of any particular hundred digit number, once someone posts it online, will be orders of magnitude more likely to appear elsewhere.
But I was joking in the “complaint” about the posting, because the probability both before and after the post is small enough that no reasonable person could worry about the thing happening.
I really don’t see how you could have drawn that conclusion. It’s not like anyone here is actually worried about being forced to choose between torture and dust specks, or being accosted by Omega and required to choose one box or two, or being counterfactually mugged.
This isn’t consistent with Roko’s post here which took seriously the notion of a post Singularity FAI precommitting to torturing some people for eternity. ETA: Although it does seem that part of the reason that post was downvoted heavily was that most people considered the situation ridiculous.
If you ask me, the prevalence of torture scenarios on this site has very little to do with clarity and a great deal to do with a certain kind of autism-y obsession with things that might happen but probably won’t.
If you ask me, the prevalence of torture scenarios on this site has very little to do with clarity and a great deal to do with a certain kind of autism-y obsession with things that might happen but probably won’t.
It’s the same mental machinery that makes people avoid sidewalk cracks or worry their parents have poisoned their food.
A lot of times it seems the “rationality” around here simply consists of an environment that enables certain neuroses and personality problems while suppressing more typical ones.
I don’t think that being fascinated by extremely low-probability but dramatic possibilities has anything to do with autism. As you imply, people in general tend to do it, though being terrified about airplane crashes might be a better example.
I’d come up with an evolutionary explanation, but a meteor would probably fall on my head if I did that.
I really don’t see how you could have drawn that conclusion. It’s not like anyone here is actually worried about being forced to choose between torture and dust specks, or being accosted by Omega and required to choose one box or two, or being counterfactually mugged. (And, if you were wondering, we don’t actually think Clippy is a real paperclip maximizer, either.) “Torture” is a convenient, agreeable stand-in for “something very strongly negatively valued” or “something you want to avoid more than almost anything else that could happen to you” in decision problems. I think it works pretty well for that purpose.
Yes, a recent now-deleted post proposed a torture scenario as something that might actually happen, but it was not a typical case and not well-received. You need to provide more evidence that more than a few people here actually worry about that sort of thing, that it’s more than just an Omega-like abstraction used to simplify decision problems by removing loopholes around thinking about the real question.
How about this guy a couple comments down?
Actually I’m not sure he’s even serious, but I’ve certainly seen that argument advanced before, and the parent post’s “1% chance” thing is I’m pretty sure a parody of the idea that you have to give anything at least a 1% chance, because it’s all so messy and how can you ever be sure?! which has certainly shown up on this site on several occasions recently, particularly in relation to the very extreme fringe scenarios you say help people think more clearly.
Torture scenarios have LONG being advanced in this community as more than a trolly problem with added poor-taste hyperbole. Even if you go back to the SL4 mailing list, it’s full of discussions where someone says something about AI, and someone else replies “what, so an AI is like god in this respect? What if it goes wrong? What if religious people make one? What if my mean neighbor gets uploaded? What if what if what if? WE’LL ALL BE TORTURED!”
I was serious that the probability of an explicitly described situation is orders of magnitude greater than the probability of a not-yet-chosen random scenario. In the same way the probability of any particular hundred digit number, once someone posts it online, will be orders of magnitude more likely to appear elsewhere.
But I was joking in the “complaint” about the posting, because the probability both before and after the post is small enough that no reasonable person could worry about the thing happening.
This isn’t consistent with Roko’s post here which took seriously the notion of a post Singularity FAI precommitting to torturing some people for eternity. ETA: Although it does seem that part of the reason that post was downvoted heavily was that most people considered the situation ridiculous.
Pondering blue tentacle scenarios can be a displacement activity, to avoid dealing with things that really might happen, or be made to happen, here and now.