A related point. I don’t think the creators of The Sims, for example, anticipated that perhaps the primary use of their game would be as a sadistic torture simulator. The game explicitly rewards certain actions within the game, namely improving your game-character’s status. The game tells you what you should want. But due to our twisted psychology, we get more satisfaction locking our character in a room with no toilet, bed, food or water, with a blaring TV playing all day and night. And then killing him in a fire.
Totally normal people who are not otherwise sadists will play The Sims in this fashion. Playing “Kill Your Character Horribly” is just more fun than playing the game they intended you to play. You get more utility from sadism. An AI with unanticipated internal drives will act in ways that “don’t make sense.” It will want things we didn’t tell it to want.
Yes, this is a good point. I tried to minimize this effect (direct utility of fun playing the game in certain ways) by providing external incentives, which are assumed to be large enough to override the fun for people.
However, after more thinking, I’m not sure any external incentives would work in the important cases. After all, this belief structure—knowledge by the player of being in a game, and knowledge of getting outside utility from playing for specified arbitrary goals—appears to be able to override goals of any agent, including FAI. But if FAI is truly friendly, then it won’t play if the game’s quality is sufficiently advanced and the NPCs become real persons. Same is probably true for normal people. They wouldn’t torture ‘sufficiently real’ characters, not for any money.
Thus, the original idea of the grandfather comment is flawed...
Yeees… though I’m not sure how ‘normal’ it was, it could be mainly group effects.
I can only extrapolate from a single point, and “there isn’t anything I find even the tiniest bit tempting about nailing the skins of Yermy Wibble’s family to a newsroom wall”. Sadistically killing a ‘not real’ game character can be fun, but if I try to imagine doing the same to a ‘sufficiently real’ character, like uploaded person, then it… doesn’t work.
Yeees… though I’m not sure how ‘normal’ it was, it could be mainly group effects.
What does this distinction mean? A normal person in those groups would commit torture, and there’s no such thing as a ‘normal person’ completely abstracted from ‘group effects’; a Homo sapiens without memes isn’t really a person.
For large numbers of people to abhor torture as much as we do is a bizarre (from a historical POV) recent phenomenon, AFAIK.
In the interests of allowing you to extrapolate from two points, have an anecdote: Assuming ‘get away scot-free’ is included, I’d torture a thirty-year-old white male for a billion dollars. That’s my starting bid, you could probably bargain me down to less money or more objectionable kinds of people if you tried.
People in Milgram experiment didn’t say they would torture people.
Knoth and TheOtherDave answered a different question. The data I defied is the truth of the first-person statement “I imagined myself torturing a person for N dollars, and according to my self-simulation, I would do it”.
???
Are you conceding that people do in fact torture one another, but denying that pedanterrific is one of those people? If not, I completely fail to understand you. If so, on what basis?
People in Milgram experiment didn’t say they would torture people.
Agreed. They did, however, choose to inflict pain, in many cases intolerable pain, on other people. That seems sufficient grounds for me to conclude that they chose to torture people.
Oh, right. Yeah, oughtn’t have implied otherwise. ”… that some people do in fact torture one another, and therefore some people are willing to torture one another, but denying that pendanterrific...” is closer to what I actually wanted to mean.
Could you explain one possible way someone could be mistaken about “I imagined myself torturing a person for N dollars, and according to my self-simulation, I would do it”?
Your original statement was much weaker, just “I’d torture… for a billion”. I thought you were mistaken because you didn’t actually imagine it. I clarified this later in the conversation.
Sure, I’ll weigh in, since you’re asking. History, including recent history, is full of people who tortured other people. I see no reason to believe that defining all of those people as “not normal” is in the least bit justified; that seems more likely to be a No True Scotsman fallacy in action. Adding a concrete incentive like money probably helps, if it’s a large enough sum, but honestly introducing money to the discussion seems to clutter the question unnecessarily. Normal people will torture one another for no money at all, under the right circumstances.
This is an astonishing thing to say. You’re so confident I’m wrong that if someone else stated a similar opinion you would insist they were wrong too? What if that person had a verifiable job history as a professional torturer for some third-world dictator? There comes a point where the replications outweigh your ability to judge others’ knowledge of their own minds.
But we’re talking about ‘normal’ people, not professional torturers. I am fairly confident the LW community is torturers-free, DUST SPECKS controversy notwithstanding :)
My hypothesis is that you are making a logical declarative statement, not actually imagining the process.
But I guess if you insist that you are, and if many people agree with you, I will have to update...
Do you think all professional torturers are evil mutants, or what?
I don’t know. I never saw one in my life.
I took this possibility seriously, so I just spent a minute imagining the process in as much detail as possible. I’m willing to come down to a hundred million.
Hmm. Ok, another hypothesis: do you use an argument like “I’ll use the money I’ll get to improve the conditions of lots of other people, stop many other tortures going all over the world”, or something similar? It didn’t work for me, but it may make sense for a stronger rationalist.
A related point. I don’t think the creators of The Sims, for example, anticipated that perhaps the primary use of their game would be as a sadistic torture simulator. The game explicitly rewards certain actions within the game, namely improving your game-character’s status. The game tells you what you should want. But due to our twisted psychology, we get more satisfaction locking our character in a room with no toilet, bed, food or water, with a blaring TV playing all day and night. And then killing him in a fire.
Totally normal people who are not otherwise sadists will play The Sims in this fashion. Playing “Kill Your Character Horribly” is just more fun than playing the game they intended you to play. You get more utility from sadism. An AI with unanticipated internal drives will act in ways that “don’t make sense.” It will want things we didn’t tell it to want.
Yes, this is a good point. I tried to minimize this effect (direct utility of fun playing the game in certain ways) by providing external incentives, which are assumed to be large enough to override the fun for people.
However, after more thinking, I’m not sure any external incentives would work in the important cases. After all, this belief structure—knowledge by the player of being in a game, and knowledge of getting outside utility from playing for specified arbitrary goals—appears to be able to override goals of any agent, including FAI. But if FAI is truly friendly, then it won’t play if the game’s quality is sufficiently advanced and the NPCs become real persons. Same is probably true for normal people. They wouldn’t torture ‘sufficiently real’ characters, not for any money.
Thus, the original idea of the grandfather comment is flawed...
You know torture and execution used to be major forms of entertainment, right?
Yeees… though I’m not sure how ‘normal’ it was, it could be mainly group effects.
I can only extrapolate from a single point, and “there isn’t anything I find even the tiniest bit tempting about nailing the skins of Yermy Wibble’s family to a newsroom wall”. Sadistically killing a ‘not real’ game character can be fun, but if I try to imagine doing the same to a ‘sufficiently real’ character, like uploaded person, then it… doesn’t work.
What does this distinction mean? A normal person in those groups would commit torture, and there’s no such thing as a ‘normal person’ completely abstracted from ‘group effects’; a Homo sapiens without memes isn’t really a person.
For large numbers of people to abhor torture as much as we do is a bizarre (from a historical POV) recent phenomenon, AFAIK.
Group effects (peer pressure, authority, etc) apparently can easily override personal values in humans’ corrupted hardware.
I am not sure you’re right about historical POV. I don’t think high primates deliberately torture each other for fun. I can be wrong, though...
So you’re claiming that there is a difference between “group effects” and “personal values”. I’m highly dubious.
In the interests of allowing you to extrapolate from two points, have an anecdote: Assuming ‘get away scot-free’ is included, I’d torture a thirty-year-old white male for a billion dollars. That’s my starting bid, you could probably bargain me down to less money or more objectionable kinds of people if you tried.
For altruistic utilitarian reasons?
Why do you ask?
Because I could see myself being persuaded in the altruistic case, but not in the selfish one.
Altruism: the best argument for torturing people.
This is why I’m always suspicious of altruism.
I defy the data.
Milgram experiment?
I’d say Khoth is obviously correct here.
People in Milgram experiment didn’t say they would torture people.
Knoth and TheOtherDave answered a different question. The data I defied is the truth of the first-person statement “I imagined myself torturing a person for N dollars, and according to my self-simulation, I would do it”.
??? Are you conceding that people do in fact torture one another, but denying that pedanterrific is one of those people?
If not, I completely fail to understand you.
If so, on what basis?
Agreed. They did, however, choose to inflict pain, in many cases intolerable pain, on other people.
That seems sufficient grounds for me to conclude that they chose to torture people.
Apparently irrational belief about people on LW.
(Just in case there’s any confusion: I have never tortured anyone, and don’t plan to.)
Oh, right. Yeah, oughtn’t have implied otherwise.
”… that some people do in fact torture one another, and therefore some people are willing to torture one another, but denying that pendanterrific...” is closer to what I actually wanted to mean.
Oh, okay. As a suggestion for the future: when you mean “I think you’re lying”, don’t say “I defy the data”. They don’t mean the same thing.
I didn’t think you were lying, I thought you were mistaken.
Could you explain one possible way someone could be mistaken about “I imagined myself torturing a person for N dollars, and according to my self-simulation, I would do it”?
Your original statement was much weaker, just “I’d torture… for a billion”. I thought you were mistaken because you didn’t actually imagine it. I clarified this later in the conversation.
Okay then.
Okay! Let’s get a third opinion in here. For those just joining us, the claim (paraphrased) is
Anyone want to chime in on this?
I’m not sure how much money it would take, but I think most normal people would do it for free if it was socially expected.
Sure, I’ll weigh in, since you’re asking.
History, including recent history, is full of people who tortured other people.
I see no reason to believe that defining all of those people as “not normal” is in the least bit justified; that seems more likely to be a No True Scotsman fallacy in action.
Adding a concrete incentive like money probably helps, if it’s a large enough sum, but honestly introducing money to the discussion seems to clutter the question unnecessarily. Normal people will torture one another for no money at all, under the right circumstances.
In fact due to the way taboo tradeoffs work, I suspect offering people money will make them less inclined to torture.
Yeah, that thought had crossed my mind as well; it’s certainly true for small-to-medium amounts of money.
Additional opinions wouldn’t help. I think your belief that you would torture people for a billion dollars is wrong.
This is an astonishing thing to say. You’re so confident I’m wrong that if someone else stated a similar opinion you would insist they were wrong too? What if that person had a verifiable job history as a professional torturer for some third-world dictator? There comes a point where the replications outweigh your ability to judge others’ knowledge of their own minds.
But we’re talking about ‘normal’ people, not professional torturers. I am fairly confident the LW community is torturers-free, DUST SPECKS controversy notwithstanding :)
My hypothesis is that you are making a logical declarative statement, not actually imagining the process.
But I guess if you insist that you are, and if many people agree with you, I will have to update...
What’s the distinction you’re making here? Do you think all professional torturers are evil mutants, or what?
I took this possibility seriously, so I just spent a minute imagining the process in as much detail as possible.
I’m willing to come down to a hundred million.
I don’t know. I never saw one in my life.
Hmm. Ok, another hypothesis: do you use an argument like “I’ll use the money I’ll get to improve the conditions of lots of other people, stop many other tortures going all over the world”, or something similar?
It didn’t work for me, but it may make sense for a stronger rationalist.
Be careful about confusing utility and pleasure. But your point about unexpected drives leading to unexpected results is absolutely true.