I don’t actually go to meetups, but Harry’s comments about anti-conformity training made me wonder if it’d be worth trying.
You could retest the original experiment, see if lesswrongians can avoid it through knowledge of the effect.
You could mock obviously true statements to practice withstanding opposition.
You could practice the ability to do harmless but nonconformist things to gain the ability to do so if the situation called for something unusual, but you might otherwise be too conformist or embarassed. (each meeting attendee shall order a coffee whilst wearing the ceremonial tea-cosy!). I suspect some of this overlaps with PUA a little and easily veers into general confidence building.
I don’t know if rehearsals would do any good, but you could go through the motions of not complying with the Milgram experiment, making people handle little fake emergencies...
You could wonder if EY is planning things like this for the Center for Modern Rationality.
Really, this is how I feel. I’d be really surprised if a setup like that actually worked. I’m not sure Harry is supposed to actually believe (with any confidence) that it works for Chaos. Ultimately you know and everyone else knows that it’s just a charade, and that really your “nonconforming” is just conforming one level below surface: You stand there and take abuse that you know to be insincere, and then get a pat on the back about it later, just like everyone else did on their turn.
Hopefully CMR has a better exercise in mind. A really good anti-Asch training tool seems like a great thing to have.
You could mock obviously true statements to practice withstanding opposition.
The danger with this seems to be that you’ll also be developing skills for attacking correct positions. It’s training you to develop tactics for entrenching yourself in incorrect beliefs. Also it seems to lend itself to the view of arguments as status conflicts rather than group truth-investigation (though I suppose we do need to at least practice how to handle arguments with people who do perceive them this way).
I disagree. I think it would have a very good chance to work.
To a perfect Bayesian, the importance of an act is not what it looks like on the surface, but the state of the world that makes such an act possible. Unfortunately (or fortunately in this case), human minds are not perfectly Bayesian.
To the human mind, merely resembling another thing is enough for the mind to form connections and associations between the two. This is why public speaking courses can improve people’s abilities and lessen their fears of public speech. Even though people know they’re just speaking in front of a class who is obligated to receive the speech well, their mind naturally reduces the anxiety they feel for any future speaking engagements. The mind says “eh, it’s close enough. I can do this,” just like how anti-conformity training should fool the mind into considering it ‘close enough’ to real disagreement. Speech classes don’t not work perfectly, just like chaos training (I assume) doesn’t work perfectly, but it’s pretty good.
Anti-conformity training seems practically identical to a proven training method, and thus I rate it highly likely to work.
Well, I guess it’s just an empirical question where we differ in predictions. Personally I don’t think the analogy with public speaking is very strong, because public speaking classes are actually public speaking. People stand up and speak in front of lots of people, that’s just what it is.
Upon reflection though, it does seem like there’s one way that it might help, which is that it might help you figure out how to go about non-conformity, what exactly you can do or say in such a situation. So even if your mind doesn’t buy into the charade, roleplaying with good partners might help you figure out ways to navigate a non-conformity situation. Having those methods worked out in advance might make you less hesitant to speak out in real world situations, but only to the extent that your hesitation is about not knowing what exactly to say or do (as opposed to fear of social punishment, the usual explanation for Asch’s results).
What I’ve always wondered about with Asch’s experiment is how much of a difference a small monetary incentive (say, $1 per correct answer) would make. It seems like the experiment is odd in that there is no incentive to give correct answers, but at least a potential or perceived social incentive to give conforming ones. This seems like it would be relevant to our disagreement because it’s a question of whether the situation becomes different when something is actually on the line. Unfortunately I can’t seem to google up any examples of variations like this.
Anti-conformity training seems practically identical to a proven training method, and thus I rate it highly likely to work.
There is a difference. Even if the class (during a public speaking course) is obligated to receive the speech well, you know that their approval might be insincere and that’s still scary. In the proposed nonconformity exercise you would be sure that the other participants don’t really disapprove.
I think that if you’ve got a deeply habitual inhibition against firmly disagreeing with people, even a known-to-be-simulated experience of breaking the inhibition can help quite a bit.
Putting it in another perspective: if you have an inhibition against doing something, you will also have an inhibition against simulating it. But the latter is easier to break by reminding you that it is not real.
Personally, to me it struck me as something I would try on a group of first graders (provided I knew I wouldn’t be sued) but not on a group of adults. They know it’s just a game, and treat it as such, but nobody’s going to refuse because to me that sounds like a very fun game (approached from the right mindset, anyways, and provided you make sure the audience doesn’t take it too far. I’d probably hand pick people to “criticize” and make myself a member of that group so I could step in if another was being problematic). So they all do it, and they all think it doesn’t matter outside the game, except that since it was so unusual a thing to ask them to do they’re going to remember when you tell them why they did it at the end of the lesson, and they’re not going to forget it anytime soon. Preferably do it when they would normally be having a “normal” lesson.
Harry’s army is about four years too old for that angle to work, so I wouldn’t expect much of the “training”. I’d expect more from the entirely conscious chain of reasoning that they respect General Potter, and that he has them do all kinds of weird “training” things and an awful lot of them turn out to be good ideas, and that he told you outright that he thinks this is an important thing for you to try to do. But then, that’s a conscious chain of thought, and by the time an issue like this hits conscious thought it’s already passed all the lines of defense that matter. So I wouldn’t expect much of it. But hey, he’s already employing a scatter shot approach towards their weirdness training so if this one idea doesn’t work out it doesn’t cost him much more than the time it took him to plan the exercise.
But hey, he’s already employing a scatter shot approach towards their weirdness training so if this one idea doesn’t work out it doesn’t cost him much more than the time it took him to plan the exercise.
There’s almost certainly a way to do this sort of thing that would function quite well as a morale and team-building exercise even if it didn’t work at all on the conformity level.
Ultimately you know and everyone else knows that it’s just a charade, and that really your “nonconforming” is just conforming one level below surface: You stand there and take abuse that you know to be insincere, and then get a pat on the back about it later, just like everyone else did on their turn.
Indeed … This sounds like an initiation, not a rationality exercise.
In one sense, yes I agree it’s a charade, but people are non-rational and often very sensitive to the form of things. To me it sounds at least worth trying.
Pondering this further, I think the biggest problem is finding a way to measure conformity even in the face of people knowing they’re being tested for conformity.
Do not have the audience be part of the group being tested. Pull in confederates off the street, and tell them about the test. Do not allow subjects to see each other’s testing. Let’s say now that the current subject is Alex. Alex prefers vanilla ice cream to chocolate ice cream. Now go through the anti-conformity training.
After the training, hold a break (still with just Alex and the confederates). Offer ice cream in chocolate, vanilla, and, say, mango. Have most (maybe about 80%) of the confederates go for the chocolate, 10% for the vanilla, and 10% for the mango.
The mango should help to decrease the suspicion, as should having not everybody go for the chocolate. It may help to have the confederates go through the training as well, to decrease suspicion.
The problems I see with this are a) Cost. This one I’ll ignore, because that is a matter of practicality. b) The subject group is not the group conforming. This will decrease the likelihood of conforming.
The problem with having the subject group be the confederates, is that then the subject group knows how the test is being done.
Do not have the audience be part of the group being tested. Pull in confederates off the street, and tell them about the test. Do not allow subjects to see each other’s testing. Let’s say now that the current subject is Alex. Alex prefers vanilla ice cream to chocolate ice cream. Now go through the anti-conformity training.
After the training, hold a break (still with just Alex and the confederates). Offer ice cream in chocolate, vanilla, and, say, mango. Have most (maybe about 80%) of the confederates go for the chocolate, 10% for the vanilla, and 10% for the mango.
The mango should help to decrease the suspicion, as should having not everybody go for the chocolate. It may help to have the confederates go through the training as well, to decrease suspicion.
The problems I see with this are
a) Cost. This one I’ll ignore, because that is a matter of practicality.
b) The subject group is not the group conforming. This will decrease the likelihood of conforming.
The problem with having the subject group be the confederates, is that then the subject group knows how the test is being done.
Based on my own experiences being a lone dissenter, the main thing that has allowed me to stand up and maintain my position consistently in the face of uniform opposition and derision, was not expecting much of everyone else in the first place.
For example, in an introductory logic course, when the professor made a mistake, which everyone else in the class agreed with, and I was the sole person to disagree, and attempt to explain it in the face of the entire class brushing me off and laughing about how I thought I knew better when the answer was so obvious to everyone else, it didn’t seem weird to me at all that every other person would make the same mistake and I would be the only one to notice it. It wasn’t confusing to me, and my success in showing the professor in a couple minutes after class that she had been mistaken after all confirmed for me that my expectations were on track.
Conforming to the beliefs of the crowd is perfectly sensible behavior, in domains where you have no reason to expect yourself to be more accurate than anyone else. Learning to disregard conforming instincts completely is a bad idea, because a lot of the time, it really will be everyone else who’s right, and you who’s making a stupid mistake which will make you feel like an idiot when you finally realize it. Refusing to conform is both achievable and proper when you have a palpable expectation that other people are going to be stupid.
Unfortunately, it’s rather easy for one’s expectations of other people’s intelligence and rationality to become poorly calibrated.
Having an experience of being right where everyone else is wrong, is good for breaking the fear of nonconformity.
When I was a child, I participated on a science olympiad and on one question I gave an answer that seemed trivially wrong, but in fact it was correct. (There were two objects of different size, made of same material, balanced on a lever, then both immersed in water. How will the balance chance?) Everyone thought I was wrong, and the official solution confirmed it. Then the organizers realized they made a mistake, and confirmed my solution.
Since then I knew (also on emotional level) that it is possible to be right, even if everyone else disagrees. Sometimes it is wise to keep quiet, because the social consequences of nonconformity are real, but being alone does not make one automatically wrong. It was a good lesson.
On the other hand, the fact that you were comfortable being the lone dissenter while untrained in resisting conformity may indicate that your social wiring is atypical. Some people in some situations may interpret that difference as a socioemotional flaw.
It might be that my own experiences won’t generalize to other people, but in domains where I haven’t developed an expectation that everyone else really will make mistakes that I didn’t, if everyone else is saying I’m wrong, I’ll tend to conclude that I’m probably wrong.
I wouldn’t exactly say I was untrained though. We learn from our life experiences. I was smarter than the great majority of my peers growing up, and fairly precocious, a stage ahead through my childhood, so seeing people around me believing things that seemed head-in-sand stupid, because the things they believed were actually stupid, as the products of less developed minds, was simply the way of the world as I grew up.
The hard part, which I see reflected among many other people who grew up ahead of their peers, was developing a sense of when you really should expect other people not to be any less sensible than you are.
The Center for Modern Rationality is offering $50 prizes for any suggested rationality training exercises that look good enough to test, and $500 prizes for any suggested exercises that we actually adopt into a unit. Specific descriptions of mental skills, accompanied by the request for exercises to teach them, have been posted for the units Be Specific and Check Consequentialism. (Think of this as trying to invent the actual content of the bizarre exercises that Harry has been inflicting on the Chaos Legion since Ch. 29… oh, wait, I haven’t mentioned those in the text yet, have I?)
I doubt whether it’s good to do actual anti-conformity training because it might make you too non-conforming (i.e. sticking to wrong positions). Instead, maybe it’d be better to do training on how to use others’ opinions as evidence, similar to calibration training. The approach of anti-conformity training sounds good, but I’d stray in some statements which are actually false, the goal here is to actually get to the right conclusions whether the rest of society is right or wrong.
It seems unlikely to be the sort of field where people will overcorrect—the causes of conformity(both psychologically and sociologically) are immense, and a bit of training will not overbalance the scales.
I don’t actually go to meetups, but Harry’s comments about anti-conformity training made me wonder if it’d be worth trying.
You could retest the original experiment, see if lesswrongians can avoid it through knowledge of the effect.
You could mock obviously true statements to practice withstanding opposition.
You could practice the ability to do harmless but nonconformist things to gain the ability to do so if the situation called for something unusual, but you might otherwise be too conformist or embarassed. (each meeting attendee shall order a coffee whilst wearing the ceremonial tea-cosy!). I suspect some of this overlaps with PUA a little and easily veers into general confidence building.
I don’t know if rehearsals would do any good, but you could go through the motions of not complying with the Milgram experiment, making people handle little fake emergencies...
You could wonder if EY is planning things like this for the Center for Modern Rationality.
Really, this is how I feel. I’d be really surprised if a setup like that actually worked. I’m not sure Harry is supposed to actually believe (with any confidence) that it works for Chaos. Ultimately you know and everyone else knows that it’s just a charade, and that really your “nonconforming” is just conforming one level below surface: You stand there and take abuse that you know to be insincere, and then get a pat on the back about it later, just like everyone else did on their turn.
Hopefully CMR has a better exercise in mind. A really good anti-Asch training tool seems like a great thing to have.
The danger with this seems to be that you’ll also be developing skills for attacking correct positions. It’s training you to develop tactics for entrenching yourself in incorrect beliefs. Also it seems to lend itself to the view of arguments as status conflicts rather than group truth-investigation (though I suppose we do need to at least practice how to handle arguments with people who do perceive them this way).
I disagree. I think it would have a very good chance to work.
To a perfect Bayesian, the importance of an act is not what it looks like on the surface, but the state of the world that makes such an act possible. Unfortunately (or fortunately in this case), human minds are not perfectly Bayesian.
To the human mind, merely resembling another thing is enough for the mind to form connections and associations between the two. This is why public speaking courses can improve people’s abilities and lessen their fears of public speech. Even though people know they’re just speaking in front of a class who is obligated to receive the speech well, their mind naturally reduces the anxiety they feel for any future speaking engagements. The mind says “eh, it’s close enough. I can do this,” just like how anti-conformity training should fool the mind into considering it ‘close enough’ to real disagreement. Speech classes don’t not work perfectly, just like chaos training (I assume) doesn’t work perfectly, but it’s pretty good.
Anti-conformity training seems practically identical to a proven training method, and thus I rate it highly likely to work.
Well, I guess it’s just an empirical question where we differ in predictions. Personally I don’t think the analogy with public speaking is very strong, because public speaking classes are actually public speaking. People stand up and speak in front of lots of people, that’s just what it is.
Upon reflection though, it does seem like there’s one way that it might help, which is that it might help you figure out how to go about non-conformity, what exactly you can do or say in such a situation. So even if your mind doesn’t buy into the charade, roleplaying with good partners might help you figure out ways to navigate a non-conformity situation. Having those methods worked out in advance might make you less hesitant to speak out in real world situations, but only to the extent that your hesitation is about not knowing what exactly to say or do (as opposed to fear of social punishment, the usual explanation for Asch’s results).
What I’ve always wondered about with Asch’s experiment is how much of a difference a small monetary incentive (say, $1 per correct answer) would make. It seems like the experiment is odd in that there is no incentive to give correct answers, but at least a potential or perceived social incentive to give conforming ones. This seems like it would be relevant to our disagreement because it’s a question of whether the situation becomes different when something is actually on the line. Unfortunately I can’t seem to google up any examples of variations like this.
I would be really interested in the result of this experiment.
There is a difference. Even if the class (during a public speaking course) is obligated to receive the speech well, you know that their approval might be insincere and that’s still scary. In the proposed nonconformity exercise you would be sure that the other participants don’t really disapprove.
I think that if you’ve got a deeply habitual inhibition against firmly disagreeing with people, even a known-to-be-simulated experience of breaking the inhibition can help quite a bit.
Putting it in another perspective: if you have an inhibition against doing something, you will also have an inhibition against simulating it. But the latter is easier to break by reminding you that it is not real.
Personally, to me it struck me as something I would try on a group of first graders (provided I knew I wouldn’t be sued) but not on a group of adults. They know it’s just a game, and treat it as such, but nobody’s going to refuse because to me that sounds like a very fun game (approached from the right mindset, anyways, and provided you make sure the audience doesn’t take it too far. I’d probably hand pick people to “criticize” and make myself a member of that group so I could step in if another was being problematic). So they all do it, and they all think it doesn’t matter outside the game, except that since it was so unusual a thing to ask them to do they’re going to remember when you tell them why they did it at the end of the lesson, and they’re not going to forget it anytime soon. Preferably do it when they would normally be having a “normal” lesson.
Harry’s army is about four years too old for that angle to work, so I wouldn’t expect much of the “training”. I’d expect more from the entirely conscious chain of reasoning that they respect General Potter, and that he has them do all kinds of weird “training” things and an awful lot of them turn out to be good ideas, and that he told you outright that he thinks this is an important thing for you to try to do. But then, that’s a conscious chain of thought, and by the time an issue like this hits conscious thought it’s already passed all the lines of defense that matter. So I wouldn’t expect much of it. But hey, he’s already employing a scatter shot approach towards their weirdness training so if this one idea doesn’t work out it doesn’t cost him much more than the time it took him to plan the exercise.
There’s almost certainly a way to do this sort of thing that would function quite well as a morale and team-building exercise even if it didn’t work at all on the conformity level.
Not that the Chaos Legion needs more morale.
Indeed … This sounds like an initiation, not a rationality exercise.
You meant this one, right?
In one sense, yes I agree it’s a charade, but people are non-rational and often very sensitive to the form of things. To me it sounds at least worth trying.
Pondering this further, I think the biggest problem is finding a way to measure conformity even in the face of people knowing they’re being tested for conformity.
Do not have the audience be part of the group being tested. Pull in confederates off the street, and tell them about the test. Do not allow subjects to see each other’s testing. Let’s say now that the current subject is Alex. Alex prefers vanilla ice cream to chocolate ice cream. Now go through the anti-conformity training.
After the training, hold a break (still with just Alex and the confederates). Offer ice cream in chocolate, vanilla, and, say, mango. Have most (maybe about 80%) of the confederates go for the chocolate, 10% for the vanilla, and 10% for the mango.
The mango should help to decrease the suspicion, as should having not everybody go for the chocolate. It may help to have the confederates go through the training as well, to decrease suspicion.
The problems I see with this are a) Cost. This one I’ll ignore, because that is a matter of practicality. b) The subject group is not the group conforming. This will decrease the likelihood of conforming.
The problem with having the subject group be the confederates, is that then the subject group knows how the test is being done.
Do not have the audience be part of the group being tested. Pull in confederates off the street, and tell them about the test. Do not allow subjects to see each other’s testing. Let’s say now that the current subject is Alex. Alex prefers vanilla ice cream to chocolate ice cream. Now go through the anti-conformity training.
After the training, hold a break (still with just Alex and the confederates). Offer ice cream in chocolate, vanilla, and, say, mango. Have most (maybe about 80%) of the confederates go for the chocolate, 10% for the vanilla, and 10% for the mango.
The mango should help to decrease the suspicion, as should having not everybody go for the chocolate. It may help to have the confederates go through the training as well, to decrease suspicion.
The problems I see with this are a) Cost. This one I’ll ignore, because that is a matter of practicality. b) The subject group is not the group conforming. This will decrease the likelihood of conforming.
The problem with having the subject group be the confederates, is that then the subject group knows how the test is being done.
Based on my own experiences being a lone dissenter, the main thing that has allowed me to stand up and maintain my position consistently in the face of uniform opposition and derision, was not expecting much of everyone else in the first place.
For example, in an introductory logic course, when the professor made a mistake, which everyone else in the class agreed with, and I was the sole person to disagree, and attempt to explain it in the face of the entire class brushing me off and laughing about how I thought I knew better when the answer was so obvious to everyone else, it didn’t seem weird to me at all that every other person would make the same mistake and I would be the only one to notice it. It wasn’t confusing to me, and my success in showing the professor in a couple minutes after class that she had been mistaken after all confirmed for me that my expectations were on track.
Conforming to the beliefs of the crowd is perfectly sensible behavior, in domains where you have no reason to expect yourself to be more accurate than anyone else. Learning to disregard conforming instincts completely is a bad idea, because a lot of the time, it really will be everyone else who’s right, and you who’s making a stupid mistake which will make you feel like an idiot when you finally realize it. Refusing to conform is both achievable and proper when you have a palpable expectation that other people are going to be stupid.
Unfortunately, it’s rather easy for one’s expectations of other people’s intelligence and rationality to become poorly calibrated.
Having an experience of being right where everyone else is wrong, is good for breaking the fear of nonconformity.
When I was a child, I participated on a science olympiad and on one question I gave an answer that seemed trivially wrong, but in fact it was correct. (There were two objects of different size, made of same material, balanced on a lever, then both immersed in water. How will the balance chance?) Everyone thought I was wrong, and the official solution confirmed it. Then the organizers realized they made a mistake, and confirmed my solution.
Since then I knew (also on emotional level) that it is possible to be right, even if everyone else disagrees. Sometimes it is wise to keep quiet, because the social consequences of nonconformity are real, but being alone does not make one automatically wrong. It was a good lesson.
On the other hand, the fact that you were comfortable being the lone dissenter while untrained in resisting conformity may indicate that your social wiring is atypical. Some people in some situations may interpret that difference as a socioemotional flaw.
It might be that my own experiences won’t generalize to other people, but in domains where I haven’t developed an expectation that everyone else really will make mistakes that I didn’t, if everyone else is saying I’m wrong, I’ll tend to conclude that I’m probably wrong.
I wouldn’t exactly say I was untrained though. We learn from our life experiences. I was smarter than the great majority of my peers growing up, and fairly precocious, a stage ahead through my childhood, so seeing people around me believing things that seemed head-in-sand stupid, because the things they believed were actually stupid, as the products of less developed minds, was simply the way of the world as I grew up.
The hard part, which I see reflected among many other people who grew up ahead of their peers, was developing a sense of when you really should expect other people not to be any less sensible than you are.
From www.hpmor.com/notes/82/:
Hmm, interesting.
I may well have read that and forgotten the Harry reference, I knew he was working on exercises.
I doubt whether it’s good to do actual anti-conformity training because it might make you too non-conforming (i.e. sticking to wrong positions). Instead, maybe it’d be better to do training on how to use others’ opinions as evidence, similar to calibration training. The approach of anti-conformity training sounds good, but I’d stray in some statements which are actually false, the goal here is to actually get to the right conclusions whether the rest of society is right or wrong.
It seems unlikely to be the sort of field where people will overcorrect—the causes of conformity(both psychologically and sociologically) are immense, and a bit of training will not overbalance the scales.