If you want impact, use the narrative fallacy. What I mean is, use all of the other biases and fallacies you listed—tell a story about John, the guy who met a cool scientist guy when he was in primary school and now his life goal is to be a scientist. He decides to do work on global warming because ‘what could be more important than this issue?’ He expects to live in the city, be the head of a big lab… But he’s not very good at global warming science (maybe he’s not very good at research?), and he doesn’t seem to notice that the advice his colleagues give him isn’t helping. So he sticks to his guns because he’s already got a degree in global warming, but he’s always stressing about not having a job...
And so on.
And then rewind. John discovers rationality when he’s a young adult, and becomes John-prime. Compare John to John-prime, whose rationality training allows him to recognise the availability bias at work on his dream of being a scientist, and since scholarship is a virtue, he researches, interviews… discovers that politics is a much better fit! His rationality informs him that the most important thing is improving quality of life, not global warming or power, so he donates to third-world charities and ensures when he runs for political positions he does so on a platform of improving social welfare and medical access. His rationality lets him evaluate advice-givers, and he manages to see through most of the self-serving advice—and when he finds a mentor who seems genuine, he sticks to that mentor, improving his success in politics...
And so on.
(And then the punchline: explain why this story makes the audience feel like rationality is important with a description of the narrative bias!)
A related idea, when at some point in the future someone runs a rationality seminar that costs money (a reasonable service to offer) the marketing pitch ends with:
“Now, if this were a regular sales pitch, we’d end by saying ’normally we charge $500 for this workshop, but this month we’re actually having a discount, so you can get it for $400”
Beat.
“What I just did was called anchoring. Saying a number causes your brain to use that number as a reference point, whether you want it to or not. $400 sounds like a good deal compared to $500. You’re probably going to have difficulty putting a value on the workshop now that ISN’T based off of those numbers. This technique is used by marketing all time, from coupons to car salesmen. Us? We just straight up charge $300.”
Beat.
If someone chimes up with “Hey, you just used anchoring AGAIN!” they get a $50 discount.
So basically, use dark side mind control tricks to convince them of something we couldn’t otherwise, but then claim it’s OK because we reveal the tricks at the end?
I’m not sure this is true—I think we could convince them using other methods—but in either case, why tie our hands behind our back if we’re trying to win?
why tie our hands behind our back if we’re trying to win?
Because it’s unethical. I don’t think it’s so important to convince uninterested people that we should resort to unethical methods.
If we use unethical mind control tricks whose success is not correlated with the strength of
our arguments, we lose an opportunity to discover that maybe we aren’t ready to be convincing people. What if we are wrong? What if rationality is not developed enough to have the results speak for themselves? How would we know?
The fact that dark side mind control tricks look attractive is evidence that the art is not developed enough that we should even be trying to convince people of its effectiveness. When the art is ready, we will not have to convince people; they will be asking how we do it.
I don’t think it’s so important to convince uninterested people that we should resort to unethical methods.
If behaving ethically is more important in your ethics than helping people avoid huge mistakes that hurt them—like, say, choosing alternative therapies instead of something that actually cures a disease and dying because the side effects of the treatment are more available to your brain than the concept of dying—then I don’t think much of your ethics.
If there was a pill that would make people more rational, I’d be slipping it in their food without telling them. I’d be injecting it into the water supply. I’d be taking a huge dose and donating blood. Because there are people out there that refuse vaccinations, there are people out there that take alcohol and painkillers together, there are people out there that make simple silly mistakes and die. And that’s wrong.
We are not talking about slipping people some miracle cure that they are just being stupid about not taking. If that were the case, you would be right. At this point we don’t actually know that it is a miracle cure and we are just slipping them some dubious substance that shows promise but may or may not help. We need more interested people to develop the art, but not the kind of people who will only be convinced by dark side mind control tricks.
Maybe when LW rationality is at a point where reasonable people could be convinced with empirical evidence, then it will be a good idea to trick the rest.
Ethics isn’t just about right and wrong, it’s also about not doing stupid shit that’s going to bite you in the ass.
If you want impact, use the narrative fallacy. What I mean is, use all of the other biases and fallacies you listed—tell a story about John, the guy who met a cool scientist guy when he was in primary school and now his life goal is to be a scientist. He decides to do work on global warming because ‘what could be more important than this issue?’ He expects to live in the city, be the head of a big lab… But he’s not very good at global warming science (maybe he’s not very good at research?), and he doesn’t seem to notice that the advice his colleagues give him isn’t helping. So he sticks to his guns because he’s already got a degree in global warming, but he’s always stressing about not having a job...
And so on.
And then rewind. John discovers rationality when he’s a young adult, and becomes John-prime. Compare John to John-prime, whose rationality training allows him to recognise the availability bias at work on his dream of being a scientist, and since scholarship is a virtue, he researches, interviews… discovers that politics is a much better fit! His rationality informs him that the most important thing is improving quality of life, not global warming or power, so he donates to third-world charities and ensures when he runs for political positions he does so on a platform of improving social welfare and medical access. His rationality lets him evaluate advice-givers, and he manages to see through most of the self-serving advice—and when he finds a mentor who seems genuine, he sticks to that mentor, improving his success in politics...
And so on.
(And then the punchline: explain why this story makes the audience feel like rationality is important with a description of the narrative bias!)
I like this approach.
A related idea, when at some point in the future someone runs a rationality seminar that costs money (a reasonable service to offer) the marketing pitch ends with:
“Now, if this were a regular sales pitch, we’d end by saying ’normally we charge $500 for this workshop, but this month we’re actually having a discount, so you can get it for $400”
Beat.
“What I just did was called anchoring. Saying a number causes your brain to use that number as a reference point, whether you want it to or not. $400 sounds like a good deal compared to $500. You’re probably going to have difficulty putting a value on the workshop now that ISN’T based off of those numbers. This technique is used by marketing all time, from coupons to car salesmen. Us? We just straight up charge $300.”
Beat.
If someone chimes up with “Hey, you just used anchoring AGAIN!” they get a $50 discount.
Nice.
Excellent.
So basically, use dark side mind control tricks to convince them of something we couldn’t otherwise, but then claim it’s OK because we reveal the tricks at the end?
It doesn’t need to be “OK”.
I’m not sure this is true—I think we could convince them using other methods—but in either case, why tie our hands behind our back if we’re trying to win?
Because it’s unethical. I don’t think it’s so important to convince uninterested people that we should resort to unethical methods.
If we use unethical mind control tricks whose success is not correlated with the strength of our arguments, we lose an opportunity to discover that maybe we aren’t ready to be convincing people. What if we are wrong? What if rationality is not developed enough to have the results speak for themselves? How would we know?
The fact that dark side mind control tricks look attractive is evidence that the art is not developed enough that we should even be trying to convince people of its effectiveness. When the art is ready, we will not have to convince people; they will be asking how we do it.
If behaving ethically is more important in your ethics than helping people avoid huge mistakes that hurt them—like, say, choosing alternative therapies instead of something that actually cures a disease and dying because the side effects of the treatment are more available to your brain than the concept of dying—then I don’t think much of your ethics.
If there was a pill that would make people more rational, I’d be slipping it in their food without telling them. I’d be injecting it into the water supply. I’d be taking a huge dose and donating blood. Because there are people out there that refuse vaccinations, there are people out there that take alcohol and painkillers together, there are people out there that make simple silly mistakes and die. And that’s wrong.
First of all, what do you think of protected from myself?
We are not talking about slipping people some miracle cure that they are just being stupid about not taking. If that were the case, you would be right. At this point we don’t actually know that it is a miracle cure and we are just slipping them some dubious substance that shows promise but may or may not help. We need more interested people to develop the art, but not the kind of people who will only be convinced by dark side mind control tricks.
Maybe when LW rationality is at a point where reasonable people could be convinced with empirical evidence, then it will be a good idea to trick the rest.
Ethics isn’t just about right and wrong, it’s also about not doing stupid shit that’s going to bite you in the ass.
Very Ericsonian. I like it!