I’ve been trying to ease some friends into basic rationality materials but am running into a few obstacles. Is there a quick and dirty way to deal with the “but I don’t want to be rational” argument without seeming like Mr. Spock? Also, what’s a good source on the rational use of emotions?
I’ve been trying to ease some friends into basic rationality materials but am running into a few obstacles.
I suggest the same techniques that work with any kind of evangelism. Convey that you are extremely sexually attractive and otherwise high in status by virtue of your rationalist identity. Let there be an unspoken threat in the background that if they don’t come to share your beliefs someone out there somewhere may just kill them or limit their mating potential.
Sad thought, but that explains what makes evangelism successful.
To whoever modded wedrifid down: was it because of the implicit endorsement of bad behavior, or because you have some reason to believe this is not how evangelism often works?
The other is the possibility of lost purpose so that the person ends up trying to “act” rational rather than be rational.
In response to the former, epistemic hygiene is good and should be practiced when when possible, but is not necessary. Bullets kill good guys just as easily as bad guys, but guns remain a valuable tool if you’re sufficiently careful. I’m surprised there hasn’t been more discussion of when usage of the ‘dark arts’ is acceptable.
In response to the latter, how might we make sure we achieve the wrong goal here?
Could you rephrase that? I’m not sure what you think I should assume or what that assumption implies with regards to your original statement. My objection was similar to jimmy’s, if that helps.
I also implied a similarity between the in-group and out-groups, in particular a similarity to the out-group ‘religious believers’.
Then there is the fact that my suggestions just don’t really help Bindbreaker in a practical actionable way. Not that my suggestions weren’t an effective recipe for influence. It’s just that they are too general to be useful. Of course, I could be more specific about just what techniques Bindbreaker could use to generate the social dominance and influence he desires but that is just asking for trouble! ;)
Then there is the fact that my suggestions just don’t really help Bindbreaker in a practical actionable way.
Don’t sell yourself short! The first part (about conveying sexual attractiveness) might not be actionable, since people are generally already doing whatever they know how to do to maximize this or are okay with its current level.
But the second part (about the implied threat of not joining) certainly converts easily into actionable advice. At least, it’s far more specific and usable than most dating advice I’ve seen!
Interesting. My intuition would be that the ‘convey sexual attractiveness’ part is more actionable than the implied threat part. I think the amount of influence that can be gained by increasing personal status is greater per unit of effort than that that can be expected from attempting to socially engineer an infrastructure that coercively penalises irrationality. Maybe that is just because I haven’t spent as much time researching the latter!
That’s an interesting proposition you have going. In order to convey the superior sexual attractiveness of rationality we need some sexy rationalists to proselytize. Thank you Carl Sagan! But seriously, the problem might be that basic rationality doesn’t translate easily into sexuality, threat, or other emotional appeal. Those things need to be brought in from other skill sets. Rationality can help apply skills and techniques to a given end, but it doesn’t give you those techniques or skills.
Let there be an unspoken threat in the background that if they don’t come to share your beliefs someone out there somewhere may just kill them or limit their mating potential.
it’s only necessary that to convincingly give the impression that failure to join will have those negative consequences. You don’t need to actually move society in this direction!
What I had in mind for Bindbreaker’s case was something like, “If you’re not familiar with rationality, you leave yourself open for being turned into a money pump. I know a ton of people who know exactly how to do this [probably a lie], and I’d really hate for one of them to take advantage of you like that [truth]! I’d never forgive myself for not doing more to teach you about being a rationalist! [half-truth]”
Not that I’d advocate lying like that, of course :-(
“If you’re not familiar with [Jesus], you leave yourself open for [going to hell]. I know [the devil] knows exactly how to [send you there], and I’d really hate for [the devil] to take advantage of you like that. I’d never forgive myself for not doing more to teach you about [Jesus]!”
At least the argument for rationalism would be in terms they are familiar with, I suppose.
I said it would be more convincing, not that it would necessarily be a better argument. And I think the money pump is just a little more demonstrable than the devil.
In any case, the way that you would achieve the subtle threats when evangelizing standard, popular religions wouldn’t be with any kind of direct argument like that one. Rather, you would innocently drop references to how popular it already is, how the social connections provided by the religion help its members, how they have strength in numbers and strength members’ fanaticism (hinting how it can be deployed against those it deems a threat) … you get the idea.
the money pump is just a little more demonstrable than the devil.
I’ve asked this before: Why don’t rationalist run money pumps?
As far as I know, none of us are exploiting biases or irrationality for profit in any systematic way, which is itself irrational if we really believe this is an option.
We’re either an incredibly ethical group, or money pumping isn’t as easy as it would seem from reading the research.
I’ve asked this before: Why don’t rationalist run money pumps?
I think you’ve answered your own question. Let me elaborate:
1) Rationalists significantly overestimate people’s vulnerability to money pumps, often based on mistaken views about how e.g. religious irrationality “must” spill over into other areas.
2) Even if you don’t care about ethics, scamming people will just make the population more suspicious of people claiming mastery of rationalist ideas.
To do money pumping on a grand scale, you have to be in the financial markets; but there are no money pumps there which aren’t being busily pumped away. (‘Bears make money, bulls make money; pigs get slaughtered’.) This is true for pumps like casinos, too—lots of competition.
The potential for a money pump is an indication that the preference system is inconsistent and so potential for exploit to some extent. It does not mean that the agent in question must be incapable of altering their preferences in the face of blantant exploit. ‘Patching’ of a utility function in the face of inconsistencies that are the most easy to exploit comes naturally to humans.
With that in mind I observe that bizarre and irrational preferences and the exploitation thereof are extremely prevalent and I would go as far as to say a significant driver of the economy. Of course, it isn’t only rationalists that enjoy the benefits of exploiting suckers.
I’m not disagreeing with you, I think. Until rationalists start showing tangible social benefits like the ones backing the subtle threats you mentioned, it will be hard to get people in the door who aren’t already predisposed.
Though I have had trouble developing a demonstrable money pump that can’t be averted by saying “I would simply be too suspicious of somebody who offered me repeated deals to allow them to continually take my money.” Of course, the standard retort might be “you play the lottery,” but then that’s not a great way to make people like rationalists/rationalism.
Okay, we’re in agreement; I just wasn’t sure what your ultimate point was, and used the opportunity to point out how the technique is used in other contexts.
“If you’re not familiar with [Jesus], you leave yourself open for [going to hell]. I know [the devil] knows exactly how to [send you there], and I’d really hate for [the devil] to take advantage of you like that. I’d never forgive myself for not doing more to teach you about [Jesus]!”
The comparison I’m concerned with is “people who don’t conform to the beliefs of the orthodoxy are burned as heretics”.
To Eliezer’s list, I would add “Something To Protect” and the very end of “Circular Altruism”. When a friend of mine said something similar during a discussion of health care about not really wanting to be rational, I linked him to those two and summarized them like this (goes off and finds the discussion):
I don’t really care what you do on [the first thought experiment]. But I care very much what you do on [the second and third]. The importance of logic appears only when you have something that is more important to you than feeling good.
If your goal is to feel good, you can have whatever health system and whatever solution to the trolley problem makes you feel best. I mean, knowing that I didn’t let that poor old cancer patient die would make me feel really warm and fuzzy inside too. And I’d also feel really awful about pushing a fat man onto the tracks.
But if your goal is to save lives, you lose the right to do whatever you want, and you’d better start doing what’s logical. The logical solution to the two problems does, of course, save more lives than the warm fuzzy alternative.
So the question is: which is more important to you? Feeling good, or saving lives? As Overcoming Bias says:
“You know what? This isn’t about your feelings. A human life, with all its joys and all its pains, adding up over the course of decades, is worth far more than your brain’s feelings of comfort or discomfort with a plan. Does computing the expected utility feel too cold-blooded for your taste? Well, that feeling isn’t even a feather in the scales, when a life is at stake. Just shut up and multiply.”
If you’re using a different example with something less important than saving lives, maybe switch to something more important in the cosmic scheme of things. I’m very sympathetic to people who say good feelings are more important to them than a few extra bucks, and I don’t even think they’re being irrational most of the time. The more important the outcome, the more proportionately important rationality becomes than happy feelings.
I thought that this may be of interest to some. There was an IAMA posted on reddit from a person that suffers from alexithmia or lack of emotions recently. Check it out.
Are they saying that they don’t want to be rational, or just not emotionless? I think that people do want to be rational, in some sense, when dealing with emotions, but they’re just never going to have interest in, say, Kahneman and Tversky , or other formal theory. I’ve noticed that some women I know have read “He’s Just Not That Into You”, which from how they describe it, sounds like strategies on rationally dealing with strong emotions. I know it sounds hokey, but people have read that book and were able to put their emotions in a different light when it comes to romantic relationships. I couldn’t tell you if the advice was good or not, but I think it does sound like there’s at least an audience for what you’re talking about.
People don’t want to go through the formal processes of being rational in many emotional situations (and they are often right not to). I think letting people know that sometimes its rational not to go through the formal routes, because the outcome will be better if they don’t (and it’s rational to want the best outcome). For example, if you just met a person you might want a relationship with, don’t make said person fill out a questionairre and subject them to a pros-cons list of starting said relationship (I know this sounds absurd, but I know someone who did just this to all her boyfriends. Perhaps fittingly she ended up engaged to an impotent Husserlian phenomenologist twice her age.)
Usually they seem to think that being rational is the same as being emotionless, despite my efforts to convince them otherwise. I think this may again be thanks largely to that dreaded Mr. Spock.
Just keep saying (with your voice clearly pained, no need to hide the feeling) “ugh… Spock, or vulcans in general, are NOT rational. They are what silly not so rational scriptwriters imagine rationality to be”, I guess?
You both agree that being spocklike is bad, so instead of fighting with those connotations, just try to point out that theres a third alternative and why it’s better.
I’ve been trying to ease some friends into basic rationality materials but am running into a few obstacles. Is there a quick and dirty way to deal with the “but I don’t want to be rational” argument without seeming like Mr. Spock? Also, what’s a good source on the rational use of emotions?
I suggest the same techniques that work with any kind of evangelism. Convey that you are extremely sexually attractive and otherwise high in status by virtue of your rationalist identity. Let there be an unspoken threat in the background that if they don’t come to share your beliefs someone out there somewhere may just kill them or limit their mating potential.
Sad thought, but that explains what makes evangelism successful.
To whoever modded wedrifid down: was it because of the implicit endorsement of bad behavior, or because you have some reason to believe this is not how evangelism often works?
I think it’s worth distinguishing between two possible reasons to be against endorsement.
One is that this is bad epistemic hygiene.
The other is the possibility of lost purpose so that the person ends up trying to “act” rational rather than be rational.
In response to the former, epistemic hygiene is good and should be practiced when when possible, but is not necessary. Bullets kill good guys just as easily as bad guys, but guns remain a valuable tool if you’re sufficiently careful. I’m surprised there hasn’t been more discussion of when usage of the ‘dark arts’ is acceptable.
In response to the latter, how might we make sure we achieve the wrong goal here?
“Implicit” endorsement?
If it is given that I think evangelising rational culture is possibly a net negative to the culture itself even the implication is gone.
Could you rephrase that? I’m not sure what you think I should assume or what that assumption implies with regards to your original statement. My objection was similar to jimmy’s, if that helps.
I also implied a similarity between the in-group and out-groups, in particular a similarity to the out-group ‘religious believers’.
Then there is the fact that my suggestions just don’t really help Bindbreaker in a practical actionable way. Not that my suggestions weren’t an effective recipe for influence. It’s just that they are too general to be useful. Of course, I could be more specific about just what techniques Bindbreaker could use to generate the social dominance and influence he desires but that is just asking for trouble! ;)
Don’t sell yourself short! The first part (about conveying sexual attractiveness) might not be actionable, since people are generally already doing whatever they know how to do to maximize this or are okay with its current level.
But the second part (about the implied threat of not joining) certainly converts easily into actionable advice. At least, it’s far more specific and usable than most dating advice I’ve seen!
Interesting. My intuition would be that the ‘convey sexual attractiveness’ part is more actionable than the implied threat part. I think the amount of influence that can be gained by increasing personal status is greater per unit of effort than that that can be expected from attempting to socially engineer an infrastructure that coercively penalises irrationality. Maybe that is just because I haven’t spent as much time researching the latter!
That’s an interesting proposition you have going. In order to convey the superior sexual attractiveness of rationality we need some sexy rationalists to proselytize. Thank you Carl Sagan! But seriously, the problem might be that basic rationality doesn’t translate easily into sexuality, threat, or other emotional appeal. Those things need to be brought in from other skill sets. Rationality can help apply skills and techniques to a given end, but it doesn’t give you those techniques or skills.
A significant part of my point is that rational persuasion isn’t the most effective way of influencing them or of drawing them into a belief system.
To achieve this:
it’s only necessary that to convincingly give the impression that failure to join will have those negative consequences. You don’t need to actually move society in this direction!
What I had in mind for Bindbreaker’s case was something like, “If you’re not familiar with rationality, you leave yourself open for being turned into a money pump. I know a ton of people who know exactly how to do this [probably a lie], and I’d really hate for one of them to take advantage of you like that [truth]! I’d never forgive myself for not doing more to teach you about being a rationalist! [half-truth]”
Not that I’d advocate lying like that, of course :-(
Danger, Will Robinson:
“If you’re not familiar with [Jesus], you leave yourself open for [going to hell]. I know [the devil] knows exactly how to [send you there], and I’d really hate for [the devil] to take advantage of you like that. I’d never forgive myself for not doing more to teach you about [Jesus]!”
At least the argument for rationalism would be in terms they are familiar with, I suppose.
I said it would be more convincing, not that it would necessarily be a better argument. And I think the money pump is just a little more demonstrable than the devil.
In any case, the way that you would achieve the subtle threats when evangelizing standard, popular religions wouldn’t be with any kind of direct argument like that one. Rather, you would innocently drop references to how popular it already is, how the social connections provided by the religion help its members, how they have strength in numbers and strength members’ fanaticism (hinting how it can be deployed against those it deems a threat) … you get the idea.
I’ve asked this before: Why don’t rationalist run money pumps?
As far as I know, none of us are exploiting biases or irrationality for profit in any systematic way, which is itself irrational if we really believe this is an option.
We’re either an incredibly ethical group, or money pumping isn’t as easy as it would seem from reading the research.
I think you’ve answered your own question. Let me elaborate:
1) Rationalists significantly overestimate people’s vulnerability to money pumps, often based on mistaken views about how e.g. religious irrationality “must” spill over into other areas.
2) Even if you don’t care about ethics, scamming people will just make the population more suspicious of people claiming mastery of rationalist ideas.
To elaborate your elaboration:
To do money pumping on a grand scale, you have to be in the financial markets; but there are no money pumps there which aren’t being busily pumped away. (‘Bears make money, bulls make money; pigs get slaughtered’.) This is true for pumps like casinos, too—lots of competition.
And most ways to make a money pump in other areas have been outlawed or are regulated; working money pumps like Swoopo (see http://www.codinghorror.com/blog/archives/001196.html ) are usually walking a fine line.
The potential for a money pump is an indication that the preference system is inconsistent and so potential for exploit to some extent. It does not mean that the agent in question must be incapable of altering their preferences in the face of blantant exploit. ‘Patching’ of a utility function in the face of inconsistencies that are the most easy to exploit comes naturally to humans.
With that in mind I observe that bizarre and irrational preferences and the exploitation thereof are extremely prevalent and I would go as far as to say a significant driver of the economy. Of course, it isn’t only rationalists that enjoy the benefits of exploiting suckers.
I’m not disagreeing with you, I think. Until rationalists start showing tangible social benefits like the ones backing the subtle threats you mentioned, it will be hard to get people in the door who aren’t already predisposed.
Though I have had trouble developing a demonstrable money pump that can’t be averted by saying “I would simply be too suspicious of somebody who offered me repeated deals to allow them to continually take my money.” Of course, the standard retort might be “you play the lottery,” but then that’s not a great way to make people like rationalists/rationalism.
Okay, we’re in agreement; I just wasn’t sure what your ultimate point was, and used the opportunity to point out how the technique is used in other contexts.
The comparison I’m concerned with is “people who don’t conform to the beliefs of the orthodoxy are burned as heretics”.
To Eliezer’s list, I would add “Something To Protect” and the very end of “Circular Altruism”. When a friend of mine said something similar during a discussion of health care about not really wanting to be rational, I linked him to those two and summarized them like this (goes off and finds the discussion):
If you’re using a different example with something less important than saving lives, maybe switch to something more important in the cosmic scheme of things. I’m very sympathetic to people who say good feelings are more important to them than a few extra bucks, and I don’t even think they’re being irrational most of the time. The more important the outcome, the more proportionately important rationality becomes than happy feelings.
I thought that this may be of interest to some. There was an IAMA posted on reddit from a person that suffers from alexithmia or lack of emotions recently. Check it out.
http://www.reddit.com/r/IAmA/comments/9xea8/i_am_unable_to_feel_most_emotion_i_have/
Are they saying that they don’t want to be rational, or just not emotionless? I think that people do want to be rational, in some sense, when dealing with emotions, but they’re just never going to have interest in, say, Kahneman and Tversky , or other formal theory. I’ve noticed that some women I know have read “He’s Just Not That Into You”, which from how they describe it, sounds like strategies on rationally dealing with strong emotions. I know it sounds hokey, but people have read that book and were able to put their emotions in a different light when it comes to romantic relationships. I couldn’t tell you if the advice was good or not, but I think it does sound like there’s at least an audience for what you’re talking about.
People don’t want to go through the formal processes of being rational in many emotional situations (and they are often right not to). I think letting people know that sometimes its rational not to go through the formal routes, because the outcome will be better if they don’t (and it’s rational to want the best outcome). For example, if you just met a person you might want a relationship with, don’t make said person fill out a questionairre and subject them to a pros-cons list of starting said relationship (I know this sounds absurd, but I know someone who did just this to all her boyfriends. Perhaps fittingly she ended up engaged to an impotent Husserlian phenomenologist twice her age.)
Usually they seem to think that being rational is the same as being emotionless, despite my efforts to convince them otherwise. I think this may again be thanks largely to that dreaded Mr. Spock.
Just keep saying (with your voice clearly pained, no need to hide the feeling) “ugh… Spock, or vulcans in general, are NOT rational. They are what silly not so rational scriptwriters imagine rationality to be”, I guess?
I’d try playing taboo with the word “rational”.
You both agree that being spocklike is bad, so instead of fighting with those connotations, just try to point out that theres a third alternative and why it’s better.
http://lesswrong.com/lw/hp/feeling_rational/
http://lesswrong.com/lw/go/why_truth_and/
http://yudkowsky.net/rational/virtues