It seems to me that, unless one is already a powerful person, the best thing one can do to gain optimization power is building relationships with people more powerful than oneself. To the extant that this easily trumps the vast majority of other failings (epistemic rationality wise) as discussed on LW. So why aren’t we discussing how to do better at this regularly? A couple explanations immediately leap to mind:
Not a core competency of the sort of people LW attracts.
Rewards not as immediate as the sort of epiphany porn that some of LW generates.
Ugh fields. Especially in regard to things that are considered manipulative when reasoned about explicitly, even though we all do them all the time anyway.
LW’s foundational posts are all very strongly biased towards epistemic rationality, and I think that strong bias still affects our attempts to talk about instrumental rationality. There are probably all sorts of instrumentally rational things we could be doing that we don’t talk about enough.
The general angle is asking intelligent, and forward-pointing questions, specifically because deep processing for thoughts (as described in Thinking Fast and Slow) is rare, even within the business community; so demonstrating understanding, and curiosity (both of which are strength of people on LW) is an almost instant-win.
Two of the better guides on how to approach this intelligently are:
The other aspect of this is Speaking the Lingo. The problem with LW is:
1, people developing gravity wells around specific topics , and having a very hard time talking about stuff others are interested in without bringing up pet topics of their own; and
2, the inference distance between the kind of stuff that puts people into powerful position, and the kind of stuff LW develops a gravity well around is, indeed, vast.
The operational hack here is 1, listening, 2, building up the scaffolds on which these people hang their power upon; 3, recognizing whether you have an understanding of how those pieces fit together.
General algorithm for the networking dance:
1, Ask intelligent question, listen intently
2, Notice your brain popping up a question/handle that you have an urge to speak up. Develop a classification algo to notice whether the question was generated by your pet gravity well, or by novel understanding.
3, If the former,SHUT UP. If you really have the urge, mimic back what they’ve just said to internalize / develop your understanding (and move the conversation along)
Side-effects might include: developing an UGH-field towards browsing lesswrong, incorporating, and getting paid truckloads. YMMV.
The ‘towards’ scopes over browsing LW, not the rest of the itemized list: ‘1. developing an ugh-fiend (towards browsing LW); 2. incorporating (and building a business with your new spare time); 3. getting paid (truckloads).’
I’m not sure that being a rationalist gives you a significant advantage in interpersonal relationships. A lot of our brain seems to be specifically designed for social interactions; trying to use the rational part of your brain to do social interactions is like using a CPU chip to do graphics instead of a GPU; you can do it, but it’ll be slower and less efficient and effective then using the hardware that’s designed for that.
Perhaps the slow thinking could be used later at home to review the social interactions of the day. It will not directly help to fix the problems caused by quick wrong reactions, but it could help discover some strategical problems.
For example: You are failing to have a good relation with this specific person, but maybe it’s just the person randomly disliking you, or the person remembering your past errors which you have already fixed. Try spending some time with other people and notice whether their reactions are different.
More obvious, but less frequent example: This person seems to like you and invites your to their cult. Be careful!
Yeah, that’s very true; I’m not claiming that rational thought is useless for social interaction, it is good to sometimes stop and think about your social interactions on your own when you have some downtime.
That being said, there are downsides as well. If you’re using rational thought instead of the social parts of your brain to decide how to react to social situations, you will tend to react differently. Not that you’re wrong, or irrational, but you just won’t respond to social cues in the way people expect, and that itself might give you a disadvantage.
Thinking about this, it is actually reminding me of the behavior of a friend of mine who has a form of high-functioning autism; she’s very smart, but she reacts quite differently in social situations then most people would expect. Perhaps that is basically what she is doing.
It seems to me that, unless one is already a powerful person, the best thing one can do to gain optimization power is building relationships with people more powerful than oneself.
Power isn’t one dimensional. The thing that matters isn’t so much to make relationships with people who are more powerful than you in all domains but to make relationship with people who are poweful in some domain where you could ask them for help.
Because it’s hard. That’s what kept me from doing it.
I am very close to explicitly starting a project to do just that, and didn’t get to this point even until one of my powerful friends explicitly advised me to take a particular strategy to get relationships with more powerful people.
I find myself unable to be motivated to do it without calling it “Networking the Hard Way”, to remind myself that yes, it’s hard, and that’s why it will work.
Thanks for the offer. It feels great when people make such offers now, because I no longer need that kind of help, which is such a relief. I use Beeminder now, which basically solves the “stay motivated to do quantifiable goal at some rate” problem.
Realistically, Less Wrong is most concerned about epistemic rationality: the idea that having an accurate map of the territory is very important to actually reaching your instrumental goals. If you imagine for a second a world where epistemic rationality isn’t that important, you don’t really need a site like Less Wrong. There’s nods to “instrumental rationality”, but those are in the context of epistemic rationality getting you most of the way and being the base you work off of, otherwise there’s no reason to be on Less Wrong instead of a specific site dealing with the sub area.
Also, lots of “building relationships with powerful people” is zero sum at best, since it resembles influence peddling more than gains from informal trade.
Insofar as MIRI folk seem to be friends with Jaan Tallin and Thiel etc. they appear to be trying to do this, though they don’t seem to be teaching it as a great idea. But organizationally, if you’re trying to optimize the world in a more rational way, spreading rationality might be a better way than trying to befriend less rational powerful people. Obviously this is less effective on a more personal basis.
It seems to me that, unless one is already a powerful person, the best thing one can do to gain optimization power is building relationships with people more powerful than oneself.
Depends on how powerful you want to become. Those relationships will be a burden the moment you’ll “surpass the masters” so to speak. You may want to avoid building too many.
It seems to me that, unless one is already a powerful person, the best thing one can do to gain optimization power is building relationships with people more powerful than oneself. To the extant that this easily trumps the vast majority of other failings (epistemic rationality wise) as discussed on LW. So why aren’t we discussing how to do better at this regularly? A couple explanations immediately leap to mind:
Not a core competency of the sort of people LW attracts.
Rewards not as immediate as the sort of epiphany porn that some of LW generates.
Ugh fields. Especially in regard to things that are considered manipulative when reasoned about explicitly, even though we all do them all the time anyway.
LW’s foundational posts are all very strongly biased towards epistemic rationality, and I think that strong bias still affects our attempts to talk about instrumental rationality. There are probably all sorts of instrumentally rational things we could be doing that we don’t talk about enough.
Would also be useful to know how to get other people around you to up meta-ness or machiavellianism.
Do you have any experience doing this successfully? I’d assume that powerful people already have lots of folks trying to make friends with them.
Specifically for business, I do.
The general angle is asking intelligent, and forward-pointing questions, specifically because deep processing for thoughts (as described in Thinking Fast and Slow) is rare, even within the business community; so demonstrating understanding, and curiosity (both of which are strength of people on LW) is an almost instant-win.
Two of the better guides on how to approach this intelligently are:
http://www.slideshare.net/foundercentric/how-not-to-suck-at-introductions
http://www.kalzumeus.com/standing-invitation/
The other aspect of this is Speaking the Lingo. The problem with LW is:
1, people developing gravity wells around specific topics , and having a very hard time talking about stuff others are interested in without bringing up pet topics of their own; and
2, the inference distance between the kind of stuff that puts people into powerful position, and the kind of stuff LW develops a gravity well around is, indeed, vast.
The operational hack here is 1, listening, 2, building up the scaffolds on which these people hang their power upon; 3, recognizing whether you have an understanding of how those pieces fit together.
General algorithm for the networking dance:
1, Ask intelligent question, listen intently
2, Notice your brain popping up a question/handle that you have an urge to speak up. Develop a classification algo to notice whether the question was generated by your pet gravity well, or by novel understanding.
3, If the former,SHUT UP. If you really have the urge, mimic back what they’ve just said to internalize / develop your understanding (and move the conversation along)
Side-effects might include: developing an UGH-field towards browsing lesswrong, incorporating, and getting paid truckloads. YMMV.
If you have an “UGH-field towards”, do you mean attracted to, or repulsed by browsing LW, making money, etc?
The ‘towards’ scopes over browsing LW, not the rest of the itemized list: ‘1. developing an ugh-fiend (towards browsing LW); 2. incorporating (and building a business with your new spare time); 3. getting paid (truckloads).’
Unambiguous mistake or ambiguous parallel construction? I agree w/ your parse, on grounds of the indisputable goodness of truckloads of money.
I didn’t misunderstand it when I read it initially, so I think latter.
Sure, but rationalists should win.
I’m not sure that being a rationalist gives you a significant advantage in interpersonal relationships. A lot of our brain seems to be specifically designed for social interactions; trying to use the rational part of your brain to do social interactions is like using a CPU chip to do graphics instead of a GPU; you can do it, but it’ll be slower and less efficient and effective then using the hardware that’s designed for that.
Perhaps the slow thinking could be used later at home to review the social interactions of the day. It will not directly help to fix the problems caused by quick wrong reactions, but it could help discover some strategical problems.
For example: You are failing to have a good relation with this specific person, but maybe it’s just the person randomly disliking you, or the person remembering your past errors which you have already fixed. Try spending some time with other people and notice whether their reactions are different.
More obvious, but less frequent example: This person seems to like you and invites your to their cult. Be careful!
Yeah, that’s very true; I’m not claiming that rational thought is useless for social interaction, it is good to sometimes stop and think about your social interactions on your own when you have some downtime.
That being said, there are downsides as well. If you’re using rational thought instead of the social parts of your brain to decide how to react to social situations, you will tend to react differently. Not that you’re wrong, or irrational, but you just won’t respond to social cues in the way people expect, and that itself might give you a disadvantage.
Thinking about this, it is actually reminding me of the behavior of a friend of mine who has a form of high-functioning autism; she’s very smart, but she reacts quite differently in social situations then most people would expect. Perhaps that is basically what she is doing.
Power isn’t one dimensional. The thing that matters isn’t so much to make relationships with people who are more powerful than you in all domains but to make relationship with people who are poweful in some domain where you could ask them for help.
Because it’s hard. That’s what kept me from doing it.
I am very close to explicitly starting a project to do just that, and didn’t get to this point even until one of my powerful friends explicitly advised me to take a particular strategy to get relationships with more powerful people.
I find myself unable to be motivated to do it without calling it “Networking the Hard Way”, to remind myself that yes, it’s hard, and that’s why it will work.
I would be interested in hearing about this strategy if you feel like sharing.
Soon. Would rather actually do it first, before reporting on my glorious future success.
Mmhmm, good catch. Thanks.
Not done much on it yet, but here’s the plan.
Thanks for sharing. Tell me if you want me to bug you about whether you’re following your plan at scheduled points in the future.
Thanks for the offer. It feels great when people make such offers now, because I no longer need that kind of help, which is such a relief. I use Beeminder now, which basically solves the “stay motivated to do quantifiable goal at some rate” problem.
Realistically, Less Wrong is most concerned about epistemic rationality: the idea that having an accurate map of the territory is very important to actually reaching your instrumental goals. If you imagine for a second a world where epistemic rationality isn’t that important, you don’t really need a site like Less Wrong. There’s nods to “instrumental rationality”, but those are in the context of epistemic rationality getting you most of the way and being the base you work off of, otherwise there’s no reason to be on Less Wrong instead of a specific site dealing with the sub area.
Also, lots of “building relationships with powerful people” is zero sum at best, since it resembles influence peddling more than gains from informal trade.
Insofar as MIRI folk seem to be friends with Jaan Tallin and Thiel etc. they appear to be trying to do this, though they don’t seem to be teaching it as a great idea. But organizationally, if you’re trying to optimize the world in a more rational way, spreading rationality might be a better way than trying to befriend less rational powerful people. Obviously this is less effective on a more personal basis.
Depends on how powerful you want to become. Those relationships will be a burden the moment you’ll “surpass the masters” so to speak. You may want to avoid building too many.