Making Rationality General-Interest

Introduction

Less Wrong cur­rently rep­re­sents a tiny, tiny, tiny seg­ment of the pop­u­la­tion. In its cur­rent form, it might only ap­peal to a tiny, tiny seg­ment of the pop­u­la­tion. Ba­si­cally, the peo­ple who have a strong need for cog­ni­tion, who are INTx on the My­ers-Briggs (65% of us as per 2012 sur­vey data), etc.

Rais­ing the san­ity wa­ter­line seems like a gen­er­ally good idea. Smart peo­ple who be­lieve stupid things, and go on to in­vest re­sources in stupid ways be­cause of it, are frus­trat­ing. Try­ing to learn ra­tio­nal­ity skills in my 20s, when a bunch of thought pat­terns are already over­learned, is even more frus­trat­ing.

I have an in­tu­ition that a bet­ter fu­ture would be one where the con­cept of ra­tio­nal­ity (maybe called some­thing differ­ent, but the same idea) is nor­mal. Where it’s as ob­vi­ous as the idea that you shouldn’t spend more money than you earn, or that you should live a healthy lifestyle, etc. The point isn’t that ev­ery­one cur­rently lives debt-free, eats de­cently well and ex­er­cises; that isn’t the case; but they are nor­mal things to do if you’re a min­i­mally proac­tive per­son who cares a bit about your fu­ture. No one has ever told me that do­ing taek­wondo to stay fit is weird and culty, or that keep­ing a bud­get will make me un­happy be­cause I’m over­think­ing thing.

I think the ques­tions of “whether we should try to do this” and “if so, how do we do it in prac­tice?” are both valuable to dis­cuss, and in­ter­est­ing.

Is mak­ing ra­tio­nal­ity gen­eral-in­ter­est a good goal?

My in­tu­itions are far from 100% re­li­able. I can think of a few rea­sons why this might be a bad idea:

1. A lit­tle bit of ra­tio­nal­ity can be dam­ag­ing; it might push peo­ple in the di­rec­tion of too much con­trar­i­anism, or some­thing else I haven’t thought of. Since in­tro­spec­tion is im­perfect, know­ing a bit about cog­ni­tive bi­ases and the mis­takes that other peo­ple make might make peo­ple ac­tu­ally less likely to change their mind–they see other peo­ple mak­ing those well-known mis­takes, but not them­selves. Like­wise, ra­tio­nal­ity taught only as a tool or skill, with­out any kind of un­der­ly­ing philos­o­phy of why you should want to be­lieve true things, might cause prob­lems of a similar na­ture to mar­tial art skills taught with­out the tra­di­tional, of­ten non-vi­o­lent philoso­phies–it could re­sult in peo­ple abus­ing the skill to win fights/​de­bates, mak­ing the larger com­mu­nity worse off over­all. (Credit to Yan Zhang for mar­tial arts metaphor).

2. Mak­ing the con­cepts gen­eral-in­ter­est, or just grow­ing too fast, might in­volve wa­ter­ing them down or chang­ing them in some way that the value of the LW micro­com­mu­nity is lost. This could be worse for the peo­ple who cur­rently en­joy LW even if it isn’t worse over­all. I don’t know how easy it would be to avoid, or whether

3. It turns out that ra­tio­nal­ists don’t ac­tu­ally win, and x-ra­tio­nal­ity, as Yvain terms it, just isn’t that amaz­ing over-and-above already be­ing proac­tive and do­ing stuff like keep­ing a bud­get. Yeah, you can say stuff like “the defi­ni­tion of ra­tio­nal­ity is that it helps you win”, but if in real life, all the peo­ple who de­liber­ately try to in­crease their ra­tio­nal­ity do worse off over­all, by their own stan­dards (or even equally well, but with less time left over for other fun pur­suits) than the peo­ple who aim for their life goals di­rectly, I want to know that.

4. Mak­ing ra­tio­nal­ity gen­eral-in­ter­est is a good idea, but not the best thing to be spend­ing time and en­ergy on right now be­cause of Mys­te­ri­ous Rea­sons X, Y, Z. Maybe I only think it is be­cause of my per­sonal bias to­wards lik­ing com­mu­nity stuff (and wish­ing all of my friends were also friends with each other and liked the same ac­tivi­ties, which would sim­plify my so­cial life, but prob­a­bly shouldn’t hap­pen for good rea­sons).

Ob­vi­ously, if any of these are the case, I want to know about it. I also want to know about it if there are other rea­sons, off my radar, why this is a ter­rible idea.

What has to change for this to hap­pen?

I don’t re­ally know, or I would be do­ing those things already (maybe, akra­sia al­low­ing). I have some ideas, though.

1. The jar­gon thing. I’m cur­rently try­ing to com­pile a list of LW/​CFAR jar­gon as a pro­ject for CFAR, and there are lots of terms I don’t know. There are terms that I’ve re­al­ized in ret­ro­spect that I was us­ing in­cor­rectly all along. This pre­sents both a large ini­tial effort for some­one in­ter­ested in learn­ing about ra­tio­nal­ity via the LW route, and also might con­tribute to the look­ing-like-a-cult thing.

2. The gen­der ra­tio thing. This has been dis­cussed be­fore, and it’s a con­tro­ver­sial thing to dis­cuss, and I don’t know how much ar­gu­ing about it in com­ments will pre­sent any solu­tions. It seems pretty clear that if you want to ap­peal to the whole pop­u­la­tion, and a group that rep­re­sents 50% of the gen­eral pop­u­la­tion only rep­re­sents 10% of your par­ti­ci­pants (also as per 2012 sur­vey data, see link above), there’s go­ing to be a prob­lem some­where down the road.

My data point: as a fe­male on LW, I haven’t ex­pe­rienced any dis­crim­i­na­tion, and I’m a bit baf­fled as to why the gen­der ra­tio is so skewed in the first place. Then again, I’ve already been through the filter of not car­ing if I’m the only girl at a meetup group. And I do hang out in fe­male-dom­i­nated groups (i.e. the en­tire field of nurs­ing), and fit in okay, but I’m prob­a­bly not all that good as a typ­i­cal ex­am­ple to gen­er­al­ize from.

3. LW cur­rently ap­peals to in­tel­li­gent peo­ple, or at least peo­ple who self-iden­tify as in­tel­li­gent; ac­cord­ing to the 2012 sur­vey data, the self-re­ported IQ me­dian is 138. This wouldn’t be sur­pris­ing, and isn’t a prob­lem un­til you want to ap­peal to more than 1% of the pop­u­la­tion. But in­tel­li­gence and ra­tio­nal­ity are, in the­ory, or­thog­o­nal, or at least not the same thing. If I suffered a brain in­jury that re­duced my IQ sig­nifi­cantly but didn’t oth­er­wise af­fects my likes and dis­likes, I ex­pect I would still be in­ter­ested in im­prov­ing my ra­tio­nal­ity and think it was im­por­tant, per­haps even more so, but I also think I would find it frus­trat­ing. And I might feel hor­ribly out of place.

4. Ra­tion­al­ity in gen­eral has a bad rap; speci­fi­cally, the Spock thing. And this isn’t just af­fect­ing whether or not peo­ple thing Less Wrong the site is weird; it’s af­fect­ing whether they want to think about their own de­ci­sion-mak­ing.

This is only what I can think of in 5 min­utes...

What’s already hap­pen­ing?

Meetup groups are hap­pen­ing. CFAR is hap­pen­ing. And there are groups out there prac­tic­ing skills similar or re­lated to ra­tio­nal­ity, whether or not they call it the same thing.

Conclusion

Ra­tion­al­ity, Less Wrong and CFAR have, grad­u­ally over the last 2-3 years, be­come a big part of my life. It’s been fun, and I think it’s made me stronger, and I would pre­fer a world where as many other peo­ple as pos­si­ble have that. I’d like to know if peo­ple think that’s a) a good idea, b) fea­si­ble, and c) how to do it prac­ti­cally.