2016 LessWrong Diaspora Survey Analysis: Part Four (Politics, Calibration & Probability, Futurology, Charity & Effective Altruism)


The LessWrong sur­vey has a very in­volved sec­tion ded­i­cated to poli­tics. In pre­vi­ous anal­y­sis the benefits of this weren’t fully re­al­ized. In the 2016 anal­y­sis we can look at not just the poli­ti­cal af­fili­a­tion of a re­spon­dent, but what be­liefs are as­so­ci­ated with a cer­tain af­fili­a­tion. The charts be­low sum­ma­rize most of the re­sults.

Poli­ti­cal Opinions By Poli­ti­cal Affiliation

Mis­cel­la­neous Politics

There were also some other ques­tions in this sec­tion which aren’t cov­ered by the above charts.


On a scale from 1 (not in­ter­ested at all) to 5 (ex­tremely in­ter­ested), how would you de­scribe your level of in­ter­est in poli­tics?

1: 67 (2.182%)

2: 257 (8.371%)

3: 461 (15.016%)

4: 595 (19.381%)

5: 312 (10.163%)


Did you vote in your coun­try’s last ma­jor na­tional elec­tion? (LW Turnout Ver­sus Gen­eral Elec­tion Turnout By Coun­try)
Group Turnout
LessWrong 68.9%
Aus­trailia 91%
Brazil 78.90%
Bri­tain 66.4%
Canada 68.3%
Fin­land 70.1%
France 79.48%
Ger­many 71.5%
In­dia 66.3%
Is­rael 72%
New Zealand 77.90%
Rus­sia 65.25%
United States 54.9%
Num­bers taken from Wikipe­dia, ac­cu­rate as of the last gen­eral elec­tion in each coun­try listed at time of writ­ing.


If you are an Amer­i­can, what party are you reg­istered with?

Demo­cratic Party: 358 (24.5%)

Repub­li­can Party: 72 (4.9%)

Liber­tar­ian Party: 26 (1.8%)

Other third party: 16 (1.1%)

Not reg­istered for a party: 451 (30.8%)

(op­tion for non-Amer­i­cans who want an op­tion): 541 (37.0%)

Cal­ibra­tion And Prob­a­bil­ity Questions

Cal­ibra­tion Questions

I just couldn’t an­a­lyze these, sorry guys. I put many hours into try­ing to get them into a de­cent for­mat I could even read and that sucked up an in­cred­ible amount of time. It’s why this part of the sur­vey took so long to get out. Thank­fully an­other LessWrong user, Houshalter, has kindly done their own anal­y­sis.

All my cal­ibra­tion ques­tions were meant to satisfy a few es­sen­tial prop­er­ties:

  1. They should be ‘self con­tained’. I.E, some­thing you can rea­son­ably an­swer or at least try to an­swer with a 5th grade sci­ence ed­u­ca­tion and nor­mal life ex­pe­rience.

  2. They should, at least to a cer­tain ex­tent, be Fermi Es­timable.

  3. They should pro­gres­sively scale in difficulty so you can see whether some­body un­der­stands ba­sic prob­a­bil­ity or not. (eg. In an ‘or’ ques­tion do they put a prob­a­bil­ity of less than 50% of be­ing right?)

At least one per­son re­quested a work­book, so I might write more in the fu­ture. I’ll ob­vi­ously write more for the sur­vey.

Prob­a­bil­ity Questions

Ques­tion Mean Me­dian Mode St­dev
Please give the ob­vi­ous an­swer to this ques­tion, so I can au­to­mat­i­cally throw away all sur­veys that don’t fol­low the rules: What is the prob­a­bil­ity of a fair coin com­ing up heads? 49.821 50.0 50.0 3.033
What is the prob­a­bil­ity that the Many Wor­lds in­ter­pre­ta­tion of quan­tum me­chan­ics is more or less cor­rect? 44.599 50.0 50.0 29.193
What is the prob­a­bil­ity that non-hu­man, non-Earthly in­tel­li­gent life ex­ists in the ob­serv­able uni­verse? 75.727 90.0 99.0 31.893
…in the Milky Way galaxy? 45.966 50.0 10.0 38.395
What is the prob­a­bil­ity that su­per­nat­u­ral events (in­clud­ing God, ghosts, magic, etc) have oc­curred since the be­gin­ning of the uni­verse? 13.575 1.0 1.0 27.576
What is the prob­a­bil­ity that there is a god, defined as a su­per­nat­u­ral in­tel­li­gent en­tity who cre­ated the uni­verse? 15.474 1.0 1.0 27.891
What is the prob­a­bil­ity that any of hu­mankind’s re­vealed re­li­gions is more or less cor­rect? 10.624 0.5 1.0 26.257
What is the prob­a­bil­ity that an av­er­age per­son cry­on­i­cally frozen to­day will be suc­cess­fully re­stored to life at some fu­ture time, con­di­tional on no global catas­tro­phe de­stroy­ing civ­i­liza­tion be­fore then? 21.225 10.0 5.0 26.782
What is the prob­a­bil­ity that at least one per­son liv­ing at this mo­ment will reach an age of one thou­sand years, con­di­tional on no global catas­tro­phe de­stroy­ing civ­i­liza­tion in that time? 25.263 10.0 1.0 30.510
What is the prob­a­bil­ity that our uni­verse is a simu­la­tion? 25.256 10.0 50.0 28.404
What is the prob­a­bil­ity that sig­nifi­cant global warm­ing is oc­cur­ring or will soon oc­cur, and is pri­mar­ily caused by hu­man ac­tions? 83.307 90.0 90.0 23.167
What is the prob­a­bil­ity that the hu­man race will make it to 2100 with­out any catas­tro­phe that wipes out more than 90% of hu­man­ity? 76.310 80.0 80.0 22.933

Prob­a­bil­ity ques­tions is prob­a­bly the area of the sur­vey I put the least effort into. My plan for next year is to over­haul these sec­tions en­tirely and try in­clud­ing some Tet­lock-es­que fore­cast­ing ques­tions, a link to some ad­vice on how to make good pre­dic­tions, etc.


This sec­tion got a bit of a facelift this year. In­clud­ing new cry­on­ics ques­tions, ge­netic en­g­ineer­ing, and tech­nolog­i­cal un­em­ploy­ment in ad­di­tion to the pre­vi­ous years.



Are you signed up for cry­on­ics?

Yes—signed up or just finish­ing up pa­per­work: 48 (2.9%)

No—would like to sign up but un­available in my area: 104 (6.3%)

No—would like to sign up but haven’t got­ten around to it: 180 (10.9%)

No—would like to sign up but can’t af­ford it: 229 (13.8%)

No—still con­sid­er­ing it: 557 (33.7%)

No—and do not want to sign up for cry­on­ics: 468 (28.3%)

Never thought about it /​ don’t un­der­stand: 68 (4.1%)


Do you think cry­on­ics, as cur­rently prac­ticed by Al­cor/​Cry­on­ics In­sti­tute will work?

Yes: 106 (6.6%)

Maybe: 1041 (64.4%)

No: 470 (29.1%)

In­ter­est­ingly enough, of those who think it will work with enough con­fi­dence to say ‘yes’, only 14 are ac­tu­ally signed up for cry­on­ics.

sqlite> se­lect count(*) from data where Cry­on­ic­sNow=”Yes” and Cry­on­ics=”Yes—signed up or just finish­ing up pa­per­work”;


sqlite> se­lect count(*) from data where Cry­on­ic­sNow=”Yes” and (Cry­on­ics=”Yes—signed up or just finish­ing up pa­per­work” OR Cry­on­ics=”No—would like to sign up but un­available in my area” OR “No—would like to sign up but haven’t got­ten around to it” OR “No—would like to sign up but can’t af­ford it”);



Do you think cry­on­ics works in prin­ci­ple?

Yes: 802 (49.3%)

Maybe: 701 (43.1%)

No: 125 (7.7%)

LessWrongers seem to be very bullish on the un­der­ly­ing physics of cry­on­ics even if they’re not as en­thu­si­as­tic about cur­rent meth­ods in use.

The Brain Preser­va­tion Foun­da­tion also did an anal­y­sis of cry­on­ics re­sponses to the LessWrong Sur­vey.



By what year do you think the Sin­gu­lar­ity will oc­cur? An­swer such that you think, con­di­tional on the Sin­gu­lar­ity oc­cur­ring, there is an even chance of the Sin­gu­lar­ity fal­ling be­fore or af­ter this year. If you think a sin­gu­lar­ity is so un­likely you don’t even want to con­di­tion on it, leave this ques­tion blank.

Mean: 8.110300081581755e+16

Me­dian: 2080.0

Mode: 2100.0

St­dev: 2.847858859055733e+18

I didn’t bother to filter out the silly an­swers for this.

Ob­vi­ously it’s a bit hard to see with­out fil­ter­ing out the uber-large an­swers, but the me­dian doesn’t seem to have changed much from the 2014 sur­vey.

Ge­netic Engineering


Would you ever con­sider hav­ing your child ge­net­i­cally mod­ified for any rea­son?

Yes: 1552 (95.921%)

No: 66 (4.079%)

Well that’s fairly over­whelming.


Would you be will­ing to have your child ge­net­i­cally mod­ified to pre­vent them from get­ting an in­her­i­ta­ble dis­ease?

Yes: 1387 (85.5%)

Depends on the dis­ease: 207 (12.8%)

No: 28 (1.7%)

I find it amus­ing how the strict “No” group shrinks con­sid­er­ably af­ter this ques­tion.


Would you be will­ing to have your child ge­net­i­cally mod­ified for im­prove­ment pur­poses? (eg. To heighten their in­tel­li­gence or re­duce their risk of schizophre­nia.)

Yes : 0 (0.0%)

Maybe a lit­tle: 176 (10.9%)

Depends on the strength of the im­prove­ments: 262 (16.2%)

No: 84 (5.2%)

Yes I know ‘yes’ is bugged, I don’t know what causes this bug and de­spite my best efforts I couldn’t track it down. There is also an is­sue here where ‘re­duce your risk of schizophre­nia’ is offered as an ex­am­ple which might con­fuse peo­ple, but the ac­tual sci­ence of things cuts closer to that than it does to a clean sep­a­ra­tion be­tween dis­ease risk and ‘im­prove­ment’.

This ques­tion is too im­por­tant to just not have an an­swer to so I’ll do it man­u­ally. Un­for­tu­nately I can’t eas­ily re­move the ‘ex­cluded’ en­tries so that we’re deal­ing with the ex­act same dis­tri­bu­tion but only 13 or so re­sponses are filtered out any­way.

sqlite> se­lect count(*) from data where Ge­net­icIm­prove­ment=”Yes”;


>>> 1100 + 176 + 262 + 84
>>> 1100 /​ 1622

67.8% are will­ing to ge­net­i­cally en­g­ineer their chil­dren for im­prove­ments.


Would you be will­ing to have your child ge­net­i­cally mod­ified for cos­metic rea­sons? (eg. To make them taller or have a cer­tain eye color.)

Yes: 500 (31.0%)

Maybe a lit­tle: 381 (23.6%)

Depends on the strength of the im­prove­ments: 277 (17.2%)

No: 455 (28.2%)

Th­ese num­bers go about how you would ex­pect, with peo­ple be­ing pro­gres­sively less in­ter­ested the more ‘shal­low’ a ge­netic change is seen as.


What’s your over­all opinion of other peo­ple ge­net­i­cally mod­ify­ing their chil­dren for dis­ease pre­ven­tion pur­poses?

Pos­i­tive: 1177 (71.7%)

Mostly Pos­i­tive: 311 (19.0%)

No strong opinion: 112 (6.8%)

Mostly Nega­tive: 29 (1.8%)

Nega­tive: 12 (0.7%)


What’s your over­all opinion of other peo­ple ge­net­i­cally mod­ify­ing their chil­dren for im­prove­ment pur­poses?

Pos­i­tive: 737 (44.9%)

Mostly Pos­i­tive: 482 (29.4%)

No strong opinion: 273 (16.6%)

Mostly Nega­tive: 111 (6.8%)

Nega­tive: 38 (2.3%)


What’s your over­all opinion of other peo­ple ge­net­i­cally mod­ify­ing their chil­dren for cos­metic rea­sons?

Pos­i­tive: 291 (17.7%)

Mostly Pos­i­tive: 290 (17.7%)

No strong opinion: 576 (35.1%)

Mostly Nega­tive: 328 (20.0%)

Nega­tive: 157 (9.6%)

All three of these seem largely con­sis­tent with peo­ples per­sonal prefer­ences about mod­ifi­ca­tion. Were I in­clined I could do a deeper anal­y­sis that ac­tu­ally takes sur­vey re­spon­dents row by row and looks at cor­re­la­tion be­tween prefer­ence for ones own chil­dren and prefer­ence for oth­ers.

Tech­nolog­i­cal Unemployment


Do you think the Lud­dite’s Fal­lacy is an ac­tual fal­lacy?

Yes: 443 (30.936%)

No: 989 (69.064%)

We can use this as an over­all mea­sure of worry about tech­nolog­i­cal un­em­ploy­ment, which would seem to be high among the LW de­mo­graphic.


By what year do you think the ma­jor­ity of peo­ple in your coun­try will have trou­ble find­ing em­ploy­ment for au­toma­tion re­lated rea­sons? If you think this is some­thing that will never hap­pen leave this ques­tion blank.

Mean: 2102.9713740458014

Me­dian: 2050.0

Mode: 2050.0

St­dev: 1180.2342850727339

Ques­tion is flawed be­cause you can’t dis­t­in­guish an­swers of “never hap­pen” from peo­ple who just didn’t see it.

In­ter­est­ing ques­tion that would be fun to take a look at in com­par­i­son to the es­ti­mates for the sin­gu­lar­ity.


Do you think the “end of work” would be a good thing?

Yes: 1238 (81.287%)

No: 285 (18.713%)

Fairly over­whelming con­sen­sus, but with a sig­nifi­cant minor­ity of peo­ple who have a dis­sent­ing opinion.


If ma­chines end all or al­most all em­ploy­ment, what are your biggest wor­ries? Pick two.

Ques­tion Count Per­cent
Peo­ple will just idle about in de­struc­tive ways 513 16.71%
Peo­ple need work to be fulfilled and if we elimi­nate work we’ll all feel deep ex­is­ten­tial angst 543 17.687%
The rich are go­ing to take all the re­sources for them­selves and leave the rest of us to starve or live in poverty 1066 34.723%
The ma­chines won’t need us, and we’ll starve to death or be oth­er­wise liqui­dated 416 13.55%
Ques­tion is flawed be­cause it de­manded the user ‘pick two’ in­stead of up to two.

The plu­ral­ity of wor­ries are about elites who re­fuse to share their wealth.

Ex­is­ten­tial Risk


Which dis­aster do you think is most likely to wipe out greater than 90% of hu­man­ity be­fore the year 2100?

Nu­clear war: +4.800% 326 (20.6%)

As­teroid strike: −0.200% 64 (4.1%)

Un­friendly AI: +1.000% 271 (17.2%)

Nan­otech /​ grey goo: −2.000% 18 (1.1%)

Pan­demic (nat­u­ral): +0.100% 120 (7.6%)

Pan­demic (bio­eng­ineered): +1.900% 355 (22.5%)

En­vi­ron­men­tal col­lapse (in­clud­ing global warm­ing): +1.500% 252 (16.0%)

Eco­nomic /​ poli­ti­cal col­lapse: −1.400% 136 (8.6%)

Other: 35 (2.217%)

Sig­nifi­cantly more peo­ple wor­ried about Nu­clear War than last year. Effect of new re­spon­dents, or geopoli­ti­cal situ­a­tion? Who knows.

Char­ity And Effec­tive Altruism

Char­i­ta­ble Giving


What is your ap­prox­i­mate an­nual in­come in US dol­lars (non-Amer­i­cans: con­vert at www.xe.com)? Ob­vi­ously you don’t need to an­swer this ques­tion if you don’t want to. Please don’t in­clude com­mas or dol­lar signs.

Sum: 66054140.47384

Mean: 64569.052271593355

Me­dian: 40000.0

Mode: 30000.0

St­dev: 107297.53606321265


How much money, in num­ber of dol­lars, have you donated to char­ity over the past year? (non-Amer­i­cans: con­vert to dol­lars at http://​​www.xe.com/​​ ). Please don’t in­clude com­mas or dol­lar signs in your an­swer. For ex­am­ple, 4000

Sum: 2389900.6530000004

Mean: 2914.5129914634144

Me­dian: 353.0

Mode: 100.0

St­dev: 9471.962766896671


How much money have you donated to char­i­ties aiming to re­duce ex­is­ten­tial risk (other than MIRI/​CFAR) in the past year?

Sum: 169300.89

Mean: 1991.7751764705883

Me­dian: 200.0

Mode: 100.0

St­dev: 9219.941506342007


How much have you donated in US dol­lars to the fol­low­ing char­i­ties in the past year? (Non-amer­i­cans: con­vert to dol­lars at http://​​www.xe.com/​​) Please don’t in­clude com­mas or dol­lar signs in your an­swer. Op­tions start­ing with “any” aren’t the name of a char­ity but a cat­e­gory of char­ity.

Ques­tion Sum Mean Me­dian Mode St­dev
Against Malaria Foun­da­tion 483935.027 1905.256 300.0 None 7216.020
Schis­to­so­mi­a­sis Con­trol Ini­ti­a­tive 47908.0 840.491 200.0 1000.0 1618.785
De­worm the World Ini­ti­a­tive 28820.0 565.098 150.0 500.0 1432.712
GiveDirectly 154410.177 1429.723 450.0 50.0 3472.082
Any kind of an­i­mal rights char­ity 83130.47 1093.821 154.235 500.0 2313.493
Any kind of bug rights char­ity 1083.0 270.75 157.5 None 353.396
Ma­chine In­tel­li­gence Re­search In­sti­tute 141792.5 1417.925 100.0 100.0 5370.485
Any char­ity com­bat­ing nu­clear ex­is­ten­tial risk 491.0 81.833 75.0 100.0 68.060
Any char­ity com­bat­ing global warm­ing 13012.0 245.509 100.0 10.0 365.542
Cen­ter For Ap­plied Ra­tion­al­ity 127101.0 3177.525 150.0 100.0 12969.096
Strate­gies for Eng­ineered Neg­ligible Se­nes­cence Re­search Foun­da­tion 9429.0 554.647 100.0 20.0 1156.431
Wikipe­dia 12765.5 53.189 20.0 10.0 126.444
In­ter­net Archive 2975.04 80.406 30.0 50.0 173.791
Any cam­paign for poli­ti­cal office 38443.99 366.133 50.0 50.0 1374.305
Other 564890.46 1661.442 200.0 100.0 4670.805
“Bug Rights” char­ity was sup­posed to be a troll fake­out but ap­par­ently...

This table is in­ter­est­ing given the re­cent de­bates about how much money cer­tain causes are ‘tak­ing up’ in Effec­tive Altru­ism.

Effec­tive Altruism


Do you fol­low any dietary re­stric­tions re­lated to an­i­mal prod­ucts?

Yes, I am ve­gan: 54 (3.4%)

Yes, I am veg­e­tar­ian: 158 (10.0%)

Yes, I re­strict meat some other way (pesc­etar­ian, flex­i­tar­ian, try to only eat eth­i­cally sourced meat): 375 (23.7%)

No: 996 (62.9%)


Do you know what Effec­tive Altru­ism is?

Yes: 1562 (89.3%)

No but I’ve heard of it: 114 (6.5%)

No: 74 (4.2%)


Do you self-iden­tify as an Effec­tive Altru­ist?

Yes: 665 (39.233%)

No: 1030 (60.767%)

The dis­tri­bu­tion given by the 2014 sur­vey re­sults does not sum to one, so it’s difficult to de­ter­mine if Effec­tive Altru­ism’s mem­ber­ship ac­tu­ally went up or not but if we take the num­bers at face value it ex­pe­rienced an 11.13% in­crease in mem­ber­ship.


Do you par­ti­ci­pate in the Effec­tive Altru­ism com­mu­nity?

Yes: 314 (18.427%)

No: 1390 (81.573%)

Same is­sue as last, tak­ing the num­bers at face value com­mu­nity par­ti­ci­pa­tion went up by 5.727%


Has Effec­tive Altru­ism caused you to make dona­tions you oth­er­wise wouldn’t?

Yes: 666 (39.269%)

No: 1030 (60.731%)


Effec­tive Altru­ist Anxiety


Have you ever had any kind of moral anx­iety over Effec­tive Altru­ism?

Yes: 501 (29.6%)

Yes but only be­cause I worry about ev­ery­thing: 184 (10.9%)

No: 1008 (59.5%)

There’s an on­go­ing de­bate in Effec­tive Altru­ism about what kind of rhetor­i­cal strat­egy is best for get­ting peo­ple on board and whether Effec­tive Altru­ism is caus­ing peo­ple sig­nifi­cant moral anx­iety.

It cer­tainly ap­pears to be. But is moral anx­iety effec­tive? Let’s look:

Sam­ple Size: 244
Aver­age amount of money donated by peo­ple anx­ious about EA who aren’t EAs: 257.5409836065574

Sam­ple Size: 679
Aver­age amount of money donated by peo­ple who aren’t anx­ious about EA who aren’t EAs: 479.7501384388807

Sam­ple Size: 249 Aver­age amount of money donated by EAs anx­ious about EA: 1841.5292369477913

Sam­ple Size: 314
Aver­age amount of money donated by EAs not anx­ious about EA: 1837.8248407643312

It seems fairly con­clu­sive that anx­iety is not a good way to get peo­ple to donate more than they already are, but is it a good way to get peo­ple to be­come Effec­tive Altru­ists?

Sam­ple Size: 1685
P(Effec­tive Altru­ist): 0.3940652818991098
P(EA Anx­iety): 0.29554896142433235
P(Effec­tive Altru­ist | EA Anx­iety): 0.5

Maybe. There is of course an ar­gu­ment to be made that suffi­cient good done by caus­ing peo­ple anx­iety out­weighs feed­ing into peo­ples scrupu­los­ity, but it can be dis­cussed af­ter I get through ex­plain­ing it on the phone to wealthy PR-con­scious donors and tel­ling the lo­cal all-kill shelter where I want my ship­ment of dead kit­tens.


What’s your over­all opinion of Effec­tive Altru­ism?

Pos­i­tive: 809 (47.6%)

Mostly Pos­i­tive: 535 (31.5%)

No strong opinion: 258 (15.2%)

Mostly Nega­tive: 75 (4.4%)

Nega­tive: 24 (1.4%)

EA ap­pears to be do­ing a pretty good job of get­ting peo­ple to like them.

In­ter­est­ing Tables

Char­ity Dona­tions By Poli­ti­cal Affila­tion
Affili­a­tion In­come Char­ity Con­tri­bu­tions % In­come Donated To Char­ity To­tal Sur­vey Char­ity % Sam­ple Size
Anar­chist 1677900.0 72386.0 4.314% 3.004% 50
Com­mu­nist 298700.0 19190.0 6.425% 0.796% 13
Con­ser­va­tive 1963000.04 62945.04 3.207% 2.612% 38
Futarchist 1497494.1099999999 166254.0 11.102% 6.899% 31
Left-Liber­tar­ian 9681635.613839999 416084.0 4.298% 17.266% 245
Liber­tar­ian 11698523.0 214101.0 1.83% 8.885% 190
Moder­ate 3225475.0 90518.0 2.806% 3.756% 67
Ne­o­re­ac­tionary 1383976.0 30890.0 2.232% 1.282% 28
Ob­jec­tivist 399000.0 1310.0 0.328% 0.054% 10
Other 3150618.0 85272.0 2.707% 3.539% 132
Prag­ma­tist 5087007.609999999 266836.0 5.245% 11.073% 131
Pro­gres­sive 8455500.440000001 368742.78 4.361% 15.302% 217
So­cial Demo­crat 8000266.54 218052.5 2.726% 9.049% 237
So­cial­ist 2621693.66 78484.0 2.994% 3.257% 126
Num­ber Of Effec­tive Altru­ists In The Di­as­pora Com­mu­ni­ties
Com­mu­nity Count % In Com­mu­nity Sam­ple Size
LessWrong 136 38.418% 354
LessWrong Mee­tups 109 50.463% 216
LessWrong Face­book Group 83 48.256% 172
LessWrong Slack 22 39.286% 56
SlateS­tarCodex 343 40.98% 837
Ra­tion­al­ist Tum­blr 175 49.716% 352
Ra­tion­al­ist Face­book 89 58.94% 151
Ra­tion­al­ist Twit­ter 24 40.0% 60
Effec­tive Altru­ism Hub 86 86.869% 99
Good Judge­ment(TM) Open 23 74.194% 31
Pre­dic­tionBook 31 51.667% 60
Hacker News 91 35.968% 253
#less­wrong on freen­ode 19 24.675% 77
#slat­estar­codex on freen­ode 9 24.324% 37
#chapelper­ilous on freen­ode 2 18.182% 11
/​r/​ra­tio­nal 117 42.545% 275
/​r/​HPMOR 110 47.414% 232
/​r/​SlateS­tarCodex 93 37.959% 245
One or more pri­vate ‘ra­tio­nal­ist’ groups 91 47.15% 193
Effec­tive Altru­ist Dona­tions By Poli­ti­cal Affili­a­tion
Affili­a­tion EA In­come EA Char­ity Sam­ple Size
Anar­chist 761000.0 57500.0 18
Futarchist 559850.0 114830.0 15
Left-Liber­tar­ian 5332856.0 361975.0 112
Liber­tar­ian 2725390.0 114732.0 53
Moder­ate 583247.0 56495.0 22
Other 1428978.0 69950.0 49
Prag­ma­tist 1442211.0 43780.0 43
Pro­gres­sive 4004097.0 304337.78 107
So­cial Demo­crat 3423487.45 149199.0 93
So­cial­ist 678360.0 34751.0 41