Rule Thinkers In, Not Out

Link post

Imag­ine a black box which, when you pressed a but­ton, would gen­er­ate a sci­en­tific hy­poth­e­sis. 50% of its hy­pothe­ses are false; 50% are true hy­pothe­ses as game-chang­ing and el­e­gant as rel­a­tivity. Even de­spite the er­ror rate, it’s easy to see this box would quickly sur­pass space cap­sules, da Vinci paint­ings, and printer ink car­tridges to be­come the most valuable ob­ject in the world. Scien­tific progress on de­mand, and all you have to do is test some stuff to see if it’s true? I don’t want to de­value ex­per­i­men­tal­ists. They do great work. But it’s ap­pro­pri­ate that Ein­stein is more fa­mous than Ed­ding­ton. If you took away Ed­ding­ton, some­one else would have tested rel­a­tivity; the bot­tle­neck is in Ein­steins. Ein­stein-in-a-box at the cost of re­quiring two Ed­ding­tons per in­sight is a heck of a deal.

What if the box had only a 10% suc­cess rate? A 1% suc­cess rate? My guess is: still most valuable ob­ject in the world. Even an 0.1% suc­cess rate seems pretty good, con­sid­er­ing (what if we ask the box for can­cer cures, then test them all on lab rats and vol­un­teers?) You have to go pretty low be­fore the box stops be­ing great.

I thought about this af­ter read­ing this list of ge­niuses with ter­rible ideas. Linus Paul­ing thought Vi­tamin C cured ev­ery­thing. Isaac New­ton spent half his time work­ing on weird Bible codes. Nikola Tesla pur­sued mad en­ergy beams that couldn’t work. Lynn Mar­gulis rev­olu­tionized cell biol­ogy by dis­cov­er­ing mi­to­chon­drial en­dosym­bio­sis, but was also a 9-11 truther and doubted HIV caused AIDS. Et cetera. Ob­vi­ously this should hap­pen. Ge­nius of­ten in­volves com­ing up with an out­ra­geous idea con­trary to con­ven­tional wis­dom and pur­su­ing it ob­ses­sively de­spite naysay­ers. But no­body can have a 100% suc­cess rate. Peo­ple who do this suc­cess­fully some­times should also fail at it some­times, just be­cause they’re the kind of per­son who at­tempts it at all. Not ev­ery­one fails. Ein­stein seems to have bat­ted a perfect 1000 (un­less you count his sup­port for so­cial­ism). But failure shouldn’t sur­prise us.

Yet aren’t some of these ex­am­ples un­for­give­ably bad? Like, se­ri­ously Isaac – Bible codes? Well, granted, New­ton’s chem­i­cal ex­per­i­ments may have ex­posed him to a lit­tle more mer­cury than can be en­tirely healthy. But re­mem­ber: grav­ity was con­sid­ered creepy oc­cult pseu­do­science by its early en­e­mies. It sub­jected the earth and the heav­ens to the same law, which shocked 17th cen­tury sen­si­bil­ities the same way try­ing to link con­scious­ness and mat­ter would to­day. It pos­tu­lated that ob­jects could act on each other through in­visi­ble forces at a dis­tance, which was equally out­side the con­tem­po­ra­ne­ous Over­ton Win­dow. New­ton’s ex­cep­tional ge­nius, his ex­cep­tional abil­ity to think out­side all rele­vant boxes, and his ex­cep­tion­ally egre­gious mis­takes are all the same phe­nomenon (plus or minus a lit­tle mer­cury).

Or think of it a differ­ent way. New­ton stared at prob­lems that had vexed gen­er­a­tions be­fore him, and no­ticed a sub­tle pat­tern ev­ery­one else had missed. He must have amaz­ing hy­per­sen­si­tive pat­tern-match­ing go­ing on. But peo­ple with such hy­per­sen­si­tivity should be most likely to see pat­terns where they don’t ex­ist. Hence, Bible codes.

Th­ese ge­niuses are like our black boxes: gen­er­a­tors of brilli­ant ideas, plus a cer­tain failure rate. The failures can be eas­ily dis­carded: physi­cists were able to take up New­ton’s grav­ity with­out wast­ing time on his Bible codes. So we’re right to treat ge­niuses as valuable in the same way we would treat those boxes as valuable.

This goes not just for ge­niuses, but for any­body in the idea in­dus­try. Com­ing up with a gen­uinely origi­nal idea is a rare skill, much harder than judg­ing ideas is. Some­body who comes up with one good origi­nal idea (plus ninety-nine re­ally stupid cringe­wor­thy takes) is a bet­ter use of your read­ing time than some­body who re­li­ably never gets any­thing too wrong, but never says any­thing you find new or sur­pris­ing. Alyssa Vance calls this pos­i­tive se­lec­tion – a sin­gle good call rules you in – as op­posed to nega­tive se­lec­tion, where a sin­gle bad call rules you out. You should prac­tice pos­i­tive se­lec­tion for ge­niuses and other in­tel­lec­tu­als.

I think about this ev­ery time I hear some­one say some­thing like “I lost all re­spect for Steven Pinker af­ter he said all that stupid stuff about AI”. Your prob­lem was think­ing of “re­spect” as a rele­vant pred­i­cate to ap­ply to Steven Pinker in the first place. Is he your father? Your youth pas­tor? No? Then why are you wor­ry­ing about whether or not to “re­spect” him? Steven Pinker is a black box who oc­ca­sion­ally spits out ideas, opinions, and ar­gu­ments for you to eval­u­ate. If some of them are ar­gu­ments you wouldn’t have come up with on your own, then he’s do­ing you a ser­vice. If 50% of them are false, then the best-case sce­nario is that they’re mo­ron­i­cally, ob­vi­ously false, so that you can re­ject them quickly and get on with your life.

I don’t want to take this too far. If some­one has 99 stupid ideas and then 1 seem­ingly good one, ob­vi­ously this should in­crease your prob­a­bil­ity that the seem­ingly good one is ac­tu­ally flawed in a way you haven’t no­ticed. If some­one has 99 stupid ideas, ob­vi­ously this should make you less will­ing to waste time read­ing their other ideas to see if they are re­ally good. If you want to learn the ba­sics of a field you know noth­ing about, ob­vi­ously read a text­book. If you don’t trust your abil­ity to figure out when peo­ple are wrong, ob­vi­ously read some­one with a track record of always rep­re­sent­ing the con­ven­tional wis­dom cor­rectly. And if you’re a so­cial en­g­ineer try­ing to recom­mend what other peo­ple who are less in­tel­li­gent than you should read, ob­vi­ously steer them away from any­one who’s wrong too of­ten. I just worry too many peo­ple wear their so­cial en­g­ineer hat so of­ten that they for­get how to take it off, for­get that “in­tel­lec­tual ex­plo­ra­tion” is a differ­ent job than “pro­mote the right opinions about things” and re­quires differ­ent strate­gies.

But con­sider the de­bate over “out­rage cul­ture”. Most of this fo­cuses on moral out­rage. Some smart per­son says some­thing we con­sider evil, and so we stop listen­ing to her or giv­ing her a plat­form. There are ar­gu­ments for and against this – at the very least it dis­in­cen­tivizes evil-seem­ing state­ments.

But I think there’s a similar phe­nomenon that gets less at­ten­tion and is even less defen­si­ble – a sort of in­tel­lec­tual out­rage cul­ture. “How can you pos­si­bly read that guy when he’s said [stupid thing]?” I don’t want to get into defend­ing ev­ery weird be­lief or con­spir­acy the­ory that’s ever been [stupid thing]. I just want to say it prob­a­bly wasn’t as stupid as Bible codes. And yet, New­ton.

Some of the peo­ple who have most in­spired me have been in­ex­cus­ably wrong on ba­sic is­sues. But you only need one world-chang­ing rev­e­la­tion to be worth read­ing.