Take heed, for it is a trap

If you have worked your way through most of the se­quences you are likely to agree with the ma­jor­ity of these state­ments:

  • When peo­ple die we should cut off their heads so we can pre­serve those heads and make the per­son come back to life in the (far far) fu­ture.

  • It is pos­si­ble to run a per­son on Con­ways Game of Life. This would be a per­son as real as you or me, and wouldn’t be able to tell he’s in a vir­tual world be­cause it looks ex­actly like ours.

  • Right now there ex­ist many copies/​clones of you, some of which are bliss­fully happy and some of which are be­ing tor­tured and we should not care about this at all.

  • Most sci­en­tists dis­agree with this but that’s just be­cause it sounds counter-in­tu­itive and sci­en­tists are bi­ased against coun­ter­in­tu­itive ex­pla­na­tions.

  • Be­sides, the sci­en­tific method is wrong be­cause it is in con­flict with prob­a­bil­ity the­ory. Oh, and prob­a­bil­ity is cre­ated by hu­mans, it doesn’t ex­ist in the uni­verse.

  • Every frac­tion of a sec­ond you split into thou­sands of copies of your­self. Of course you can­not de­tect these copies sci­en­tifi­cally, but that be­cause sci­ence is wrong and stupid.

  • In fact, it’s not just peo­ple that split but the en­tire uni­verse splits over and over.

  • Time isn’t real. There is no flow of time from 0 to now. All your fu­ture and past selves just ex­ist.

  • Com­put­ers will soon be­come so fast that AI re­searchers will be able to cre­ate an ar­tifi­cial in­tel­li­gence that’s smarter than any hu­man. When this hap­pens hu­man­ity will prob­a­bly be wiped out.

  • To pro­tect us against com­put­ers de­stroy­ing hu­man­ity we must cre­ate a su­per-pow­er­ful com­puter in­tel­li­gence that won’t de­stroy hu­man­ity.

  • Ethics are very im­por­tant and we must take ex­treme cau­tion to make sure we do the right thing. Also, we some­times pre­fer tor­ture to dust-specs.

  • If ev­ery­thing goes to plan a su­per com­puter will solve all prob­lems (dis­ease, famine, ag­ing) and turn us into su­per hu­mans who can then go on to ex­plore the galaxy and have fun.

  • And fi­nally, the truth of all these state­ments is com­pletely ob­vi­ous to those who take the time to study the un­der­ly­ing ar­gu­ments. Peo­ple who dis­agree are just dumb, ir­ra­tional, mise­d­u­cated or a com­bi­na­tion thereof.

  • I learned this all from this web­site by these guys who want us to give them our money.

In two words: crack­pot be­liefs.

Th­ese state­ments cover only a frac­tion of the se­quences and al­though they’re de­liber­ately phrased to in­cite knee­jerk dis­agree­ment and ugh-fields I think most LW read­ers will find them­selves in agree­ment with al­most all of them. And If not then you can always come up with bet­ter ex­am­ples that illus­trate some of your non-main­stream be­liefs.

Think back for a sec­ond to your pre-bayesian days. Think back to the time be­fore your ex­po­sure to the se­quences. Now the ques­tion is, what es­ti­mate would you have given that any chain of ar­gu­ments could per­suade you the state­ments above are true? In my case, it would be near zero.

You can take some­body who likes philos­o­phy and is fa­mil­iar with the differ­ent streams and philo­soph­i­cal dilem­mas, who knows com­pu­ta­tion the­ory and clas­si­cal physics, who has a good un­der­stand­ing of prob­a­bil­ity and math and some­body who is a nat­u­rally cu­ri­ous re­duc­tion­ist. And this per­son will still roll his eyes and will sar­cas­ti­cally dis­miss the ideas enu­mer­ated above. After all, these are crack­pot ideas, and peo­ple who be­lieve them are so far “out there”, they can­not be rea­soned with!

That is re­ally the bot­tom line here. You can­not ex­plain the be­liefs that fol­low from the se­quences be­cause they have too many de­pen­den­cies and even if you did have time to go through all the nec­es­sary de­pen­den­cies ex­plain­ing a be­lief is still an or­der of mag­ni­tude more difficult than fol­low­ing the ex­pla­na­tion writ­ten down by some­body else be­cause in or­der to ex­plain some­thing you have to jug­gle two men­tal mod­els: your own and the one of the listener.

Some of the se­quences touches on the con­cept of the cog­ni­tive gap (in­fer­en­tial dis­tance). We have all learned this the hard way that we can’t ex­pect peo­ple to just un­der­stand what we say and we can’t ex­pect short in­fer­en­tial dis­tances. In prac­tice there is just no way to bridge the cog­ni­tive gap. This isn’t a big deal for most ed­u­cated peo­ple, be­cause peo­ple don’t ex­pect to un­der­stand com­plex ar­gu­ments in other peo­ple’s fields and all ed­u­cated in­tel­lec­tu­als are on the same team any­way (well, most of the time). For crack­pot LW be­liefs it’s a whole differ­ent story though. I sus­pect most of us have found that out the hard way.

Ra­tional Rian: What do you think is go­ing to hap­pen to the econ­omy?

Bayesian Bob: I’m not sure. I think Krug­man be­lieves that a big­ger cash in­jec­tion is needed to pre­vent a sec­ond dip.

Ra­tional Rian: Why do you always say what other peo­ple think, what’s your opinion?

Bayesian Bob: I can’t re­ally dis­t­in­guish be­tween good eco­nomic rea­son­ing and flawed eco­nomic rea­son­ing be­cause I’m a lay man. So I tend to go with what Krug­man writes, un­less I have a good rea­son to be­lieve he is wrong. I don’t re­ally have strong opinions about the econ­omy, I just go with the ev­i­dence I have.

Ra­tional Rian: Ev­i­dence? You mean his opinion.

Bayesian Bob: Yep.

Ra­tional Rian: Eh? Opinions aren’t ev­i­dence.

Bayesian Bob: (Whoops, now I have to ei­ther ex­plain the na­ture of ev­i­dence on the spot or Rian will think I’m an idiot with crazy be­liefs. Okay then, here goes.) An opinion re­flects the be­lief of the ex­pert. Th­ese be­liefs can ei­ther be un­cor­re­lated with re­al­ity, nega­tively cor­re­lated or pos­i­tively cor­re­lated. If there is ab­solutely no re­la­tion be­tween what an ex­pert be­lieves and what is true then, sure, it wouldn’t count as ev­i­dence. How­ever, it turns out that ex­perts mostly be­lieve true things (that’s why they’re called ex­perts) and so the be­liefs of an ex­pert are pos­i­tively cor­re­lated with re­al­ity and thus his opinion counts as ev­i­dence.

Ra­tional Rian: That doesn’t make sense. It’s still just an opinion. Ev­i­dence comes from ex­per­i­ments.

Bayesian Bob: Yep, but ex­perts have ei­ther done ex­per­i­ments them­selves or read about ex­per­i­ments other peo­ple have done. That’s what their opinions are based on. Sup­pose you take a ran­dom sci­en­tific state­ment, you have no idea what it is, and the only thing you know is that 80% of the top re­searchers in that field agree with that state­ment, would you then as­sume the state­ment is prob­a­bly true? Would the agree­ment of these sci­en­tists be ev­i­dence for the truth of the state­ment?

Ra­tional Rian: That’s just an ar­gu­ment ad pop­u­lus! Truth isn’t gov­erned by ma­jor­ity opinion! It is just re­li­gious non­sense that if enough peo­ple be­lieve some­thing then there must be some some truth to it.

Bayesian Bob: (Ad pop­u­lum! Pop­u­lum! Ah, crud, I should’ve phrased that more care­fully.) I don’t mean that ma­jor­ity opinion proves that the state­ment is true, it’s just ev­i­dence in fa­vor of it. If there is coun­terev­i­dence the scale can tip the other way. In the case of re­li­gion there is over­whelming coun­terev­i­dence. Scien­tifi­cally speak­ing re­li­gion is clearly false, no dis­agree­ment there.

Ra­tional Rian: There’s sci­en­tific coun­terev­i­dence for re­li­gion? Science can’t prove non-ex­is­tence. You know that!

Bayesian Bob: (Oh god, not this again!) Ab­sence of ev­i­dence is ev­i­dence of ab­sence.

Ra­tional Rian: Counter-ev­i­dence is not the same as ab­sence of ev­i­dence! Be­sides, stay with the point, sci­ence can’t prove a nega­tive.

Bayesian Bob: The cer­tainty of our be­liefs should be pro­por­tional to amount of ev­i­dence we have in fa­vor of the be­lief. Com­plex be­liefs re­quire more ev­i­dence than sim­ple be­liefs, and the laws of prob­a­bil­ity, Bayes speci­fi­cally, tell us how to weigh new ev­i­dence. A state­ment, any state­ment, starts out with a 50% prob­a­bil­ity of be­ing true, and then you ad­just that per­centage based on the ev­i­dence you come into con­tact with. (I shouldn’t have said that 50% part. There’s no way that’s go­ing to go over well. I’m such an idiot.)

Ra­tional Rian: A state­ment with­out ev­i­dence is 50% likely to be true!? Have you for­got­ten ev­ery­thing from math class? This doesn’t make sense on so many lev­els, I don’t even know where to start!

Bayesian Bob: (There’s no way to res­cue this. I’m go­ing to cut my losses.) I meant that in a vac­uum we should be­lieve it with 50% cer­tainty, not that any ar­bi­trary state­ment is 50% likely to ac­cu­rately re­flect re­al­ity. But no mat­ter. Let’s just get some­thing to eat, I’m hun­gry.

Ra­tional Rian: So we should be­lieve some­thing even if it’s un­likely to be true? That’s just stupid. Why do I even get into these con­ver­sa­tions with you? *sigh* … So, how about Sub­way?

The moral here is that crack­pot be­liefs are low sta­tus. Not just low-sta­tus like be­liev­ing in a de­ity, but ma­jorly low sta­tus. When you be­lieve things that are per­ceived as crazy and when you can’t ex­plain to peo­ple why you be­lieve what you be­lieve then the only re­sult is that peo­ple will see you as “that crazy guy”. They’ll won­der, be­hind your back, why a smart per­son can have such stupid be­liefs. Then they’ll con­clude that in­tel­li­gence doesn’t pro­tect peo­ple against re­li­gion ei­ther so there’s no point in try­ing to talk about it.

If you fail to con­ceal your low-sta­tus be­liefs you’ll be pun­ished for it so­cially. If you think that they’re in the wrong and that you’re in the right, then you missed the point. This isn’t about right and wrong, this is about an­ti­ci­pat­ing the con­se­quences of your be­hav­ior. If you choose to to talk about out­landish be­liefs when you know you can­not con­vince peo­ple that your be­lief is jus­tified then you hurt your cred­i­bil­ity and you get noth­ing for it in ex­change. You can­not re­pair the dam­age eas­ily, be­cause even if your friends are pa­tient and will­ing to listen to your com­plete rea­son­ing you’ll (ac­ci­dently) ex­pose three even cra­zier be­liefs you have.

An im­por­tant life skill is the abil­ity to get along with other peo­ple and to not ex­pose your­self as a weirdo when this isn’t in your in­ter­est to do so. So take heed and choose your words wisely, lest you fall into the trap.

EDIT—Google Sur­vey by Pfft

PS: in­tended for /​main but since this is my first se­ri­ous post I’ll put it in dis­cus­sion first to see if it’s con­sid­ered suffi­ciently in­sight­ful.