Of Two Minds

Fol­low-up to: The In­tel­li­gent So­cial Web

The hu­man mind evolved un­der pres­sure to solve two kinds of prob­lems:

  • How to phys­i­cally move

  • What to do about other people

I don’t mean that list to be ex­haus­tive. It doesn’t in­clude main­tain­ing home­osta­sis, for in­stance. But in prac­tice I think it hits ev­ery­thing we might want to call “think­ing”.

…which means we can think of the mind as hav­ing two types of rea­son­ing: me­chan­i­cal and so­cial.

Me­chan­i­cal rea­son­ing is where our in­tu­itions about “truth” ground out. You throw a ball in the air, your brain makes a pre­dic­tion about how it’ll move and how to catch it, and ei­ther you catch it as ex­pected or you don’t. We can imag­ine how to build an en­g­ine, and then build it, and then we can find out whether it works. You can try a hand­stand, no­tice how it fails, and try again… and af­ter a while you’ll prob­a­bly figure it out. It means some­thing for our brains’ pre­dic­tions to be right or wrong (or some­where in be­tween).

I recom­mend this TED Talk for a great overview of this point.

The fact that we can plan move­ments lets us do ab­stract truth-based rea­son­ing. The book Where Math­e­mat­ics Comes From digs into this in math. But for just one ex­am­ple, no­tice how set the­ory al­most always uses con­tainer metaphors. E.g., we say el­e­ments are in sets like peb­bles are in buck­ets. That phys­i­cal in­tu­ition lets us use things like Venn di­a­grams to rea­son about sets and logic.

…well, at least un­til our in­tu­itions are wrong. Then we get sur­prised. And then, like in learn­ing to catch a ball, we change our an­ti­ci­pa­tions. We up­date.

Me­chan­i­cal rea­son­ing seems to already obey Bayes’ The­o­rem for up­dat­ing. This seems plau­si­ble from my read of Scott’s re­view of Sur­fing Uncer­tainty, and in the TED Talk I men­tioned ear­lier Daniel Wolpert claims this is mea­sured. And it makes sense: evolu­tion would have put a lot of pres­sure on our an­ces­tors to get move­ment right.

Why, then, is there sys­tem­atic bias? Why do the Se­quences help at all with think­ing?

Some­times, oc­ca­sion­ally, it’s be­cause of some­thing struc­tural — like how we sys­tem­at­i­cally feel some­one’s blow as harder than they felt they had hit us. It just falls out of how our brains make phys­i­cal pre­dic­tions. If we know about this, we can try to cor­rect for it when it mat­ters.

But the rest of the time?

It’s be­cause we pre­dict it’s so­cially helpful to be bi­ased that way.

When it comes to sur­viv­ing and find­ing mates, hav­ing a place in the so­cial web mat­ters a lot more than be­ing right, nearly always. If your ac­cess to food, sex, and oth­ers’ pro­tec­tion de­pends on your agree­ing with oth­ers that the sky is green, you ei­ther find ways to con­clude that the sky is green, or you don’t have many kids. If the so­cial web puts a lot of effort into figur­ing out what you re­ally think, then you’d bet­ter find some way to re­ally think the sky is green, re­gard­less of what your eyes tell you.

Is it any won­der that so many de­vi­a­tions from clear think­ing are about so­cial sig­nal­ing?

The thing is, “clear think­ing” here mostly points at me­chan­i­cal rea­son­ing. If we were to cre­ate a me­chan­i­cal model of so­cial dy­nam­ics… well, it might start look­ing like a re­cur­sively gen­er­ated so­cial web, and then me­chan­i­cal rea­son­ing would mostly de­rive the same thing the so­cial mind already does.

…be­cause that’s how the so­cial mind evolved.

And once it evolved, it be­came over­whelm­ingly more im­por­tant than ev­ery­thing else. Be­cause a strong, healthy, phys­i­cally co­or­di­nated, skil­led war­rior has al­most no hope of defeat­ing a weak­ling who can in­spire many, many oth­ers to fight for them.

Thus when­ever peo­ple’s so­cial and me­chan­i­cal minds dis­agree, the so­cial mind al­most always wins, even if it kills them.

You might hope that that “al­most” in­cludes things like en­g­ineer­ing and hard sci­ence. But re­ally, for the most part, we just figured out how to al­ign so­cial in­cen­tives with truth-seek­ing. And that’s im­por­tant! We figured out that if we tie so­cial stand­ing to whether your rocket ac­tu­ally works, then be­ing right so­cially mat­ters, and now cul­ture can care about truth.

But once there’s the slight­est gap be­tween cul­tural in­cen­tives and mak­ing phys­i­cal things work, so­cial forces take over.

This means that in any hu­man in­ter­ac­tion, if you don’t see how the so­cial web causes each per­son’s ac­tions, then you’re prob­a­bly miss­ing most of what’s go­ing on — at least con­sciously.

And there’s prob­a­bly a rea­son you’re miss­ing it.

No nominations.
No reviews.