Is Clickbait Destroying Our General Intelligence?

(Cross-posted from Face­book.)


Now and then peo­ple have asked me if I think that other peo­ple should also avoid high school or col­lege if they want to de­velop new ideas. This always felt to me like a wrong way to look at the ques­tion, but I didn’t know a right one.

Re­cently I thought of a scary new view­point on that sub­ject.

This started with a con­ver­sa­tion with Arthur where he men­tioned an idea by Yoshua Ben­gio about the soft­ware for gen­eral in­tel­li­gence hav­ing been de­vel­oped memet­i­cally. I re­marked that I didn’t think du­pli­cat­ing this cul­turally trans­mit­ted soft­ware would be a sig­nifi­cant part of the prob­lem for AGI de­vel­op­ment. (Roughly: low-fidelity soft­ware tends to be al­gorith­mi­cally shal­low. Fur­ther dis­cus­sion moved to com­ment be­low.)

But this con­ver­sa­tion did get me think­ing about the topic of cul­turally trans­mit­ted soft­ware that con­tributes to hu­man gen­eral in­tel­li­gence. That soft­ware can be an im­por­tant gear even if it’s an al­gorith­mi­cally shal­low part of the over­all ma­chin­ery. Re­mov­ing a few sim­ple gears that are 2% of a ma­chine’s mass can re­duce the ma­chine’s perfor­mance by way more than 2%. Feral chil­dren would be the case in point.

A scary ques­tion is whether it’s pos­si­ble to do sub­tler dam­age to the cul­turally trans­mit­ted soft­ware of gen­eral in­tel­li­gence.

I’ve had the sense be­fore that the In­ter­net is turn­ing our so­ciety stupi­der and meaner. My pri­mary hy­poth­e­sis is “The In­ter­net is se­lect­ing harder on a larger pop­u­la­tion of ideas, and san­ity falls off the se­lec­tive fron­tier once you se­lect hard enough.”

To re­view, there’s a gen­eral idea that strong (so­cial) se­lec­tion on a char­ac­ter­is­tic im­perfectly cor­re­lated with some other met­ric of good­ness can be bad for that met­ric, where weak (so­cial) se­lec­tion on that char­ac­ter­is­tic was good. If you press sci­en­tists a lit­tle for pub­lish­able work, they might do sci­ence that’s of greater in­ter­est to oth­ers. If you se­lect very harshly on pub­li­ca­tion records, the aca­demics spend all their time wor­ry­ing about pub­lish­ing and real sci­ence falls by the wayside.

On my feed yes­ter­day was an es­say com­plain­ing about how the in­tense com­pe­ti­tion to get into Har­vard is pro­duc­ing a mono­cul­ture of stu­dents who’ve lined up ev­ery sin­gle stan­dard ac­com­plish­ment and how these stu­dents don’t know any­thing else they want to do with their lives. Gen­tle, soft com­pe­ti­tion on a few ac­com­plish­ments might se­lect gen­uinely stronger stu­dents; hy­per­com­pe­ti­tion for the ap­pear­ance of strength pro­duces weak­ness, or just empti­ness.

A hy­poth­e­sis I find plau­si­ble is that the In­ter­net, and maybe tele­vi­sion be­fore it, se­lected much more harshly from a much wider field of memes; and also al­lowed tai­lor­ing con­tent more nar­rowly to nar­rower au­di­ences. The In­ter­net is mak­ing it pos­si­ble for ideas that are op­ti­mized to ap­peal he­do­nically-virally within a filter bub­ble to out­com­pete ideas that have been even slightly op­ti­mized for any­thing else. We’re look­ing at a col­lapse of refer­ence to ex­per­tise be­cause defer­ring to ex­per­tise costs a cou­ple of he­dons com­pared to be­ing told that all your in­tu­itions are perfectly right, and at the harsh se­lec­tive fron­tier there’s no room for that. We’re look­ing at a col­lapse of in­ter­ac­tion be­tween bub­bles be­cause there used to be just a few news­pa­pers serv­ing all the bub­bles; and now that the bub­bles have sep­a­rated there’s lit­tle in­cen­tive to show peo­ple how to be fair in their judg­ment of ideas for other bub­bles, it’s not the most ap­peal­ing Tum­blr con­tent. Print mag­a­z­ines in the 1950s were hardly perfect, but they could get away with some­times pre­sent­ing com­pli­cated is­sues as com­pli­cated, be­cause there weren’t a hun­dred blogs say­ing oth­er­wise and steal­ing their clicks. Or at least, that’s the hy­poth­e­sis.

It seems plau­si­ble to me that ba­sic soft­ware for in­tel­li­gent func­tion­ing is be­ing dam­aged by this hy­per­com­pe­ti­tion. Espe­cially in a so­cial con­text, but maybe even out­side it; that kind of thing tends to slop over. When some­one po­litely pre­sents them­selves with a care­ful ar­gu­ment, does your cul­tural soft­ware tell you that you’re sup­posed to listen and make a care­ful re­sponse, or make fun of the other per­son and then laugh about how they’re up­set? What about when your own brain tries to gen­er­ate a care­ful ar­gu­ment? Does your cul­tural mi­lieu give you any ex­am­ples of peo­ple show­ing how to re­ally care deeply about some­thing (i.e. de­bate con­se­quences of paths and hew hard to the best one), or is ev­ery­thing you see just peo­ple com­pet­ing to be loud in their iden­ti­fi­ca­tion? The Oc­cupy move­ment not hav­ing any de­mands or agenda could rep­re­sent mild dam­age to a gear of hu­man gen­eral in­tel­li­gence that was cul­turally trans­mit­ted and that en­abled pro­cess­ing of a cer­tain kind of goal-di­rected be­hav­ior. And I’m not sure to what ex­tent that is merely a metaphor, ver­sus it be­ing sim­ple fact if we could look at the true soft­ware laid out. If you look at how some bub­bles are talk­ing and think­ing now, “in­tel­lec­tu­ally feral chil­dren” doesn’t seem like en­tirely in­ap­pro­pri­ate lan­guage.

Shortly af­ter that con­ver­sa­tion with Arthur, it oc­curred to me that I was pretty much raised and so­cial­ized by my par­ents’ col­lec­tion of sci­ence fic­tion.

My par­ents’ col­lec­tion of old sci­ence fic­tion.

Isaac Asi­mov. H. Beam Piper. A. E. van Vogt. Early Hein­lein, be­cause my par­ents didn’t want me read­ing the later books.

And when I did try read­ing sci­ence fic­tion from later days, a lot of it struck me as… icky. Neu­ro­mancer, bleah, what is wrong with this book, it feels dam­aged, why do peo­ple like this, it feels like there’s way too much flash and it ate the sub­stance, it’s show­ing off way too hard.

And now that I think about it, I feel like a lot of my writ­ing on ra­tio­nal­ity would be a lot more pop­u­lar if I could go back in time to the 1960s and pre­sent it there. “Twelve Virtues of Ra­tion­al­ity” is what peo­ple could’ve been read­ing in­stead of Hein­lein’s Stranger in a Strange Land, to take a differ­ent path from the branch­ing point that found Stranger in a Strange Land ap­peal­ing.

I didn’t stick to merely the cul­ture I was raised in, be­cause that wasn’t what that cul­ture said to do. The char­ac­ters I read didn’t keep to the way they were raised. They were con­stantly be­ing challenged with new ideas and of­ten mod­ified or par­tially re­jected those ideas in the course of ab­sorb­ing them. If you were im­mersed in an alien civ­i­liza­tion that had some good ideas, you were sup­posed to con­sider it open-mind­edly and then steal only the good parts. Which… kind of sounds ax­io­matic to me? You could make a case that this is an ob­vi­ous guideline for how to do generic op­ti­miza­tion. It’s just what you do to pro­cess an in­put. And yet “when you en­counter a differ­ent way of think­ing, judge it open-mind­edly and then steal only the good parts” is di­rectly con­tra­dicted by some mod­ern soft­ware that seems to be memet­i­cally hy­per­com­pet­i­tive. It prob­a­bly sounds a bit alien or weird to some peo­ple read­ing this, at least as some­thing that you’d say out loud. Soft­ware con­tribut­ing to generic op­ti­miza­tion has been dam­aged.

Later the In­ter­net came along and ex­posed me to some mod­ern de­vel­op­ments, some of which are in­deed im­prove­ments. But only af­ter I had a cog­ni­tive and eth­i­cal foun­da­tion that could judge which changes were progress ver­sus dam­age. More im­por­tantly, a cog­ni­tive foun­da­tion that had the idea of even try­ing to do that. Tver­sky and Kah­ne­man didn’t ex­ist in the 1950s, but when I was ex­posed to this new cog­ni­tive bi­ases liter­a­ture, I re­acted like an Isaac Asi­mov char­ac­ter try­ing to in­te­grate it into their ex­ist­ing ideas about psy­chohis­tory, in­stead of a William Gib­son char­ac­ter won­der­ing how it would look on a black and chrome T-Shirt. If that refer­ence still means any­thing to any­one.

I sus­pect some cul­turally trans­mit­ted parts of the gen­eral in­tel­li­gence soft­ware got dam­aged by ra­dio, tele­vi­sion, and the In­ter­net, with a key causal step be­ing an in­creased hy­per­com­pe­ti­tion of ideas com­pared to ear­lier years. I sus­pect this in­de­pen­dently of any other hy­pothe­ses about my ori­gin story. It feels to me like the his­tor­i­cal case for this the­sis ought to be visi­ble by mere ob­ser­va­tion to any­one who watched the qual­ity of on­line dis­cus­sion de­grade from 2002 to 2017.

But if you con­sider me to be more than usu­ally in­tel­lec­tu­ally pro­duc­tive for an av­er­age Ashke­nazic ge­nius in the mod­ern gen­er­a­tion, then in this con­nec­tion it’s an in­ter­est­ing and scary fur­ther ob­ser­va­tion that I was ini­tially so­cial­ized by books writ­ten be­fore the Great Stag­na­tion. Or by books writ­ten by au­thors from only a sin­gle gen­er­a­tion later, who read a lot of old books them­selves and didn’t watch much tele­vi­sion.

That hy­poth­e­sis doesn’t feel wrong to me the way that “oh you just need to not go to col­lege” feels wrong to me.