Why is the Future So Absurd?

Fol­lowup to: Stranger than His­tory, Ab­sur­dity Heuris­tic /​ Ab­sur­dity Bias

Why is the fu­ture more ab­surd than peo­ple seem to ex­pect? (That is: Why, his­tor­i­cally, has the fu­ture so of­ten turned out to be more “ab­surd” than peo­ple seem to have ex­pected?)

One ob­vi­ous rea­son is hind­sight bias. Hind­sight does not just cause peo­ple to severely un­der­es­ti­mate how much they would have been sur­prised. Hind­sight also leads peo­ple to over­es­ti­mate how much at­ten­tion they would have paid to the key fac­tors, the fac­tors that turned out to be im­por­tant. As R. H. Tawney put it:

“His­to­ri­ans give an ap­pear­ance of in­evita­bil­ity to an ex­ist­ing or­der by drag­ging into promi­nence the forces which have triumphed and thrust­ing into the back­ground those which they have swal­lowed up.”

When peo­ple look at his­tor­i­cal changes and think “I could have pre­dicted X” or “You could have pre­dicted X if you looked at fac­tors 1, 2, and 3″; then they for­get that peo­ple did not, in fact, pre­dict X, per­haps be­cause they were dis­tracted by fac­tors 4 through 117. Peo­ple read his­tory books, see co­her­ent nar­ra­tives, and think that’s how Time works. Un­der­es­ti­mat­ing the sur­prise of the pre­sent, they over­es­ti­mate the pre­dictabil­ity of the fu­ture.

I sus­pect that a ma­jor fac­tor con­tribut­ing to ab­sur­dity bias is that, when we look over his­tory, we see changes away from ab­surd con­di­tions such as ev­ery­one be­ing a peas­ant farmer and women not hav­ing the vote, to­ward nor­mal con­di­tions like a ma­jor­ity mid­dle class and equal rights. When peo­ple look at his­tory, they see a se­ries of nor­mal­iza­tions. They learn the rule, “The fu­ture grows ever less ab­surd over time.”

Per­haps one way to com­pre­hend the bizarreness of the fu­ture would be to try and imag­ine his­tor­i­cal changes oc­cur­ring in re­verse—how ab­surd would it be if all your elec­tri­cal ap­pli­ances sud­denly dis­ap­peared, or you were trans­formed into a peas­ant farmer? Even if the fu­ture is nicer than the past, it will feel at least that ab­surd.

The cor­re­spon­dence bias of so­cial psy­chol­ogy may also play a role in how we fail to learn from his­tory—or so my own ex­pe­rience sug­gests. When we read about the strange be­hav­iors of peo­ple in other eras, we may see them as peo­ple with a dis­po­si­tion to that strange be­hav­ior, rather than prop­erly com­pre­hend­ing the strangeness of the times. In the 16th cen­tury, one pop­u­lar en­ter­tain­ment was set­ting a cat on fire. If you think to your­self “What hor­rible peo­ple they must be!” then you have, to the same ex­tent, diminished your ap­pre­ci­a­tion of what hor­rible times they lived in.

We see at least some so­cial and tech­nolog­i­cal changes dur­ing our own life­time. We do have some ex­pe­rience of gen­uine fu­ture shock. Why wouldn’t this be enough to ex­trap­o­late for­ward?

Ac­cord­ing to Ray Kurzweil’s the­sis of ac­cel­er­at­ing change, our in­tu­itions about the fu­ture are lin­ear—we ex­pect around as much change as oc­curred in the past—but tech­nolog­i­cal change feeds on it­self, and there­fore has a pos­i­tive sec­ond deriva­tive. We should ex­pect more tech­nolog­i­cal change in the fu­ture than we have seen in the past, and in­so­far as tech­nol­ogy drives cul­tural change, we should ex­pect more cul­tural change too.

Or that, in my opinion, is the strongest ver­sion of Kurzweil’s the­ory that can be put for­ward. Kurzweil dwells on Moore’s Law and smoothly pre­dictable ex­po­nen­tial curves, but this seems to me both iffy and un­nec­es­sary. A curve does not need to be smooth or ex­po­nen­tial to have a pos­i­tive sec­ond deriva­tive. And our cul­tural sen­si­tivity to, say, com­put­ing power, is prob­a­bly log­a­r­ith­mic any­way, obey­ing We­ber’s Law - a 20% in­crease in com­put­ing power prob­a­bly feels the same whether it’s from 1MHz to 1.2MHz, or 2GHz to 2.4GHz. In which case, peo­ple ex­trap­o­lat­ing the fu­ture “lin­early” should get it pretty much cor­rect.

But if you pull back and view the last few mil­len­nia, not just the last few decades, the strength of the core idea be­comes ob­vi­ous—tech­nol­ogy change does feed on it­self and there­fore does speed up.

I would ac­tu­ally ques­tion Kurzweil’s as­ser­tion that peo­ple ex­trap­o­late the past lin­early into the fu­ture. Kurzweil may be too op­ti­mistic here. As dis­cussed ear­lier, dwellers on flood plains do not ex­trap­o­late from small floods to large floods; in­stead, small floods set a per­ceived up­per bound on risk. I sus­pect that when peo­ple try to vi­su­al­ize the strangeness of the fu­ture, they fo­cus on a sin­gle pos­si­ble change, of no greater mag­ni­tude than the largest sin­gle change they re­mem­ber in their own life­time.

The real fu­ture is not com­posed of sin­gle de­vel­op­ments, but many de­vel­op­ments to­gether. Even if one change can pass the fu­tur­ism filter, to sup­pose three ab­sur­di­ties si­mul­ta­neously—never mind twenty—would en­tirely over­load the ab­sur­dity me­ter. This may also ex­plain why fu­ture pro­jec­tions get wronger and wronger as they go fur­ther out. Peo­ple seem to imag­ine fu­tures that are min­i­mally coun­ter­in­tu­itive, with one or two in­ter­est­ing changes to make a good story, rather than a re­al­is­tic num­ber of changes that would over­load their ex­trap­o­la­tion abil­ities.

What other bi­ases could lead us to un­der­es­ti­mate the ab­sur­dity of the fu­ture?