Our society lacks good self-preservation mechanisms

The prospect of a dan­ger­ous col­lec­tion of ex­is­ten­tial risks and risks of ma­jor civ­i­liza­tional-level catas­tro­phes in the 21st cen­tury, com­bined with a dis­tinct lack of agen­cies whose job it is to miti­gate against such risks prob­a­bly in­di­cates that the world might be in some­thing of an emer­gency at the mo­ment. Firstly, what do we mean by risks? Well, Bostrom has a pa­per on ex­is­ten­tial risks, and he lists the fol­low­ing risks as be­ing “most likely”:

  • De­liber­ate mi­suse of nan­otech­nol­ogy,

  • Nu­clear holo­caust,

  • Badly pro­grammed su­per­in­tel­li­gence,

  • Ge­net­i­cally en­g­ineered biolog­i­cal agent,

  • Ac­ci­den­tal mi­suse of nan­otech­nol­ogy (“gray goo”),

  • Physics dis­asters,

  • Nat­u­rally oc­cur­ring dis­ease,

  • As­teroid or comet im­pact,

  • Ru­n­away global warm­ing,

  • Re­source de­ple­tion or ecolog­i­cal de­struc­tion,

  • Mis­guided world gov­ern­ment or an­other static so­cial equil­ibrium stops tech­nolog­i­cal progress,

  • “Dys­genic” pres­sures (We might evolve into a less brainy but more fer­tile species, homo philo­pro­gen­i­tus “lover of many offspring”)

  • Our po­ten­tial or even our core val­ues are eroded by evolu­tion­ary de­vel­op­ment,

  • Tech­nolog­i­cal ar­rest,

  • Take-over by a tran­scend­ing up­load,

  • Flawed su­per­in­tel­li­gence,

  • [Stable] Re­pres­sive to­tal­i­tar­ian global regime,

  • Han­son’s cos­mic lo­custs sce­nario [Added by au­thor]

To which I would add var­i­ous pos­si­bil­ities for ma­jor civ­i­liza­tion-level dis­asters that aren’t ex­is­ten­tial risks, such as milder ver­sions of all of the above, or the fol­low­ing:

  • con­ver­gence of com­puter viruses and cults/​re­li­gions,

  • ad­vanced per­sonal weapons or surveillance de­vices such as nan­otech, micro-UAV bugs (cy­ber­punk dystopia),

  • ero­sion of pri­vacy and free­dom through mas­sively op­pres­sive gov­ern­ment,

  • highly effec­tive meta-re­li­gions such as Scien­tol­ogy or a much more viru­lent ver­sion of mod­ern evan­gel­i­cal Christianity

This col­lec­tion is daunt­ing, es­pe­cially given that the hu­man race doesn’t have any offi­cial agency ded­i­cated to miti­gat­ing risks to its own medium-long term sur­vival. We face a long list of challenges, and we aren’t even for­mally try­ing to miti­gate many of them in ad­vance, and in many past cases, miti­ga­tion of risks oc­curred on a last-minute, ad-hoc ba­sis, such as in­di­vi­d­u­als in the cold war mak­ing the de­ci­sion not to ini­ti­ate a nul­cear ex­change, par­tic­u­larly in the Cuban mis­sile crisis.

So, a small group of peo­ple have re­al­ized that the likely out­come of a large and dan­ger­ous col­lec­tion of risks com­bined with a hap­haz­ard, in­for­mal method­ol­ogy for deal­ing with risks (driven by the efforts of in­di­vi­d­u­als, char­i­ties and pub­lic opinion) is that one of these po­ten­tial risks will ac­tu­ally be re­al­ized—kil­ling many or all of us or rad­i­cally re­duc­ing our qual­ity of life. This com­ing dis­aster is ul­ti­mately not the re­sult of any one par­tic­u­lar risk, but the re­sult of the lack of a pow­er­ful defence against risks.

One could ar­gue that I [and Bostrom, Rees, etc] are blow­ing the is­sue out of pro­por­tion. We have sur­vived so far, right? (Wrong, ac­tu­ally—an­thropic con­sid­er­a­tions in­di­cate that sur­vival so far is not ev­i­dence that we will sur­vive for a lot longer, and tech­nolog­i­cal progress in­di­cates that risks in the fu­ture are worse than risks in the past). Ma­jor civ­i­liza­tional dis­asters have already hap­pened many, many times over.

Most ecosys­tems that ever ex­isted were wiped out by nat­u­ral means, al­most all species that have ever ex­isted have gone ex­tinct, and with­out hu­man in­ter­ven­tion most ex­ist­ing ecosys­tems will prob­a­bly be wiped out within a 100 mil­lion year timescale. Most civ­i­liza­tions that ever ex­isted, col­lapsed. Some went re­ally badly wrong, like com­mu­nist Rus­sia. Com­plex, home­o­static ob­jects that don’t have ex­tremely effec­tive self-preser­va­tion sys­tems em­piri­cally tend to get wiped by the churn­ing of the uni­verse.

Our west­ern civ­i­liza­tion lacks an effec­tive long-term (or­der of 50 years plus) self-preser­va­tion sys­tem. Hence we should rea­son­ably ex­pect to ei­ther build one, or get wiped out, be­cause we ob­serve that com­plex sys­tems which seem similar to so­cieties to­day—such as past so­cieties—col­lapsed.

And even though our so­ciety does have short-term sur­vival mechanisms such as gov­ern­ments and philan­thropists, they of­ten be­have in su­perbly ir­ra­tional, my­opic or late-re­spond­ing ways. It seems that the re­sponse to the global warm­ing prob­lem (late-re­spond­ing, weak, still failing to over­come co-or­di­na­tion prob­lems) or the in­va­sion of Iraq (plain ir­ra­tional) are cases in point from re­cent his­tory, and that there are nu­mer­ous ex­am­ples from the past, such as close calls in the cold war, and the spec­tac­u­lar chain of failures that led from world war I to world war II and the rise of Hitler.

This ar­ti­cle could be sum­ma­rized as fol­lows:

The sys­tems we have for pre­serv­ing the val­ues and ex­is­tence of our west­ern so­ciety, and the hu­man race as a whole are weak, and the challenges of the 21st-22nd cen­tury seem likely to over­whelm them.

I origi­nally wanted to write an ar­ti­cle about ways to miti­gate ex­is­ten­tial risks and ma­jor civ­i­liza­tion-level catas­tro­phes, but I de­cided to first es­tab­lish that there are ac­tu­ally such things as se­ri­ous ex­is­ten­tial risks and ma­jor civ­i­liza­tion-level catas­tro­phes, and that we haven’t got them han­dled yet. My next post will be about ways to miti­gate ex­is­ten­tial risks.