Subduing Moloch

If ra­tio­nal­ity is sys­tem­ized win­ning, then the di­as­pora of ra­tio­nal­ists should be the most pow­er­ful group in the world. One way to in­crease in­di­vi­d­ual and group power is to in­crease co­op­er­a­tion among ra­tio­nal agents.

As a start­ing point, the ob­vi­ous way to in­crease co­op­er­a­tion is to in­crease the fre­quency of high band­width com­mu­ni­ca­tion among the agents. Imag­ine the ex­is­tence of an app that each week would ran­domly pair you with an­other ra­tio­nal­ist. You would con­verse with the per­son for an hour ev­ery­day (for ex­am­ple, you could spend 30 mins each dis­cussing your re­spec­tive days and how to op­ti­mize it). Then if you re­peat the pro­cess for a year, then you would get to know 52 peo­ple. In 10 years, you would know 520 peo­ple. If you con­nect par­tic­u­larly well with some­one, you can con­tinue talk­ing with them el­se­where af­ter the week is over.

The pur­pose of this is to de­velop a real com­mu­nity of peo­ple with the shared mis­sion of op­ti­miz­ing the world. This is ad­mit­tedly a pretty weird idea, but once there is a suffi­ciently high den­sity of edges, then the group as a whole can act as an “su­per-agent” when the need arises.

While the benefits of form­ing such a com­mu­nity will ini­tially be re­stricted to mun­dane things like job refer­ences and en­ter­tain­ment, it can even­tu­ally be used to re­solve multi-player pris­oner’s dilem­mas of the kind de­scribed by SlateS­tarCodex:

Bostrom makes an offhanded refer­ence of the pos­si­bil­ity of a dic­ta­tor­less dystopia, one that ev­ery sin­gle cit­i­zen in­clud­ing the lead­er­ship hates but which nev­er­the­less en­dures un­con­quered. It’s easy enough to imag­ine such a state. Imag­ine a coun­try with two rules: first, ev­ery per­son must spend eight hours a day giv­ing them­selves strong elec­tric shocks. Se­cond, if any­one fails to fol­low a rule (in­clud­ing this one), or speaks out against it, or fails to en­force it, all cit­i­zens must unite to kill that per­son. Sup­pose these rules were well-enough es­tab­lished by tra­di­tion that ev­ery­one ex­pected them to be en­forced.
So you shock your­self for eight hours a day, be­cause you know if you don’t ev­ery­one else will kill you, be­cause if they don’t, ev­ery­one else will kill them, and so on. Every sin­gle cit­i­zen hates the sys­tem, but for lack of a good co­or­di­na­tion mechanism it en­dures. From a god’s-eye-view, we can op­ti­mize the sys­tem to “ev­ery­one agrees to stop do­ing this at once”, but no one within the sys­tem is able to effect the tran­si­tion with­out great risk to them­selves.

An ex­am­ple might be ev­ery­one de­cid­ing that the cur­rent ed­u­ca­tion sys­tem is com­pletely bro­ken (stu­dent debt in the US is over 1 trillion dol­lars) ― and hav­ing an or­ga­ni­za­tion, let’s call it the Bayesian church fix it by sup­ply­ing the co­or­di­na­tion mechanism needed to change the sys­tem.

Much like the Catholic Church was all-pow­er­ful in the 16th cen­tury, the Bayesian Church should also end up wield­ing enor­mous power in the far fu­ture. The above thought ex­per­i­ment where ra­tio­nal­ists in­ter­act with each other could be the first step to­wards ig­nit­ing the co­or­di­na­tion mechanism that will sub­due Moloch.

I feel rather ex­cited about this idea, please let me know what you think in the com­ments.

No nominations.
No reviews.