Modest Superintelligences

I’m skeptical about trying to build FAI, but not about trying to influence the Singularity in a positive direction. Some people may be skeptical even of the latter because they don’t think the possibility of an intelligence explosion is a very likely one. I suggest that even if intelligence explosion turns out to be impossible, we can still reach a positive Singularity by building what I’ll call “modest superintelligences”, that is, superintelligent entities, capable of taking over the universe and preventing existential risks and Malthusian outcomes, whose construction does not require fast recursive self-improvement or other questionable assumptions about the nature of intelligence. This helps to establish a lower bound on the benefits of an organization that aims to strategically influence the outcome of the Singularity.

  • MSI-1: 105 biologically cloned humans of von Neumann-level intelligence, highly educated and indoctrinated from birth to work collaboratively towards some goal, such as building MSI-2 (or equivalent)

  • MSI-2: 1010 whole brain emulations of von Neumann, each running at ten times human speed, with WBE-enabled institutional controls that increase group coherence/​rationality (or equivalent)

  • MSI-3: 1020 copies of von Neumann WBE, each running at a thousand times human speed, with more advanced (to be invented) institutional controls and collaboration tools (or equivalent)

(To recall what the actual von Neumann, who we might call MSI-0, accomplished, open his Wikipedia page and scroll through the “known for” sidebar.)

Building a MSI-1 seems to require a total cost on the order of $100 billion (assuming $10 million for each clone), which is comparable to the Apollo project, and about 0.25% of the annual Gross World Product. (For further comparison, note that Apple has a market capitalization of $561 billion, and annual profit of $25 billion.) In exchange for that cost, any nation that undertakes the project has a reasonable chance of obtaining an insurmountable lead in whatever technologies end up driving the Singularity, and with that a large measure of control over its outcome. If no better strategic options come along, lobbying a government to build MSI-1 and/​or influencing its design and aims seems to be the least that a Singularitarian organization could do.