Eco­nomic Con­se­quences of AGI

TagLast edit: 3 Jun 2021 23:24 UTC by habryka

The economic consequences of artificial general intelligence arise from their fundamentally new properties compared to the human brains currently driving the economy. Once such digital minds become generally intelligent enough to perform a wide range of economic functions, they are likely to bring radical changes, creating great wealth, but also displacing humans out of more and more types of job.

An important aspect of the question is that of economic growth. The invention of AGI or WBE could cause a sudden increase in growth by adding machine intelligence to the pool of human innovators. Machine intelligence could be much cheaper to produce, faster, and qualitatively smarter than human talent. A feedback loop from better machine intelligence technology, to more and better machine researchers, back to better machine intelligence technology could ensue.

Robin Hanson has written much about the economics of whole brain emulation. In his view, the unrestricted creation of additional uploads will cause a Malthusian scenario, where upload wages fall to subsistence levels. He sees the transition to whole brain emulation as a jump to a new “growth mode” with higher exponential growth rates, similar to the transitions to agriculture and industry.

In “The Future of Human Evolution”, Nick Bostrom argues that in an emulated-brain society with individuals living at subsistence levels, entities that possess a large set of features we care about – which he calls flamboyant displays, or culture in general – will be outcompeted by more efficient ones that lack inefficient humans’ cultural aspects. This will lead to elimination of all forms of being that we care about. He proposes that only a Singleton could ensure strict control in order to prevent the elimination of culture through outcompetition.

Others predict that growth will blow up even more suddenly (up to the point where physical limits become relevant), and that growth will be concentrated in a smaller and more coherent set of agents, so that instead of continued free market competition, we will see a singleton emerge.

Blog posts

External links

See also

Rogue AGI Em­bod­ies Valuable In­tel­lec­tual Property

3 Jun 2021 20:37 UTC
69 points
9 comments3 min readLW link

Dragon Ball’s Hyper­bolic Time Chamber

gwern2 Sep 2012 23:49 UTC
50 points
65 comments1 min readLW link

The im­pact of whole brain emulation

jefftk14 May 2013 19:59 UTC
4 points
34 comments2 min readLW link

Whole Brain Emu­la­tion: Look­ing At Progress On C. elgans

jefftk29 Oct 2011 15:21 UTC
59 points
85 comments2 min readLW link

Whole Brain Emu­la­tion & DL: imi­ta­tion learn­ing for faster AGI?

gwern22 Oct 2018 15:07 UTC
15 points
0 comments1 min readLW link

[link] Whole Brain Emu­la­tion and the Evolu­tion of Superorganisms

Wei_Dai3 May 2011 23:38 UTC
25 points
8 comments1 min readLW link

Su­per­in­tel­li­gence via whole brain emulation

AlexMennen17 Aug 2016 4:11 UTC
14 points
33 comments3 min readLW link

Hedg­ing our Bets: The Case for Pur­su­ing Whole Brain Emu­la­tion to Safe­guard Hu­man­ity’s Future

inklesspen1 Mar 2010 2:32 UTC
14 points
248 comments3 min readLW link

New WBE implementation

Louie30 Nov 2012 11:16 UTC
27 points
7 comments1 min readLW link

In­ter­mit­tent Distil­la­tions #4: Semi­con­duc­tors, Eco­nomics, In­tel­li­gence, and Tech­nolog­i­cal Progress.

Mark Xu8 Jul 2021 22:14 UTC
81 points
9 comments10 min readLW link

Some thoughts on David Rood­man’s GWP model and its re­la­tion to AI timelines

Tom Davidson19 Jul 2021 22:59 UTC
30 points
1 comment8 min readLW link

The Solow-Swan model of eco­nomic growth

Matthew Barnett29 Aug 2021 18:55 UTC
30 points
6 comments10 min readLW link

Phase tran­si­tions and AGI

17 Mar 2022 17:22 UTC
44 points
19 comments9 min readLW link

Hyper­bolic takeoff

Ege Erdil9 Apr 2022 15:57 UTC
17 points
8 comments10 min readLW link

Em­bod­i­ment is Indis­pens­able for AGI

P. G. Keerthana Gopalakrishnan7 Jun 2022 21:31 UTC
5 points
1 comment6 min readLW link
No comments.