The utility-maximizing complexity of conscious beings

An AI that runs con­scious be­ings has finite re­sources. Com­plex con­scious be­ings con­sume more of these re­sources than do sim­pler con­scious be­ings. Sup­pose, for pur­poses of dis­cus­sion, that a friendly AI would max­i­mize the ag­gre­gate util­ity of the be­ings it runs by mod­u­lat­ing their level of com­plex­ity.

The util­ity-max­i­miz­ing level of com­plex­ity would de­pend on the nat­u­ral laws that gov­ern the re­la­tion be­tween the com­plex­ity of a be­ing and the mag­ni­tude of the util­ity that it can ex­pe­rience.

When you are up­loaded to the AI, then, one of three things will hap­pen, de­pend­ing on the re­sult above:

1. You may wake up to dis­cover that most of your mind, mem­ory, and iden­tity had dis­ap­peared, be­cause most of your com­pu­ta­tional re­sources had been stolen in or­der to cre­ate a set of new, sim­ple be­ings who, in to­tal, would en­joy more util­ity than you had thereby lost. You may even be much less happy than you now are.

2. You may not wake up at all, be­cause the to­tal­ity of your com­pu­ta­tional re­sources had been stolen from you and given to a su­per­be­ing who was ca­pa­ble, be­cause of some econ­omy of scale in util­ity con­sump­tion, of ex­tract­ing more util­ity from them than you could.

3. You may wake up to dis­cover that you are much smarter and hap­pier than you now are, be­cause the util­ity-max­i­miz­ing level of com­plex­ity is larger than your own, but not so large that, if all peo­ple were to up­load, it would ex­haust the AI’s re­sources, thus mak­ing ra­tioning nec­es­sary.