Mahatma Armstrong: CEVed to death.

My main ob­jec­tion to Co­her­ent Ex­trap­o­lated Vo­li­tion (CEV) is the “Ex­trap­o­lated” part. I don’t see any rea­son to trust the ex­trap­o­lated vo­li­tion of hu­man­ity—but this isn’t just for self cen­tred rea­sons. I don’t see any rea­son to trust my own ex­trap­o­lated vo­li­tion. I think it’s perfectly pos­si­ble that my ex­trap­o­lated vo­li­tion would fol­low some sce­nario like this:

  1. It starts with me, Arm­strong 1. I want to be more al­tru­is­tic at the next level, valu­ing other hu­mans more.

  2. The al­tru­is­tic Arm­strong 2 wants to be even more al­tru­is­tic. He makes him­self into a perfectly al­tru­is­tic util­i­tar­ian to­wards hu­mans, and in­creases his al­tru­ism to­wards an­i­mals.

  3. Arm­strong 3 won­ders about the differ­ence be­tween an­i­mals and hu­mans, and why he should value one of them more. He de­cided to in­crease his al­tru­ism equally to­wards all sen­tient crea­tures.

  4. Arm­strong 4 is wor­ried about the fact that sen­tience isn’t clearly defined, and seems ar­bi­trary any­way. He in­crease his al­tru­ism to­wards all liv­ing things.

  5. Arm­strong 5′s prob­lem is that the bar­rier be­tween liv­ing and non-liv­ing things isn’t clear ei­ther (e.g. viruses). He de­cides that he should solve this by valu­ing all worth­while things—is not art and beauty worth some­thing as well?

  6. But what makes a thing worth­while? Is there not art in ev­ery­thing, beauty in the eye of the right be­holder? Arm­strong 6 will make him­self value ev­ery­thing.

  7. Arm­strong 7 is in tur­moil: so many an­i­mals prey upon other an­i­mals, or de­stroy valuable rocks! To avoid this, he de­cides the most moral thing he can do is to try and de­stroy all life, and then cre­ate a world of sta­sis for the ob­jects that re­main.

There are many other ways this could go, maybe end­ing up as a nega­tive util­i­tar­ian or com­pletely in­differ­ent, but that’s enough to give the flavour. You might trust the per­son you want to be, to do the right things. But you can’t trust them to want to be the right per­son—es­pe­cially sev­eral lev­els in (com­pare with the ar­gu­ment in this post, and my very old chain­ing god idea). I’m not claiming that such a value drift is in­evitable, just that it’s pos­si­ble—and so I’d want my ini­tial val­ues to dom­i­nate when there is a large con­flict.

Nor do I give Arm­strong 7′s val­ues any credit for hav­ing origi­nated from mine. Un­der tor­ture, I’m pretty sure I could be made to ac­cept any sys­tem of val­ues what­so­ever; there are other ways that would prov­ably al­ter my val­ues, so I don’t see any rea­son to priv­ilege Arm­strong 7′s val­ues in this way.

“But,” says the ob­ject­ing straw­man, “this is com­pletely differ­ent! Arm­strong 7′s val­ues are the ones that you would reach by fol­low­ing the path you would want to fol­low any­way! That’s where you would get to, if you started out want­ing to be more al­tru­is­tic, had con­trol over you own mo­ti­va­tional struc­ture, and grew and learnt and knew more!”

“Thanks for point­ing that out,” I re­spond, “now that I know where that ends up, I must make sure to change the path I would want to fol­low! I’m not sure whether I shouldn’t be more al­tru­is­tic, or avoid touch­ing my mo­ti­va­tional struc­ture, or not want to grow or learn or know more. Those all sound pretty good, but if they end up at Arm­strong 7, some­thing’s go­ing to have to give.”