Lifeism, Anti-Deathism, and Some Other Terminal-Values Rambling

(Apolo­gies to RSS users: ap­par­ently there’s no draft but­ton, but only “pub­lish” and “pub­lish-and-go-back-to-the-edit-screen”, mis­lead­ingly la­beled.)

You have a but­ton. If you press it, a happy, fulfilled per­son will be cre­ated in a sealed box, and then be painlessly garbage-col­lected fif­teen min­utes later. If asked, they would say that they’re glad to have ex­isted in spite of their mor­tal­ity. Be­cause they’re sealed in a box, they will leave be­hind no be­reaved friends or fam­ily. In short, this takes place in Magic Thought Ex­per­i­ment Land where ex­ter­nal­ities don’t ex­ist. Your choice is be­tween cre­at­ing a fif­teen-minute-long happy life or not.

Do you push the but­ton?

I sus­pect Eliezer would not, be­cause it would in­crease the death-count of the uni­verse by one. I would, be­cause it would in­crease the life-count of the uni­verse by fif­teen min­utes.

Ac­tu­ally, that’s an over­sim­plifi­ca­tion of my po­si­tion. I ac­tu­ally be­lieve that the im­por­tant part of any al­gorithm is its out­put, ad­di­tional copies mat­ter not at all, the net util­ity of the ex­is­tence of a group of en­tities-whose-ex­is­tence-con­sti­tutes-util­ity is equal to the max­i­mum of the in­di­vi­d­ual util­ities, and the (ter­mi­nal) util­ity of the ex­is­tence of a par­tic­u­lar com­pu­ta­tion is bounded be­low at zero. I would sub­mit a large num­ber of copies of my­self to slav­ery and/​or tor­ture to gain mod­er­ate benefits to my pri­mary copy.

(What hap­pens to the last copy of me, of course, does af­fect the ques­tion of “what com­pu­ta­tion oc­curs or not”. I would sub­ject N out of N+1 copies of my­self to tor­ture, but not N out of N. Also, I would hes­i­tate to tor­ture copies of other peo­ple, on the grounds that there’s a con­flict of in­ter­est and I can’t trust my­self to rea­son hon­estly. I might feel differ­ently af­ter I’d been us­ing my own fork-slaves for a while.)

So the real value of push­ing the but­ton would be my warm fuzzies, which breaks the no-ex­ter­nal­ities as­sump­tion, so I’m in­differ­ent.

But nev­er­the­less, even know­ing about the heat death of the uni­verse, know­ing that any­one born must in­evitably die, I do not con­sider it im­moral to cre­ate a per­son, even if we as­sume all else equal.