The Person As Input

I. Hu­mans are emo­tion-feel­ing ma­chines.

I don’t mean that hu­mans are ma­chines that hap­pen to feel emo­tions. I mean that hu­mans are ma­chines whose out­put is the feel­ing of emo­tions—“emo­tion-feel­ing” is the thing of value that we pro­duce.

Not just “be­ing happy.” Then wire­head­ing is the ul­ti­mate good, rather than the go-to utopia-hor­ror ex­am­ple. But emo­tions must be in­volved, be­cause ev­ery­thing else one can do are no more than a means to an end. Pro­duc­ing things, prop­a­gat­ing life, even think­ing. They all seem like en­deav­ors that are use­ful, but a life of max­i­miz­ing those things would suck. And the im­pli­ca­tion is that if we can cre­ate a ma­chine that can do those things bet­ter than we can, it would be good to re­place our­selves with that ma­chine and set it to re­pro­duce it­self in­finitely.

I re­cently saw a state­ment to the effect of “Art ex­ists to pro­duce feel­ings in us that we want, but do not get enough of in the course of nor­mal life.” That’s what makes art valuable – sup­ple­ment­ing emo­tional malnu­tri­tion. Such a thing ex­ists be­cause “to feel emo­tions” is the core func­tion of hu­man­ity, and not fulfilling that func­tion hurts like not eat­ing does.

This is why (for many peo­ple) the op­ti­mal level of psy­chosis is non-zero. This is why in­tel­li­gence is im­por­tant – a greater level of in­tel­li­gence al­lows a species to ex­pe­rience far more com­plex and nu­anced emo­tional states. And the abil­ity to ex­pe­rience more va­ri­eties of emo­tions is why it’s bet­ter to be­come more com­plex rather than sim­ply di­al­ing up hap­piness. It’s why di­s­or­ders that pre­vent us from ex­pe­rienc­ing cer­tain emo­tions are so awful (with the worst ob­vi­ously be­ing the ones that pre­vent us from feel­ing the “best” de­sires)

It’s why we like funny things, and tragic things, and scary things. Who wants to feel the way they feel af­ter watch­ing all of Evan­ge­lion?? Turns out – ev­ery­one, at some point, for at least a lit­tle bit of time!

It is why all hu­man life has value. You do not mat­ter based on what you can pro­duce, or how smart you are, or how use­ful you are to oth­ers. You mat­ter be­cause you are a hu­man who feels things.

My util­ity func­tion is to feel a cer­tain elas­tic web of emo­tions, and it varies from other util­ity func­tions by which emo­tions are de­sired in which amounts. My per­son­al­ity de­ter­mines what ac­tions pro­duce what emo­tions.

And a ma­chine that could feel things even bet­ter than hu­mans can could be a won­der­ful thing. Greg Egan’s “Di­as­pora” fea­tures an en­tire so­ciety of up­loaded hu­mans, liv­ing rich, com­plex lives of sub­stance. Lov­ing, striv­ing, cry­ing, etc. The so­ciety can sup­port far more hu­mans than is phys­i­cally pos­si­ble in meat-bod­ies, run­ning far faster than is pos­si­ble in re­als­pace. Since all these hu­mans are run­ning on com­puter chips, one could ar­gue that one way of look­ing at this thing is not “A so­ciety of up­loaded hu­mans” but “A ma­chine that feels hu­man emo­tions bet­ter than meat-hu­mans do.” And it’s a glo­ri­ous thing. I would be happy to live in such a so­ciety.

II. God Mode is Su­per Lame

Why not just wire­head with a large and com­plex set of emo­tions?

I’m old enough to have played the origi­nal Doom when it came out (sooo old!). It had a cheat-code that made you in­vin­cible, com­monly called god-mode. The first thing you no­tice is that it’s su­per cool to be in­vin­cible and just mow down all those mon­sters with im­punity! The next thing you no­tice is that af­ter a while (maybe ten min­utes?) it loses all ap­peal. It be­comes bor­ing. There is no game any­more, once you no longer have to worry about tak­ing dam­age. It be­comes a task. You start en­abling other cheats to get through it faster. Full-ammo cheats, to just use the biggest, fastest gun non­stop and get those mon­sters out of your way. Then walk-through-wall cheats, so you can just go straight to the level exit with­out wan­der­ing around look­ing for keys. Over, and over, and over again, level af­ter level. It be­comes a Kafka-es­que grotes­query. Why am I do­ing this? Why am I here? Is my pur­pose just to keep walk­ing end­lessly from Spawn Point to Exit, the world pass­ing around me in a blur, green and blue ex­plo­sions ob­scur­ing all vi­sion? When will this end?

It was a re­lief to be finished with the game.

That was my gen­er­a­tion’s first brush with the differ­ence be­tween goal-ori­ented ob­jec­tives, and pro­cess-ori­ented ob­jec­tives. We learned that the point of a game isn’t to get to the end, the point is to play the game. It used to be that if you wanted to be an awe­some gui­tarist, you had to go through the pro­cess of play­ing gui­tar a LOT. There was no short­cut. So one could be ex­cused for con­fus­ing “I want to be a rock star” with “I want to be play­ing awe­some mu­sic.” Be­fore cheat codes, get­ting to the end of the game was fun, so we thought that was our ob­jec­tive. After cheat-codes we could go straight to the end any time we wanted, and now we had to choose – is your ob­jec­tive re­ally just to get to the end? Or is it to go through the pro­cess of play­ing the game?

Some things are goal-ori­ented, of course. Very few peo­ple clean their toi­lets be­cause they en­joy the pro­cess of clean­ing their toi­let. They want their toi­let to be clean. If they could push a but­ton and have a clean toi­let with­out hav­ing to do the clean­ing, they would.

Pro­cess-ori­ented ob­jec­tives still have a goal. You want to beat the game. But you do not want first-or­der con­trol over the bit “Game Won? Y/​N”. You want first-or­der con­trol over the ac­tions that can get you there – strafing, shoot­ing, jump­ing – re­sult­ing in sec­ond-or­der con­trol over if the bit fi­nally gets flipped or not.

First-or­der con­trol is god mode. Your goal is com­pleted with full effi­ciency. Se­cond-or­der con­trol is in­di­rect. You can take ac­tions, and those ac­tions will, if ex­e­cuted well, get you closer to your goal. They are fuzzier, you can be wrong about their effects, their effects can be in­con­sis­tent over time, and you can get bet­ter at us­ing them. You can tell if you’d pre­fer god-mode for a task by con­sid­er­ing if you’d like to have it com­pleted with­out go­ing through the steps.

Do you want to:

Have Not Played The Game, And Have It Com­pleted? or Be Play­ing The Game?
Have A Clean Toi­let, Without Clean­ing It Your­self? or Be Clean­ing The Toi­let?
Be At The End of a Movie? or Be Watch­ing The Movie?

If the an­swer is in the first column, you want first-or­der con­trol. If it is in the sec­ond column, you want sec­ond-or­der con­trol.

Wire­head­ing, even vari­able multi-emo­tional wire­head­ing, as­sumes that emo­tions are a goal-ori­ented ob­jec­tive, and thus takes first-or­der con­trol of one’s emo­tional state. I con­test that emo­tions are a pro­cess-ori­ented ob­jec­tive. The pur­pose is to evoke those emo­tions by us­ing sec­ond-or­der con­trol – tak­ing ac­tions that will lead to those emo­tions be­ing felt. To elimi­nate that step and go straight to the cred­its is to lose the whole point of be­ing hu­man.

III. Re­mov­ing The Per­son From The Output

How is the pro­cess of play­ing Doom with­out cheat codes dis­t­in­guished from the pro­cess of re­peat­edly push­ing a but­ton con­nected to cer­tain elec­trodes in your head that pro­duce the emo­tions as­so­ci­ated with play­ing Doom with­out cheat codes? (Or just ly­ing there while the com­puter chooses which elec­trodes to stim­u­late on your be­half?)

If it’s just the emo­tions with­out the ex­pe­riences that would cause those emo­tions, I think that’s a huge differ­ence. That is once again just jump­ing right to the end-state, rather than ex­pe­rienc­ing the pro­cess that brings it about. It’s first-or­der con­trol, and that effi­ciency and di­rect­ness strips out all the com­plex­ity and nu­ance of a sec­ond-or­der ex­pe­rience.

See In­com­ing Fire­ball → Star­tled, Fear
S­trafe Right → An­ti­ci­pa­tion, Dread­
Fire­ball Dodged → Relief
Re­turn Fire → Vengeance!!

Is strictly more com­pli­cated than just

Star­tled, Fear
An­ti­ci­pa­tion, Dread­

The key differ­ence be­ing that in the first case, the player is en­tan­gled in the pro­cess. While these things are de­signed to pro­duce a spe­cific and very similar ex­pe­riences for ev­ery­one (which is why they’re pop­u­lar to a wide player base), it takes a pre-ex­ist­ing per­son and com­bines them with a se­ries of el­e­ments that is sup­posed to lead to an emo­tional re­sponse. The ex­act situ­a­tion is unique(ish) for each per­son, be­cause the per­son is a vi­tal in­put. The out­put (of per­son feel­ing X emo­tions) is unique and per­son­al­ized, as the in­put is differ­ent in ev­ery case.

When sim­ply con­jur­ing the emo­tions di­rectly via wire, the in­di­vi­d­ual is re­moved as an in­put. The emo­tions are im­planted di­rectly and do not de­pend on the per­son. The out­put (of per­son feel­ing X emo­tions) is iden­ti­cal and of far less com­plex­ity and value. Even if the emo­tions are hooked up to a ran­dom num­ber gen­er­a­tor or in some other way made to re­sult in non-iden­ti­cal out­puts, the situ­a­tion is not im­proved. Be­cause the prob­lem isn’t so much “iden­ti­cal out­put” as it is that the Per­son was not an in­put, was not en­tan­gled in the pro­cess, and there­fore doesn’t mat­ter.

I ac­tu­ally don’t have much of a prob­lem with simu­lated-re­al­ities. Already a large per­centage of the emo­tions felt by mid­dle-class peo­ple in the first world are due to simu­lated re­al­ities. We in­duce feel­ings via mu­sic, tele­vi­sion/​movies, video games, nov­els, and other art. I think this has had some pos­i­tive effects on so­ciety – it’s nice when peo­ple can get their Thrill needs met with­out ac­tu­ally risk­ing their lives and/​or com­mit­ting crimes. In fact, the sorts of peo­ple who still try to get all their emo­tional needs met in the real world tend to be de­struc­tive and dra­matic and I’m sure ev­ery­one knows at least one per­son like that, and tries to avoid them.

I think a com­plete re­treat to iso­la­tion would be sad, be­cause other hu­man minds are the most com­plex things that ex­ist, and to cut that out of one’s life en­tirely would be an im­pov­er­ish­ment. But a com­mu­nity of peo­ple in­ter­act­ing in a cy­ber­world, with ac­cess to phys­i­cal re­al­ity? Shit, that sounds amaz­ing!

Of course a “To­tal Re­call” style sys­tem has the po­ten­tial to be­come night­mar­ish. Right now when some­one watches a movie, they bring their whole life with them. The movie is in­ter­preted in light of one’s life ex­pe­rience. Every viewer has a differ­ent ex­pe­rience (some peo­ple have rad­i­cally differ­ent ex­pe­riences, as me and my SO re­cently dis­cov­ered when we watched Bird­man to­gether. In fact, this com­par­ing of the differ­ence of ex­pe­riences is the most fun part of my bi-weekly book club meet­ings. It’s kinda the whole point.). The per­son is an in­put in the pro­cess, and they’re mashed up into the product. If your pro­posed sys­tem would sim­ply im­pose a mem­ory or an ex­pe­rience onto some­one else whole­sale* with­out them be­ing in­volved in the pro­cess, then it would be just as bad as the “se­ries of emo­tions” pro­cess.

I have a vi­sion of billions of peo­ple spend­ing all of eter­nity sim­ply re­liv­ing the most in­tense emo­tional ex­pe­riences ever recorded, in perfect car­bon copy, over and over again, and I shud­der in hor­ror. That’s not even be­ing a per­son any­more. That’s over­writ­ing your own ex­is­tence with the recorded ex­is­tence of some­one(s) else. :(

But a good piece of art, that re­spects the per­son-as-in­put, and uses the art­work to cause them to cre­ate/​feel more of their own emo­tions? That seems like a good thing.

(*this was adapted from a se­ries of posts on my blog)