Not for the Sake of Happiness (Alone)

When I met the fu­tur­ist Greg Stock some years ago, he ar­gued that the joy of sci­en­tific dis­cov­ery would soon be re­placed by pills that could simu­late the joy of sci­en­tific dis­cov­ery. I ap­proached him af­ter his talk and said, “I agree that such pills are prob­a­bly pos­si­ble, but I wouldn’t vol­un­tar­ily take them.

And Stock said, “But they’ll be so much bet­ter that the real thing won’t be able to com­pete. It will just be way more fun for you to take the pills than to do all the ac­tual sci­en­tific work.”

And I said, “I agree that’s pos­si­ble, so I’ll make sure never to take them.”

Stock seemed gen­uinely sur­prised by my at­ti­tude, which gen­uinely sur­prised me.

One of­ten sees ethi­cists ar­gu­ing as if all hu­man de­sires are re­ducible, in prin­ci­ple, to the de­sire for our­selves and oth­ers to be happy. (In par­tic­u­lar, Sam Har­ris does this in The End of Faith, which I just finished pe­rus­ing—though Har­ris’s re­duc­tion is more of a drive-by shoot­ing than a ma­jor topic of dis­cus­sion.)

This isn’t the same as ar­gu­ing whether all hap­pinesses can be mea­sured on a com­mon util­ity scale—differ­ent hap­pinesses might oc­cupy differ­ent scales, or be oth­er­wise non-con­vert­ible. And it’s not the same as ar­gu­ing that it’s the­o­ret­i­cally im­pos­si­ble to value any­thing other than your own psy­cholog­i­cal states, be­cause it’s still per­mis­si­ble to care whether other peo­ple are happy.

The ques­tion, rather, is whether we should care about the things that make us happy, apart from any hap­piness they bring.

We can eas­ily list many cases of moral­ists go­ing astray by car­ing about things be­sides hap­piness. The var­i­ous states and coun­tries that still out­law oral sex make a good ex­am­ple; these leg­is­la­tors would have been bet­ter off if they’d said, “Hey, what­ever turns you on.” But this doesn’t show that all val­ues are re­ducible to hap­piness; it just ar­gues that in this par­tic­u­lar case it was an eth­i­cal mis­take to fo­cus on any­thing else.

It is an un­de­ni­able fact that we tend to do things that make us happy, but this doesn’t mean we should re­gard the hap­piness as the only rea­son for so act­ing. First, this would make it difficult to ex­plain how we could care about any­one else’s hap­piness—how we could treat peo­ple as ends in them­selves, rather than in­stru­men­tal means of ob­tain­ing a warm glow of satis­fac­tion.

Se­cond, just be­cause some­thing is a con­se­quence of my ac­tion doesn’t mean it was the sole jus­tifi­ca­tion. If I’m writ­ing a blog post, and I get a headache, I may take an ibupro­fen. One of the con­se­quences of my ac­tion is that I ex­pe­rience less pain, but this doesn’t mean it was the only con­se­quence, or even the most im­por­tant rea­son for my de­ci­sion. I do value the state of not hav­ing a headache. But I can value some­thing for its own sake and also value it as a means to an end.

For all value to be re­ducible to hap­piness, it’s not enough to show that hap­piness is in­volved in most of our de­ci­sions—it’s not even enough to show that hap­piness is the most im­por­tant con­se­quent in all of our de­ci­sions—it must be the only con­se­quent. That’s a tough stan­dard to meet. (I origi­nally found this point in a Sober and Wil­son pa­per, not sure which one.)

If I claim to value art for its own sake, then would I value art that no one ever saw? A screen­saver run­ning in a closed room, pro­duc­ing beau­tiful pic­tures that no one ever saw? I’d have to say no. I can’t think of any com­pletely life­less ob­ject that I would value as an end, not just a means. That would be like valu­ing ice cream as an end in it­self, apart from any­one eat­ing it. Every­thing I value, that I can think of, in­volves peo­ple and their ex­pe­riences some­where along the line.

The best way I can put it, is that my moral in­tu­ition ap­pears to re­quire both the ob­jec­tive and sub­jec­tive com­po­nent to grant full value.

The value of sci­en­tific dis­cov­ery re­quires both a gen­uine sci­en­tific dis­cov­ery, and a per­son to take joy in that dis­cov­ery. It may seem difficult to dis­en­tan­gle these val­ues, but the pills make it clearer.

I would be dis­turbed if peo­ple re­treated into holodecks and fell in love with mind­less wal­l­pa­per. I would be dis­turbed even if they weren’t aware it was a holodeck, which is an im­por­tant eth­i­cal is­sue if some agents can po­ten­tially trans­port peo­ple into holodecks and sub­sti­tute zom­bies for their loved ones with­out their aware­ness. Again, the pills make it clearer: I’m not just con­cerned with my own aware­ness of the un­com­fortable fact. I wouldn’t put my­self into a holodeck even if I could take a pill to for­get the fact af­ter­ward. That’s sim­ply not where I’m try­ing to steer the fu­ture.

I value free­dom: When I’m de­cid­ing where to steer the fu­ture, I take into ac­count not only the sub­jec­tive states that peo­ple end up in, but also whether they got there as a re­sult of their own efforts. The pres­ence or ab­sence of an ex­ter­nal pup­pet mas­ter can af­fect my val­u­a­tion of an oth­er­wise fixed out­come. Even if peo­ple wouldn’t know they were be­ing ma­nipu­lated, it would mat­ter to my judg­ment of how well hu­man­ity had done with its fu­ture. This is an im­por­tant eth­i­cal is­sue, if you’re deal­ing with agents pow­er­ful enough to helpfully tweak peo­ple’s fu­tures with­out their knowl­edge.

So my val­ues are not strictly re­ducible to hap­piness: There are prop­er­ties I value about the fu­ture that aren’t re­ducible to ac­ti­va­tion lev­els in any­one’s plea­sure cen­ter; prop­er­ties that are not strictly re­ducible to sub­jec­tive states even in prin­ci­ple.

Which means that my de­ci­sion sys­tem has a lot of ter­mi­nal val­ues, none of them strictly re­ducible to any­thing else. Art, sci­ence, love, lust, free­dom, friend­ship...

And I’m okay with that. I value a life com­pli­cated enough to be challeng­ing and aes­thetic—not just the feel­ing that life is com­pli­cated, but the ac­tual com­pli­ca­tions—so turn­ing into a plea­sure cen­ter in a vat doesn’t ap­peal to me. It would be a waste of hu­man­ity’s po­ten­tial, which I value ac­tu­ally fulfilling, not just hav­ing the feel­ing that it was fulfilled.