Hrm… I’m pretty sure that, at least initially, losing the capacity for pain is a change I would not want. There’re definite changes I would want in myself, but I don’t think, at least initially, I would want that.
I’d want more to be, well, “stronger” than I am, better able to handle it, for lack of better terminology. Not so much less pain so much as so much more, well “me”, that the pain can’t fill it. (Yes, this is obviously imprecise. I’m simply trying to appeal to how I currently imagine the desired state “feeling from the inside”, as best as I can.)
Further, in the long term, I’m thinking I may want to keep the capacity because I don’t think I’d want to give up the ability to really properly “comprehend from the inside” my memories. So I think I’d want to retain some of that circuitry in some form, at least to decode ancient memories.
(But then, contrary to something you hinted at in a previous post, I think I would like, on some level, to still like cookies even by the time the last star would have burned out. I find the idea of carrying the ability to enjoy such a “simple childish” pleasure so far into deep time to itself be appealing. That and I hope to retain appreciation at least in some form for something analogous to corny puns. And again, for similar motivations.)
However, the similarity of removing pain to removing boredom or anything analogous, well… I’m not sure. I think one motivation I’d have would be more “increasing complexity/possibility of human experience”, so it’s partly I just don’t want to give up a “trick” I already have.
But if pleasures came in complex forms like tastes and smells and… no. Cancel that. Let’s go farther: Pleasure as complex as human vision (or, preferably, much farther. But you get the idea), then it might be different. I don’t know if this is possible, it’d be a much longer term change, a more complex upgrade probably, but eventually setting myself up so that at no point do the pleasures translate to any simple one dimensional positive reinforcement. That “all the way down” the algorithm stack it’s distinct and complex, and not just differently named tokens that do the same thing, then maybe we could more or less safely eliminate suffering without giving up anything important, without really approaching anything as, well, anything as downright depressing as a world of static blissed out wireheads.
I don’t know if it’s possible to really get something like pleasure that doesn’t, somewhere in the stack of stuff that generates experience, translate on some level to simple 1D reinforcement. I admit this notion of being able to do this may be an incoherent confusion, I’m really unsure here. But I think if it was possible, getting rid of the whole simple 1D positive reinforcement thing might be nice, if on all levels of experience it was more complex than that.
Further, in the long term, I’m thinking I may want to keep the capacity because I don’t think I’d want to give up the ability to really properly “comprehend from the inside” my memories. So I think I’d want to retain some of that circuitry in some form, at least to decode ancient memories.
Throwing away the circuitry behind tears means throwing away the circuitry that allows one to sympathize with the tears of others. To ignore the objection any virtue ethicist that might be reading this currently for a little while, obviously in a world without tears such sympathy may not be needed. But as you point out we still have our memories and great tales that we’d probably like to go on appreciating somewhat for a long time to come.
Hrm… I’m pretty sure that, at least initially, losing the capacity for pain is a change I would not want. There’re definite changes I would want in myself, but I don’t think, at least initially, I would want that.
I’d want more to be, well, “stronger” than I am, better able to handle it, for lack of better terminology. Not so much less pain so much as so much more, well “me”, that the pain can’t fill it. (Yes, this is obviously imprecise. I’m simply trying to appeal to how I currently imagine the desired state “feeling from the inside”, as best as I can.)
Further, in the long term, I’m thinking I may want to keep the capacity because I don’t think I’d want to give up the ability to really properly “comprehend from the inside” my memories. So I think I’d want to retain some of that circuitry in some form, at least to decode ancient memories.
(But then, contrary to something you hinted at in a previous post, I think I would like, on some level, to still like cookies even by the time the last star would have burned out. I find the idea of carrying the ability to enjoy such a “simple childish” pleasure so far into deep time to itself be appealing. That and I hope to retain appreciation at least in some form for something analogous to corny puns. And again, for similar motivations.)
However, the similarity of removing pain to removing boredom or anything analogous, well… I’m not sure. I think one motivation I’d have would be more “increasing complexity/possibility of human experience”, so it’s partly I just don’t want to give up a “trick” I already have.
But if pleasures came in complex forms like tastes and smells and… no. Cancel that. Let’s go farther: Pleasure as complex as human vision (or, preferably, much farther. But you get the idea), then it might be different. I don’t know if this is possible, it’d be a much longer term change, a more complex upgrade probably, but eventually setting myself up so that at no point do the pleasures translate to any simple one dimensional positive reinforcement. That “all the way down” the algorithm stack it’s distinct and complex, and not just differently named tokens that do the same thing, then maybe we could more or less safely eliminate suffering without giving up anything important, without really approaching anything as, well, anything as downright depressing as a world of static blissed out wireheads.
I don’t know if it’s possible to really get something like pleasure that doesn’t, somewhere in the stack of stuff that generates experience, translate on some level to simple 1D reinforcement. I admit this notion of being able to do this may be an incoherent confusion, I’m really unsure here. But I think if it was possible, getting rid of the whole simple 1D positive reinforcement thing might be nice, if on all levels of experience it was more complex than that.
Throwing away the circuitry behind tears means throwing away the circuitry that allows one to sympathize with the tears of others. To ignore the objection any virtue ethicist that might be reading this currently for a little while, obviously in a world without tears such sympathy may not be needed. But as you point out we still have our memories and great tales that we’d probably like to go on appreciating somewhat for a long time to come.