I do not object that. I object that an eutopia is made solely of cooler existence, and I made formal request that I’m left the option to choose the uncooler one. That is not trivial, depending on the fun theory you use.
Would you object if (on top of making you cooler) we tied your responses to other situations? Say, looking up a simple fact would feel like scratching an itch, exchanging ideas would feel like sex, whatever environment you’re most often in would trigger your “this is home and normal and safe” response, manipulating fleets of nanobots would feel like carving, etc.
Yes, I would object to that too. To put it differently, I would object to everything that violates the integrity of the present correlation between my brain states and the external world (that includes the fact that my body is actually the mediator).
Been thinking more about your stated preferences. To untangle them more:
Do you insist on keeping simple pleasures because you fear you might lose them if cooler ones were available, i.e., increasing happiness set levels would eventually lower your total fun?
I once had a discussion with my mother in which I was optimistic about the increasing influence of machines and software in our lives and that in a near future we might spend most of our leisure uploaded in advanced MMOs. She objected that she valued gardening and similar menial tasks and that she did not like this direction. What I found out after some probing was that she objected to the aesthetic. “Machines” brought up fictional evidence of a very sterile future with heroin-addict-like wireheads being hooked up to computers.
So what I’m getting at, might that be a factor? Do you have a specific vision of what a complex future of the kind you are opposing might look like?
If I understand you right, you think that increasing fun leads to either wireheading and/or increasing complexity. (I generally agree.) I see how you can have a problem with wireheading. What’s your issue with complexity? Why would you want to not want to indulge in complex fun with the consequence of less interest in simple fun along the way?
In other words, wouldn’t you always want to have as much fun as possible? (With “fun” in the fun theory sense that includes all your terminal values, not just pleasure.) It seems to me like this should be true for any properly functioning agent, although agents might disagree on what fun is. But you seem to agree that you would actually enjoy more complex fun more.
Do you insist on keeping simple pleasures because you fear you might lose them if cooler ones were available, i.e., increasing happiness set levels would eventually lower your total fun?
No, my opinion is that a cooler existence would make them meaningless. It’s not a question of fun or happiness: in a eutopia those are cheap commodities. It is more a question of identity and futility.
So what I’m getting at, might that be a factor? Do you have a specific vision of what a complex future of the kind you are opposing might look like?
I feel it’s important to say that I’m not opposing a future like that. I like AIs and robots and think that we need more of that in our lives. What I’m opposing is that for some people, it’s unnecessary and unwanted to increase the complexity of existence per se. I simply don’t value terminally complexity, so for me an existence which is built on that is simply an existence which I don’t prefer.
In other words, wouldn’t you always want to have as much fun as possible? (With “fun” in the fun theory sense that includes all your terminal values, not just pleasure.) It seems to me like this should be true for any properly functioning agent, although agents might disagree on what fun is.
That, in essence, is the central ‘dogma’ of your theory of fun. I’m telling you however that for some people (me, for example), that is just not true. I just don’t want to have more and more fun, it strikes me as meaningless and ‘childish’ (that is not an exact description. I would need to dig deeper into the precise feeling).
I would like to add to your theory of fun that there are agents who, once a certain level of fun/happiness is reached, they just need no more and can continue happily forever in that state of mind.
I can understand “maxing out” fun. I even suspect that my ability to experience fun is bounded and that even without post-singularity tech I might maximize it. I wonder, what happens then? Once all your values are fulfilled (and sustainability is not an issue), what do you do?
(Obviously, self-modify to not get bored and enjoy the ride, says the wirehead. I’m not so sure about that anymore.)
Why would you ever choose the uncool option?! I don’t intend that to be a rude response, I just can’t wrap my head around it. Do you just feel that way? How do you know? Have you thought about that you might be confused, or might have some kind of status-quo bias?
(Not that I have a problem with your choice. I think a utopia should totally allow people to do things I find stupid, as long as it doesn’t cause any other harm.)
Have you thought about that you might be confused, or might have some kind of status-quo bias?
Even though it’s too strong to say that I would never choose the cooler existence, for what I understand of my present preferences, I’m not tempted at all about living in a simulation. By saying that I mean:
I’m aware there’s no difference between reality and a perfect simulation;
I don’t need particular complexity in my life, and most importantly, I don’t need more complexity than I have now;
I would choose simulation if it was a matter of survival, e.g. a gamma burst approaching Earth
if I were granted in a eutopia the state of happiness that I have in the present life, forever, that would be very ok for me.
(Not that I have a problem with your choice. I think a utopia should totally allow people to do things I find stupid, as long as it doesn’t cause any other harm.)
That is very cool from you: I promise that if I were to code a friendly AI, I will let it have people upload in a cooler existence, as long as they don’t cause any harm to the ‘meatballs’ ;)
I do not object that. I object that an eutopia is made solely of cooler existence, and I made formal request that I’m left the option to choose the uncooler one. That is not trivial, depending on the fun theory you use.
Would you object if (on top of making you cooler) we tied your responses to other situations? Say, looking up a simple fact would feel like scratching an itch, exchanging ideas would feel like sex, whatever environment you’re most often in would trigger your “this is home and normal and safe” response, manipulating fleets of nanobots would feel like carving, etc.
Yes, I would object to that too. To put it differently, I would object to everything that violates the integrity of the present correlation between my brain states and the external world (that includes the fact that my body is actually the mediator).
Been thinking more about your stated preferences. To untangle them more:
Do you insist on keeping simple pleasures because you fear you might lose them if cooler ones were available, i.e., increasing happiness set levels would eventually lower your total fun?
I once had a discussion with my mother in which I was optimistic about the increasing influence of machines and software in our lives and that in a near future we might spend most of our leisure uploaded in advanced MMOs. She objected that she valued gardening and similar menial tasks and that she did not like this direction. What I found out after some probing was that she objected to the aesthetic. “Machines” brought up fictional evidence of a very sterile future with heroin-addict-like wireheads being hooked up to computers.
So what I’m getting at, might that be a factor? Do you have a specific vision of what a complex future of the kind you are opposing might look like?
If I understand you right, you think that increasing fun leads to either wireheading and/or increasing complexity. (I generally agree.) I see how you can have a problem with wireheading. What’s your issue with complexity? Why would you want to not want to indulge in complex fun with the consequence of less interest in simple fun along the way?
In other words, wouldn’t you always want to have as much fun as possible? (With “fun” in the fun theory sense that includes all your terminal values, not just pleasure.) It seems to me like this should be true for any properly functioning agent, although agents might disagree on what fun is. But you seem to agree that you would actually enjoy more complex fun more.
No, my opinion is that a cooler existence would make them meaningless. It’s not a question of fun or happiness: in a eutopia those are cheap commodities. It is more a question of identity and futility.
I feel it’s important to say that I’m not opposing a future like that. I like AIs and robots and think that we need more of that in our lives. What I’m opposing is that for some people, it’s unnecessary and unwanted to increase the complexity of existence per se. I simply don’t value terminally complexity, so for me an existence which is built on that is simply an existence which I don’t prefer.
That, in essence, is the central ‘dogma’ of your theory of fun. I’m telling you however that for some people (me, for example), that is just not true. I just don’t want to have more and more fun, it strikes me as meaningless and ‘childish’ (that is not an exact description. I would need to dig deeper into the precise feeling). I would like to add to your theory of fun that there are agents who, once a certain level of fun/happiness is reached, they just need no more and can continue happily forever in that state of mind.
Thanks, now I understand you much better.
I can understand “maxing out” fun. I even suspect that my ability to experience fun is bounded and that even without post-singularity tech I might maximize it. I wonder, what happens then? Once all your values are fulfilled (and sustainability is not an issue), what do you do?
(Obviously, self-modify to not get bored and enjoy the ride, says the wirehead. I’m not so sure about that anymore.)
Why would you ever choose the uncool option?! I don’t intend that to be a rude response, I just can’t wrap my head around it. Do you just feel that way? How do you know? Have you thought about that you might be confused, or might have some kind of status-quo bias?
(Not that I have a problem with your choice. I think a utopia should totally allow people to do things I find stupid, as long as it doesn’t cause any other harm.)
Even though it’s too strong to say that I would never choose the cooler existence, for what I understand of my present preferences, I’m not tempted at all about living in a simulation. By saying that I mean:
I’m aware there’s no difference between reality and a perfect simulation;
I don’t need particular complexity in my life, and most importantly, I don’t need more complexity than I have now;
I would choose simulation if it was a matter of survival, e.g. a gamma burst approaching Earth
if I were granted in a eutopia the state of happiness that I have in the present life, forever, that would be very ok for me.
That is very cool from you: I promise that if I were to code a friendly AI, I will let it have people upload in a cooler existence, as long as they don’t cause any harm to the ‘meatballs’ ;)