Perhaps the issue is that I believe I should not care—that if I was more rational, I would not care.
That my values are based on a misunderstanding of reality, just as the title of this post.
In particular, my values seem to be pinned on ideas that are not true—that states of the universe matter, objectively rather than just subjectively, and that I exist forever/always.
This “pinning” doesn’t seem to be that critical—life goes on, and I eat a turkey sandwich when I get hungry. But it seems unfortunate that I should understand cerebrally (to the extent that I am capable) that my values are based on an illusion, but that my biology demands that I keep on as though my values were based on something real. To be very dramatic, it is like some concept of my ‘self’ is trapped in this non-nonsensical machine that keeps on eating and enjoying and caring like Sisyphus.
Put this way, it just sounds like a disconnect in the way our hardware and software evolved—my brain has evolved to think about how to satisfying certain goals supplied by biology, which often includes the meta-problem of prioritizing and evaluating these goals. The biology doesn’t care if the answer returned is ‘mu’ in the recursion, and furthermore doesn’t care if I’m at a step in this evolution where checking-out of the simulation-I’m-in seems just as reasonable an answer as any other course of action.
Fortunately, my organism just ignores those nihilistic opines. (Perhaps this ignoring also evolved, socially or more fundamentally in the hardware, as well.) I say fortunately, because I have other goals besides Tarski, or finding resolutions to these value conundrums.
In particular, my values seem to be pinned on ideas that are not true—that states of the universe matter, objectively rather than just subjectively, and that I exist forever/always.
Well, if they are, and if I understand what you mean by “pinned on,” then we should expect the strength of those values to weaken as you stop investing in those ideas.
I can’t tell from your discussion whether you don’t find this to be true (in which case I would question what makes you think the values are pinned on the ideas in the first place), or whether you’re unable to test because you haven’t been able to stop investing in those ideas in the first place.
If it’s the latter, though… what have you tried, and what failure modes have you encountered?
My values seem to be pinned on these ideas (the ones that are not true) because while I am in the process of caring about the things I care about, and especially when I am making a choice about something, I find that I am always making the assumption that these ideas are true—that the states of the universe matter and that I exist forever.
When it occurs to me to remember that these assumptions are not true, I feel a great deal of cognitive dissonance. However, the cognitive dissonance has no resolution. I think about it for a little while, go about my business, and discover some time later I forgot again.
I don’t know if a specific example will help or not. I am driving home, in traffic, and brain is happily buzzing with thoughts. I am thinking about all the people in cars around me and how I’m part of a huge social network and whether the traffic is as efficient as it could be and civilization and how I am going to go home and what I am going to do. And then I remember about death, the snuffing out of my awareness, and something about that just doesn’t connect. It’s like I can empathize with my own non-existence (hopefully this example is something more than just a moment of psychological disorder) and I feel that my current existence is a mirage. Or rather, the moral weight that I’ve given it doesn’t make sense. That’s what the cognitive dissonance feels like.
I want to add that I don’t believe I am that unusual. I think this need for an objective morality (objective value system) is why some people are naturally theists.
I also think that people who think wire-heading is a failure mode, must be in the same boat that I’m in.
It is a fact that I care, we agree.
Perhaps the issue is that I believe I should not care—that if I was more rational, I would not care.
That my values are based on a misunderstanding of reality, just as the title of this post.
In particular, my values seem to be pinned on ideas that are not true—that states of the universe matter, objectively rather than just subjectively, and that I exist forever/always.
This “pinning” doesn’t seem to be that critical—life goes on, and I eat a turkey sandwich when I get hungry. But it seems unfortunate that I should understand cerebrally (to the extent that I am capable) that my values are based on an illusion, but that my biology demands that I keep on as though my values were based on something real. To be very dramatic, it is like some concept of my ‘self’ is trapped in this non-nonsensical machine that keeps on eating and enjoying and caring like Sisyphus.
Put this way, it just sounds like a disconnect in the way our hardware and software evolved—my brain has evolved to think about how to satisfying certain goals supplied by biology, which often includes the meta-problem of prioritizing and evaluating these goals. The biology doesn’t care if the answer returned is ‘mu’ in the recursion, and furthermore doesn’t care if I’m at a step in this evolution where checking-out of the simulation-I’m-in seems just as reasonable an answer as any other course of action.
Fortunately, my organism just ignores those nihilistic opines. (Perhaps this ignoring also evolved, socially or more fundamentally in the hardware, as well.) I say fortunately, because I have other goals besides Tarski, or finding resolutions to these value conundrums.
Well, if they are, and if I understand what you mean by “pinned on,” then we should expect the strength of those values to weaken as you stop investing in those ideas.
I can’t tell from your discussion whether you don’t find this to be true (in which case I would question what makes you think the values are pinned on the ideas in the first place), or whether you’re unable to test because you haven’t been able to stop investing in those ideas in the first place.
If it’s the latter, though… what have you tried, and what failure modes have you encountered?
My values seem to be pinned on these ideas (the ones that are not true) because while I am in the process of caring about the things I care about, and especially when I am making a choice about something, I find that I am always making the assumption that these ideas are true—that the states of the universe matter and that I exist forever.
When it occurs to me to remember that these assumptions are not true, I feel a great deal of cognitive dissonance. However, the cognitive dissonance has no resolution. I think about it for a little while, go about my business, and discover some time later I forgot again.
I don’t know if a specific example will help or not. I am driving home, in traffic, and brain is happily buzzing with thoughts. I am thinking about all the people in cars around me and how I’m part of a huge social network and whether the traffic is as efficient as it could be and civilization and how I am going to go home and what I am going to do. And then I remember about death, the snuffing out of my awareness, and something about that just doesn’t connect. It’s like I can empathize with my own non-existence (hopefully this example is something more than just a moment of psychological disorder) and I feel that my current existence is a mirage. Or rather, the moral weight that I’ve given it doesn’t make sense. That’s what the cognitive dissonance feels like.
I want to add that I don’t believe I am that unusual. I think this need for an objective morality (objective value system) is why some people are naturally theists.
I also think that people who think wire-heading is a failure mode, must be in the same boat that I’m in.