I strive for altruism, but I’m not sure I can believe that subjective selfishness—caring about your own future experiences—is an incoherent utility function; that we are forced to be Buddhists who dare not cheat a neighbor, not because we are kind, but because we anticipate experiencing their consequences just as much as we anticipate experiencing our own. I don’t think that, if I were really selfish, I could jump off a cliff knowing smugly that a different person would experience the consequence of hitting the ground.
It strikes me that this is a problem of comparing things at different levels of meta. You are talking about your “motivations” as if they were things that you could freely choose based on your a priori determinations about how one well and truly ought to act. You sound, forgive me for saying this, I do respect you deeply, almost Kantian here.
The underlying basis for your ethical system, or even further, your motivational system, does not lie in this abstract observer that is Eliezer today and Brittany Spears tomorrow. I think I’m ripping this off of some videogame I’ve played, but think of the brain as a container, and this “observerstuff” (dare I call it a soul?) as water poured into the container. Different containers have different shapes and shape the water differently, even though it is the same water. But the key point is, motivations are a property of the container, not of the water, and their referents are containers, not water. Eliezer-shaped water cares about what happens to the Eliezer-container, not what happens to the water in it. That’s just not how the container is built.
It strikes me that this is a problem of comparing things at different levels of meta. You are talking about your “motivations” as if they were things that you could freely choose based on your a priori determinations about how one well and truly ought to act. You sound, forgive me for saying this, I do respect you deeply, almost Kantian here.
The underlying basis for your ethical system, or even further, your motivational system, does not lie in this abstract observer that is Eliezer today and Brittany Spears tomorrow. I think I’m ripping this off of some videogame I’ve played, but think of the brain as a container, and this “observerstuff” (dare I call it a soul?) as water poured into the container. Different containers have different shapes and shape the water differently, even though it is the same water. But the key point is, motivations are a property of the container, not of the water, and their referents are containers, not water. Eliezer-shaped water cares about what happens to the Eliezer-container, not what happens to the water in it. That’s just not how the container is built.