Can you explain to me this notion of “resources of the universe?” I would have thought that simulated brains would have the same measure as actual brains.
Thinking about it more carefully, I think the statement that they don’t have the same measure is broken (not even wrong, incoherent).
As far as resources, I think the argument can be made in terms of entropy or energy, but I will make it in terms of energy because it’s easier to understand. Suppose for the sake of argument that a perfect simulation of a brain requires the same amount of energy to run as a real brain (in actuality it would require more, because you either have to create a duplicate of the brain, which requires the same energy, or solve the field equations for the brain explicitly, which requires greater energy). In order to provide the simulated inputs to the brain, you have to spend energy to make sure that they are internally consistent, react properly to the outputs of the brain, etc. So it’s impossible for a perfect simulation to require less energy than a real brain. If we are somewhere in a tower of simulated universes, either our simulation is being run imperfectly, or each universe in the tower must be smaller than the last, and probably dramatically smaller.
Now, imagine that you have a solar system worth of energy, and that running a simulation incurs an overhead of 3 orders of magnitude to calculate the simulation and consistent inputs to it. Using that solar system’s energy, you can either support (warning: completely made up numbers ahead) a trillion people accepting inputs from the universe at large, or a billion people running on efficient perfect simulations (perfect with respect to their brain activity, but obviously not perfect with respect to the universe, because that’s not possible without a greater than universe sized source of energy).
Measure with respect to minds is related to probability, so it really relates an existing consciousness to its futures. If I step into a duplicator, the measure of each of my future selves is 1⁄2 with respect to my current self, because I have a 50% probability of ending up in either of those future bodies, but from the perspective of my duplicates post-duplication, their measure is once again one. Bearing this in mind, the measure of a simulation currently running is 1, from its own perspective, and the measure of any given individual is also 1 if they currently exist.
Thinking about it more carefully, I think the statement that they don’t have the same measure is broken (not even wrong, incoherent).
So you agree with me then, that they have the same measure?
As for resources: I really don’t think that the amount of energy and matter used to compute a mind has any bearing on the measure of that mind. What matters is whether or not the energy and matter instantiates the correct program; if it does, then the mind exists there, if it doesn’t, then it doesn’t.
True, the quantity of minds matters (probably) for measure. So a mind with a trillion copies has greater measure than a mind with a billion copies. If we think that the relevant level of detail for implementation is exactly the fundamental level for our brains, then yes this would mean we should expect ourselves, other things equal, to be brains rather than simulations. But I’d say it is highly likely that the relevant level of detail for implementation is much higher—the neuron level, say—and thus simulations quite possibly outnumber brains by a great deal.
Of course, either way, it comes down to more than just the resource requirements—it also comes down to e.g. how likely it is that a posthuman society would create large numbers of ancestor simulations.
Can you explain to me this notion of “resources of the universe?” I would have thought that simulated brains would have the same measure as actual brains.
Thinking about it more carefully, I think the statement that they don’t have the same measure is broken (not even wrong, incoherent).
As far as resources, I think the argument can be made in terms of entropy or energy, but I will make it in terms of energy because it’s easier to understand. Suppose for the sake of argument that a perfect simulation of a brain requires the same amount of energy to run as a real brain (in actuality it would require more, because you either have to create a duplicate of the brain, which requires the same energy, or solve the field equations for the brain explicitly, which requires greater energy). In order to provide the simulated inputs to the brain, you have to spend energy to make sure that they are internally consistent, react properly to the outputs of the brain, etc. So it’s impossible for a perfect simulation to require less energy than a real brain. If we are somewhere in a tower of simulated universes, either our simulation is being run imperfectly, or each universe in the tower must be smaller than the last, and probably dramatically smaller.
Now, imagine that you have a solar system worth of energy, and that running a simulation incurs an overhead of 3 orders of magnitude to calculate the simulation and consistent inputs to it. Using that solar system’s energy, you can either support (warning: completely made up numbers ahead) a trillion people accepting inputs from the universe at large, or a billion people running on efficient perfect simulations (perfect with respect to their brain activity, but obviously not perfect with respect to the universe, because that’s not possible without a greater than universe sized source of energy).
Measure with respect to minds is related to probability, so it really relates an existing consciousness to its futures. If I step into a duplicator, the measure of each of my future selves is 1⁄2 with respect to my current self, because I have a 50% probability of ending up in either of those future bodies, but from the perspective of my duplicates post-duplication, their measure is once again one. Bearing this in mind, the measure of a simulation currently running is 1, from its own perspective, and the measure of any given individual is also 1 if they currently exist.
So you agree with me then, that they have the same measure?
As for resources: I really don’t think that the amount of energy and matter used to compute a mind has any bearing on the measure of that mind. What matters is whether or not the energy and matter instantiates the correct program; if it does, then the mind exists there, if it doesn’t, then it doesn’t.
True, the quantity of minds matters (probably) for measure. So a mind with a trillion copies has greater measure than a mind with a billion copies. If we think that the relevant level of detail for implementation is exactly the fundamental level for our brains, then yes this would mean we should expect ourselves, other things equal, to be brains rather than simulations. But I’d say it is highly likely that the relevant level of detail for implementation is much higher—the neuron level, say—and thus simulations quite possibly outnumber brains by a great deal.
Of course, either way, it comes down to more than just the resource requirements—it also comes down to e.g. how likely it is that a posthuman society would create large numbers of ancestor simulations.