Thinking about it more carefully, I think the statement that they don’t have the same measure is broken (not even wrong, incoherent).
So you agree with me then, that they have the same measure?
As for resources: I really don’t think that the amount of energy and matter used to compute a mind has any bearing on the measure of that mind. What matters is whether or not the energy and matter instantiates the correct program; if it does, then the mind exists there, if it doesn’t, then it doesn’t.
True, the quantity of minds matters (probably) for measure. So a mind with a trillion copies has greater measure than a mind with a billion copies. If we think that the relevant level of detail for implementation is exactly the fundamental level for our brains, then yes this would mean we should expect ourselves, other things equal, to be brains rather than simulations. But I’d say it is highly likely that the relevant level of detail for implementation is much higher—the neuron level, say—and thus simulations quite possibly outnumber brains by a great deal.
Of course, either way, it comes down to more than just the resource requirements—it also comes down to e.g. how likely it is that a posthuman society would create large numbers of ancestor simulations.
So you agree with me then, that they have the same measure?
As for resources: I really don’t think that the amount of energy and matter used to compute a mind has any bearing on the measure of that mind. What matters is whether or not the energy and matter instantiates the correct program; if it does, then the mind exists there, if it doesn’t, then it doesn’t.
True, the quantity of minds matters (probably) for measure. So a mind with a trillion copies has greater measure than a mind with a billion copies. If we think that the relevant level of detail for implementation is exactly the fundamental level for our brains, then yes this would mean we should expect ourselves, other things equal, to be brains rather than simulations. But I’d say it is highly likely that the relevant level of detail for implementation is much higher—the neuron level, say—and thus simulations quite possibly outnumber brains by a great deal.
Of course, either way, it comes down to more than just the resource requirements—it also comes down to e.g. how likely it is that a posthuman society would create large numbers of ancestor simulations.