They are making what some consider to be a philosophical mistake and others don’t. The falsifiability of MWI isn’t a scientific fact if it depends on a contentious philosophical claim. By the way, computational zombies, functional duplicates of humans which lack consciousness, can’t be argued against using the same arguments that exclude p-zombies.
No belief-theoretic mistake is considered one by those who make it. We should be thinking about what’s true, not what people think, to find out whether the premise is false. If your functional duplicate says it’s conscious, that’s going to be for the same reasons you would, and you couldn’t deduce your consciousness from talking about it any more than you could deduce the duplicate’s consciousness from its talking about it. As the link explains.
They are making what some consider to be a philosophical mistake and others don’t. The falsifiability of MWI isn’t a scientific fact if it depends on a contentious philosophical claim. By the way, computational zombies, functional duplicates of humans which lack consciousness, can’t be argued against using the same arguments that exclude p-zombies.
No belief-theoretic mistake is considered one by those who make it. We should be thinking about what’s true, not what people think, to find out whether the premise is false. If your functional duplicate says it’s conscious, that’s going to be for the same reasons you would, and you couldn’t deduce your consciousness from talking about it any more than you could deduce the duplicate’s consciousness from its talking about it. As the link explains.