If consciousness has any significant effect on our decisions then the AI will have to simulate it and therefore something will perceive to be in the situation depicted in the original post. It was a crude guess that for an AI to be able to credibly threat you with simulated torture in many cases it would also use this capability to arrive at the most detailed data of your expected decision procedure.
If consciousness has any significant effect on our decisions then the AI will have to simulate it and therefore something will perceive to be in the situation depicted in the original post.
Only if there isn’t a non-conscious algorithm that has the same effect on our decisions. Which seems likely to be the case; it’s certainly possible to make a p-zombie if you can redesign the original brain all you want.
If consciousness has any significant effect on our decisions then the AI will have to simulate it and therefore something will perceive to be in the situation depicted in the original post. It was a crude guess that for an AI to be able to credibly threat you with simulated torture in many cases it would also use this capability to arrive at the most detailed data of your expected decision procedure.
Only if there isn’t a non-conscious algorithm that has the same effect on our decisions. Which seems likely to be the case; it’s certainly possible to make a p-zombie if you can redesign the original brain all you want.