If only all our knowledge of our trading partners and environment was as reliable as ‘fundamentally included in the very nature of the problem specification’.
If you’re going to make that kind of argument, you’re dismissing pretty much all LW-style thought experiments.
If you’re going to make that kind of argument, you’re dismissing pretty much all LW-style thought experiments.
I think you’re reading in an argument that isn’t there. I was explaining the most common reason why human intuitions fail so blatantly when encountering transparent Newcomb. If anything that is more reason to formalise it as a thought experiment.
If you’re going to make that kind of argument, you’re dismissing pretty much all LW-style thought experiments.
I think you’re reading in an argument that isn’t there. I was explaining the most common reason why human intuitions fail so blatantly when encountering transparent Newcomb. If anything that is more reason to formalise it as a thought experiment.