I’m not sure, but it seems to relate to what Eliezer highlighted in how an algorithm feels from inside; the way brains track concepts as separate from the components that define them. If you can imagine consciousness as something that persists as a first-order object, something separate from the brain—because it is hard to recognize “thinking” when looking at your brain—if you can see “I” as a concept distinct from the brain that you are, it makes sense to imagine “I wake up as Eliezer”; you just take the “I” object and reassign it to Eliezer’s brain. That’s why the sequences are so big on dissolving the question and looking at what experiences the concept actually makes you anticipate.
Afaics, the problem is hard not because of some intrinsic difficulty but because it requires us to recognize “ourselves” in our brains, and consciousness is so central to our experience that it’s hard to go up against the intuitions we have about it.
I’m not sure, but it seems to relate to what Eliezer highlighted in how an algorithm feels from inside; the way brains track concepts as separate from the components that define them. If you can imagine consciousness as something that persists as a first-order object, something separate from the brain—because it is hard to recognize “thinking” when looking at your brain—if you can see “I” as a concept distinct from the brain that you are, it makes sense to imagine “I wake up as Eliezer”; you just take the “I” object and reassign it to Eliezer’s brain. That’s why the sequences are so big on dissolving the question and looking at what experiences the concept actually makes you anticipate.
Afaics, the problem is hard not because of some intrinsic difficulty but because it requires us to recognize “ourselves” in our brains, and consciousness is so central to our experience that it’s hard to go up against the intuitions we have about it.