Nesov, your writings are so hard to understand sometimes. Let me take this as an example and give you some detailed feedback. I hope it’s useful to you to determine in the future where you might have to explain in more detail or use more precise language.
It’s not you who should use UDT, it’s the world.
Do you mean “it’s not only you”, or “it’s the world except you”? If it’s the latter, it doesn’t seem to make any sense. If it’s the former, it doesn’t seem to answer Eliezer’s objection.
This is a salient point of departure between FAI and humanity.
Do you mean FAI should use UDT, and humanity shouldn’t?
FAI is not in the business of saying in words what you should expect.
Ok, this seems clear. (Although why not, if that would make me feel better?)
People are stuff of the world, not rules of the world or strategies to play by those rules.
By “stuff”, do you mean “part of the state of the world”? And people do in some sense embody strategies (what they would do in different situations), so what do you mean by “people are not strategies”?
Rules and strategies don’t depend on particular moves, they specify how to handle them, but plays consist of moves, of evidence. This very distinction between plays and strategies is the true origin of updatelessness. It is the fault to make this distinction that causes the confusion UDT resolves.
This part makes sense, but I don’t see the connection to what Eliezer wrote.
Do you mean “it’s not only you”, or “it’s the world except you”? If it’s the latter, it doesn’t seem to make any sense. If it’s the former, it doesn’t seem to answer Eliezer’s objection.
I mean the world as substrate, with “you” being implemented on the substrate of FAI. FAI runs UDT, you consist of FAI’s decisions (even if in the sense of “influenced by”, there seems to be no formal difference). The decisions are output of the strategy optimized for by UDT, two levels removed from running UDT themselves.
Do you mean FAI should use UDT, and humanity shouldn’t?
Yes, in the sense that humanity runs on the FAI-substrate that uses UDT or something on the level of strategy-optimization anyway, but humanity itself is not about optimization.
By “stuff”, do you mean “part of the state of the world”? And people do in some sense embody strategies (what they would do in different situations), so what do you mean by “people are not strategies”?
I suspect that people should be found in plays (what actually happens given the state of the world), not strategies (plans for every eventuality).
Nesov, your writings are so hard to understand sometimes. Let me take this as an example and give you some detailed feedback. I hope it’s useful to you to determine in the future where you might have to explain in more detail or use more precise language.
Do you mean “it’s not only you”, or “it’s the world except you”? If it’s the latter, it doesn’t seem to make any sense. If it’s the former, it doesn’t seem to answer Eliezer’s objection.
Do you mean FAI should use UDT, and humanity shouldn’t?
Ok, this seems clear. (Although why not, if that would make me feel better?)
By “stuff”, do you mean “part of the state of the world”? And people do in some sense embody strategies (what they would do in different situations), so what do you mean by “people are not strategies”?
This part makes sense, but I don’t see the connection to what Eliezer wrote.
I mean the world as substrate, with “you” being implemented on the substrate of FAI. FAI runs UDT, you consist of FAI’s decisions (even if in the sense of “influenced by”, there seems to be no formal difference). The decisions are output of the strategy optimized for by UDT, two levels removed from running UDT themselves.
Yes, in the sense that humanity runs on the FAI-substrate that uses UDT or something on the level of strategy-optimization anyway, but humanity itself is not about optimization.
I suspect that people should be found in plays (what actually happens given the state of the world), not strategies (plans for every eventuality).