The timing problem is a problem for how well we can predict the actions of myopic agents: Any agent that has a myopic utility function has no instrumental convergent reason for goal preservation.
The reason I suspect you haven’t is that whether an agent is “myopic” or not is irrelevant to the argument. Where we may disagree is over the nature of goal having, as Seth Herd pointed out. If you want to find a challenge to the argument, that’s the place to look.
The timing problem is a problem for how well we can predict the actions of myopic agents: Any agent that has a myopic utility function has no instrumental convergent reason for goal preservation.
Have you read the paper?
I did read 2/3rd of the paper, and I tried my best to understand it, but apparently I failed.
The reason I suspect you haven’t is that whether an agent is “myopic” or not is irrelevant to the argument. Where we may disagree is over the nature of goal having, as Seth Herd pointed out. If you want to find a challenge to the argument, that’s the place to look.
It is possible that we also disagree on the nature of goal having. I reserve the right to find my own places to challenge your argument.
Ha, yes, fair enough