Is this a general argument against precommitments?
If I say “if you do X, I will do a horrible thing Y”, I am not saying it because doing Y is smart per se. I am saying it to discourage you from doing X. The problem is, I have to follow on my promise, if you actually do X anyway. At least, if you can predict that I would change my mind and not do Y, then you have no reason to stop doing X.
Whether doing this entire exercise is rational, that depends on many things, such as the probability of a threat discouraging one from doing X. In general, that is very difficult to evaluate.
But anger evolved for a reason. If it was a purely stupid thing, evolution would not have selected for it.
Is this a general argument against precommitments?
If I say “if you do X, I will do a horrible thing Y”, I am not saying it because doing Y is smart per se. I am saying it to discourage you from doing X. The problem is, I have to follow on my promise, if you actually do X anyway. At least, if you can predict that I would change my mind and not do Y, then you have no reason to stop doing X.
Whether doing this entire exercise is rational, that depends on many things, such as the probability of a threat discouraging one from doing X. In general, that is very difficult to evaluate.
But anger evolved for a reason. If it was a purely stupid thing, evolution would not have selected for it.