Being able to convince yourself of arbitrary belief is an anti-truth skill, and Eliezer suggests you should dis-cultivate it by telling yourself you can’t do it.
That’s an interesting example in this context. You seem to say you want to believe that “you can’t do it” because it’s useful to hold that belief and not necessarily because it’s true.
Practically, I don’t think convincing yourself of a belief because the belief is useful is the same thing as convincing yourself of an arbitrary belief. I don’t think that the people I now who I consider particularly skilled at adopting beliefs because they consider them useful practiced on arbitrary beliefs.
To use an NLP term (given that’s the community where I know most people with the relevant skill set), behavior change is much easier when the belief change is ecological then if it’s random.
That’s an interesting example in this context. You seem to say you want to believe that “you can’t do it” because it’s useful to hold that belief and not necessarily because it’s true.
Practically, I don’t think convincing yourself of a belief because the belief is useful is the same thing as convincing yourself of an arbitrary belief. I don’t think that the people I now who I consider particularly skilled at adopting beliefs because they consider them useful practiced on arbitrary beliefs.
To use an NLP term (given that’s the community where I know most people with the relevant skill set), behavior change is much easier when the belief change is ecological then if it’s random.