I don’t think the OP was using “belief” to mean Bayesian credence; more like “what your System 1 expects”. Also, you can definitely make it the case that your conditions improving is correlated with effort, and the OP gives several examples of how to do this. Changing the territory is a pretty good way to change your (approximate cached proxies for) beliefs in a predictable direction.
I don’t think the OP was using “belief” to mean Bayesian credence; more like “what your System 1 expects”. Also, you can definitely make it the case that your conditions improving is correlated with effort, and the OP gives several examples of how to do this. Changing the territory is a pretty good way to change your (approximate cached proxies for) beliefs in a predictable direction.