It doesn’t make you right. It just makes them as wrong (or lazy) as you.
If you feel afraid that incorporating a belief would change your values, that’s fine. It’s understandable that you won’t then dispassionately weigh the evidence for it; perhaps you’ll bring a motivated skepticism to bear on the scary belief. If it’s important enough that you care, then the effort is justified.
However, fighting to protect your cherished belief is going to lead to a biased evaluation of evidence, so refusing to engage the scary arguments is just a more extreme and honest version of trying to refute them.
I’d justify both practices situationally: considering the chance you weigh the evidence dispassionately but get the answer quite wrong (even your confidence estimation is off), you can err on the side of caution in protecting your most cherished values. That is, your objective function isn’t just to have the best Bayesian-rational track record.
It doesn’t make you right. It just makes them as wrong (or lazy) as you.
If you feel afraid that incorporating a belief would change your values, that’s fine. It’s understandable that you won’t then dispassionately weigh the evidence for it; perhaps you’ll bring a motivated skepticism to bear on the scary belief. If it’s important enough that you care, then the effort is justified.
However, fighting to protect your cherished belief is going to lead to a biased evaluation of evidence, so refusing to engage the scary arguments is just a more extreme and honest version of trying to refute them.
I’d justify both practices situationally: considering the chance you weigh the evidence dispassionately but get the answer quite wrong (even your confidence estimation is off), you can err on the side of caution in protecting your most cherished values. That is, your objective function isn’t just to have the best Bayesian-rational track record.