I don’t think it would be too surprising if that movement on my end continues.
I’m very confused about the notion of fitting expected updating within a Bayesian framework. Phenomena like the fact that a Bayesian agent should expect to never change any particular belief, although they might have high credence that they’ll change some belief; or that a Bayesian agent can recognize a median belief change ≠ 0 but not a mean belief change ≠ 0.
On the theoretical level, I believe that it’s consistent to say “further movement is unsurprising, but I can’t predict in which direction”.
On the practical level, it’s probably also consistent to say “If you forced betting odds out of me now, I’d probably bet that I’ll increase funding to MIRI, so if you’re trusting my view you should donate there yourself, but my process for increasing a grant size has more steps and deliberation and I’m not going to immediately decide to increase funding for MIRI—wait for my next report”.
I think I understand this a bit better know, given also Rob’s comment on FB.
On the theoretical level, that’s a very interesting belief to have, because sometimes it doesn’t pay rent in anticipated experience at all. Given that you cannot predict a change in direction, it seems rational to act as if your belief will not change, despite you being very confident it will change.
Your practical example is not a change of belief. It’s rather saying “I now believe I’ll increase funding to MIRI, but my credence is still <70% as the formal decision process usually uncovers many surprises”
I’m very confused about the notion of fitting expected updating within a Bayesian framework. Phenomena like the fact that a Bayesian agent should expect to never change any particular belief, although they might have high credence that they’ll change some belief; or that a Bayesian agent can recognize a median belief change ≠ 0 but not a mean belief change ≠ 0.
On the theoretical level, I believe that it’s consistent to say “further movement is unsurprising, but I can’t predict in which direction”.
On the practical level, it’s probably also consistent to say “If you forced betting odds out of me now, I’d probably bet that I’ll increase funding to MIRI, so if you’re trusting my view you should donate there yourself, but my process for increasing a grant size has more steps and deliberation and I’m not going to immediately decide to increase funding for MIRI—wait for my next report”.
I think I understand this a bit better know, given also Rob’s comment on FB.
On the theoretical level, that’s a very interesting belief to have, because sometimes it doesn’t pay rent in anticipated experience at all. Given that you cannot predict a change in direction, it seems rational to act as if your belief will not change, despite you being very confident it will change.
Your practical example is not a change of belief. It’s rather saying “I now believe I’ll increase funding to MIRI, but my credence is still <70% as the formal decision process usually uncovers many surprises”