I think this is true, but like the Lindy effect, is a very weak form of evidence that is basically immediately ignorable in light of any stronger evidence gained by actually examining object-level reality
It can become very strong for shorter time predictions. If I say that the end of the world is tomorrow, it has very small a priori probability and very large update is needed to override it.
Very large updates are abundant. I’m looking out of my window, and now I’m completely certain it’s currently not raining, even though a priori the odds for that are far from complete certainty at any given time.
Good priors are important, but their details often get washed away by the scale of concrete evidence. Base rates are more often than not just a result of updating on most of your data, rather than conceptualized from first principles, and then as a cherry on top you update on a little more data to get a better prediction.
I think this is true, but like the Lindy effect, is a very weak form of evidence that is basically immediately ignorable in light of any stronger evidence gained by actually examining object-level reality
It can become very strong for shorter time predictions. If I say that the end of the world is tomorrow, it has very small a priori probability and very large update is needed to override it.
Very large updates are abundant. I’m looking out of my window, and now I’m completely certain it’s currently not raining, even though a priori the odds for that are far from complete certainty at any given time.
Good priors are important, but their details often get washed away by the scale of concrete evidence. Base rates are more often than not just a result of updating on most of your data, rather than conceptualized from first principles, and then as a cherry on top you update on a little more data to get a better prediction.