I don’t understand how the examples given illustrate free-floating beliefs: they seem to have at least some predictive powers, and thus shape anticipation - (some comments by others below illustrate this better).
The phlogiston theory had predictive power (e.g. what kind of “air” could be expected to support combustion, and that substances would grow lighter when they burned), and it was falsifyable (and was eventually falsified). It had advantages over the theories it replaced and was replaced by another theory which represented a better understanding. (I base this reading on Jim Loy’s page on Phlogiston Theory.
Literary genres don’t have much predictive powers if you don’t know anything about them—if you do, then they do. Classifying a writer as producing “science fiction” or “fantasy” creates anticipations that are statistically meaningful. For another comparison, saying some band plays “Death Metal” will shape our anticipation; somewhat differently for those who can distinguish Death Metal from Speed Metal as compared to those who merely know that “Metal” means “noise”.
I can imagine beliefs leading to false anticipations, and they’re obviously inferior to beliefs leading to more correct ones. That doesn’t mean they’re free-floating.
One example for the free-floating belief is actually about the tree falling in the forest: to believe that it makes a sound does not anticipate any sensory experience, since the tree falls explicitly where nobody is around to hear it, and whether there is sound or no sound will not change how the forest looks when we enter it later. However, to let go of the belief that the tree makes a sound does not seem to me to be very useful. What am I missing?
I understand that many beliefs are held not because they have predictive power, but because they generalize experiences (or thoughts) we have had into a condensed form: a sort of “packing algrithm” for the mind when we detect something common; and when we understand this commonality enough, we get to the point where we can make prediction, and if we don’t yet, we can’t, but may do so later. There is no belief or thought we can hold that we couldn’t trace back to experiences; beliefs are not anticipatory, but formed from hindsight. They organize past experience. Can you predict which of these beliefs is not going to be helpful in organizing future experiences? How?
It is bad to apply statistics when you don’t in fact have large numbers—we have just one universe (at least until the many-world theory is better established—and anyway, the exposition didn’t mention it).
I think the following problem is equivalent to the one posed: It is late at night, you’re tired, and it’s dark and you’re driving down an unfamiliar road. Then you see two motels, one to the right of the street, one to the left, both advertising vacant rooms. You know from a visit years ago that one has 10 rooms, the other has 100, but you can’t tell which is which (though you do remember that the larger one is cheaper). Anyway, you’re tired, so you just choose the one on the right at random, check in, and go to sleep. As you wake up in the morning, what are your chances that you find yourself in the larger motel? Does the number of rooms come into it? (Assume both motels are 90% full.)
The paradox is that while the other hotel is not contrafactual, it might as well be—the problem will play out the same. Same with the universe—there aren’t actually two universes with probabilities on which one you’ll end up in.
For a version where the Bayesian update works, you’d not go to the motel directly, but go to a tourist information stall that directs vistors to either the smaller or the larger motel until both are full—in that case, expect to wake up in the larger one. In this case, we have not one world, but two, and then the reasoning holds.
But if there’s only one motel, because the other burnt down (and we don’t know which), we’re back to 50⁄50.
I know that “fuzzy logic” tries to mix statistics and logic, and many AIs use it to deal with uncertain assertions, but statistics can be misapplied so easily that you seem to have a problem here.