Here’s something I just posted elsewhere (in a debate concerning cryonics!) relating to this:
Sure, there are many people who know the same amount as I do and reach different numbers. But that’s hardly a unique failing of this approach: equally-knowledgable people disagree about all kinds of issues all the time, even when they don’t try to put their intuitions into numbers.
No, there isn’t any well-established, objective approach for deriving the numbers. But that’s beside the point. The difference between just throwing around verbal arguments and writing down probability estimates is like the difference in basing judgements on vague intuitions in your head and explicitly writing out the pros and cons. Putting numbers on things not only helps clarify the exact degree to which you disagree with someone else, it also forces you to be more explicit in your reasoning.
My above comment is a good example: Aleksander challenged me on two points, which led me to 1) consciously realize that I’d been basing one of my figures on two disjoint assumptions, helping me clarify my reasons for why I believed that 2) do some more research on another figure, leading me to data that made me revise my estimate to one tenth of what it previously was. If we’d been just throwing around verbal arguments and vague appeals to intuitions like we’d been doing before, I’m not sure whether either one of those would have happened. So no, putting explicit numbers on things doesn’t mean we’ll ever reach the correct conclusion, but it does help work out the exact points of disagreement and maybe make the estimates converge at least a bit.
It’s also important to realize that not putting numbers on things doesn’t mean we’re not still pulling numbers out of thin air! Your brain is still doing some sort of implicit probability estimate. And while one could reasonably argue that trying to put numbers into our intuitions loses important data (as we don’t have introspective access to all our thought processes), there are also some pretty convincing lines of argument suggesting that consciously-held information has evolved as much to appear good and persuade others as it has to actually evaluate the truthfulness of things. So a refusal to put explicit numbers on things seems to me suspicious, as it seems like the kind of a trick one that’d be useful in hiding inconsistencies in one’s arguments.
Note however that I’m most definitely not accusing anyone of intentional dishonesty, or anything along those lines. The issue is not in people being consciously deceitful. The issue is in people’s minds being built in such a way as to trick their consciousness into making estimates based on something else than truth, and then having reasonable-seeming intuitions that happen to lead to a “cover-up” for the flimsiness of the reasoning. I’m just as skeptical about my own conscious thought processes as I am of those of others (or at least I try to be), which is part of the reason why I’m so eager to put down my own probability estimates for others to critique.
Here’s something I just posted elsewhere (in a debate concerning cryonics!) relating to this: