The B approach to Occam’s razor is just a way to think carefully about your possible preference for simplicity. If you prefer simpler explanations, you can bias your prior appropriately, and then the B machinery will handle how you should change your mind with more evidence (which might possibly favor more complex explanations, since Nature isn’t obligated to follow your preferences).
I don’t think it’s a good idea to use B in settings other than statistical inference, or probability puzzles. Arguing with people is an exercise in xenoanthropology, not an exercise in B.
I don’t think it’s a good idea to use B in settings other than statistical inference, or probability puzzles.
I’m not sure exactly what you mean by this. Do you mean that Bayesianism is inappropriate for situations where the data points are arguments and explanations rather than quantifiable measurements or the like? Do you mean that it shouldn’t be used to prefer one person’s argument over another’s?
In any case, could you elaborate on this point? I haven’t read through much of the Sequences yet (I’m waiting for the book version to come out), but my impression was that using Bayesian-type approaches outside of purely statistical situations is a large part of what they are about.
Arguing with people is an exercise in xenoanthropology, not an exercise in B.
Not sure I understand this. Assuming you’re both trying to approach the truth, arguing with others is a chance to get additional evidence you might not have noticed before. That’s both xenoanthropology and Bayesianism.
my impression was that using Bayesian-type approaches outside of purely statistical situations is a large part of
what they are about.
Yes. I disagree.
Do you mean that it shouldn’t be used to prefer one person’s argument over another’s?
Look at our good friend Scott Alexander dissecting arguments. How much actual B does he use? Usually just pointing out basic innumeracy is enough “oh you are off by a few orders of magnitude” (but that’s not B, that’s just being numerate, e.g. being able to add numbers, etc.)
Assuming you’re both trying to approach the truth...
I think the kind of stuff folks in this community use to argue/update internally is all fine, but I don’t think it’s a formal B setup usually, just some hacks along the lines of “X has shown herself to be thoughtful and sensible in the past, and disagrees w/ me about Y, I should adjust my own beliefs.”
This will not work with outsiders, since they generally play a different game than you. I think the dominating term in arguments is understanding social context in which the other side is operating, and learning how they use words. If B comes up at all, it’s just easy bookkeeping on top of that hard stuff.
I don’t understand what people here mean by “B.” For example, using Bayes theorem isn’t “B” because everyone who believes the chain rule of probabilities uses Bayes theorem (so hopefully everyone).
Seems they’re referring to Bayesian Epistemology / Bayesian Confirmation Theory, along with informal variants thereof. Bayesian Epistemology is a very well respected and popular movement in philosophy, although it is by no means universally accepted. In any case, the use of the term “Bayesian” in this sense is certainly not limited to LessWrong.
The B approach to Occam’s razor is just a way to think carefully about your possible preference for simplicity. If you prefer simpler explanations, you can bias your prior appropriately, and then the B machinery will handle how you should change your mind with more evidence (which might possibly favor more complex explanations, since Nature isn’t obligated to follow your preferences).
I don’t think it’s a good idea to use B in settings other than statistical inference, or probability puzzles. Arguing with people is an exercise in xenoanthropology, not an exercise in B.
Upvoted for
I’m not sure exactly what you mean by this. Do you mean that Bayesianism is inappropriate for situations where the data points are arguments and explanations rather than quantifiable measurements or the like? Do you mean that it shouldn’t be used to prefer one person’s argument over another’s?
In any case, could you elaborate on this point? I haven’t read through much of the Sequences yet (I’m waiting for the book version to come out), but my impression was that using Bayesian-type approaches outside of purely statistical situations is a large part of what they are about.
Not sure I understand this. Assuming you’re both trying to approach the truth, arguing with others is a chance to get additional evidence you might not have noticed before. That’s both xenoanthropology and Bayesianism.
Yes. I disagree.
Look at our good friend Scott Alexander dissecting arguments. How much actual B does he use? Usually just pointing out basic innumeracy is enough “oh you are off by a few orders of magnitude” (but that’s not B, that’s just being numerate, e.g. being able to add numbers, etc.)
I think the kind of stuff folks in this community use to argue/update internally is all fine, but I don’t think it’s a formal B setup usually, just some hacks along the lines of “X has shown herself to be thoughtful and sensible in the past, and disagrees w/ me about Y, I should adjust my own beliefs.”
This will not work with outsiders, since they generally play a different game than you. I think the dominating term in arguments is understanding social context in which the other side is operating, and learning how they use words. If B comes up at all, it’s just easy bookkeeping on top of that hard stuff.
I don’t understand what people here mean by “B.” For example, using Bayes theorem isn’t “B” because everyone who believes the chain rule of probabilities uses Bayes theorem (so hopefully everyone).
Seems they’re referring to Bayesian Epistemology / Bayesian Confirmation Theory, along with informal variants thereof. Bayesian Epistemology is a very well respected and popular movement in philosophy, although it is by no means universally accepted. In any case, the use of the term “Bayesian” in this sense is certainly not limited to LessWrong.