‘An objective defense of Bayesianism’

Recently, Hans Leitgeb and Richard Pettigrew have published a novel defense of Bayesianism:

An Objective Defense of Bayesianism I: Measuring Inaccuracy

One of the fundamental problems of epistemology is to say when the evidence in an agent’s possession justifies the beliefs she holds. In this paper and its sequel, we defend the Bayesian solution to this problem by appealing to the following fundamental norm:

Accuracy: An epistemic agent ought to minimize the inaccuracy of her partial beliefs.

In this paper, we make this norm mathematically precise in various ways. We describe three epistemic dilemmas that an agent might face if she attempts to follow Accuracy, and we show that the only inaccuracy measures that do not give rise to such dilemmas are the quadratic inaccuracy measures. In the sequel, we derive the main tenets of Bayesianism from the relevant mathematical versions of Accuracy to which this characterization of the legitimate inaccuracy measures gives rise, but we also show that unless the requirement of Rigidity is imposed from the start, Jeffrey conditionalization has to be replaced by a different method of update in order for Accuracy to be satisfied.

An Objective Defense of Bayesianism II: The Consequences of Minimizing Inaccuracy

In this article and its prequel, we derive Bayesianism from the following norm: Accuracy—an agent ought to minimize the inaccuracy of her partial beliefs. In the prequel, we make the norm mathematically precise; in this article, we derive its consequences. We show that the two core tenets of Bayesianism follow from Accuracy, while the characteristic claim of Objective Bayesianism follows from Accuracy together with an extra assumption. Finally, we show that Jeffrey Conditionalization violates Accuracy unless Rigidity is assumed, and we describe the alternative updating rule that Accuracy mandates in the absence of Rigidity.

Richard Pettigrew has also written an excellent introduction to probability.