TagLast edit: 11 Aug 2020 15:29 UTC by brook

Someone is well-calibrated if the things they predict with X% chance of happening in fact occur X% of the time. Importantly, calibration is not the same as accuracy. Calibration is about accurately assessing how good your predictions are, not making good predictions. Person A, whose predictions are marginally better than chance (60% of them come true when choosing from two options) and who is precisely 60% confident in their choices, is perfectly calibrated. In contrast, Person B, who is 99% confident in their predictions, and right 90% of the time, is more accurate than Person A, but less well-calibrated.

See also: Betting, Epistemic Modesty, Forecasting & Prediction

Being well-calibrated has value for rationalists separately from accuracy. Among other things, being well-calibrated lets you make good bets /​ make good decisions, communicate information helpfully to others if they know you to be well-calibrated (See Group Rationality), and helps prioritize which information is worth acquiring.

Note that all expressions of quantified confidence in beliefs can be well- or poorly- calibrated. For example, calibration applies to whether a person’s 95% confidence intervals captures the true outcome 95% of the time.

In­for­ma­tion Charts

Rafael Harth13 Nov 2020 16:12 UTC
28 points
6 comments13 min readLW link

Cal­ibrate your self-assessments

Scott Alexander9 Oct 2011 23:26 UTC
93 points
121 comments6 min readLW link

The Sin of Underconfidence

Eliezer Yudkowsky20 Apr 2009 6:30 UTC
74 points
185 comments6 min readLW link

Cal­ibra­tion Test with database of 150,000+ questions

Nanashi14 Mar 2015 11:22 UTC
54 points
31 comments1 min readLW link

Sus­pi­ciously bal­anced evidence

gjm12 Feb 2020 17:04 UTC
48 points
24 comments4 min readLW link

Ham­mer­time Day 9: Time Calibration

alkjash7 Feb 2018 1:40 UTC
13 points
7 comments2 min readLW link

Qual­i­ta­tively Confused

Eliezer Yudkowsky14 Mar 2008 17:01 UTC
49 points
82 comments4 min readLW link

Cre­dence Cal­ibra­tion Ice­breaker Game

Ruby16 May 2014 7:29 UTC
25 points
1 comment2 min readLW link

Au­mann Agree­ment Game

abramdemski9 Oct 2015 17:14 UTC
30 points
10 comments1 min readLW link

Si­mul­ta­neous Over­con­fi­dence and Underconfidence

abramdemski3 Jun 2015 21:04 UTC
36 points
6 comments5 min readLW link

The Bayesian Tyrant

abramdemski20 Aug 2020 0:08 UTC
116 points
14 comments6 min readLW link

A Mo­tor­cy­cle (and Cal­ibra­tion?) Accident

boggler18 Mar 2018 22:21 UTC
23 points
11 comments2 min readLW link

Bayes-Up: An App for Shar­ing Bayesian-MCQ

Louis Faucon6 Feb 2020 19:01 UTC
51 points
9 comments1 min readLW link

We Change Our Minds Less Often Than We Think

Eliezer Yudkowsky3 Oct 2007 18:14 UTC
63 points
117 comments1 min readLW link

Plac­ing Your­self as an In­stance of a Class

abramdemski3 Oct 2017 19:10 UTC
33 points
5 comments3 min readLW link

Lawful Uncertainty

Eliezer Yudkowsky10 Nov 2008 21:06 UTC
62 points
54 comments4 min readLW link

Kurzweil’s pre­dic­tions: good ac­cu­racy, poor self-calibration

Stuart_Armstrong11 Jul 2012 9:55 UTC
50 points
39 comments9 min readLW link

Hor­rible LHC Inconsistency

Eliezer Yudkowsky22 Sep 2008 3:12 UTC
26 points
34 comments1 min readLW link

Rais­ing the fore­cast­ing wa­ter­line (part 1)

Morendil9 Oct 2012 15:49 UTC
51 points
106 comments6 min readLW link

Pre­dic­tion Con­test 2018

jbeshir30 Apr 2018 18:26 UTC
9 points
4 comments3 min readLW link

Pre­dic­tion Con­test 2018: Scores and Retrospective

jbeshir27 Jan 2019 17:20 UTC
28 points
5 comments1 min readLW link

Say It Loud

Eliezer Yudkowsky19 Sep 2008 17:34 UTC
46 points
20 comments2 min readLW link

Ad­vanc­ing Certainty

komponisto18 Jan 2010 9:51 UTC
42 points
110 comments4 min readLW link

Cal­ibra­tion for con­tin­u­ous quantities

Cyan21 Nov 2009 4:53 UTC
30 points
13 comments3 min readLW link

Over­con­fi­dent Pessimism

lukeprog24 Nov 2012 0:47 UTC
35 points
38 comments4 min readLW link

RFC on an open prob­lem: how to de­ter­mine prob­a­bil­ities in the face of so­cial distortion

ialdabaoth7 Oct 2017 22:04 UTC
6 points
3 comments2 min readLW link

Illu­sion of Trans­parency: Why No One Un­der­stands You

Eliezer Yudkowsky20 Oct 2007 23:49 UTC
115 points
51 comments3 min readLW link

How to reach 80% of your goals. Ex­actly 80%.

Stuckwork10 Oct 2020 17:33 UTC
31 points
11 comments1 min readLW link

Test Your Cal­ibra­tion!

alyssavance11 Nov 2009 22:03 UTC
25 points
30 comments2 min readLW link

[Question] What are good ML/​AI re­lated pre­dic­tion /​ cal­ibra­tion ques­tions for 2019?

james_t4 Jan 2019 2:40 UTC
19 points
4 comments2 min readLW link

[Question] Is there a.. more ex­act.. way of scor­ing a pre­dic­tor’s cal­ibra­tion?

MakoYass16 Jan 2019 8:19 UTC
20 points
6 comments1 min readLW link
No comments.