RSS

Calibration

Some­one is well-cal­ibrated if the things they pre­dict with X% chance of hap­pen­ing in fact oc­cur X% of the time. Im­por­tantly, cal­ibra­tion is not the same as ac­cu­racy. Cal­ibra­tion is about ac­cu­rately as­sess­ing how good your pre­dic­tions are, not mak­ing good pre­dic­tions. Per­son A, whose pre­dic­tions are marginally bet­ter than chance (60% of them come true when choos­ing from two op­tions) and who is pre­cisely 60% con­fi­dent in their choices, is perfectly cal­ibrated. In con­trast, Per­son B, who is 99% con­fi­dent in their pre­dic­tions, and right 90% of the time, is more ac­cu­rate than Per­son A, but less well-cal­ibrated.

See also: Bet­ting, Epistemic Modesty, Fore­cast­ing & Prediction

Be­ing well-cal­ibrated has value for ra­tio­nal­ists sep­a­rately from ac­cu­racy. Among other things, be­ing well-cal­ibrated lets you make good bets /​ make good de­ci­sions, com­mu­ni­cate in­for­ma­tion helpfully to oth­ers if they know you to be well-cal­ibrated (See Group Ra­tion­al­ity), and helps pri­ori­tize which in­for­ma­tion is worth ac­quiring.

Note that all ex­pres­sions of quan­tified con­fi­dence in be­liefs can be well- or poorly- cal­ibrated. For ex­am­ple, cal­ibra­tion ap­plies to whether a per­son’s 95% con­fi­dence in­ter­vals cap­tures the true out­come 95% of the time.

Cal­ibrate your self-assessments

Scott Alexander
9 Oct 2011 23:26 UTC
70 points
121 comments6 min readLW link

The Sin of Underconfidence

Eliezer Yudkowsky
20 Apr 2009 6:30 UTC
65 points
185 comments6 min readLW link

The Bayesian Tyrant

abramdemski
20 Aug 2020 0:08 UTC
103 points
14 comments6 min readLW link

Cal­ibra­tion Test with database of 150,000+ questions

Nanashi
14 Mar 2015 11:22 UTC
37 points
31 comments1 min readLW link

Sus­pi­ciously bal­anced evidence

gjm
12 Feb 2020 17:04 UTC
49 points
24 comments4 min readLW link

Ham­mer­time Day 9: Time Calibration

alkjash
7 Feb 2018 1:40 UTC
29 points
2 comments2 min readLW link
(radimentary.wordpress.com)

Qual­i­ta­tively Confused

Eliezer Yudkowsky
14 Mar 2008 17:01 UTC
38 points
82 comments4 min readLW link

Cre­dence Cal­ibra­tion Ice­breaker Game

Ruby
16 May 2014 7:29 UTC
15 points
1 comment2 min readLW link

Au­mann Agree­ment Game

abramdemski
9 Oct 2015 17:14 UTC
18 points
10 comments1 min readLW link

Si­mul­ta­neous Over­con­fi­dence and Underconfidence

abramdemski
3 Jun 2015 21:04 UTC
28 points
6 comments5 min readLW link

A Mo­tor­cy­cle (and Cal­ibra­tion?) Accident

boggler
18 Mar 2018 22:21 UTC
71 points
11 comments2 min readLW link

Bayes-Up: An App for Shar­ing Bayesian-MCQ

Louis Faucon
6 Feb 2020 19:01 UTC
45 points
9 comments1 min readLW link

We Change Our Minds Less Often Than We Think

Eliezer Yudkowsky
3 Oct 2007 18:14 UTC
56 points
116 comments1 min readLW link

Plac­ing Your­self as an In­stance of a Class

abramdemski
3 Oct 2017 19:10 UTC
51 points
5 comments3 min readLW link

Lawful Uncertainty

Eliezer Yudkowsky
10 Nov 2008 21:06 UTC
52 points
53 comments4 min readLW link

Kurzweil’s pre­dic­tions: good ac­cu­racy, poor self-calibration

Stuart_Armstrong
11 Jul 2012 9:55 UTC
36 points
39 comments9 min readLW link

Hor­rible LHC Inconsistency

Eliezer Yudkowsky
22 Sep 2008 3:12 UTC
19 points
34 comments1 min readLW link

Rais­ing the fore­cast­ing wa­ter­line (part 1)

Morendil
9 Oct 2012 15:49 UTC
33 points
106 comments6 min readLW link

Pre­dic­tion Con­test 2018

jbeshir
30 Apr 2018 18:26 UTC
25 points
4 comments3 min readLW link

Pre­dic­tion Con­test 2018: Scores and Retrospective

jbeshir
27 Jan 2019 17:20 UTC
29 points
5 comments1 min readLW link

Say It Loud

Eliezer Yudkowsky
19 Sep 2008 17:34 UTC
33 points
20 comments2 min readLW link

Ad­vanc­ing Certainty

komponisto
18 Jan 2010 9:51 UTC
34 points
110 comments4 min readLW link

Cal­ibra­tion for con­tin­u­ous quantities

Cyan
21 Nov 2009 4:53 UTC
26 points
13 comments3 min readLW link

Over­con­fi­dent Pessimism

lukeprog
24 Nov 2012 0:47 UTC
25 points
38 comments4 min readLW link

RFC on an open prob­lem: how to de­ter­mine prob­a­bil­ities in the face of so­cial distortion

ialdabaoth
7 Oct 2017 22:04 UTC
25 points
3 comments2 min readLW link

Illu­sion of Trans­parency: Why No One Un­der­stands You

Eliezer Yudkowsky
20 Oct 2007 23:49 UTC
99 points
51 comments3 min readLW link
No comments.