An­ti­ci­pated Experiences

TagLast edit: 31 Dec 2020 2:49 UTC by jimrandomh

One principle of rationality is that “beliefs should pay rent in anticipated experiences.” If you believe in something, what do you expect to be different as a result? What does the belief say should happen, and what does it say should not happen? If you have a verbal disagreement with someone, how does your disagreement cash out in differing expectations?

If two people try to get specific about the anticipated experiences driving their disagreement, one method for doing so is the double crux technique. The notion that beliefs are models of what we expect to experience is also one of the basic premises of predictive processing theories of how the brain works. Beliefs that do not pay rent may be related to meaningless arguments driven by coalitional instincts.

If a tree falls in a forest and no one hears it, does it make a sound? One says, “Yes it does, for it makes vibrations in the air.” Another says, “No it does not, for there is no auditory processing in any brain.” [...]
Suppose that, after a tree falls, the two arguers walk into the forest together. Will one expect to see the tree fallen to the right, and the other expect to see the tree fallen to the left? Suppose that before the tree falls, the two leave a sound recorder next to the tree. Would one, playing back the recorder, expect to hear something different from the other? Suppose they attach an electroencephalograph to any brain in the world; would one expect to see a different trace than the other?
Though the two argue, one saying “No,” and the other saying “Yes,” they do not anticipate any different experiences. The two think they have different models of the world, but they have no difference with respect to what they expect will happen to them; their maps of the world do not diverge in any sensory detail.
-- Eliezer Yudkowsky, Making Beliefs Pay Rent (In Anticipated Experiences)

Mak­ing Beliefs Pay Rent (in An­ti­ci­pated Ex­pe­riences)

Eliezer Yudkowsky28 Jul 2007 22:59 UTC
249 points
258 comments4 min readLW link

Ap­plause Lights

Eliezer Yudkowsky11 Sep 2007 18:31 UTC
196 points
93 comments2 min readLW link

Guess­ing the Teacher’s Password

Eliezer Yudkowsky22 Aug 2007 3:40 UTC
186 points
95 comments4 min readLW link

Fake Explanations

Eliezer Yudkowsky20 Aug 2007 21:13 UTC
118 points
89 comments2 min readLW link

A Sketch of Good Communication

Ben Pace31 Mar 2018 22:48 UTC
116 points
34 comments3 min readLW link

Truly Part Of You

Eliezer Yudkowsky21 Nov 2007 2:18 UTC
127 points
59 comments4 min readLW link

Belief in Belief

Eliezer Yudkowsky29 Jul 2007 17:49 UTC
129 points
176 comments5 min readLW link

Dou­ble Crux — A Strat­egy for Mu­tual Understanding

Duncan_Sabien2 Jan 2017 4:37 UTC
144 points
104 comments12 min readLW link

Eth­nic Ten­sion And Mean­ingless Arguments

Scott Alexander5 Nov 2014 3:38 UTC
51 points
7 comments22 min readLW link

Anx­iety and Rationality

[deleted]19 Jan 2016 18:30 UTC
48 points
31 comments4 min readLW link

How to Mea­sure Anything

lukeprog7 Aug 2013 4:05 UTC
96 points
52 comments22 min readLW link

Urges vs. Goals: The anal­ogy to an­ti­ci­pa­tion and belief

AnnaSalamon24 Jan 2012 23:57 UTC
122 points
71 comments7 min readLW link

Phys­i­cal and Men­tal Behavior

Scott Alexander10 Jul 2011 20:20 UTC
79 points
22 comments3 min readLW link

In­stru­men­tal vs. Epistemic—A Bardic Perspective

MBlume25 Apr 2009 7:41 UTC
84 points
189 comments3 min readLW link

Belief in Self-Deception

Eliezer Yudkowsky5 Mar 2009 15:20 UTC
80 points
112 comments4 min readLW link

What is Ev­i­dence?

Eliezer Yudkowsky22 Sep 2007 6:43 UTC
109 points
52 comments3 min readLW link

[Question] What are some good ex­am­ples of fake be­liefs?

adamzerner14 Nov 2020 7:40 UTC
18 points
8 comments1 min readLW link

Believ­ing vs understanding

adamzerner24 Jul 2021 3:39 UTC
15 points
2 comments6 min readLW link

Seek Fair Ex­pec­ta­tions of Others’ Models

Zvi17 Oct 2017 14:30 UTC
57 points
17 comments9 min readLW link

The Power of Pos­i­tivist Thinking

Scott Alexander21 Mar 2009 20:55 UTC
89 points
56 comments9 min readLW link

The Sim­ple Truth

Eliezer Yudkowsky1 Jan 2008 20:00 UTC
76 points
6 comments22 min readLW link

The Fu­til­ity of Emergence

Eliezer Yudkowsky26 Aug 2007 22:10 UTC
72 points
140 comments3 min readLW link

Hug the Query

Eliezer Yudkowsky14 Dec 2007 19:51 UTC
75 points
22 comments1 min readLW link

My Wild and Reck­less Youth

Eliezer Yudkowsky30 Aug 2007 1:52 UTC
74 points
54 comments3 min readLW link

Say Not “Com­plex­ity”

Eliezer Yudkowsky29 Aug 2007 4:22 UTC
61 points
53 comments3 min readLW link

So you think you un­der­stand Quan­tum Mechanics

shminux22 Dec 2012 21:16 UTC
65 points
64 comments3 min readLW link

Belief in the Im­plied Invisible

Eliezer Yudkowsky8 Apr 2008 7:40 UTC
50 points
34 comments6 min readLW link

Two Truths and a Lie

Psychohistorian23 Dec 2009 6:34 UTC
70 points
67 comments2 min readLW link

From First Principles

[deleted]27 Sep 2012 19:04 UTC
77 points
46 comments5 min readLW link

Ar­tifi­cial Addition

Eliezer Yudkowsky20 Nov 2007 7:58 UTC
60 points
128 comments6 min readLW link

This Ter­ri­tory Does Not Exist

ike13 Aug 2020 0:30 UTC
7 points
199 comments7 min readLW link

Magic Tricks Re­vealed: Test Your Rationality

Peter Wildeford13 Aug 2011 5:23 UTC
42 points
29 comments2 min readLW link

Mak­ing Beliefs Pay Rent (in An­ti­ci­pated Ex­pe­riences): Exercises

RobinZ17 Apr 2011 15:31 UTC
38 points
13 comments1 min readLW link

Belief in Belief vs. Internalization

Desrtopa29 Nov 2010 3:12 UTC
41 points
59 comments2 min readLW link

How to Teach Stu­dents to Not Guess the Teacher’s Pass­word?

Petruchio4 Jan 2013 15:18 UTC
35 points
96 comments1 min readLW link

[Question] Any layper­son-ac­cessible refer­ence posts on how to op­er­a­tional­ize be­liefs ?

Optimization Process5 Feb 2021 7:26 UTC
17 points
0 comments1 min readLW link