An­ti­ci­pated Experiences

TagLast edit: 13 Jul 2020 16:19 UTC by Kaj_Sotala

One principle of rationality is that “beliefs should pay rent in anticipated experiences.” If you believe in something, what do you expect to be different as a result? What does the belief say should happen, and what does it say should not happen? If you have a verbal disagreement with someone, how does your disagreement cash out in differing expectations?

If two people try to get specific about the anticipated experiences driving their disagreement, one method for doing so is the double crux technique. The notion that beliefs are models of what we expect to experience is also one of the basic premises of predictive processing theories of how the brain works. Beliefs that do not pay rent may be related to meaningless arguments driven by coalitional instincts.

If a tree falls in a forest and no one hears it, does it make a sound? One says, “Yes it does, for it makes vibrations in the air.” Another says, “No it does not, for there is no auditory processing in any brain.” [...]
Suppose that, after a tree falls, the two arguers walk into the forest together. Will one expect to see the tree fallen to the right, and the other expect to see the tree fallen to the left? Suppose that before the tree falls, the two leave a sound recorder next to the tree. Would one, playing back the recorder, expect to hear something different from the other? Suppose they attach an electroencephalograph to any brain in the world; would one expect to see a different trace than the other?
Though the two argue, one saying “No,” and the other saying “Yes,” they do not anticipate any different experiences. The two think they have different models of the world, but they have no difference with respect to what they expect will happen to them; their maps of the world do not diverge in any sensory detail.
-- Eliezer Yudkowsky, Making Beliefs Pay Rent (In Anticipated Experiences)

Mak­ing Beliefs Pay Rent (in An­ti­ci­pated Ex­pe­riences)

Eliezer Yudkowsky28 Jul 2007 22:59 UTC
431 points
266 comments4 min readLW link

Ap­plause Lights

Eliezer Yudkowsky11 Sep 2007 18:31 UTC
317 points
99 comments2 min readLW link

Guess­ing the Teacher’s Password

Eliezer Yudkowsky22 Aug 2007 3:40 UTC
267 points
98 comments4 min readLW link

A Sketch of Good Communication

Ben Pace31 Mar 2018 22:48 UTC
198 points
35 comments3 min readLW link1 review

In­stru­men­tal vs. Epistemic—A Bardic Perspective

MBlume25 Apr 2009 7:41 UTC
89 points
189 comments3 min readLW link

Fake Explanations

Eliezer Yudkowsky20 Aug 2007 21:13 UTC
156 points
96 comments2 min readLW link

Truly Part Of You

Eliezer Yudkowsky21 Nov 2007 2:18 UTC
174 points
61 comments4 min readLW link

Eth­nic Ten­sion And Mean­ingless Arguments

Scott Alexander5 Nov 2014 3:38 UTC
80 points
8 comments22 min readLW link

When is a mind me?

Rob Bensinger17 Apr 2024 5:56 UTC
77 points
62 comments15 min readLW link

The An­thropic Trilemma

Eliezer Yudkowsky27 Sep 2009 1:47 UTC
57 points
232 comments6 min readLW link

An­ti­ci­pa­tion vs. Faith: At What Cost Ra­tion­al­ity?

Wei Dai13 Oct 2009 0:10 UTC
11 points
106 comments1 min readLW link

Dou­ble Crux — A Strat­egy for Mu­tual Understanding

[DEACTIVATED] Duncan Sabien2 Jan 2017 4:37 UTC
184 points
108 comments12 min readLW link

Anx­iety and Rationality

[deleted]19 Jan 2016 18:30 UTC
51 points
31 comments4 min readLW link

How to Mea­sure Anything

lukeprog7 Aug 2013 4:05 UTC
117 points
55 comments22 min readLW link

Urges vs. Goals: The anal­ogy to an­ti­ci­pa­tion and belief

AnnaSalamon24 Jan 2012 23:57 UTC
126 points
71 comments7 min readLW link

Hug the Query

Eliezer Yudkowsky14 Dec 2007 19:51 UTC
148 points
22 comments1 min readLW link

Phys­i­cal and Men­tal Behavior

Scott Alexander10 Jul 2011 20:20 UTC
89 points
22 comments3 min readLW link

Belief in Belief

Eliezer Yudkowsky29 Jul 2007 17:49 UTC
192 points
176 comments5 min readLW link

Mak­ing Beliefs Pay Rent (in An­ti­ci­pated Ex­pe­riences): Exercises

RobinZ17 Apr 2011 15:31 UTC
38 points
13 comments1 min readLW link

Belief in Self-Deception

Eliezer Yudkowsky5 Mar 2009 15:20 UTC
92 points
114 comments4 min readLW link

What is Ev­i­dence?

Eliezer Yudkowsky22 Sep 2007 6:43 UTC
159 points
60 comments3 min readLW link

[Question] What are some good ex­am­ples of fake be­liefs?

Adam Zerner14 Nov 2020 7:40 UTC
18 points
8 comments1 min readLW link

Believ­ing vs understanding

Adam Zerner24 Jul 2021 3:39 UTC
15 points
2 comments6 min readLW link

My Wild and Reck­less Youth

Eliezer Yudkowsky30 Aug 2007 1:52 UTC
105 points
53 comments3 min readLW link

Say Not “Com­plex­ity”

Eliezer Yudkowsky29 Aug 2007 4:22 UTC
100 points
53 comments3 min readLW link

So you think you un­der­stand Quan­tum Mechanics

shminux22 Dec 2012 21:16 UTC
65 points
64 comments3 min readLW link

Belief in the Im­plied Invisible

Eliezer Yudkowsky8 Apr 2008 7:40 UTC
59 points
34 comments6 min readLW link

How to Teach Stu­dents to Not Guess the Teacher’s Pass­word?

Petruchio4 Jan 2013 15:18 UTC
35 points
96 comments1 min readLW link

Magic by forgetting

avturchin24 Apr 2024 14:32 UTC
15 points
30 comments4 min readLW link

Two Truths and a Lie

Psychohistorian23 Dec 2009 6:34 UTC
70 points
67 comments2 min readLW link

From First Principles

[deleted]27 Sep 2012 19:04 UTC
79 points
46 comments5 min readLW link

Ar­tifi­cial Addition

Eliezer Yudkowsky20 Nov 2007 7:58 UTC
76 points
128 comments6 min readLW link

This Ter­ri­tory Does Not Exist

ike13 Aug 2020 0:30 UTC
7 points
197 comments7 min readLW link

Prob­lem of In­duc­tion: What if In­stants Were Independent

Epirito3 Apr 2022 0:54 UTC
1 point
7 comments3 min readLW link

Magic Tricks Re­vealed: Test Your Rationality

Peter Wildeford13 Aug 2011 5:23 UTC
42 points
29 comments2 min readLW link

So, just why do GPTs have to op­er­ate by con­tin­u­ing an ex­ist­ing string?

Bill Benzon24 Mar 2023 12:08 UTC
−4 points
0 comments3 min readLW link

Seek Fair Ex­pec­ta­tions of Others’ Models

Zvi17 Oct 2017 14:30 UTC
60 points
17 comments9 min readLW link

The Power of Pos­i­tivist Thinking

Scott Alexander21 Mar 2009 20:55 UTC
92 points
57 comments9 min readLW link

[Question] Any layper­son-ac­cessible refer­ence posts on how to op­er­a­tional­ize be­liefs ?

Optimization Process5 Feb 2021 7:26 UTC
17 points
0 comments1 min readLW link

The Sim­ple Truth

Eliezer Yudkowsky1 Jan 2008 20:00 UTC
130 points
15 comments22 min readLW link

The Fu­til­ity of Emergence

Eliezer Yudkowsky26 Aug 2007 22:10 UTC
92 points
142 comments3 min readLW link

Belief in Belief vs. Internalization

Desrtopa29 Nov 2010 3:12 UTC
42 points
59 comments2 min readLW link