RSS

Rationalization

TagLast edit: 1 Oct 2020 22:47 UTC by Ruby

Rationalization is the act of finding reasons to believe what one has already decided they want to believe. It is a decidedly terrible way to arrive at true beliefs.

“Rationalization.” What a curious term. I would call it a wrong word. You cannot “rationalize” what is not already rational. It is as if “lying” were called “truthization.” – Rationalization

Rationality starts from evidence, and then crunches forward through belief updates, in order to output a probable conclusion. “Rationalization” starts from a conclusion, and then works backward to arrive at arguments apparently favoring that conclusion. Rationalization argues for a side already selected; rationality tries to choose between sides.

Rationalization can be conscious or unconscious. It can take on a blatant, conscious form, in which you are aware that you want a particular side to be correct and you deliberately compose arguments for only that side, looking over the evidence and consciously filtering which facts will be presented. Or it can occur at perceptual speeds, without conscious intent or conscious awareness.

Defeating rationalization—or even discovering rationalizations—is a lifelong battle for the aspiring rationalist.

Notable Posts

See Also

The Bot­tom Line

Eliezer Yudkowsky28 Sep 2007 17:47 UTC
88 points
14 comments4 min readLW link

Rationalization

Eliezer Yudkowsky30 Sep 2007 19:29 UTC
41 points
29 comments2 min readLW link

The Apol­o­gist and the Revolutionary

Scott Alexander11 Mar 2009 21:39 UTC
208 points
100 comments5 min readLW link

Know­ing About Bi­ases Can Hurt People

Eliezer Yudkowsky4 Apr 2007 18:01 UTC
135 points
80 comments2 min readLW link

Your Strength as a Rationalist

Eliezer Yudkowsky11 Aug 2007 0:21 UTC
135 points
121 comments2 min readLW link

Why a New Ra­tion­al­iza­tion Se­quence?

dspeyer13 Jan 2020 6:46 UTC
30 points
8 comments3 min readLW link

Is That Your True Re­jec­tion?

Eliezer Yudkowsky6 Dec 2008 14:26 UTC
93 points
100 comments3 min readLW link

Red Flags for Rationalization

dspeyer14 Jan 2020 7:34 UTC
25 points
6 comments4 min readLW link

Confabulation

lsusr8 Dec 2019 10:18 UTC
51 points
6 comments2 min readLW link

Firm­ing Up Not-Ly­ing Around Its Edge-Cases Is Less Broadly Use­ful Than One Might Ini­tially Think

Zack_M_Davis27 Dec 2019 5:09 UTC
96 points
39 comments8 min readLW link2 nominations2 reviews

Maybe Ly­ing Doesn’t Exist

Zack_M_Davis14 Oct 2019 7:04 UTC
59 points
57 comments8 min readLW link

“But It Doesn’t Mat­ter”

Zack_M_Davis1 Jun 2019 2:06 UTC
46 points
16 comments1 min readLW link

A Case Study of Mo­ti­vated Continuation

Eliezer Yudkowsky31 Oct 2007 1:27 UTC
34 points
36 comments3 min readLW link

SotW: Avoid Mo­ti­vated Cognition

Eliezer Yudkowsky28 May 2012 15:57 UTC
31 points
83 comments10 min readLW link

[Question] Does im­proved in­tro­spec­tion cause ra­tio­nal­i­sa­tion to be­come less no­tice­able?

jacobjacob30 Jul 2019 10:03 UTC
27 points
4 comments1 min readLW link

Par­tial sum­mary of de­bate with Ben­quo and Jes­si­cata [pt 1]

Raemon14 Aug 2019 20:02 UTC
87 points
66 comments22 min readLW link2 nominations3 reviews

A cyn­i­cal ex­pla­na­tion for why ra­tio­nal­ists worry about FAI

aaronsw4 Aug 2012 12:27 UTC
28 points
182 comments1 min readLW link

The Value of The­o­ret­i­cal Research

paulfchristiano25 Feb 2011 18:06 UTC
50 points
53 comments3 min readLW link

The Power of Agency

lukeprog7 May 2011 1:38 UTC
86 points
78 comments1 min readLW link

Against Devil’s Advocacy

Eliezer Yudkowsky9 Jun 2008 4:15 UTC
38 points
60 comments6 min readLW link

Mis­takes with Con­ser­va­tion of Ex­pected Evidence

abramdemski8 Jun 2019 23:07 UTC
159 points
21 comments12 min readLW link2 nominations1 review

Ex­pla­na­tion vs Rationalization

abramdemski22 Feb 2018 23:46 UTC
11 points
10 comments4 min readLW link

Avoid­ing Rationalization

dspeyer15 Jan 2020 10:55 UTC
15 points
0 comments2 min readLW link

Test­ing for Rationalization

dspeyer16 Jan 2020 8:12 UTC
19 points
0 comments2 min readLW link

Us­ing Ex­pert Disagreement

dspeyer16 Jan 2020 22:42 UTC
13 points
1 comment5 min readLW link

Against Ra­tion­al­iza­tion II: Se­quence Recap

dspeyer16 Jan 2020 22:51 UTC
6 points
2 comments1 min readLW link

Do­ing your good deed for the day

Scott Alexander27 Oct 2009 0:45 UTC
147 points
57 comments3 min readLW link

Just Lose Hope Already

Eliezer Yudkowsky25 Feb 2007 0:39 UTC
87 points
77 comments1 min readLW link

Whin­ing-Based Communities

Eliezer Yudkowsky7 Apr 2009 20:31 UTC
71 points
99 comments4 min readLW link

Ineffi­cient Doesn’t Mean Indifferent

Zvi29 Apr 2018 11:30 UTC
37 points
17 comments3 min readLW link
(thezvi.wordpress.com)

A Ra­tional Argument

Eliezer Yudkowsky2 Oct 2007 18:35 UTC
71 points
41 comments2 min readLW link

One Ar­gu­ment Against An Army

Eliezer Yudkowsky15 Aug 2007 18:39 UTC
66 points
36 comments2 min readLW link

Ex­is­ten­tial Angst Factory

Eliezer Yudkowsky19 Jul 2008 6:55 UTC
63 points
101 comments4 min readLW link

If You Want to Win, Stop Conceding

Davis_Kingsley22 Nov 2018 18:10 UTC
43 points
15 comments3 min readLW link

What Can We Learn About Hu­man Psy­chol­ogy from Chris­tian Apolo­get­ics?

ChrisHallquist21 Oct 2013 22:02 UTC
65 points
162 comments8 min readLW link

(Ir)ra­tio­nal­ity of Pas­cal’s wager

filozof3377@gmial.com3 Aug 2020 20:57 UTC
3 points
10 comments4 min readLW link

Dispel your jus­tifi­ca­tion-mon­key with a “HWA!”

MalcolmOcean24 Jan 2018 4:51 UTC
20 points
15 comments5 min readLW link

Back Up and Ask Whether, Not Why

Eliezer Yudkowsky6 Nov 2008 19:20 UTC
50 points
26 comments1 min readLW link

Fake Justification

Eliezer Yudkowsky1 Nov 2007 3:57 UTC
69 points
57 comments3 min readLW link

The Ge­netic Fallacy

Eliezer Yudkowsky11 Jul 2008 5:47 UTC
42 points
16 comments3 min readLW link

Up­date Your­self Incrementally

Eliezer Yudkowsky14 Aug 2007 14:56 UTC
59 points
29 comments3 min readLW link

What Ev­i­dence Filtered Ev­i­dence?

Eliezer Yudkowsky29 Sep 2007 23:10 UTC
76 points
43 comments4 min readLW link

Third Alter­na­tives for After­life-ism

Eliezer Yudkowsky8 May 2007 7:41 UTC
33 points
20 comments1 min readLW link

Es­cap­ing Your Past

Z_M_Davis22 Apr 2009 21:15 UTC
28 points
51 comments3 min readLW link

A Failed Just-So Story

Eliezer Yudkowsky5 Jan 2008 6:35 UTC
17 points
49 comments2 min readLW link

The Mis­take Script

jimrandomh9 Mar 2009 17:35 UTC
12 points
14 comments3 min readLW link

Ad­ver­sar­ial Sys­tem Hats

Johnicholas11 Mar 2009 16:56 UTC
8 points
15 comments1 min readLW link

Ra­tional Defense of Ir­ra­tional Beliefs

cleonid12 Mar 2009 18:48 UTC
5 points
33 comments1 min readLW link

Proverbs and Cached Judg­ments: the Rol­ling Stone

Annoyance1 Apr 2009 15:40 UTC
18 points
30 comments2 min readLW link

Toxic Truth

MichaelHoward11 Apr 2009 11:25 UTC
16 points
31 comments1 min readLW link

Per­sua­sive­ness vs Soundness

Patrick13 Apr 2009 8:43 UTC
0 points
19 comments2 min readLW link

“Self-pre­tend­ing” is not as use­ful as we think

pwno25 Apr 2009 23:01 UTC
4 points
15 comments1 min readLW link

Re­play­ing History

G Gordon Worley III8 May 2009 5:35 UTC
7 points
19 comments2 min readLW link

Richard Dawk­ins TV—Baloney De­tec­tion Kit video

Roko25 Jun 2009 0:27 UTC
2 points
35 comments1 min readLW link

The Fixed Sum Fallacy

cousin_it3 Jul 2009 13:01 UTC
5 points
4 comments1 min readLW link
No comments.