Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
Definitions
Tag
Last edit:
31 Jul 2020 6:48 UTC
by
Yoav Ravid
Posts that attempt to
Define
or clarify the meaning of a concept, a word, phrase, or something else.
Relevant
New
Old
What Do We Mean By “Rationality”?
Eliezer Yudkowsky
16 Mar 2009 22:33 UTC
293
points
14
comments
6
min read
LW
link
Compact vs. Wide Models
Vaniver
16 Jul 2018 4:09 UTC
31
points
5
comments
3
min read
LW
link
Logical Foundations of Government Policy
FCCC
10 Oct 2020 17:05 UTC
2
points
0
comments
17
min read
LW
link
What Does “Signalling” Mean?
abramdemski
16 Sep 2020 21:19 UTC
38
points
18
comments
2
min read
LW
link
What’s So Bad About Ad-Hoc Mathematical Definitions?
johnswentworth
15 Mar 2021 21:51 UTC
169
points
57
comments
7
min read
LW
link
When to use “meta” vs “self-reference”, “recursive”, etc.
Alex_Altair
6 Apr 2022 4:57 UTC
20
points
5
comments
5
min read
LW
link
The Problem With The Current State of AGI Definitions
Yitz
29 May 2022 13:58 UTC
40
points
22
comments
8
min read
LW
link
Behavioral and mechanistic definitions (often confuse AI alignment discussions)
LawrenceC
20 Feb 2023 21:33 UTC
20
points
5
comments
6
min read
LW
link
Essential Behaviorism Terms
Rivka
17 Mar 2023 17:41 UTC
13
points
1
comment
10
min read
LW
link
The Useful Idea of Truth
Eliezer Yudkowsky
2 Oct 2012 18:16 UTC
163
points
544
comments
12
min read
LW
link
A point of clarification on infohazard terminology
eukaryote
2 Feb 2020 17:43 UTC
49
points
21
comments
2
min read
LW
link
(eukaryotewritesblog.com)
Disambiguating “alignment” and related notions
David Scott Krueger (formerly: capybaralet)
5 Jun 2018 15:35 UTC
22
points
21
comments
2
min read
LW
link
Four kinds of problems
jacobjacob
21 Aug 2018 23:01 UTC
35
points
11
comments
3
min read
LW
link
Note on Terminology: “Rationality”, not “Rationalism”
Vladimir_Nesov
14 Jan 2011 21:21 UTC
37
points
51
comments
1
min read
LW
link
An Agent is a Worldline in Tegmark V
komponisto
12 Jul 2018 5:12 UTC
24
points
12
comments
2
min read
LW
link
Flowsheet Logic and Notecard Logic
moridinamael
9 Sep 2015 16:42 UTC
44
points
28
comments
3
min read
LW
link
In favor of tabooing the word “values” and using only “priorities” instead
chaosmage
25 Oct 2018 22:28 UTC
21
points
11
comments
2
min read
LW
link
Spaghetti Towers
eukaryote
22 Dec 2018 5:29 UTC
147
points
25
comments
3
min read
LW
link
1
review
(eukaryotewritesblog.com)
Technical model refinement formalism
Stuart_Armstrong
27 Aug 2020 11:54 UTC
19
points
0
comments
6
min read
LW
link
Glossary of Futurology
mind_bomber
21 Aug 2015 5:51 UTC
3
points
7
comments
13
min read
LW
link
Define Rationality
Marshall
5 Mar 2009 18:25 UTC
1
point
14
comments
1
min read
LW
link
Seeking better name for “Effective Egoism”
DataPacRat
25 Nov 2016 22:31 UTC
14
points
30
comments
1
min read
LW
link
[Question]
What Belongs in my Glossary?
Zvi
2 Nov 2020 19:52 UTC
14
points
8
comments
1
min read
LW
link
Beginning at the Beginning
Annoyance
11 Mar 2009 19:23 UTC
5
points
60
comments
4
min read
LW
link
What’s in a name? That which we call a rationalist…
badger
24 Apr 2009 23:53 UTC
7
points
92
comments
1
min read
LW
link
Rationality is winning—or is it?
taw
7 May 2009 14:51 UTC
−8
points
10
comments
1
min read
LW
link
My concerns about the term ‘rationalist’
JamesCole
4 Jun 2009 15:31 UTC
12
points
34
comments
2
min read
LW
link
The two meanings of mathematical terms
JamesCole
15 Jun 2009 14:30 UTC
0
points
80
comments
3
min read
LW
link
ESR’s comments on some EY:OB/LW posts
Eliezer Yudkowsky
20 Jun 2009 0:16 UTC
7
points
16
comments
1
min read
LW
link
Rationalists, Post-Rationalists, And Rationalist-Adjacents
orthonormal
13 Mar 2020 20:25 UTC
78
points
43
comments
3
min read
LW
link
Rationality Is Expertise With Reality
Horatio Von Becker
2 Sep 2021 20:45 UTC
−14
points
18
comments
1
min read
LW
link
Sentience, Sapience, Consciousness & Self-Awareness: Defining Complex Terms
LukeOnline
20 Oct 2021 13:48 UTC
9
points
7
comments
4
min read
LW
link
Disentangling inner alignment failures
Erik Jenner
10 Oct 2022 18:50 UTC
14
points
5
comments
4
min read
LW
link
Why There Is No Answer to Your Philosophical Question
Bryan Frances
24 Mar 2023 23:22 UTC
−12
points
10
comments
12
min read
LW
link
No comments.
Back to top