Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
Eliezer Yudkowsky
Karma:
138,904
All
Posts
Comments
New
Top
Old
Page
1
‘Empiricism!’ as Anti-Epistemology
Eliezer Yudkowsky
14 Mar 2024 2:02 UTC
137
points
67
comments
25
min read
LW
link
My current LK99 questions
Eliezer Yudkowsky
1 Aug 2023 22:48 UTC
205
points
38
comments
5
min read
LW
link
GPTs are Predictors, not Imitators
Eliezer Yudkowsky
8 Apr 2023 19:59 UTC
364
points
88
comments
3
min read
LW
link
Pausing AI Developments Isn’t Enough. We Need to Shut it All Down
Eliezer Yudkowsky
8 Apr 2023 0:36 UTC
238
points
38
comments
12
min read
LW
link
Eliezer Yudkowsky’s Shortform
Eliezer Yudkowsky
1 Apr 2023 22:43 UTC
14
points
0
comments
1
min read
LW
link
Manifold: If okay AGI, why?
Eliezer Yudkowsky
25 Mar 2023 22:43 UTC
116
points
37
comments
1
min read
LW
link
(manifold.markets)
Alexander and Yudkowsky on AGI goals
Scott Alexander
and
Eliezer Yudkowsky
24 Jan 2023 21:09 UTC
174
points
52
comments
26
min read
LW
link
A challenge for AGI organizations, and a challenge for readers
Rob Bensinger
and
Eliezer Yudkowsky
1 Dec 2022 23:11 UTC
300
points
33
comments
2
min read
LW
link
Don’t use ‘infohazard’ for collectively destructive info
Eliezer Yudkowsky
15 Jul 2022 5:13 UTC
84
points
33
comments
1
min read
LW
link
2
reviews
(www.facebook.com)
Let’s See You Write That Corrigibility Tag
Eliezer Yudkowsky
19 Jun 2022 21:11 UTC
125
points
69
comments
1
min read
LW
link
AGI Ruin: A List of Lethalities
Eliezer Yudkowsky
5 Jun 2022 22:05 UTC
885
points
690
comments
30
min read
LW
link
3
reviews
Six Dimensions of Operational Adequacy in AGI Projects
Eliezer Yudkowsky
30 May 2022 17:00 UTC
299
points
66
comments
13
min read
LW
link
1
review
ProjectLawful.com: Eliezer’s latest story, past 1M words
Eliezer Yudkowsky
11 May 2022 6:18 UTC
213
points
112
comments
1
min read
LW
link
4
reviews
Lies Told To Children
Eliezer Yudkowsky
14 Apr 2022 11:25 UTC
370
points
93
comments
7
min read
LW
link
1
review
MIRI announces new “Death With Dignity” strategy
Eliezer Yudkowsky
2 Apr 2022 0:43 UTC
339
points
543
comments
18
min read
LW
link
1
review
Shah and Yudkowsky on alignment failures
Rohin Shah
and
Eliezer Yudkowsky
28 Feb 2022 19:18 UTC
85
points
39
comments
91
min read
LW
link
1
review
Christiano and Yudkowsky on AI predictions and human intelligence
Eliezer Yudkowsky
23 Feb 2022 21:34 UTC
70
points
35
comments
42
min read
LW
link
Ngo and Yudkowsky on scientific reasoning and pivotal acts
Eliezer Yudkowsky
and
Richard_Ngo
21 Feb 2022 20:54 UTC
64
points
14
comments
35
min read
LW
link
(briefly) RaDVaC and SMTM, two things we should be doing
Eliezer Yudkowsky
12 Jan 2022 6:20 UTC
227
points
79
comments
3
min read
LW
link
1
review
Ngo’s view on alignment difficulty
Richard_Ngo
and
Eliezer Yudkowsky
14 Dec 2021 21:34 UTC
63
points
7
comments
17
min read
LW
link
Back to top
Next