Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
Goals
Tag
Relevant
New
Old
Do Humans Want Things?
lukeprog
4 Aug 2011 5:00 UTC
40
points
53
comments
5
min read
LW
link
The Values-to-Actions Decision Chain
Remmelt
30 Jun 2018 21:52 UTC
29
points
6
comments
10
min read
LW
link
Complex Behavior from Simple (Sub)Agents
moridinamael
10 May 2019 21:44 UTC
110
points
13
comments
9
min read
LW
link
1
review
Logical Foundations of Government Policy
FCCC
10 Oct 2020 17:05 UTC
2
points
0
comments
17
min read
LW
link
Distinguishing goals from chores
Amir Bolous
10 Jan 2021 7:45 UTC
5
points
1
comment
4
min read
LW
link
[Question]
How to select a long-term goal and align my mind towards it?
Alexander
24 Dec 2021 11:40 UTC
19
points
8
comments
2
min read
LW
link
Unpacking “Shard Theory” as Hunch, Question, Theory, and Insight
Jacy Reese Anthis
16 Nov 2022 13:54 UTC
31
points
9
comments
2
min read
LW
link
Goal alignment without alignment on epistemology, ethics, and science is futile
Roman Leventov
7 Apr 2023 8:22 UTC
20
points
2
comments
2
min read
LW
link
[Question]
What to do after a mental breakdown? (Dealing with fear of failure)
TeaTieAndHat
13 Jul 2023 9:09 UTC
2
points
0
comments
4
min read
LW
link
Ideation and Trajectory Modelling in Language Models
NickyP
5 Oct 2023 19:21 UTC
15
points
2
comments
10
min read
LW
link
AISC Project: Modelling Trajectories of Language Models
NickyP
13 Nov 2023 14:33 UTC
25
points
0
comments
12
min read
LW
link
Refinement of Active Inference agency ontology
Roman Leventov
15 Dec 2023 9:31 UTC
16
points
0
comments
5
min read
LW
link
(arxiv.org)