Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
RSS
SE Gyges
Karma:
153
All
Posts
Comments
New
Top
Old
Might An LLM Be Conscious?
SE Gyges
9 Mar 2026 17:56 UTC
16
points
0
comments
19
min read
LW
link
(www.verysane.ai)
Alignment Is Proven Tractable
SE Gyges
18 Feb 2026 17:55 UTC
10
points
0
comments
10
min read
LW
link
(www.verysane.ai)
Most Observers Are Alone: The Fermi Paradox as Default
SE Gyges
16 Feb 2026 0:52 UTC
29
points
12
comments
4
min read
LW
link
(segyges.leaflet.pub)
Is Rationalism a Religion
SE Gyges
25 Nov 2025 0:07 UTC
4
points
18
comments
12
min read
LW
link
(segyges.leaflet.pub)
AI 2027 Response Followup
SE Gyges
23 Aug 2025 4:41 UTC
9
points
3
comments
9
min read
LW
link
(www.lesswrong.com)
Back to top