Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
DragonGod comments on
AI alignment researchers don’t (seem to) stack
DragonGod
21 Feb 2023 17:06 UTC
5
points
2
AGI Safety Fundamentals is trying to do something that is somewhat similar I think.
Back to top
AGI Safety Fundamentals is trying to do something that is somewhat similar I think.