Archive
Sequences
About
Search
Log In
Questions
Events
Shortform
Alignment Forum
AF Comments
Home
Featured
All
Tags
Recent
Comments
abramdemski comments on
LLMs for Alignment Research: a safety priority?
abramdemski
10 Apr 2024 14:57 UTC
LW: 5 AF: 5
1
AF
But not
intentionally
. It was an
unintentional
consequence of training.
Back to top
But not intentionally. It was an unintentional consequence of training.