I’ll let this be my chance to ask whether the Alignment Newsletter Podcast is on hold or finished? I don’t think there was a publicized announcement of hibernation or termination.
A few of Scott Alexander’s blog posts (made into podcast episodes) are really good (he’s got a sequence summarising the late 2021 MIRI conversations; the Bio Anchors and Takeoff Speeds ones I found especially informative & comprehensible). These doesn’t make up the bulk of content and isn’t super technical but thought I’d mention it anyway
AXRP (the AI X-risk research podcast)
The Alignment Newsletter Podcast
Many entries on the Nonlinear Library
Maybe Towards Data Science
edit: also FLI’s AI alignment podcast
I’ll let this be my chance to ask whether the Alignment Newsletter Podcast is on hold or finished? I don’t think there was a publicized announcement of hibernation or termination.
A few of Scott Alexander’s blog posts (made into podcast episodes) are really good (he’s got a sequence summarising the late 2021 MIRI conversations; the Bio Anchors and Takeoff Speeds ones I found especially informative & comprehensible). These doesn’t make up the bulk of content and isn’t super technical but thought I’d mention it anyway
Note that the full 2021 MIRI conversations are also available (in robot voice) in the Nonlinear Library archive.