RSS

Tristan Cook

Karma: 295

Cen­ter on Long-Term Risk: Sum­mer Re­search Fel­low­ship 2025 - Ap­ply Now

Tristan Cook26 Mar 2025 17:29 UTC
33 points
0 comments1 min readLW link
(longtermrisk.org)

How im­por­tant are ac­cu­rate AI timelines for the op­ti­mal spend­ing sched­ule on AI risk in­ter­ven­tions?

Tristan Cook16 Dec 2022 16:05 UTC
27 points
2 commentsLW link

The op­ti­mal timing of spend­ing on AGI safety work; why we should prob­a­bly be spend­ing more now

Tristan Cook24 Oct 2022 17:42 UTC
62 points
0 commentsLW link