I think Ozy conflates acting as if X were true with believing X (you can devote your career to AI safety while believing there’s only a 10% chance it ends up mattering, or whatever), and lists some potential costs of the new focus without attempting to compare them to the potential benefits.
I think Ozy conflates acting as if X were true with believing X (you can devote your career to AI safety while believing there’s only a 10% chance it ends up mattering, or whatever), and lists some potential costs of the new focus without attempting to compare them to the potential benefits.