i think the exodus was not literally inevitable, but it would have required a heroic effort to prevent. imo the two biggest causes of the exodus were the board coup and the implosion of superalignment (which was indirectly caused by the coup).
my guess is there will be some people who take alignment people less seriously in long timelines because of AI 2027. i would not measure this by how loudly political opponents dunk on alignment people, because they will always find something to dunk on. i think the best way to counteract this is to emphasize the principle component that this whole AI thing is really big deal, and that there is a very wide range of beliefs in the field, but even “long” timeline worlds are insane as hell compared to what everyone else expects. i’m biased, though, because i think sth like 2035 is a more realistic median world; if i believed AGI was 50% likely to happen by 2029 or something then i might behave very diffrently
i think the exodus was not literally inevitable, but it would have required a heroic effort to prevent. imo the two biggest causes of the exodus were the board coup and the implosion of superalignment (which was indirectly caused by the coup).
my guess is there will be some people who take alignment people less seriously in long timelines because of AI 2027. i would not measure this by how loudly political opponents dunk on alignment people, because they will always find something to dunk on. i think the best way to counteract this is to emphasize the principle component that this whole AI thing is really big deal, and that there is a very wide range of beliefs in the field, but even “long” timeline worlds are insane as hell compared to what everyone else expects. i’m biased, though, because i think sth like 2035 is a more realistic median world; if i believed AGI was 50% likely to happen by 2029 or something then i might behave very diffrently