There is no war in the run-up to AGI that would derail the project, e.g. by necessitating that most resources be used for capabilities instead of safety research.
Assuming short timelines, I think it’s likely impossible to reach my desired levels of safety culture.
I feel obliged to note that a nuclear war, by dint of EMPs wiping out the power grid, would likely remove private AI companies as a thing for a while, thus deleting their current culture. It would also lengthen timelines.
I feel obliged to note that a nuclear war, by dint of EMPs wiping out the power grid, would likely remove private AI companies as a thing for a while, thus deleting their current culture. It would also lengthen timelines.
Certainly not ideal in its own right, though.