I think the reason why ”we figured out how to survive in tribes” doesn’t hold is that with climbing the tech tree you distribute actuators to individual agents that are complex enough to have much longer feedback loops. Acting uncooperatively in a tribe is trivially observable as bad through short feedback loops. Releasing a bioagent to hurt a rival nation or a larger interest group is not trivially traceable and there’s less historic analogues to extrapolate from.
Your point on an omnipresent force that selects for survival preparedness holds for things that have analogues in near miss scenarios—e.g. pandemic preparedness. But I think the authors central thesis is one of the “near misses are the only thing driving survival preparedness“. In fact if we zoom in on AI—I think the best chance we have civilisationaly is that we increase AI ability sufficiently to have a near miss disaster of sufficient scale happen quickly enough to incentivise strong survival preparedness way ahead of ASI. This is what I see the natural consequence of the slowdown to be—buy yourself time to experience a near miss at social dynamics feedback loop level before you make the next ability jump. But the game is not progressing with that feedback loop in mind.
I think the reason why ”we figured out how to survive in tribes” doesn’t hold is that with climbing the tech tree you distribute actuators to individual agents that are complex enough to have much longer feedback loops. Acting uncooperatively in a tribe is trivially observable as bad through short feedback loops. Releasing a bioagent to hurt a rival nation or a larger interest group is not trivially traceable and there’s less historic analogues to extrapolate from.
Your point on an omnipresent force that selects for survival preparedness holds for things that have analogues in near miss scenarios—e.g. pandemic preparedness. But I think the authors central thesis is one of the “near misses are the only thing driving survival preparedness“. In fact if we zoom in on AI—I think the best chance we have civilisationaly is that we increase AI ability sufficiently to have a near miss disaster of sufficient scale happen quickly enough to incentivise strong survival preparedness way ahead of ASI. This is what I see the natural consequence of the slowdown to be—buy yourself time to experience a near miss at social dynamics feedback loop level before you make the next ability jump. But the game is not progressing with that feedback loop in mind.