If we divide the list of “10 necessary events” into two groups of five, the first five being technical achievements and the last five being ways to derail technological society… then I suppose the standard doomer view would be that once the first necessary event is achieved (AGI algorithms) then the other technical achievements become 100% possible (edit: because AI figures out how to do them); and that whether or not the derailing events occur, boils down to whether AI lets them happen.
edit: The implication being that algorithmic progress controls everything else.
If we divide the list of “10 necessary events” into two groups of five, the first five being technical achievements and the last five being ways to derail technological society… then I suppose the standard doomer view would be that once the first necessary event is achieved (AGI algorithms) then the other technical achievements become 100% possible (edit: because AI figures out how to do them); and that whether or not the derailing events occur, boils down to whether AI lets them happen.
edit: The implication being that algorithmic progress controls everything else.