This since non-obvious to me (or at least not a slam dunk is really what I think). It may be easier for misaligned AI 1 to strike a deal with humanity that it will use humans’ resources to defeat AI 2 and 3 in exchange for say 80% of the lightcone (as opposed to the split 3 ways with the AIs).
I’m not actually sure how well this applies in the exact situation Daniel describes (I’d need to think more) but it definitely seems plausible under a bunch of scenarios with multiple misaligned ASIs
This strikes me as a fairly strong strawman. My guess if the vast majority of thoughtful radicals basically have a similar view to you. Indeed, at least from your description, its plausible my view is more charitable than yours—I think a lot of it is also endangering humanity due to cowardice and following of local incentives etc.