(I mean, I think it’s a substantial reason to think that “literally everyone dies” is considerably less likely and makes me not want to say stuff like “everyone dies”, but I just don’t think it implies much optimism exactly because the chance of death still seems pretty high and the value of the future is still lost. Like I don’t consider “misaligned AIs have full control and 80% of humans survive after a violent takeover” to be a good outcome.)
(I mean, I think it’s a substantial reason to think that “literally everyone dies” is considerably less likely and makes me not want to say stuff like “everyone dies”, but I just don’t think it implies much optimism exactly because the chance of death still seems pretty high and the value of the future is still lost. Like I don’t consider “misaligned AIs have full control and 80% of humans survive after a violent takeover” to be a good outcome.)