It does happen to be the case that thinking that climate change has much of a chance of being existentially bad is just wrong. Thinking that AI is existentially bad is right (at least according to me). A major confounder to address is that conditioning on a major false belief will of course be indicative of being worse at pursuing your goals than conditioning on a major true belief.
sure, I agree with the object level claim, hence why I say the facts of AI are different. it sounds like you’re saying that because climate change is not that existential, if we condition on people believing that climate change is existential, then this is confounded by people also being worse as believing true things. this is definitely an effect. however, I think there is an ameliorating factor: as an emotional stance, existential fear doesn’t have to literally be induced by human extinction; while the nuance between different levels of catastrophe matters a lot consequentially, for most people their emotional ability to feel it even harder caps out much lower than x-risk.
of course, you can still argue that given AGI is bigger, then we should still be more worried about it. but I think rejecting “AGI is likely to kill everyone” indicts one’s epistemics a lot less than accepting “climate change is likely to kill everyone” does. so this makes the confounder smaller.
It does happen to be the case that thinking that climate change has much of a chance of being existentially bad is just wrong. Thinking that AI is existentially bad is right (at least according to me). A major confounder to address is that conditioning on a major false belief will of course be indicative of being worse at pursuing your goals than conditioning on a major true belief.
sure, I agree with the object level claim, hence why I say the facts of AI are different. it sounds like you’re saying that because climate change is not that existential, if we condition on people believing that climate change is existential, then this is confounded by people also being worse as believing true things. this is definitely an effect. however, I think there is an ameliorating factor: as an emotional stance, existential fear doesn’t have to literally be induced by human extinction; while the nuance between different levels of catastrophe matters a lot consequentially, for most people their emotional ability to feel it even harder caps out much lower than x-risk.
of course, you can still argue that given AGI is bigger, then we should still be more worried about it. but I think rejecting “AGI is likely to kill everyone” indicts one’s epistemics a lot less than accepting “climate change is likely to kill everyone” does. so this makes the confounder smaller.