sure, I agree with the object level claim, hence why I say the facts of AI are different. it sounds like you’re saying that because climate change is not that existential, if we condition on people believing that climate change is existential, then this is confounded by people also being worse as believing true things. this is definitely an effect. however, I think there is an ameliorating factor: as an emotional stance, existential fear doesn’t have to literally be induced by human extinction; while the nuance between different levels of catastrophe matters a lot consequentially, for most people their emotional ability to feel it even harder caps out much lower than x-risk.
of course, you can still argue that given AGI is bigger, then we should still be more worried about it. but I think rejecting “AGI is likely to kill everyone” indicts one’s epistemics a lot less than accepting “climate change is likely to kill everyone” does. so this makes the confounder smaller.
sure, I agree with the object level claim, hence why I say the facts of AI are different. it sounds like you’re saying that because climate change is not that existential, if we condition on people believing that climate change is existential, then this is confounded by people also being worse as believing true things. this is definitely an effect. however, I think there is an ameliorating factor: as an emotional stance, existential fear doesn’t have to literally be induced by human extinction; while the nuance between different levels of catastrophe matters a lot consequentially, for most people their emotional ability to feel it even harder caps out much lower than x-risk.
of course, you can still argue that given AGI is bigger, then we should still be more worried about it. but I think rejecting “AGI is likely to kill everyone” indicts one’s epistemics a lot less than accepting “climate change is likely to kill everyone” does. so this makes the confounder smaller.