The analogy makes sense to me, and I can both see how being Bob on alignment (and many other areas) is a failure mode, and how being Alice in some cases is failure mode.
But I don’t think it applies to what I said.
there are some things that are really x-risks and directly cause human extinction, and there are other things like bad governance structures or cancel culture or climate change that are pretty bad indeed and generally make society much worse off
I agree, but I was postulating that climate change increases literal extinction (where everyone dies) by 10%.
The difference between x-risk and catastrophic risk (what I think you’re talking about) is not the same as the difference between first and n-th order existential risk. As far as I’m concerned, the former is very large because of future generations, but the second is zero. I don’t care at all if climate change kills everyone directly or via nuclear war, as long as everyone is dead.
Or was your point just that the two could be conflated?
The analogy makes sense to me, and I can both see how being Bob on alignment (and many other areas) is a failure mode, and how being Alice in some cases is failure mode.
But I don’t think it applies to what I said.
I agree, but I was postulating that climate change increases literal extinction (where everyone dies) by 10%.
The difference between x-risk and catastrophic risk (what I think you’re talking about) is not the same as the difference between first and n-th order existential risk. As far as I’m concerned, the former is very large because of future generations, but the second is zero. I don’t care at all if climate change kills everyone directly or via nuclear war, as long as everyone is dead.
Or was your point just that the two could be conflated?