i) Global warming. While not as urgent or sexy as AI-run-amok, I think it a far more important issue for humankind.
Reading these letters so far, the experts very often make such statements. I think that either they systematically overestimate the likley risk of global warming in itself, which wouldn’t be too surprising for a politicized issue (in the US at least), or they feel the need to play it up.
I think a lot of people make this mistake, to think that “very bad things” is equivalently bad to extinction—or even is extinction. It is unlikely that large scale nuclear war will extinguish the species, it is far beyond unlikely that global warning would extinguish humans. It is extremely unlikely large scale biological weapons usage by terrorists or states would extinguish humanity. But because we know for a certain fact that these things could happen and have even come close to happening or are beginning to happen, and because they are so terrible its just not really possible for most people to keep enough perspective to recognize that things not likely to happen really soon but that will eventually be possible are actually much more dangerous in terms of capability for extinction.
Reading these letters so far, the experts very often make such statements. I think that either they systematically overestimate the likley risk of global warming in itself, which wouldn’t be too surprising for a politicized issue (in the US at least), or they feel the need to play it up.
I think a lot of people make this mistake, to think that “very bad things” is equivalently bad to extinction—or even is extinction. It is unlikely that large scale nuclear war will extinguish the species, it is far beyond unlikely that global warning would extinguish humans. It is extremely unlikely large scale biological weapons usage by terrorists or states would extinguish humanity. But because we know for a certain fact that these things could happen and have even come close to happening or are beginning to happen, and because they are so terrible its just not really possible for most people to keep enough perspective to recognize that things not likely to happen really soon but that will eventually be possible are actually much more dangerous in terms of capability for extinction.
Or some people place high negative value on half of all humans dying, comparable to extinction.