First, upvotes and kudos for asking about current attitudes and opinions before diving into specifics and explanation/exhortation on the topic. This is awesome—well done!
My general opinion is that this topic is too politicized to be a great fit for LessWrong. Objective modeling of climate change and making predictions might be OK, but somewhere before you get to “mass media attitude”, you’ve crossed the line into talking about outgroup errors and things other than truth-seeking in one’s own beliefs. Even when focusing on predictions and truth (for which other people’s actions is definitely in scope), this is hard-mode discussion, and likely to derail by confusing positive with normative elements of the analyses.
I’d keep it off of LW, or perhaps have a linkpost and see the reaction—maybe I’m wrong.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy—very likely that it’s a large component of the Great Filter. And it’s caused by such deep human drives (procreation and scope-insensitive caring about our young in the short-term) that it’s probably inevitable—any additional efficiency or sustainability we undertake will get used up by making more people. I’d like to see more focus on how to get truly self-sufficient Mars (and asteroid/moons) colonies of at least 100K people with a clear upward slope, and on how to get at least 0.5B people to survive the collapse of earth, with enough knowledge and resources that the dark age lasts less than 300 years. I don’t currently see a path to either, nor to a reduction in human population and resource usage that doesn’t include more destruction in war than it’s worth.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy—very likely that it’s a large component of the Great Filter.
I don’t think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn’t really work well as a factor that eliminates nearly all civilizations at a specific level of development.
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from “able to manipulate a civilization’s environment at scale” to “able to modulate use of resources in order not to destroy said civilization”, with an added element of “over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment” could easily be universal problems.
It’s the fragility that worries me most—I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don’t believe that we CAN remain calm—I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it’s universal, that’s the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
First, upvotes and kudos for asking about current attitudes and opinions before diving into specifics and explanation/exhortation on the topic. This is awesome—well done!
My general opinion is that this topic is too politicized to be a great fit for LessWrong. Objective modeling of climate change and making predictions might be OK, but somewhere before you get to “mass media attitude”, you’ve crossed the line into talking about outgroup errors and things other than truth-seeking in one’s own beliefs. Even when focusing on predictions and truth (for which other people’s actions is definitely in scope), this is hard-mode discussion, and likely to derail by confusing positive with normative elements of the analyses.
I’d keep it off of LW, or perhaps have a linkpost and see the reaction—maybe I’m wrong.
My personal opinion: climate change (and more directly, conflict caused or exacerbated by it) is the single biggest risk to human-like intelligence flourishing in the galaxy—very likely that it’s a large component of the Great Filter. And it’s caused by such deep human drives (procreation and scope-insensitive caring about our young in the short-term) that it’s probably inevitable—any additional efficiency or sustainability we undertake will get used up by making more people. I’d like to see more focus on how to get truly self-sufficient Mars (and asteroid/moons) colonies of at least 100K people with a clear upward slope, and on how to get at least 0.5B people to survive the collapse of earth, with enough knowledge and resources that the dark age lasts less than 300 years. I don’t currently see a path to either, nor to a reduction in human population and resource usage that doesn’t include more destruction in war than it’s worth.
I don’t think that the idea of the Great Filter fits very well here. The Great Filter would be something so universal that it eliminates ~100% of all civilizations. Climate change seems to be conditional on a number of factors specific to earth, e.g. carbon-based life, green-house gas effects, interdependent civilization etc., that it doesn’t really work well as a factor that eliminates nearly all civilizations at a specific level of development.
My suspicion is that it generalizes well beyond mechanisms of greenhouse gasses or temperature ranges. The path from “able to manipulate a civilization’s environment at scale” to “able to modulate use of resources in order not to destroy said civilization”, with an added element of “over-optimization for a given environment rendering a nascent civilization extremely vulnerable to changes in their environment” could easily be universal problems.
It’s the fragility that worries me most—I believe that if we could remain calm and coordinate the application of mitigations, we could make it through most of the projected changes. But I don’t believe that we CAN remain calm—I suspect (and fear) that humans will react violently to any significant future changes, and our civilization will turn out to be much much easier to destroy than to maintain.
Regardless of whether it’s universal, that’s the x-risk I see to our brand of human-like intelligent experiences. Not climate change directly, but war and destruction about how to slow it down, and over who gets the remaining nice bits as it gets worse.
An angle that’s interesting (though only tangentially connected with climate change) is how civilizations deal with waste heat.