it’s not inherently “smart” to sacrifice those significantly for the sake of a long term project
Your argument was that this hopeless trap might happen after a catastrophe and it’s so terrible that maybe it’s as bad or worse as everyone dying quickly. If it’s so terrible, in any decision-relevant sense, then it’s also smart to plot towards projects that dig humanity out of the trap.
No, sorry, I may have conveyed that wrong and mixed up two arguments. I don’t think stasis is straight up worse than extinction. For good or bad, people lived in the Middle Ages too. My point was more that if your guiding principle is “can we recover”, then there are more things than extinction to worry about. If you aspire at some kind of future in which humans grow exponentially then you won’t get it if we’re knocked back to preindustrial levels and can’t recover.
I don’t personally think that’s a great metric or goal to adopt, just following the logic to its endpoint. And I also expect that many smart people in the stasis wouldn’t plot with only that sort of long term benefit in mind. They’d seek relatively short term returns.
I see. Referring back to your argument was more an illustration of existence for this motivation. If a society forms around the motivation, at any one time in the billion years, and selects for intelligence to enable nontrivial long term institution design, that seems sufficient to escape stasis.
Your argument was that this hopeless trap might happen after a catastrophe and it’s so terrible that maybe it’s as bad or worse as everyone dying quickly. If it’s so terrible, in any decision-relevant sense, then it’s also smart to plot towards projects that dig humanity out of the trap.
No, sorry, I may have conveyed that wrong and mixed up two arguments. I don’t think stasis is straight up worse than extinction. For good or bad, people lived in the Middle Ages too. My point was more that if your guiding principle is “can we recover”, then there are more things than extinction to worry about. If you aspire at some kind of future in which humans grow exponentially then you won’t get it if we’re knocked back to preindustrial levels and can’t recover.
I don’t personally think that’s a great metric or goal to adopt, just following the logic to its endpoint. And I also expect that many smart people in the stasis wouldn’t plot with only that sort of long term benefit in mind. They’d seek relatively short term returns.
I see. Referring back to your argument was more an illustration of existence for this motivation. If a society forms around the motivation, at any one time in the billion years, and selects for intelligence to enable nontrivial long term institution design, that seems sufficient to escape stasis.