This seems like the least of our concerns here. I think a far-flung, spacefaring strain of highly efficient mindless replicators well-protected against all forms of existential risk is still a horrifying future for humanity.
Ah, I see. I did not read the original post or Yvain’s examples as necessarily resulting in the loss of flexibility, but I can see how this can be a fatal side effect in some cases. I guess this would be akin to sacrificing far mode for near mode, though not as extreme as wireheading.
Increase average total human wealth significantly, such that a greater proportion of the total population has more ability to meaningfully try new things or respond to novel challenges in a stabilizing manner.
People are somewhat flexible. If they’re highly optimized for a particular set of constraints, then the human race is more likely to get wiped out.
This seems like the least of our concerns here. I think a far-flung, spacefaring strain of highly efficient mindless replicators well-protected against all forms of existential risk is still a horrifying future for humanity.
I probably have a stronger belief in unknown unknowns than you do, but I agree that either outcome is undesirable.
Ah, I see. I did not read the original post or Yvain’s examples as necessarily resulting in the loss of flexibility, but I can see how this can be a fatal side effect in some cases. I guess this would be akin to sacrificing far mode for near mode, though not as extreme as wireheading.
Second thought: Is there any conceivable way of increasing human flexibility, or would it get borked by Goodhart’s Law?
Increase average total human wealth significantly, such that a greater proportion of the total population has more ability to meaningfully try new things or respond to novel challenges in a stabilizing manner.
(The caveats pretty much write themselves.)