This has one important implication for the long-term future, assuming no collapse has happened (in the thousands of years future.)
The best-case scenario for the long-term future will be very weird, lovecraftian and surprisingly horrifying for morality, as extremely high-end technology like whole brain emulation, genetic engineering, immortality, and nanotechnology make concepts like personal identity very weird, very fast. This obviously makes a whole lot of moralities look lovecraftian and weird, but no more so than even good futures apply. So the fact that morality gets weird and lovecraftian fast is really a symptom of a larger problem, that extreme scenarios seem to do apply to the long-term future.
This has one important implication for the long-term future, assuming no collapse has happened (in the thousands of years future.)
The best-case scenario for the long-term future will be very weird, lovecraftian and surprisingly horrifying for morality, as extremely high-end technology like whole brain emulation, genetic engineering, immortality, and nanotechnology make concepts like personal identity very weird, very fast. This obviously makes a whole lot of moralities look lovecraftian and weird, but no more so than even good futures apply. So the fact that morality gets weird and lovecraftian fast is really a symptom of a larger problem, that extreme scenarios seem to do apply to the long-term future.