I agree. If I am reading Will’s point correctly, he is assuming more people will want to permanently enshrine beliefs such that even if they are immortal they never want to change.
What I see online is that people get bored and crave novelty. That might become an editable value, but it happens to be linked to a whole lot of other things, so editing it and only it is not guaranteed to be possible even with ASI unless the subject stops being human. Editing it at all is not guaranteed to be safe.
Moreover, the transhumanist movement is extremely diverse in terms of what they want. They will go in many directions when the capabilities become available. I see no reason why they would want to enshrine stability over exploration and self discovery through ongoing transformation.
Even Will’s take on immortality is locked into current realities of gerontocracy. One primary goal of research into longevity is the ability to maintain brain plasticity and learning capabilities. An immortal will not remain part of the elite society if they cannot use current technology. So unless Will is proposing that immortals will freeze technological growth, they will have to be adaptive just to maintain their place in society.
Now cultural dominance? That is a short- to mid-term issue. I can see a single culture getting an early lead in AGI plus space and that becoming a lasting impact. But unless we figure out Ansibles (faster than light communication using quantum entanglement) or the speed of light stops being a barrier to communication, this cannot last long term. Within a star system (especially ours) sure. With multi-system expansion as longtermism presupposes? I cannot see how the system stays in contact regularly enough to prevent change in different directions.
I don’t think he failed to consider these angles, so am curious to hear more on the topic.
I agree. If I am reading Will’s point correctly, he is assuming more people will want to permanently enshrine beliefs such that even if they are immortal they never want to change.
What I see online is that people get bored and crave novelty. That might become an editable value, but it happens to be linked to a whole lot of other things, so editing it and only it is not guaranteed to be possible even with ASI unless the subject stops being human. Editing it at all is not guaranteed to be safe.
Moreover, the transhumanist movement is extremely diverse in terms of what they want. They will go in many directions when the capabilities become available. I see no reason why they would want to enshrine stability over exploration and self discovery through ongoing transformation.
Even Will’s take on immortality is locked into current realities of gerontocracy. One primary goal of research into longevity is the ability to maintain brain plasticity and learning capabilities. An immortal will not remain part of the elite society if they cannot use current technology. So unless Will is proposing that immortals will freeze technological growth, they will have to be adaptive just to maintain their place in society.
Now cultural dominance? That is a short- to mid-term issue. I can see a single culture getting an early lead in AGI plus space and that becoming a lasting impact. But unless we figure out Ansibles (faster than light communication using quantum entanglement) or the speed of light stops being a barrier to communication, this cannot last long term. Within a star system (especially ours) sure. With multi-system expansion as longtermism presupposes? I cannot see how the system stays in contact regularly enough to prevent change in different directions.
I don’t think he failed to consider these angles, so am curious to hear more on the topic.