Did you miss transhumanism? If it’s truly important to you, to be useful, alignment would mean that superintelligence will find a way to lift you up and give you a role.
I suppose there might be a period during which we’ve figured out existential security but the FASI hasn’t figured out human augmentation beyond the high priority stuff like curing aging. I wouldn’t expect that period to be long.
Did you miss transhumanism? If it’s truly important to you, to be useful, alignment would mean that superintelligence will find a way to lift you up and give you a role.
I suppose there might be a period during which we’ve figured out existential security but the FASI hasn’t figured out human augmentation beyond the high priority stuff like curing aging. I wouldn’t expect that period to be long.