An ASI that cares about humans with all the necessary caveats (I wouldn’t want to remain human for 1e30 years) doesn’t seem clearly less feasible than an ASI that cares about existing minds of whatever kind continuing to live on as they choose. This requires some allocation of resources, but not necessarily more than a token amount. By current footprint I mean the current population.
An ASI that cares about humans with all the necessary caveats (I wouldn’t want to remain human for 1e30 years) doesn’t seem clearly less feasible than an ASI that cares about existing minds of whatever kind continuing to live on as they choose. This requires some allocation of resources, but not necessarily more than a token amount. By current footprint I mean the current population.