An ASI that cares about humans with all the necessary caveats (I wouldn’t want to remain human for 1e30 years) doesn’t seem clearly less feasible than an ASI that cares about existing minds of whatever kind continuing to live on as they choose. This requires some allocation of resources, but not necessarily more than a token amount. By current footprint I mean the current population.
I don’t see why we’d be grandfathered in to the current footprint by a universally loving ASI.
I’m not saying that’s bad, just not probably what OP was hoping for.
An ASI that cares about humans with all the necessary caveats (I wouldn’t want to remain human for 1e30 years) doesn’t seem clearly less feasible than an ASI that cares about existing minds of whatever kind continuing to live on as they choose. This requires some allocation of resources, but not necessarily more than a token amount. By current footprint I mean the current population.