What about an AI which cares a tiny bit about keeping humans alive
This is quite believable in the short (a few generations of humans) term. It’s much less so in the long term. I model “caring about a lot of things” as an equilibrium case of competing reasons for action, and almost nothing that is “care a little bit” is going to be stable.
I suppose “care a lot about maintaining a small quantity of biologicals” is possible as well for inscrutable AI reasons, but that doesn’t bring any civilization back.
This is quite believable in the short (a few generations of humans) term. It’s much less so in the long term. I model “caring about a lot of things” as an equilibrium case of competing reasons for action, and almost nothing that is “care a little bit” is going to be stable.
I suppose “care a lot about maintaining a small quantity of biologicals” is possible as well for inscrutable AI reasons, but that doesn’t bring any civilization back.