Yes and my reply to that (above) is humanity has a bad track record at that so why would AIs trained on human data be better? Think also of indigenous peoples, extinct species humans didn’t care enough about etc. The point also in the Dyson sphere parabel is not wanting something, it’s wanting something enough so that it happens.
People do devote some effort to things like preserving endangered species, things of historical significance that are no longer immediately useful, etc. If AIs devoted a similar fraction of their resources to humans that would be enough to preserve our existence.
Yes and my reply to that (above) is humanity has a bad track record at that so why would AIs trained on human data be better? Think also of indigenous peoples, extinct species humans didn’t care enough about etc. The point also in the Dyson sphere parabel is not wanting something, it’s wanting something enough so that it happens.
OK I see, didn’t get the connection there.
People do devote some effort to things like preserving endangered species, things of historical significance that are no longer immediately useful, etc. If AIs devoted a similar fraction of their resources to humans that would be enough to preserve our existence.
Agree but again, we don’t get to choose what existence means.