“Better” is defined by us. This is the point of the metaethics sequence! A universe tiled with paperclips is not better than what we have now. Rationality is not something one values, it’s someone ones uses to get what they value.
You seem to be imagining FAI as some kind of anthropomorphic intelligence with some sort of “constraint” that says “make sure biological humans continue to exist”. This is exactly the wrong way to implement FAI. The point of FAI is simply for the AI to do what is right (as opposed to what is prime, or paperclip-maximising). In EY’s plan, this involves the AI looking at human minds to discover what we mean by right first.
Now, the right thing may not involve keeping 21st century humanity around forever. Some people will want to be uploaded. Some people will just want better bodies. And yes, most of us will want to “live forever”. But the right thing is definitely not to immediately exterminate the entire population of earth.
“Better” is defined by us. This is the point of the metaethics sequence! A universe tiled with paperclips is not better than what we have now. Rationality is not something one values, it’s someone ones uses to get what they value.
You seem to be imagining FAI as some kind of anthropomorphic intelligence with some sort of “constraint” that says “make sure biological humans continue to exist”. This is exactly the wrong way to implement FAI. The point of FAI is simply for the AI to do what is right (as opposed to what is prime, or paperclip-maximising). In EY’s plan, this involves the AI looking at human minds to discover what we mean by right first.
Now, the right thing may not involve keeping 21st century humanity around forever. Some people will want to be uploaded. Some people will just want better bodies. And yes, most of us will want to “live forever”. But the right thing is definitely not to immediately exterminate the entire population of earth.