--Rhesus macaque is definitely threatened by extinction, because the humans are doing various important things that might kill them by accident. For example, runaway climate change, possible nuclear winter, and of course, AGI. Similarly, once unaligned AIs take over the world and humans are no longer in control, they might later do various things that kill humans as a side effect, such as get into some major nuclear war with each other, or cover everything in solar panels, or disassemble the earth for materials.
--At first, humans will be a precious resource for AIs, because they won’t have a self-sustaining robot economy, nor giant armies of robots. Then, humans will be a poor source of useful atoms, because there will be better sources of useful atoms e.g. the dirt, the biosphere, the oceans, the Moon. Then, humans (or more likely, whatever remains of their corpses) will be a good source of useful atoms again, because all the better sources in the solar system will have been exhausted.
--Humans are not adaptive enough to survive in many environments created by unaligned AIs, unless said unaligned AIs specifically care about humans and take care to design the environment that way. (Think “the Earth is being disassembled for materials which are used to construct space probes and giant supercomputers orbiting the Sun) Probably. There’s a lot more to say on this which I’m happy to get into if you think it might change your mind.
--If AIs become extinct, e.g. by killing each other, humans will continue to flourish, unless AIs take much of the biosphere or humansphere with them (e.g. by nuclear war, or biological war)
Taking it line by line:
--Rhesus macaque is definitely threatened by extinction, because the humans are doing various important things that might kill them by accident. For example, runaway climate change, possible nuclear winter, and of course, AGI. Similarly, once unaligned AIs take over the world and humans are no longer in control, they might later do various things that kill humans as a side effect, such as get into some major nuclear war with each other, or cover everything in solar panels, or disassemble the earth for materials.
--At first, humans will be a precious resource for AIs, because they won’t have a self-sustaining robot economy, nor giant armies of robots. Then, humans will be a poor source of useful atoms, because there will be better sources of useful atoms e.g. the dirt, the biosphere, the oceans, the Moon. Then, humans (or more likely, whatever remains of their corpses) will be a good source of useful atoms again, because all the better sources in the solar system will have been exhausted.
--Humans are not adaptive enough to survive in many environments created by unaligned AIs, unless said unaligned AIs specifically care about humans and take care to design the environment that way. (Think “the Earth is being disassembled for materials which are used to construct space probes and giant supercomputers orbiting the Sun) Probably. There’s a lot more to say on this which I’m happy to get into if you think it might change your mind.
--If AIs become extinct, e.g. by killing each other, humans will continue to flourish, unless AIs take much of the biosphere or humansphere with them (e.g. by nuclear war, or biological war)