Kurzweil does say that AGI is a GCR.
Where?
In the Singularity Is Near, go to the index and look for “risk,” “pathogen,” and so on to find the relevant chapter. He says that the best way to reduce AI risk is to be moral, so that our future selves and successors respond well.
Kurzweil does say that AGI is a GCR.
Where?
In the Singularity Is Near, go to the index and look for “risk,” “pathogen,” and so on to find the relevant chapter. He says that the best way to reduce AI risk is to be moral, so that our future selves and successors respond well.