Especially since it seems rather discongruent to assign a a 50% probability for AI to be developed by 2050 and a 90% probability for self-modification that leads to superintelligence within 5 years of AI being developed… and then only a 0.01% probability for AI-caused human extinction within this century.
Of course, he may be presuming that superintelligence alone won’t help much.
Especially since it seems rather discongruent to assign a a 50% probability for AI to be developed by 2050 and a 90% probability for self-modification that leads to superintelligence within 5 years of AI being developed… and then only a 0.01% probability for AI-caused human extinction within this century.
Of course, he may be presuming that superintelligence alone won’t help much.