Superintelligent AI mentioned as a possible risk by Bill Gates

“There are other potential problems in the future that Mr. Ridley could have addressed but did not. Some would put super-intelligent computers on that list. My own list would include large-scale bioterrorism or a pandemic … But bioterrorism and pandemics are the only threats I can foresee that could kill over a billion people.

- Bill Gates

From

Africa Needs Aid, Not Flawed Theories

One wonders where Bill Gates read that superintelligent AI could be (but in his estimation, in fact isn’t) a GCR. It couldn’t have been Kurzweil, because Kurzweil doesn’t say that. The only realistic possibilities are that the influence came via Nick Bostrom, Stephen Hawking or Martin Rees or possibly Bill Joy(See comments).

It seems that Bill is also something of a Bayesian with respect to global catastrophic risk:

“Even though we can’t compute the odds for threats like bioterrorism or a pandemic, it’s important to have the right people worrying about them and taking steps to minimize their likelihood and potential impact. On these issues, I am not impressed right now with the work being done by the U.S. and other governments.”