Was a philosophy PhD student, left to work at AI Impacts, then Center on Long-Term Risk, then OpenAI. Quit OpenAI due to losing confidence that it would behave responsibly around the time of AGI. Now executive director of the AI Futures Project. I subscribe to Crocker’s Rules and am especially interested to hear unsolicited constructive criticism. http://sl4.org/crocker.html
Some of my favorite memes:
(by Rob Wiblin)
(xkcd)
My EA Journey, depicted on the whiteboard at CLR:
(h/t Scott Alexander)
You know, it does seem like maybe he was right here, sorta? Nuclear weapons do seem like the main reason why WW3 hasn’t happened yet.
That said, WW3 isn’t impossible, and if it were to happen nuclear weapons would make it much worse than all previous wars.