Nice! I’d add human cloning, human genetic engineering, and eugenics to the list of scary technologies that humanity decided not to pursue. As far as I can tell the reasons we aren’t pursuing them are almost entirely ethical, i.e. if the Nazis won the war they would probably have gone full steam ahead.
If we expect significant changes to the state of the world during takeoff, it makes it harder to predict what kinds of landscape the AI researchers of that time will be facing.
I’m not sure I’d describe eugenics as a technology, but the other two are interesting. Unlike most of the other technologies on the list, the caution around them is entirely preemptive. Similarly to the concerns raised in the Asilomar conference, actually, it makes me wonder if there’s something in the culture of biology that makes that sort of thing more likely.
Nice! I’d add human cloning, human genetic engineering, and eugenics to the list of scary technologies that humanity decided not to pursue. As far as I can tell the reasons we aren’t pursuing them are almost entirely ethical, i.e. if the Nazis won the war they would probably have gone full steam ahead.
Shameless plug for this tool for speculating about what kinds of changes to the world might happen prior to advanced AGI.
I’m not sure I’d describe eugenics as a technology, but the other two are interesting. Unlike most of the other technologies on the list, the caution around them is entirely preemptive. Similarly to the concerns raised in the Asilomar conference, actually, it makes me wonder if there’s something in the culture of biology that makes that sort of thing more likely.
That’s a very neat list, thank you!