Let me put it like this: If a dictator suddenly rose to power in a first world country on the promise of building a Deep Learning enabled eutopia, and then used that ideology to rationalize the execution of eleven million civilians during WW3, that would be the first of a series of unlikely events to enable others to implement the taboo you seek, if you wanted to do it the way it was done the first time around.
There are intermediary positions. Capabilities engineers have sometimes been known to stop Capabilities Engineering when they realize they’re going to end the world. But that particular moratorium on research was kind of a one time thing.
I didn’t think the taboo about human cloning was due to WW2, but if it is then I’d buy your argument more. The first few results on a quick google-scholar search for “human cloning ethics” don’t seem to mention “Nazi”, but I don’t know, maybe that’s the root of it.
Edit: So for example, what if you get the Christians very upset about the creation of minds? (Not saying this is a good idea.)
A large part of the reason biotechnologies could be banned is due to the fear of eugenics, which started after WWII. Essentially it lowered resistance to regulations based on that fear of eugenics. In other words, it accidentally prevented us from pursuing biotech.
Could you point to a source re/ cloning? What you say seems right w.r.t. to eugenics, and cloning is vaguely related to eugenics by being about applying technology to reproduction. My quick look, though, didn’t find arguments about cloning mentioning Nazis. The influence could be hidden though, so maybe there’s some social scientist who’s tracked the anthropology of the taboo on cloning. As another example, consider gain-of-function research. I don’t think that taboo is about WWII, I think it’s about, like, don’t create dangerous viruses. (Though that taboo is apparently maybe not strong enough.)
Probably not.
Agreed wholeheartedly. Even thinking about trying to pull something like this could directly cause catastrophically bad outcomes.
I wouldn’t go that far, but yes, some possible action plans seem particularly unlikely to be net-positive.
AI engineers seem to be a particularly sensitive area to me, they’re taken very seriously as a key strategic resource.
What? How?
Why?
Let me put it like this: If a dictator suddenly rose to power in a first world country on the promise of building a Deep Learning enabled eutopia, and then used that ideology to rationalize the execution of eleven million civilians during WW3, that would be the first of a series of unlikely events to enable others to implement the taboo you seek, if you wanted to do it the way it was done the first time around.
There are intermediary positions. Capabilities engineers have sometimes been known to stop Capabilities Engineering when they realize they’re going to end the world. But that particular moratorium on research was kind of a one time thing.
I didn’t think the taboo about human cloning was due to WW2, but if it is then I’d buy your argument more. The first few results on a quick google-scholar search for “human cloning ethics” don’t seem to mention “Nazi”, but I don’t know, maybe that’s the root of it.
Edit: So for example, what if you get the Christians very upset about the creation of minds? (Not saying this is a good idea.)
A large part of the reason biotechnologies could be banned is due to the fear of eugenics, which started after WWII. Essentially it lowered resistance to regulations based on that fear of eugenics. In other words, it accidentally prevented us from pursuing biotech.
Could you point to a source re/ cloning? What you say seems right w.r.t. to eugenics, and cloning is vaguely related to eugenics by being about applying technology to reproduction. My quick look, though, didn’t find arguments about cloning mentioning Nazis. The influence could be hidden though, so maybe there’s some social scientist who’s tracked the anthropology of the taboo on cloning. As another example, consider gain-of-function research. I don’t think that taboo is about WWII, I think it’s about, like, don’t create dangerous viruses. (Though that taboo is apparently maybe not strong enough.)