That counterfactual seems like trouble. Do you mean literally impossible by the laws of physics (surely not)? Or highly improbable that humans will be able to build one? What counts as artificial intelligence—can we do human augmentation? What counts as “highly improbable”—can we really assume stupid or evil humans won’t be able to build one eventually?
It seems to me that plugging all the holes and ways of building a general intelligence to spec would require messing with the laws of physics. We may want to specify a Cosmic Censor law.
Yeah, it is trouble. Thats why I offered the other formulation, thought that might be too vague. Basically, I just wanted to know what non-transhumanist Eliezer would be doing. I don’t really care about the counterfactual some much as picking out a different topic area. Maybe the question should just be “If the idea of intelligence augmentation had never occurred to you and no one had ever shared it with you, what would you be doing with your life?”
That counterfactual seems like trouble. Do you mean literally impossible by the laws of physics (surely not)? Or highly improbable that humans will be able to build one? What counts as artificial intelligence—can we do human augmentation? What counts as “highly improbable”—can we really assume stupid or evil humans won’t be able to build one eventually?
It seems to me that plugging all the holes and ways of building a general intelligence to spec would require messing with the laws of physics. We may want to specify a Cosmic Censor law.
Yeah, it is trouble. Thats why I offered the other formulation, thought that might be too vague. Basically, I just wanted to know what non-transhumanist Eliezer would be doing. I don’t really care about the counterfactual some much as picking out a different topic area. Maybe the question should just be “If the idea of intelligence augmentation had never occurred to you and no one had ever shared it with you, what would you be doing with your life?”