No—the aliens are slower (think slowly, progress slowly because they’re stupider), but can understand as much as humans given enough time. This is the whole point!
I don’t follow what you’re trying to communicate. The story isn’t about aliens and humans, it’s about an AI in a box.
The point is that aliens or an AI don’t need to be qualitatively different to be incomprehensible. One Einstein is incomprehensible to most people at 1x human speed. thousands of einsteins at 1000x speed would be.....
Edit: Turns out I misunderstood Greg Egan, and probably Eliezer Yudkowsky. What I thought was Egan’s position is Aaronson’s unless I misunderstood him too.
Paraphrase of Greg Egan’s position (if I and XiXiDu understand correctly): “Given enough time, humans can understand anything. In practice we still get squashed by AIs, since they’re much faster, but slow them down and we’re equals.”
Paraphrase of Eliezer Yudkowsky’s position (same disclaimer): “There are things that humans simply cannot understand, ever, no matter how long it takes, but that other minds can understand.” (I’m not sure what happens if you brute-force insightspace.)
I think that your impressions are at least implicitly inaccurate, unless your quote marks are actually indicating quotes I haven’t seen. (If not, perhaps you should paraphrase in a way that doesn’t look like direct quoting?) Greg Egan thinks that AIs are not a problem even considering (and dismissing as impossible?) their speed advantage, as far as I can tell. So, practically speaking, he thinks this uFAI alarmism is wrong and maybe contemptible, again as far as I can tell. Eliezer’s impression might be that there are things humans can never understand, but if so that’s probably because the word ‘human’ typically refers to a structure that is defined in many ways by its boundedness. That is, maybe a human could follow a superintelligent argument if the human was upgraded with a Jupiter brain, but calling such a human a human might be stretching definitions. But maybe Eliezer does in fact have deeper objections, I’m not sure.
OP asks what does it mean for something to be incomprehensible. My point was that we don’t need to resort to mysterious, non-answerable hypotheticals about rifts in mind space to answer the question.
No—the aliens are slower (think slowly, progress slowly because they’re stupider), but can understand as much as humans given enough time. This is the whole point!
I don’t follow what you’re trying to communicate. The story isn’t about aliens and humans, it’s about an AI in a box.
The point is that aliens or an AI don’t need to be qualitatively different to be incomprehensible. One Einstein is incomprehensible to most people at 1x human speed. thousands of einsteins at 1000x speed would be.....
Edit: Turns out I misunderstood Greg Egan, and probably Eliezer Yudkowsky. What I thought was Egan’s position is Aaronson’s unless I misunderstood him too.
Paraphrase of Greg Egan’s position (if I and XiXiDu understand correctly): “Given enough time, humans can understand anything. In practice we still get squashed by AIs, since they’re much faster, but slow them down and we’re equals.”
Paraphrase of Eliezer Yudkowsky’s position (same disclaimer): “There are things that humans simply cannot understand, ever, no matter how long it takes, but that other minds can understand.” (I’m not sure what happens if you brute-force insightspace.)
arguments about the human mindspace in toto are silly at this juncture in our understanding.
I think that your impressions are at least implicitly inaccurate, unless your quote marks are actually indicating quotes I haven’t seen. (If not, perhaps you should paraphrase in a way that doesn’t look like direct quoting?) Greg Egan thinks that AIs are not a problem even considering (and dismissing as impossible?) their speed advantage, as far as I can tell. So, practically speaking, he thinks this uFAI alarmism is wrong and maybe contemptible, again as far as I can tell. Eliezer’s impression might be that there are things humans can never understand, but if so that’s probably because the word ‘human’ typically refers to a structure that is defined in many ways by its boundedness. That is, maybe a human could follow a superintelligent argument if the human was upgraded with a Jupiter brain, but calling such a human a human might be stretching definitions. But maybe Eliezer does in fact have deeper objections, I’m not sure.
I don’t see anything in the story which I’d expect Egan to disagree with, so I’m not quite sure how it’s relevant here.
OP asks what does it mean for something to be incomprehensible. My point was that we don’t need to resort to mysterious, non-answerable hypotheticals about rifts in mind space to answer the question.