the physical world itself is too fuzzy to support much intellegent manipulation
I’m going to call mind projection on that one bambi. The world looks fuzzy to us, but only when we’ve got our human hats on. Put your AI programmer hat on and there’s just ‘stuff that could be used to compute’.
Right or wrong, the real answer is that we have no idea what a superhuman, self-improving intelligence would be like. Humans have transformed the face of the world in a certain time window. An intelligence orders of magnitude higher could...well, who knows? Whatever, saying ‘”recursive self improvement” is very likely to be a lot more limited than projected’ is very, very dangerous indeed, even if accurate. The rest of your comment does take this into account.
I’m gobsmacked that it was a full 18 minutes before someone slam-dunked kevin’s comment. Come on people, get it together.
the physical world itself is too fuzzy to support much intellegent manipulation
I’m going to call mind projection on that one bambi. The world looks fuzzy to us, but only when we’ve got our human hats on. Put your AI programmer hat on and there’s just ‘stuff that could be used to compute’.
Right or wrong, the real answer is that we have no idea what a superhuman, self-improving intelligence would be like. Humans have transformed the face of the world in a certain time window. An intelligence orders of magnitude higher could...well, who knows? Whatever, saying ‘”recursive self improvement” is very likely to be a lot more limited than projected’ is very, very dangerous indeed, even if accurate. The rest of your comment does take this into account.
I’m gobsmacked that it was a full 18 minutes before someone slam-dunked kevin’s comment. Come on people, get it together.