I’m fairly sure I know more about MNT than Eliezer (I tried to make a career of it around 1997-2003), and I’m convinced it would take an FAI longer than Eliezer expects unless the FAI has very powerful quantum computers.
Estimating how long a strong AI takes to design molecular nanotechnology requires knowledge of molecular nanotechnology, knowledge of recursive artificial intelligence and knowledge of computation. This is particularly the case since most of the computation required to go from a recursively-self-improving-AI-seed to nanotech is going to be spent on the early levels of self improving, not the nanotech design itself.
The “unless the FAI has very powerful quantum computers” caveat gives a rather strong indication that your appeals to your own authority are less trustworthy with respect to AI and computation than they are about MNT (for reasons alluded to by shminux).
There are some problems for which knowledge of the problem plus knowledge of computation is sufficient to estimate a minimum amount of computation needed. Are you claiming to know that MNT isn’t like that? Or that an AI can create powerful enough computers that that’s irrelevant?
Appeals to authority about AI seem unimpressive, since nobody has demonstrated expertise at creating superhuman AI.
Appeals to authority about AI seem unimpressive, since nobody has demonstrated expertise at creating superhuman AI.
Perhaps my token effort at politeness made me less than completely clear. That wasn’t an appeal to AI authority. That was a rejection of your appeal to your own personal authority based on the degree to which you undermined your credibility on the subject by expressing magical thinking about quantum computation.
Appeals to authority about AI seem unimpressive, since nobody has demonstrated expertise at creating superhuman AI.
You just appealed to your own authority about molecular nano-technology. When can I expect you to announce your product release? (Be consistent!)
Magical thinking? I intended to mainly express uncertainty about it.
I don’t expect appeals to authority to accomplish much here. Maybe it was a mistake for me to mention it at all, but I’m concerned that people here might treat Eliezer as more of an authority on MNT than he deserves. I only claimed to have more authority about MNT than Eliezer. That doesn’t imply much—I’m trying to encourage more doubt about how an AI could take over the world.
Estimating how long a strong AI takes to design molecular nanotechnology requires knowledge of molecular nanotechnology, knowledge of recursive artificial intelligence and knowledge of computation. This is particularly the case since most of the computation required to go from a recursively-self-improving-AI-seed to nanotech is going to be spent on the early levels of self improving, not the nanotech design itself.
The “unless the FAI has very powerful quantum computers” caveat gives a rather strong indication that your appeals to your own authority are less trustworthy with respect to AI and computation than they are about MNT (for reasons alluded to by shminux).
There are some problems for which knowledge of the problem plus knowledge of computation is sufficient to estimate a minimum amount of computation needed. Are you claiming to know that MNT isn’t like that? Or that an AI can create powerful enough computers that that’s irrelevant?
Appeals to authority about AI seem unimpressive, since nobody has demonstrated expertise at creating superhuman AI.
Perhaps my token effort at politeness made me less than completely clear. That wasn’t an appeal to AI authority. That was a rejection of your appeal to your own personal authority based on the degree to which you undermined your credibility on the subject by expressing magical thinking about quantum computation.
You just appealed to your own authority about molecular nano-technology. When can I expect you to announce your product release? (Be consistent!)
Magical thinking? I intended to mainly express uncertainty about it.
I don’t expect appeals to authority to accomplish much here. Maybe it was a mistake for me to mention it at all, but I’m concerned that people here might treat Eliezer as more of an authority on MNT than he deserves. I only claimed to have more authority about MNT than Eliezer. That doesn’t imply much—I’m trying to encourage more doubt about how an AI could take over the world.