Also, AFAICT, TC’s base rate for AI being dangerous seems to be something analogous to, “Well, past technologies have been good on net, even dangerous ones, so [by some intuitive analog of Laplace’s rule of succession] we should expect this to turn out fine.” Whereas mine is more like, “Well, evolution of new hominins has never gone well for past hominins (or great apes) so why should this new increase in intelligence go better for us?” combined with, “Well, we’ve never yet been able to write software that does exactly what we want it to for anything complex, or been able to prevent other humans from misusing and breaking it, so why should this be different?”
Tyler Cohen. I only used initials because the OP did the same.
And yes, I read that post, and I’ve seen similar arguments a number of times, and not just recently. They’re getting a lot sharper recently for obvious reasons, though.
I don’t think the base rates are crazy—the new evolution of hominins one is only wrong if you forget who ‘you’ is. TC and many other people are assuming that ‘we’ will be the ‘you’ that are evolving. (The worry among people here is that ‘they’ will have their own ‘you’.)
And the second example, writing new software that breaks—that is the same as making any new technology, we have done this before, and we were fine last time. Yes there were computer viruses, yes some people lost fingers in looms back in the day. But it was okay in the long run.
I think people arguing against these base rates need to do more work. The base rates are reasonable, it is the lack of updating that makes the difference. So lets help them update!
I think updating against these base rates is the critical thing.
But it’s not really an update. The key difference between optimists and pessimists in this area is the recognition that there are no base rates for something like AGI. We have developed new technologies before, but we have never developed a new species before.
New forms of intelligence and agency are a completely new phenomena. Sonic you wanted to ascribe a base rate of our surviving this with zero previous examples, you’d put it at .5. if you counted all of the previous hominid extinctions as relevant, you’d actually put the base rate much lower.
This really seems like the relevant comparison. Tools don’t kill you, but strange creatures do. AGI will be a creature, not a tool.
Yup, this is the point of Scott Alexander’s “MR Tries The Safe Uncertainty Fallacy”.
BTW, what “TC” is?
Tyler Cohen. I only used initials because the OP did the same.
And yes, I read that post, and I’ve seen similar arguments a number of times, and not just recently. They’re getting a lot sharper recently for obvious reasons, though.
TC is Tyler Cowen.
I don’t think the base rates are crazy—the new evolution of hominins one is only wrong if you forget who ‘you’ is. TC and many other people are assuming that ‘we’ will be the ‘you’ that are evolving. (The worry among people here is that ‘they’ will have their own ‘you’.)
And the second example, writing new software that breaks—that is the same as making any new technology, we have done this before, and we were fine last time. Yes there were computer viruses, yes some people lost fingers in looms back in the day. But it was okay in the long run.
I think people arguing against these base rates need to do more work. The base rates are reasonable, it is the lack of updating that makes the difference. So lets help them update!
I think updating against these base rates is the critical thing.
But it’s not really an update. The key difference between optimists and pessimists in this area is the recognition that there are no base rates for something like AGI. We have developed new technologies before, but we have never developed a new species before.
New forms of intelligence and agency are a completely new phenomena. Sonic you wanted to ascribe a base rate of our surviving this with zero previous examples, you’d put it at .5. if you counted all of the previous hominid extinctions as relevant, you’d actually put the base rate much lower.
This really seems like the relevant comparison. Tools don’t kill you, but strange creatures do. AGI will be a creature, not a tool.