I think you have misunderstood me. I was not intending to say that hard-takeoff scenarios are likely (for what it’s worth, I don’t think they are) but that they are what was being analogized to balrogs here.
(Of course a slow-takeoff controllable-by-governments superintelligence can still pose a threat—e.g., some are worried about technological unemployment, or about those who own the AI(s) ending up having almost all the world’s resources. But these are different, not very balrog-like, kinds of threat.)
I think you have misunderstood me. I was not intending to say that hard-takeoff scenarios are likely (for what it’s worth, I don’t think they are) but that they are what was being analogized to balrogs here.
(Of course a slow-takeoff controllable-by-governments superintelligence can still pose a threat—e.g., some are worried about technological unemployment, or about those who own the AI(s) ending up having almost all the world’s resources. But these are different, not very balrog-like, kinds of threat.)
Only on LW: disputes about ways in which an AI is like (or unlike) a balrog X-D
Well, we’ve had a basilisk already. Apparently we’re slowly crawling backwards through alphabetical order. Next up, perhaps, Bahamut or Azathoth.
Azathoth, check.
Is there a directory of the gods and monsters somewhere? If not, I think I’ll start one.