Do you believe that if Obama were to ask the NSA to take over Russia, that the NSA could easily do so? If so, I am speechless.
Ordering the NSA to take over Russia would effectively result in WWIII.
Anyway, the assumption that an AI could understand human motivation, and become a skilled manipulator, is already too far-fetched for me to take seriously.
For what values of skill do you believe that to be true? Do you think there are reason to believe that an AGI who is online won’t be as good at manipulating as the best humans?
For the AI-box scenario I can understand if you think that the AGI doesn’t have enough interactions with humans to train a decent model of human motivation to be good at manipulating.
Ordering the NSA to take over Russia would effectively result in WWIII.
For what values of skill do you believe that to be true? Do you think there are reason to believe that an AGI who is online won’t be as good at manipulating as the best humans?
For the AI-box scenario I can understand if you think that the AGI doesn’t have enough interactions with humans to train a decent model of human motivation to be good at manipulating.