I’ll talk more about this in follow up posts, but I don’t think the main danger is that the models will be voluntarily released. Instead, it’ll just get cheaper and cheaper to train the models that have weapons capabilities as the algorithms get more efficient, which will eventually democratize those weapons.
Analogously, we can think about how cryptography was once a government controlled technology because of its strategic implications, but became widespread as the computing power required to host cryptographic algorthims became extremely cheap. In my view, something very similar is likely to happen with AI systems, although there will also be a more active effort to steal the weights rather than just replicate them given the strategic implications.
I’ll talk more about this in follow up posts, but I don’t think the main danger is that the models will be voluntarily released. Instead, it’ll just get cheaper and cheaper to train the models that have weapons capabilities as the algorithms get more efficient, which will eventually democratize those weapons.
Analogously, we can think about how cryptography was once a government controlled technology because of its strategic implications, but became widespread as the computing power required to host cryptographic algorthims became extremely cheap. In my view, something very similar is likely to happen with AI systems, although there will also be a more active effort to steal the weights rather than just replicate them given the strategic implications.