That is, if you are in a position where you have the option to build an AI capable of destroying all competing AI projects, the moment you notice this you should update heavily in favor of short timelines (zero in your case, but everyone else should be close behind) and fast takeoff speeds (since your AI has these impressive capabilities). You should also update on existing AI regulation being insufficient (since it was insufficient to prevent you)
A functioning Bayesian should have probably have updated to that position long before they actually have the AI.
Destroying all competing AI projects might mean that the AI took a month to find a few bugs in linux and tensorflow and create something that’s basically the next stuxnet. This doesn’t sound like that fast a takeoff to me.
The regulation is basically non-existant and will likely continue to be so.
I mean making superintelligent AI probably breaks a bunch of laws, technically, as interpreted by a pedantic and literal minded laws. But breathing probably technically breaks a bunch of laws. Some laws are just overbroad, technically ban everything and are generally ignored.
Any enforced rule that makes it pragmatically hard to make AGI would basically have to be a ban on computers (or at least programming)
A functioning Bayesian should have probably have updated to that position long before they actually have the AI.
Destroying all competing AI projects might mean that the AI took a month to find a few bugs in linux and tensorflow and create something that’s basically the next stuxnet. This doesn’t sound like that fast a takeoff to me.
The regulation is basically non-existant and will likely continue to be so.
I mean making superintelligent AI probably breaks a bunch of laws, technically, as interpreted by a pedantic and literal minded laws. But breathing probably technically breaks a bunch of laws. Some laws are just overbroad, technically ban everything and are generally ignored.
Any enforced rule that makes it pragmatically hard to make AGI would basically have to be a ban on computers (or at least programming)