It might be developed in a server cluster somewhere, but as soon as you plug a superhuman machine into the internet it will be everywhere moments later.
It’s not like it’s that hard to hack into servers and run your own computations on them through the internet. Assuming the superintelligence knows enough about the internet to design something like this beforehand (likely since it runs in a server cluster), it seems like the limiting factor here would be bandwidth.
I imagine a highly intelligent human trapped in this sort of situation, with similar prior knowledge and resource access, could build a botnet in a few months. Working on it at full-capacity, non-stop, could bring this down to a few weeks, and it seems plausible to me that with increased intelligence and processing speed, it could build one in a few moments. And of course with access to its own source code, it would be trivial to have it run more copies of itself on the botnet.
Even if you disagree with this line of reasoning, I don’t think it’s fair to paint it as “*very extreme”.
It might be developed in a server cluster somewhere, but as soon as you plug a superhuman machine into the internet it will be everywhere moments later.
Even if you disagree with this line of reasoning, I don’t think it’s fair to paint it as “*very extreme”.
With “very extreme” I was referring to the part where he claims that this will happen “moments later”.
It’s not like it’s that hard to hack into servers and run your own computations on them through the internet. Assuming the superintelligence knows enough about the internet to design something like this beforehand (likely since it runs in a server cluster), it seems like the limiting factor here would be bandwidth.
I imagine a highly intelligent human trapped in this sort of situation, with similar prior knowledge and resource access, could build a botnet in a few months. Working on it at full-capacity, non-stop, could bring this down to a few weeks, and it seems plausible to me that with increased intelligence and processing speed, it could build one in a few moments. And of course with access to its own source code, it would be trivial to have it run more copies of itself on the botnet.
Even if you disagree with this line of reasoning, I don’t think it’s fair to paint it as “*very extreme”.
With “very extreme” I was referring to the part where he claims that this will happen “moments later”.
Yes, that was clear. My point is that it isn’t extreme under the mild assumption that the AI has prepared for such an event beforehand.