Yudkowski wanted to break it down to the culmination point that a single collaborator is suffient. For the sake of the argument it is understandable. From the AIs viewpoint it is not rational.
Our supply chains are based on division of labor. A chip fab would not ask what a chip design is good for when they know how to test. A pcb manufacturer needs test software and specifications. A company specified on burn-in testing will assemble any arrangement and connect it even to the internet. If an AI arranges generous payments in advance no one in the supply chain will ask.
If the AI has skills in engineering, strategic planning and social manipulation, an internet connection is sufficient to break out and kickstart any supply chain.
The suggested DNA nano maker is unnecessarily far fetched and too complex to be solved in such a simplified 5 step approach.
The recent advances of deep learning projects combined with easy access to mighty tools like Torch or TensorFlow might trigger a different way: Start-ups will strive for some low-hanging fruits. Who is fastest gets all of the cake. Who is second has lost. The result of this were on display on CES: IoT systems full of security holes were pushed into the market. Luckily AI hardware/software is not yet capable to create an existential risk. Imagine you research as team member on a project that turns out to make your bosses billionairs… how are your chances being heard when you come up with your risk assessment: Boss, we need 6 months extra to design safeguards...