ADDED: A number of the comments so far imply that the first AI built will necessarily FOOM immediately. FOOM is an appealing argument. I’ve argued in favor of it myself. But it is not a probability one theorem. I don’t care who you are; you do not know enough about AI and its future development to bet the future of the universe on your intuition that non-FOOMing AI is impossible. You may even think FOOM is the default case; that does not make it the only case to consider.
Supposing that the first AI build doesn’t FOOM, I still see no reason to suppose that adding colonies increases the overall danger. At most, it increases the population so you have less time, but the same number of person-hours, before some strong AI is created.
Supposing that the first AI build doesn’t FOOM, I still see no reason to suppose that adding colonies increases the overall danger. At most, it increases the population so you have less time, but the same number of person-hours, before some strong AI is created.