AI never stops. It only stops if it estimates “stopping” to be the most optimal decision, and it’d need to be specifically programmed to have this strange goal.
(If you try to unpack the concept of “stopping”, you’ll see just how strange it is. The AI just sitting in one place exerts gravitational attraction on all the galaxies in its light cone, so what makes dismantling all the stars different? Which of the two is preferable? If the AI is indifferent between the two, it can just toss a coin.)
In any other case, something else will be better than “stopping”. If it estimates that taking over the universe has a tiny chance of making the outcome a tiny bit better than if it stops, it’ll do it.
AI never stops. It only stops if it estimates “stopping” to be the most optimal decision, and it’d need to be specifically programmed to have this strange goal.
(If you try to unpack the concept of “stopping”, you’ll see just how strange it is. The AI just sitting in one place exerts gravitational attraction on all the galaxies in its light cone, so what makes dismantling all the stars different? Which of the two is preferable? If the AI is indifferent between the two, it can just toss a coin.)
In any other case, something else will be better than “stopping”. If it estimates that taking over the universe has a tiny chance of making the outcome a tiny bit better than if it stops, it’ll do it.