A paraphrase from Greg Egan’s “Crystal Nights” might be appropriate here: “I am going to need some workers—I can’t do it all alone, someone has to carry the load.”
Yes, if you could create a universe you could inflict our problems on other people. However, recursive solutions (in order to be solutions rather than infinite loops) still need to make progress on the problem.
Yes, and I discussed how you could alter some aspects of reality to make AI itself more difficult in the simulated universe. This would effectively push back the date of AI simulation in the simulated universe and avoid wasting computational resources on pointless simulated recursion.
And as mentioned, attempting to simulate an entire alternate earth is only one possibility. There are numerous science fiction created world routes you could take which could constrain and focus the sims to particular research topics or endeavors.
Johnicolas is suggesting that if you create a simulated universe in the hope that it will provide ill-defined benefits for mankind (e.g. a cure for cancer), you have to exclude the possibility that your AIs will make a simulated universe inside the simulation in order to solve the same problem. Because if they do, you’re no closer to an answer.
A paraphrase from Greg Egan’s “Crystal Nights” might be appropriate here: “I am going to need some workers—I can’t do it all alone, someone has to carry the load.”
Yes, if you could create a universe you could inflict our problems on other people. However, recursive solutions (in order to be solutions rather than infinite loops) still need to make progress on the problem.
Yes, and I discussed how you could alter some aspects of reality to make AI itself more difficult in the simulated universe. This would effectively push back the date of AI simulation in the simulated universe and avoid wasting computational resources on pointless simulated recursion.
And as mentioned, attempting to simulate an entire alternate earth is only one possibility. There are numerous science fiction created world routes you could take which could constrain and focus the sims to particular research topics or endeavors.
Progress on what problem?
The entire point of creating AI is to benefit mankind, is it not? How is this scenario intrinsically different?
Johnicolas is suggesting that if you create a simulated universe in the hope that it will provide ill-defined benefits for mankind (e.g. a cure for cancer), you have to exclude the possibility that your AIs will make a simulated universe inside the simulation in order to solve the same problem. Because if they do, you’re no closer to an answer.
Ah my bad—I misread him.