[Question] What if AGI had its own universe to maybe wreck?

Is there any good thought to read about what if we used all this fancy compute to first build a hugely detailed, one-way-glass model of our universe to contain AGI?

From a naive standpoint, it seems like maybe we could sort of “incept” the intellectual work we want automated by manipulating the simulation around the agent.