[Question] Reverse engineering of the simulation

Let’s say you decided a best way to achieve some goal is to create a simulation. You’ll almost certainly have to balance:

  • The accuracy of the simulation, i.e. how well it describes some “real” conditions you intend to model.

  • The cost of preparing and running the simulation, e.g. the necessary computing resources.

Note that this is true regardless of the purpose and complexity of the simulation—be it weather modelling, or a civ-like computer game.

Consider a following assumption:

We live in a simulation created by some entities who had the dilemma described above and they decided perfect accuracy is not necessary.

This gives us a sort-of-an-answer to the question “what is behind the laws of physics?”: they are approximations of some other laws that would be harder to compute. Therefore, maybe we could try to devise:

  • the “true” laws that govern the thing our world is modelling

  • the “accuracy metric” for the simulation

and observe how they together lead to the laws of physics we see?

I know and understand how speculative this is. Nevertheless, I find this really interesting. So, the question: is there any literature approaching the simulation hypothesis from this direction? Anything I could read?

EDIT: The meaning of the “simulation” in this context differs—I think—from the default. In this POV, the world we live in is the “hardware” built with a purpose of simulating something even more complex. In other words, there is no question e.g. “how they simulate our quantum mechanics” but rather “why implement quantum mechanics the way they did”.

No comments.