Yes! That’s the right intuition. And the LLMs are doing the same—but we don’t know their world model, and thus, the direction of the simplification can be arbitrarily off.
Drilling down on the simplifications, as suggested by Villiam might help.
Yes! That’s the right intuition. And the LLMs are doing the same—but we don’t know their world model, and thus, the direction of the simplification can be arbitrarily off.
Drilling down on the simplifications, as suggested by Villiam might help.