If the rules of the world preferentially destroy cultures that develop beyond the “standard fantasy technology level” (whatever that is) then I expect that over time, cultures will very strongly disfavour development beyond that level. I’m pretty sure that this will be a stable equilibrium.
If the rules are sufficiently object-level (such as in a computer game), then technological progress based on exploiting finer grained underlying rules becomes impossible. You can’t work out how to crossbreed better crops if crops never crossbreed in the first place, and likewise for other things.
If intelligence itself past some point is a serious survival risk, then it will be selected against. You may get an equilibrium where the knowledge discovered between generations is (on long-term average) equal to knowledge lost.
If the rules of the world preferentially destroy cultures that develop beyond the “standard fantasy technology level” (whatever that is) then I expect that over time, cultures will very strongly disfavour development beyond that level. I’m pretty sure that this will be a stable equilibrium.
If the rules are sufficiently object-level (such as in a computer game), then technological progress based on exploiting finer grained underlying rules becomes impossible. You can’t work out how to crossbreed better crops if crops never crossbreed in the first place, and likewise for other things.
If intelligence itself past some point is a serious survival risk, then it will be selected against. You may get an equilibrium where the knowledge discovered between generations is (on long-term average) equal to knowledge lost.
… and so on.