One way how to think about it is the predicted quantity in most of the system is not directly “sensory inputs” but content of some layer of modeling hierarchy further away from sensory inputs, lets call it L. If upper layers from L make contradictory predictions and there isn’t a way to just drop one of the models, you get prediction errror.
You can check the linked PP account of cognitive dissonance for fairly mainstream / standard view
One way how to think about it is the predicted quantity in most of the system is not directly “sensory inputs” but content of some layer of modeling hierarchy further away from sensory inputs, lets call it L. If upper layers from L make contradictory predictions and there isn’t a way to just drop one of the models, you get prediction errror.