Where does experimental data fall in your translation matrix? Some data is well within what you describe as hyperlocal (e.g. I hear raindrops outside ; it must be raining) but other data has a much longer journey. For example, if you see an image of Mars from JPL labeled as “Taken by Sojourner on date… Time… Place… With such and such camera facing a heading of...degrees North”and with colors calibrated with the color chips with spectral frequencies x, y, and z mounted below the camera”, how would that picture do after going through your translation matrix?
I’m specifically picking on Mars because (if I recall correctly), people did think it was reddish after looking at imagery for some non negligible amount of time until realizing that they messed up the photo processing.
And this is not easy for anyone on Earth to verify either. There are literally no humans on Mars who can say “Stop. You are wrong. The dirt is dirt colored, not red.” It would be entirely fair to say that a mistake like that can not be corrected at all without spending billions of dollars on another Mars mission.
I hope this example is close enough to the physical world;I am not familiar with what you mean by simulacra levels. I hope you can see how these concerns have analogs that are much closer to home without explicitly name dropping any terrestrial historical events.
I interpret the Mars data literally. I assume they are relaying the pictures as they took them. Correcting for mechanical errors of this sort is needed, but a different kind of epistemic problem.
Essentially I do not expect enemy action here.
In other experiments it can be less clear, but that seems easy enough.
Where does experimental data fall in your translation matrix? Some data is well within what you describe as hyperlocal (e.g. I hear raindrops outside ; it must be raining) but other data has a much longer journey. For example, if you see an image of Mars from JPL labeled as “Taken by Sojourner on date… Time… Place… With such and such camera facing a heading of...degrees North”and with colors calibrated with the color chips with spectral frequencies x, y, and z mounted below the camera”, how would that picture do after going through your translation matrix?
I’m specifically picking on Mars because (if I recall correctly), people did think it was reddish after looking at imagery for some non negligible amount of time until realizing that they messed up the photo processing.
And this is not easy for anyone on Earth to verify either. There are literally no humans on Mars who can say “Stop. You are wrong. The dirt is dirt colored, not red.” It would be entirely fair to say that a mistake like that can not be corrected at all without spending billions of dollars on another Mars mission.
I hope this example is close enough to the physical world;I am not familiar with what you mean by simulacra levels. I hope you can see how these concerns have analogs that are much closer to home without explicitly name dropping any terrestrial historical events.
I interpret the Mars data literally. I assume they are relaying the pictures as they took them. Correcting for mechanical errors of this sort is needed, but a different kind of epistemic problem.
Essentially I do not expect enemy action here.
In other experiments it can be less clear, but that seems easy enough.