If I had never seen a glider before, I would think there was a nonzero chance that it could travel a long distance without self-propulsion. So if someone runs the experiment of “see if you can travel a long distance with a fixed wing glider and no other innovations”, I could either observe that it works, or observe that it doesn’t.
If you can travel a long distance without propulsion, that obviously updates me very far in the direction of “fixed-wing flight works”.
So by conservation of expected evidence, observing that a glider with no propulsion doesn’t make it very far has to update me at least slightly in the direction of “fixed-wing flight does not work”. Because otherwise I would expect to update in the direction of “fixed-wing flight works” no matter what observation I made.
Note that OP said “does not update me at all” not “does not update me very much”—and the use of the language “update me” implies the strong “in a bayesian evidence sense” meaning of the words—this is not a nit I would have picked if OP had said “I don’t find the failures of autogpt and friends to self-improve to be at all convincing that RSI is impossible”.
If I had never seen a glider before, I would think there was a nonzero chance that it could travel a long distance without self-propulsion. So if someone runs the experiment of “see if you can travel a long distance with a fixed wing glider and no other innovations”, I could either observe that it works, or observe that it doesn’t.
If you can travel a long distance without propulsion, that obviously updates me very far in the direction of “fixed-wing flight works”.
So by conservation of expected evidence, observing that a glider with no propulsion doesn’t make it very far has to update me at least slightly in the direction of “fixed-wing flight does not work”. Because otherwise I would expect to update in the direction of “fixed-wing flight works” no matter what observation I made.
Note that OP said “does not update me at all” not “does not update me very much”—and the use of the language “update me” implies the strong “in a bayesian evidence sense” meaning of the words—this is not a nit I would have picked if OP had said “I don’t find the failures of autogpt and friends to self-improve to be at all convincing that RSI is impossible”.