This might be more about miscalibration in perceived relevance of technical exercises inspired by some question. A directly mostly irrelevant exercise that juggles details can be useful, worth doing and even sharing, but mostly for improving model-building intuition and developing good framings in the long term rather than for answering the question that inspired it, especially at a technical level.
So an obvious mistake would be to treat such an exercise as evidence that the person doing/sharing it considers it directly relevant for answering the question at a technical level. This mistake can even be made by that same person, but also expecting others to make the mistake about that person might echo in that person behaving as if making it themselves. So someone would do the exercises for the right reasons, then implicitly expect others to think that the person thinks that the exercises are relevant, and implicitly conclude that the exercises actually are relevant, by this invalid echo argument.
This might be more about miscalibration in perceived relevance of technical exercises inspired by some question. A directly mostly irrelevant exercise that juggles details can be useful, worth doing and even sharing, but mostly for improving model-building intuition and developing good framings in the long term rather than for answering the question that inspired it, especially at a technical level.
So an obvious mistake would be to treat such an exercise as evidence that the person doing/sharing it considers it directly relevant for answering the question at a technical level. This mistake can even be made by that same person, but also expecting others to make the mistake about that person might echo in that person behaving as if making it themselves. So someone would do the exercises for the right reasons, then implicitly expect others to think that the person thinks that the exercises are relevant, and implicitly conclude that the exercises actually are relevant, by this invalid echo argument.