thousands of other people who claim to know (and may even believe it themselves) but are wrong
Seems to me the greatest risk of this strategy is becoming one of them.
The risk can be mitigated by studying the textbooks of settled science first, and only trying to push the boundary of human knowledge later. But then, time becomes another bottleneck.
I’d venture it tends towards starting with poor mental models and then addressing a huge universe of learnable information with those models. Amplifies confirmation bias and leads to consistently learning the wrong lessons. So there’s real value to optimizing your mental models before you even try to learn the settled knowledge, but of course the knowledge itself is the basis of most people’s models.
Perhaps there’s a happy medium in building out a set of models before you start work on any new field, and looking to those you respect in those fields for pointers on what the essential models are.
Seems to me the greatest risk of this strategy is becoming one of them.
The risk can be mitigated by studying the textbooks of settled science first, and only trying to push the boundary of human knowledge later. But then, time becomes another bottleneck.
How exactly do people end up knowing little?
I’d venture it tends towards starting with poor mental models and then addressing a huge universe of learnable information with those models. Amplifies confirmation bias and leads to consistently learning the wrong lessons. So there’s real value to optimizing your mental models before you even try to learn the settled knowledge, but of course the knowledge itself is the basis of most people’s models.
Perhaps there’s a happy medium in building out a set of models before you start work on any new field, and looking to those you respect in those fields for pointers on what the essential models are.