Blinded by Insight

In­sight is of­ten dan­ger­ous to the in­tel­lect be­cause we may be so cap­ti­vated by what we have dis­cov­ered that we take it too far or bun­dle it with other false­hoods. We can be so fo­cused on the strength be­hind this nugget of wis­dom, that we fail to re­al­ise other claims sneak­ing their way in there.


  • Post­mod­ernism makes a few im­por­tant in­sights—that we should be sus­pi­cious of grand nar­ra­tives, that we should be very skep­ti­cal of claims that one par­tic­u­lar model has all the an­swers, that so­ciety of­ten dis­torts what counts as “rea­son­able” or “log­i­cal” or “sci­en­tific”. How­ever, they try to uni­ver­sal­ise this to the point where if they took their own ar­gu­ments se­ri­ously, they would have to be­lieve that throw­ing darts against a board is just as re­li­able as the top re­searchers run­ning ran­domised con­trol­led tri­als with large sam­ple sizes.

  • Prag­ma­tists are blinded by the idea that knowl­edge is an in­stru­men­tal, rather than a ter­mi­nal goal. This leads them to a in­co­her­ent defi­ni­tion of truth—that truth and use­ful­ness are always the same thing. In many ways, it can be un­der­stood as a re­ac­tion against the idea that we should pur­sue knowl­edge for its own sake. We should always be es­pe­cially sus­pi­cious of these re­ac­tive in­sights.

  • Some­one will dis­cover a new philos­o­phy or in­tel­lec­tual move­ment, dis­cover that it is way more per­sua­sive or in­sight­ful on the topic then they are cur­rently and then adopt it whole­heat­edly, to the point of be­com­ing an ide­ologue.

  • My last post, The Ba­sic Ob­ject Model and Defi­ni­tion by In­ter­face was so fo­cused on ex­pli­cat­ing the idea that we of­ten use the same word to cover on­tolog­i­cally differ­ent situ­a­tions when both situ­a­tions share a similar “in­ter­face”, that I man­aged to give a defi­ni­tion of ex­is­tence that didn’t cre­ate a di­vide be­tween ex­is­tence and non-ex­is­tence.

  • There have been times when I’ve been re­ally proud of some­thing that I’ve writ­ten, be­cause I know that it is so much bet­ter that what I could have writ­ten be­fore. It’s pos­si­ble to be so proud of im­ple­ment­ing a par­tic­u­lar strat­egy, that you for­get to ask about whether you could have done some­thing even bet­ter.

My main mo­ti­va­tion be­hind writ­ing this ar­ti­cle was an­swer­ing the ques­tion, “How can in­tel­li­gent peo­ple be­lieve things that are ob­vi­ously stupid?”. There are a few posi­bil­ities:

  • That they are ac­tu­ally stupid

  • That the be­lief isn’t ac­tu­ally ob­vi­ously stupid, but is in­stead quite rea­son­able if you adopt par­tic­u­lar assumptions

  • That they are bi­ased and are not try­ing par­tic­u­larly hard to find the truth

But nonethe­less, some of these be­liefs seem clearly un­sup­port­able by any­one who is both in­tel­li­gent and sincerely seek­ing the truth, yet there seem to be some peo­ple who sup­port these be­liefs with­out be­ing stupid or a rag­ing ide­ologue. How can we square this? My an­swer is that they are prob­a­bly Blinded by In­sight.

One rea­son why I re­ally like this term is it al­lows me to com­ment on the epistemic sta­tus of a per­son or move­ment in a more char­i­ta­ble way than sim­ply say­ing that they are stupid or ide­olog­i­cal. I think that this fram­ing is im­por­tant as it will make it more likely that you will treat them bet­ter. Of course, there is a dan­ger in that it is very easy to la­bel peo­ple as Blinded by In­sight, but I don’t be­lieve that you can save peo­ple who are de­ter­mined to mi­suse it.

Miti­gat­ing Strategies

Here are some ques­tions that you can ask your­self to re­duce the chance of you fal­ling into this trap:

  • Even if my gen­eral thrust is cor­rect, could I be go­ing too far? Is there an in­ter­me­di­ate po­si­tion I haven’t con­sid­ered?

  • What claims am I im­plic­itly mak­ing, other than my main claim? Are they well sup­ported?

  • Even if my cur­rent idea is great, could there be an even bet­ter idea?

  • Could there pos­si­bly be more to the story?

  • Sup­pose I was blinded by in­sight? What would this look like from the in­side?


  • Re­sist the Happy Death Spiral—De­scribes the ex­treme case when there is a feed­back loop be­tween the strength of your be­lief in the the­ory and how much it seems to ex­plain. I was only made aware of this link af­ter I had finished writ­ing. If I had known about it be­fore, then I would have differ­en­ti­ated this con­cept more. The first three ex­am­ples I provide my ar­ti­cle are happy death spirals, while the last two re­fer to more mod­er­ate, tem­po­rary blind­ness. So be­ing Blinded by In­sight is re­ally a broader con­cept.

No comments.