Human factors research seems very relevant to rationality

The sci­ence of “hu­man fac­tors” now per­me­ates the avi­a­tion in­dus­try. It in­cludes a so­phis­ti­cated un­der­stand­ing of the kinds of mis­takes that even ex­perts make un­der stress. So when Martin Bromiley read the Harmer re­port, an in­com­pre­hen­si­ble event sud­denly made sense to him. “I thought, this is clas­sic hu­man fac­tors stuff. Fix­a­tion er­ror, time per­cep­tion, hi­er­ar­chy.”

It’s a mir­a­cle that only ten peo­ple were kil­led af­ter Flight 173 crashed into an area of wood­land in sub­ur­ban Port­land; but the crash needn’t have hap­pened at all. Had the cap­tain at­tempted to land, the plane would have touched down safely: the sub­se­quent in­ves­ti­ga­tion found that the land­ing gear had been down the whole time. But the cap­tain and officers of Flight 173 be­came so en­grossed in one puz­zle that they be­came blind to the more ur­gent prob­lem: fuel short­age. This is called “fix­a­tion er­ror”. In a crisis, the brain’s per­cep­tual field nar­rows and short­ens. We be­come seized by a tremen­dous com­pul­sion to fix on the prob­lem we think we can solve, and quickly lose aware­ness of al­most ev­ery­thing else. It’s an af­flic­tion to which even the most skil­led an­d­
ex­pe­rienced pro­fes­sion­als are prone...”

I don’t be­lieve that I’ve heard fix­a­tion er­ror or time per­cep­tion men­tioned on Less Wrong. The field of hu­man fac­tors may be some­thing worth look­ing into more.