...What’s a bias, again?

The availa­bil­ity heuris­tic is a cog­ni­tive short­cut hu­mans use to reach con­clu­sions; and where this short­cut re­li­ably causes in­ac­cu­rate con­clu­sions, we can say that an availa­bil­ity bias is at work. Scope in­sen­si­tivity is an­other ex­am­ple of a cog­ni­tive bias.

“Cog­ni­tive bi­ases” are those ob­sta­cles to truth which are pro­duced, not by the cost of in­for­ma­tion, nor by limited com­put­ing power, but by the shape of our own men­tal ma­chin­ery. For ex­am­ple, our men­tal pro­cesses might be evolu­tion­ar­ily adapted to speci­fi­cally be­lieve some things that ar­ent true, so that we could win poli­ti­cal ar­gu­ments in a tribal con­text. Or the men­tal ma­chin­ery might be adapted not to par­tic­u­larly care whether some­thing is true, such as when we feel the urge to be­lieve what oth­ers be­lieve to get along so­cially. Or the bias may be a side-effect of a use­ful rea­son­ing heuris­tic. The availa­bil­ity heuris­tic is not it­self a bias, but it gives rise to them; the ma­chin­ery uses an al­gorithm (give things more ev­i­den­tial weight if they come to mind more read­ily) that does some good cog­ni­tive work but also pro­duces sys­tem­atic er­rors.

Our brains are do­ing some­thing wrong, and af­ter a lot of ex­per­i­men­ta­tion and/​or heavy think­ing, some­one iden­ti­fies the prob­lem ver­bally and con­cretely; then we call it a “(cog­ni­tive) bias.” Not to be con­fused with the col­lo­quial “that per­son is bi­ased,” which just means “that per­son has a skewed or prej­u­diced at­ti­tude to­ward some­thing.”

In cog­ni­tive sci­ence, “bi­ases” are dis­t­in­guished from er­rors that arise from cog­ni­tive con­tent, such as learned false be­liefs. Th­ese we call “mis­takes” rather than “bi­ases,” and they are much eas­ier to cor­rect, once we’ve no­ticed them for our­selves. (Though the source of the mis­take, or the source of the source of the mis­take, may ul­ti­mately be some bias.)

“Bi­ases” are also dis­t­in­guished from er­rors stem­ming from dam­age to an in­di­vi­d­ual hu­man brain, or from ab­sorbed cul­tural mores; bi­ases arise from ma­chin­ery that is hu­manly uni­ver­sal.

Plato wasn’t “bi­ased” be­cause he was ig­no­rant of Gen­eral Rel­a­tivity—he had no way to gather that in­for­ma­tion, his ig­no­rance did not arise from the shape of his men­tal ma­chin­ery. But if Plato be­lieved that philoso­phers would make bet­ter kings be­cause he him­self was a philoso­pher—and this be­lief, in turn, arose be­cause of a uni­ver­sal adap­tive poli­ti­cal in­stinct for self-pro­mo­tion, and not be­cause Plato’s daddy told him that ev­ery­one has a moral duty to pro­mote their own pro­fes­sion to gov­er­nor­ship, or be­cause Plato sniffed too much glue as a kid—then that was a bias, whether Plato was ever warned of it or not.

While I am not averse (as you can see) to dis­cussing defi­ni­tions, I don’t want to sug­gest that the pro­ject of bet­ter wield­ing our own minds rests on a par­tic­u­lar choice of ter­minol­ogy. If the term “cog­ni­tive bias” turns out to be un­helpful, we should just drop it.

We don’t start out with a moral duty to “re­duce bias,” sim­ply be­cause bi­ases are bad and evil and Just Not Done. This is the sort of think­ing some­one might end up with if they ac­quired a de­on­tolog­i­cal duty of “ra­tio­nal­ity” by so­cial os­mo­sis, which leads to peo­ple try­ing to ex­e­cute tech­niques with­out ap­pre­ci­at­ing the rea­son for them. (Which is bad and evil and Just Not Done, ac­cord­ing to Surely You’re Jok­ing, Mr. Feyn­man, which I read as a kid.) A bias is an ob­sta­cle to our goal of ob­tain­ing truth, and thus in our way.

We are here to pur­sue the great hu­man quest for truth: for we have des­per­ate need of the knowl­edge, and be­sides, were cu­ri­ous. To this end let us strive to over­come what­ever ob­sta­cles lie in our way, whether we call them “bi­ases” or not.