The Genetic Fallacy

In lists of log­i­cal fal­la­cies, you will find in­cluded “the ge­netic fal­lacy”—the fal­lacy at­tack­ing a be­lief, based on some­one’s causes for be­liev­ing it.

This is, at first sight, a very strange idea—if the causes of a be­lief do not de­ter­mine its sys­tem­atic re­li­a­bil­ity, what does? If Deep Blue ad­vises us of a chess move, we trust it based on our un­der­stand­ing of the code that searches the game tree, be­ing un­able to eval­u­ate the ac­tual game tree our­selves. What could li­cense any prob­a­bil­ity as­sign­ment as “ra­tio­nal”, ex­cept that it was pro­duced by some sys­tem­at­i­cally re­li­able pro­cess?

Ar­ti­cles on the ge­netic fal­lacy will tell you that ge­netic rea­son­ing is not always a fal­lacy—that the ori­gin of ev­i­dence can be rele­vant to its eval­u­a­tion, as in the case of a trusted ex­pert. But other times, say the ar­ti­cles, it is a fal­lacy; the chemist Kekulé first saw the ring struc­ture of ben­zene in a dream, but this doesn’t mean we can never trust this be­lief.

So some­times the ge­netic fal­lacy is a fal­lacy, and some­times it’s not?

The ge­netic fal­lacy is for­mally a fal­lacy, be­cause the origi­nal cause of a be­lief is not the same as its cur­rent jus­tifi­ca­tional sta­tus, the sum of all the sup­port and an­ti­sup­port cur­rently known.

Yet we change our minds less of­ten than we think. Ge­netic ac­cu­sa­tions have a force among hu­mans that they would not have among ideal Bayesi­ans.

Clear­ing your mind is a pow­er­ful heuris­tic when you’re faced with new sus­pi­cion that many of your ideas may have come from a flawed source.

Once an idea gets into our heads, it’s not always easy for ev­i­dence to root it out. Con­sider all the peo­ple out there who grew up be­liev­ing in the Bible; later came to re­ject (on a de­liber­ate level) the idea that the Bible was writ­ten by the hand of God; and who nonethe­less think that the Bible con­tains in­dis­pens­able eth­i­cal wis­dom. They have failed to clear their minds; they could do sig­nifi­cantly bet­ter by doubt­ing any­thing the Bible said be­cause the Bible said it.

At the same time, they would have to bear firmly in mind the prin­ci­ple that re­versed stu­pidity is not in­tel­li­gence; the goal is to gen­uinely shake your mind loose and do in­de­pen­dent think­ing, not to negate the Bible and let that be your al­gorithm.

Once an idea gets into your head, you tend to find sup­port for it ev­ery­where you look—and so when the origi­nal source is sud­denly cast into sus­pi­cion, you would be very wise in­deed to sus­pect all the leaves that origi­nally grew on that branch...

If you can! It’s not easy to clear your mind. It takes a con­vul­sive effort to ac­tu­ally re­con­sider, in­stead of let­ting your mind fall into the pat­tern of re­hears­ing cached ar­gu­ments. “It ain’t a true crisis of faith un­less things could just as eas­ily go ei­ther way,” said Thor Shenkel.

You should be ex­tremely sus­pi­cious if you have many ideas sug­gested by a source that you now know to be un­trust­wor­thy, but by golly, it seems that all the ideas still ended up be­ing right—the Bible be­ing the ob­vi­ous archety­pal ex­am­ple.

On the other hand… there’s such a thing as suffi­ciently clear-cut ev­i­dence, that it no longer sig­nifi­cantly mat­ters where the idea origi­nally came from. Ac­cu­mu­lat­ing that kind of clear-cut ev­i­dence is what Science is all about. It doesn’t mat­ter any more that Kekulé first saw the ring struc­ture of ben­zene in a dream—it wouldn’t mat­ter if we’d found the hy­poth­e­sis to test by gen­er­at­ing ran­dom com­puter images, or from a spiritu­al­ist re­vealed as a fraud, or even from the Bible. The ring struc­ture of ben­zene is pinned down by enough ex­per­i­men­tal ev­i­dence to make the source of the sug­ges­tion ir­rele­vant.

In the ab­sence of such clear-cut ev­i­dence, then you do need to pay at­ten­tion to the origi­nal sources of ideas—to give ex­perts more cre­dence than layfolk, if their field has earned re­spect—to sus­pect ideas you origi­nally got from sus­pi­cious sources—to dis­trust those whose mo­tives are un­trust­wor­thy, if they can­not pre­sent ar­gu­ments in­de­pen­dent of their own au­thor­ity.

The ge­netic fal­lacy is a fal­lacy when there ex­ist jus­tifi­ca­tions be­yond the ge­netic fact as­serted, but the ge­netic ac­cu­sa­tion is pre­sented as if it set­tled the is­sue.

Some good rules of thumb (for hu­mans):

  • Be sus­pi­cious of ge­netic ac­cu­sa­tions against be­liefs that you dis­like, es­pe­cially if the pro­po­nent claims jus­tifi­ca­tions be­yond the sim­ple au­thor­ity of a speaker. “Flight is a re­li­gious idea, so the Wright Brothers must be liars” is one of the clas­si­cally given ex­am­ples.

  • By the same to­ken, don’t think you can get good in­for­ma­tion about a tech­ni­cal is­sue just by sagely psy­cho­an­a­lyz­ing the per­son­al­ities in­volved and their flawed mo­tives. If tech­ni­cal ar­gu­ments ex­ist, they get pri­or­ity.

  • When new sus­pi­cion is cast on one of your fun­da­men­tal sources, you re­ally should doubt all the branches and leaves that grew from that root. You are not li­censed to re­ject them out­right as con­clu­sions, be­cause re­versed stu­pidity is not in­tel­li­gence, but...

  • Be ex­tremely sus­pi­cious if you find that you still be­lieve the early sug­ges­tions of a source you later re­jected.

Added: Hal Fin­ney sug­gests that we should call it “the ge­netic heuris­tic”.