Is there a name for, taking someone being wrong on A as evidence as being wrong on B? Is this a generally sound heuristic to have? In the case of crank magnetism; should I take someone’s crank ideas, as evidence against an idea that is new and unfamiliar to me?
It’s evidence against them being a person whose opinion is strong evidence of B, which means it is evidence against B, but it’s probably weak evidence, unless their endorsement of B is the main thing giving it high probability in your book.
I don’t know if there’s a name for this, but I definitely do it. I think it’s perfectly legitimate in certain circumstances. For example, the more B is a subject of general dispute within the relevant grouping, and the more closely-linked belief in B is to belief in A, the more sound the heuristic. But it’s not a short-cut to truth.
For example, suppose that you don’t know anything about healing crystals, but are aware that their effectiveness is disputed. You might notice that many of the same people who (dis)believe in homeopathy also (dis)believe in healing crystals, that the beliefs are reasonably well-linked in terms of structure, and you might already know that homeopathy is bunk. Therefore it’s legitimate to conclude that healing crystals are probably not a sound medical treatment—although you might revise this belief if you got more evidence. On the other hand, note that reversed stupidity is not truth—healing crystals being bunk doesn’t indicate that conventional medicine works well.
The place where I find this heuristic most useful is politics, because the sides are well-defined—effectively, you have a binary choice between A and ~A, regardless of whether hypothetical alternative B would be better. If I stopped paying attention to current affairs, and just took the opposite position to Bob Crow on every matter of domestic political dispute, I don’t think I’d go far wrong.
I don’t know if there is a name for it, but there ought to be one, since this heuristic is so common: the reliability prior of an argument is the reliability of the arguer. For example, one reason I am not a firm believer in the UFAI doomsday scenarios is Eliezer’s love affair with MWI.
Bayes’ theorem to the rescue! Consider a crank C, who endorses idea A. Then the probability of A being true, given that C endorses it equals the probability of C endorsing A, given that A is true times the probability that A is true over the probability that C endorses A.
In equations: P(A being true | C endorsing A) = P(C endorsing A | A being true)*P(A being true)/P(C endorsing A).
Since C is known to be a crank, our probability for C endorsing A given that A is true is rather low (cranks have an aversion to truth), while our probability for C endorsing A in general is rather high (i.e. compared to a more sane person). So you are justified in being more skeptical of A, given that C endorses A.
It’s a logical fallacy, but is something humans evolved to do (or didn’t evolve not to do), so may in fact be useful when dealing with humans you know in your group.
Is there a name for, taking someone being wrong on A as evidence as being wrong on B? Is this a generally sound heuristic to have? In the case of crank magnetism; should I take someone’s crank ideas, as evidence against an idea that is new and unfamiliar to me?
It’s evidence against them being a person whose opinion is strong evidence of B, which means it is evidence against B, but it’s probably weak evidence, unless their endorsement of B is the main thing giving it high probability in your book.
I don’t know if there’s a name for this, but I definitely do it. I think it’s perfectly legitimate in certain circumstances. For example, the more B is a subject of general dispute within the relevant grouping, and the more closely-linked belief in B is to belief in A, the more sound the heuristic. But it’s not a short-cut to truth.
For example, suppose that you don’t know anything about healing crystals, but are aware that their effectiveness is disputed. You might notice that many of the same people who (dis)believe in homeopathy also (dis)believe in healing crystals, that the beliefs are reasonably well-linked in terms of structure, and you might already know that homeopathy is bunk. Therefore it’s legitimate to conclude that healing crystals are probably not a sound medical treatment—although you might revise this belief if you got more evidence. On the other hand, note that reversed stupidity is not truth—healing crystals being bunk doesn’t indicate that conventional medicine works well.
The place where I find this heuristic most useful is politics, because the sides are well-defined—effectively, you have a binary choice between A and ~A, regardless of whether hypothetical alternative B would be better. If I stopped paying attention to current affairs, and just took the opposite position to Bob Crow on every matter of domestic political dispute, I don’t think I’d go far wrong.
I don’t know if there is a name for it, but there ought to be one, since this heuristic is so common: the reliability prior of an argument is the reliability of the arguer. For example, one reason I am not a firm believer in the UFAI doomsday scenarios is Eliezer’s love affair with MWI.
Yes, but in many cases it’s very weak evidence. Overweighing it leads to the “reversed stupidity” failure mode.
Bayes’ theorem to the rescue! Consider a crank C, who endorses idea A. Then the probability of A being true, given that C endorses it equals the probability of C endorsing A, given that A is true times the probability that A is true over the probability that C endorses A.
In equations: P(A being true | C endorsing A) = P(C endorsing A | A being true)*P(A being true)/P(C endorsing A).
Since C is known to be a crank, our probability for C endorsing A given that A is true is rather low (cranks have an aversion to truth), while our probability for C endorsing A in general is rather high (i.e. compared to a more sane person). So you are justified in being more skeptical of A, given that C endorses A.
It’s a logical fallacy, but is something humans evolved to do (or didn’t evolve not to do), so may in fact be useful when dealing with humans you know in your group.
Somewhat related: The Correct Contrarian Cluster.
Horrifically misnamed.
ad hominem
Not that there’s anything wrong with that.