Reversed Stupidity Is Not Intelligence

“. . . then our peo­ple on that time-line went to work with cor­rec­tive ac­tion. Here.”

He wiped the screen and then be­gan punch­ing com­bi­na­tions. Page af­ter page ap­peared, bear­ing ac­counts of peo­ple who had claimed to have seen the mys­te­ri­ous disks, and each re­port was more fan­tas­tic than the last.

“The stan­dard smother-out tech­nique,” Verkan Vall grinned. “I only heard a lit­tle talk about the ‘fly­ing saucers,’ and all of that was in joke. In that or­der of cul­ture, you can always dis­credit one true story by set­ting up ten oth­ers, pal­pably false, par­allel to it.”

—H. Beam Piper, Po­lice Operation

Piper had a point. Pers’nally, I don’t be­lieve there are any poorly hid­den aliens in­fest­ing these parts. But my dis­be­lief has noth­ing to do with the awful em­bar­rass­ing ir­ra­tional­ity of fly­ing saucer cults—at least, I hope not.

You and I be­lieve that fly­ing saucer cults arose in the to­tal ab­sence of any fly­ing saucers. Cults can arise around al­most any idea, thanks to hu­man silli­ness. This silli­ness op­er­ates or­thog­o­nally to alien in­ter­ven­tion: We would ex­pect to see fly­ing saucer cults whether or not there were fly­ing saucers. Even if there were poorly hid­den aliens, it would not be any less likely for fly­ing saucer cults to arise. The con­di­tional prob­a­bil­ity P(cults|aliens) isn’t less than P(cults|aliens), un­less you sup­pose that poorly hid­den aliens would de­liber­ately sup­press fly­ing saucer cults.1 By the Bayesian defi­ni­tion of ev­i­dence, the ob­ser­va­tion “fly­ing saucer cults ex­ist” is not ev­i­dence against the ex­is­tence of fly­ing saucers. It’s not much ev­i­dence one way or the other.

This is an ap­pli­ca­tion of the gen­eral prin­ci­ple that, as Robert Pir­sig puts it, “The world’s great­est fool may say the Sun is shin­ing, but that doesn’t make it dark out.”2

If you knew some­one who was wrong 99.99% of the time on yes-or-no ques­tions, you could ob­tain 99.99% ac­cu­racy just by re­vers­ing their an­swers. They would need to do all the work of ob­tain­ing good ev­i­dence en­tan­gled with re­al­ity, and pro­cess­ing that ev­i­dence co­her­ently, just to an­ti­cor­re­late that re­li­ably. They would have to be su­per­in­tel­li­gent to be that stupid.

A car with a bro­ken en­g­ine can­not drive back­ward at 200 mph, even if the en­g­ine is re­ally re­ally bro­ken.

If stu­pidity does not re­li­ably an­ti­cor­re­late with truth, how much less should hu­man evil an­ti­cor­re­late with truth? The con­verse of the halo effect is the horns effect: All per­ceived nega­tive qual­ities cor­re­late. If Stalin is evil, then ev­ery­thing he says should be false. You wouldn’t want to agree with Stalin, would you?

Stalin also be­lieved that 2 + 2 = 4. Yet if you defend any state­ment made by Stalin, even “2 + 2 = 4,” peo­ple will see only that you are “agree­ing with Stalin”; you must be on his side.

Corol­laries of this prin­ci­ple:

  • To ar­gue against an idea hon­estly, you should ar­gue against the best ar­gu­ments of the strongest ad­vo­cates. Ar­gu­ing against weaker ad­vo­cates proves noth­ing, be­cause even the strongest idea will at­tract weak ad­vo­cates. If you want to ar­gue against tran­shu­man­ism or the in­tel­li­gence ex­plo­sion, you have to di­rectly challenge the ar­gu­ments of Nick Bostrom or Eliezer Yud­kowsky post-2003. The least con­ve­nient path is the only valid one.3

  • Ex­hibit­ing sad, pa­thetic lu­natics, driven to mad­ness by their ap­pre­hen­sion of an Idea, is no ev­i­dence against that Idea. Many New Agers have been made cra­zier by their per­sonal ap­pre­hen­sion of quan­tum me­chan­ics.

  • Some­one once said, “Not all con­ser­va­tives are stupid, but most stupid peo­ple are con­ser­va­tives.” If you can­not place your­self in a state of mind where this state­ment, true or false, seems com­pletely ir­rele­vant as a cri­tique of con­ser­vatism, you are not ready to think ra­tio­nally about poli­tics.

  • Ad hominem ar­gu­ment is not valid.

  • You need to be able to ar­gue against geno­cide with­out say­ing “Hitler wanted to ex­ter­mi­nate the Jews.” If Hitler hadn’t ad­vo­cated geno­cide, would it thereby be­come okay?

  • In Han­so­nian terms: Your in­stinc­tive will­ing­ness to be­lieve some­thing will change along with your will­ing­ness to af­fili­ate with peo­ple who are known for be­liev­ing it—quite apart from whether the be­lief is ac­tu­ally true. Some peo­ple may be re­luc­tant to be­lieve that God does not ex­ist, not be­cause there is ev­i­dence that God does ex­ist, but rather be­cause they are re­luc­tant to af­fili­ate with Richard Dawk­ins or those darned “stri­dent” athe­ists who go around pub­li­cly say­ing “God does not ex­ist.”

  • If your cur­rent com­puter stops work­ing, you can’t con­clude that ev­ery­thing about the cur­rent sys­tem is wrong and that you need a new sys­tem with­out an AMD pro­ces­sor, an ATI video card, a Max­tor hard drive, or case fans—even though your cur­rent sys­tem has all these things and it doesn’t work. Maybe you just need a new power cord.

  • If a hun­dred in­ven­tors fail to build fly­ing ma­chines us­ing metal and wood and can­vas, it doesn’t im­ply that what you re­ally need is a fly­ing ma­chine of bone and flesh. If a thou­sand pro­jects fail to build Ar­tifi­cial In­tel­li­gence us­ing elec­tric­ity-based com­put­ing, this doesn’t mean that elec­tric­ity is the source of the prob­lem. Un­til you un­der­stand the prob­lem, hope­ful re­ver­sals are ex­ceed­ingly un­likely to hit the solu­tion.4

1Read “P(cults|aliens)” as “the prob­a­bil­ity of UFO cults given that aliens have vis­ited Earth,” and read “P(cults|aliens)” as “the prob­a­bil­ity of UFO cults given that aliens have not vis­ited Earth.”

2Robert M. Pir­sig, Zen and the Art of Mo­tor­cy­cle Main­te­nance: An In­quiry Into Values, 1st ed. (New York: Mor­row, 1974).

3See Scott Alexan­der, “The Least Con­ve­nient Pos­si­ble World,” Less Wrong (blog), De­cem­ber 2, 2018, http://​​less­wrong.com/​​lw/​​2k/​​the_least_con­ve­nient_pos­si­ble_world/​​.

4See also “Sel­ling Non­ap­ples.” http://​​less­wrong.com/​​lw/​​vs/​​sel­l­ing_non­ap­ples.