Say Wrong Things

There are many ways you might ap­proach be­ing less wrong.

A pop­u­lar one is to make fewer wrong state­ments; to say fewer wrong things.

Naively it would seem this is a recipe for suc­cess, since you just say more things that are true and right and fewer things that are false and wrong. But if Good­hart has any­thing to say about it, and he does, you’ll find ways to max­i­mize the mea­sure at the ex­pense of the origi­nal ob­jec­tive.

As­sum­ing the real ob­jec­tive is some­thing like “have a more com­plete, pre­cise, and ac­cu­rate model of the world that bet­ter pre­dicts the out­come of sub­jec­tively un­known events”, then we can quickly see the many ways Good­hart­ing can lead us astray if we fo­cus too much on ap­pear­ing less wrong. We might:

  • make fewer claims than we could, pul­ling us away from com­plete­ness even as we ap­pear less wrong;

  • make weaker claims than we could, pul­ling us away from pre­ci­sion;

  • and, a peren­nial fa­vorite, filter the claims we pub­li­cly make so we ap­pear less wrong than we re­ally are by hid­ing our least con­fi­dent claims.

The first two can be cor­rected with bet­ter cal­ibra­tion, that is by mak­ing state­ments with con­fi­dence in­ter­vals or like­li­hoods that pro­por­tion­ally match the ob­served fre­quency of cor­rect­ness of similarly con­fi­dent claims. But sim­ply sug­gest­ing some­one “be bet­ter cal­ibrated” is not a mo­tion they can make; it’s an out­come of tak­ing ac­tions to­wards in­creas­ing cal­ibra­tion. As good a place to start as any for im­prov­ing cal­ibra­tion is the fore­cast­ing liter­a­ture, if that’s what you’d like to do.

The third is more tricky, though, be­cause it’s less di­rectly about claims be­ing made and their prob­a­bil­ity of cor­rect­ness and more about so­cial dy­nam­ics and how you ap­pear to other peo­ple. And for that rea­son it’s what I want to fo­cus on here.

Ap­pear­ing Less Wrong

I’ve met a lot of peo­ple in my life who are ex­perts at not look­ing as stupid as they are.

That’s kind of harsh. Maybe a nicer way to say it is that they are ex­perts at ap­pear­ing to be bet­ter at mak­ing cor­rect pre­dic­tions about the world than they ac­tu­ally are.

Some of their tech­niques are just nor­mal so­cial tricks: pro­ject­ing con­fi­dence, us­ing so­cial sta­tus, the ever-abused term “gaslight­ing”, and other meth­ods of get­ting peo­ple to be­lieve they are right even when a more care­ful ex­am­i­na­tion would re­veal them to be mis­taken. Th­ese are peo­ple we all love to hate and love when we can call them on their bul­lshit: over­con­fi­dent aca­demics, in­flated poli­ti­ci­ans, self-im­por­tant in­ter­net in­tel­lec­tu­als, and those peo­ple whose idea of so­cial in­ter­ac­tion is to say “well, ac­tu­ally...”.

But there’s a way to avoid look­ing stupid that is more per­ni­cious, less amenable to call­ing out, and that sub­tly drags you to­wards lo­cal max­ima that trap you moun­tains and valleys away from more com­plete un­der­stand­ing. And it’s to shut up and not tell any­one about your low con­fi­dence be­liefs.

It is ex­tremely tempt­ing to do this. Among the many benefits of keep­ing low prob­a­bil­ity claims to your­self:

  • you have a high ac­cu­racy ra­tio of pub­li­cly made claims, mak­ing you look right more of­ten when ob­served;

  • you say only things that, even when wrong, turn out to be wrong in con­ser­va­tive ways that still make you look smart;

  • and you ac­crue a rep­u­ta­tion of be­ing right, usu­ally con­fer­ring so­cial sta­tus, which can feel re­ally good.

The only trou­ble is that this ap­proach is too con­ser­va­tive, too mod­est. It’s easy to jus­tify this kind of out­ward mod­esty as keep­ing up ap­pear­ances in a way that is in­stru­men­tal to some big­ger goal, and you say to your­self “I’ll still make low prob­a­bil­ity claims; I’ll just keep them to my­self”, but down that path lies shadow ra­tio­nal­ity via com­part­men­tal­iza­tion. You can try it, but good luck, be­cause it’s a dark art that hopes to do what hu­man brains can­not, or at least can­not with­out some suffi­ciently pow­er­ful magic, and that magic tra­di­tion­ally comes with vows not to do it.

Mean­while, out in the light, find­ing mod­els that are bet­ter pre­dic­tive of re­al­ity some­times re­quires hold­ing be­liefs that ap­pear un­likely to be true but then turn out to be right, some­times spec­tac­u­larly so, al­though sem­per caveat, all mod­els are wrong, but some are use­ful. And then you have to go all in some­times, ex­plor­ing the pos­si­bil­ity that your 10% guess turns out to be 100% cor­rect, minus ep­silon, be­cause if you don’t do this you’ll do no bet­ter than the me­dieval Scholas­tic hold­ing to Aris­totelian physics or the early 20th cen­tury ge­ol­o­gist ig­nor­ing the ev­i­dence for con­ti­nen­tal drift, for­ever locked away from tak­ing the big risks nec­es­sary to find bet­ter, more ac­cu­rate, pre­cise, and com­plete mod­els.

Okay, so let’s say you are con­vinced not to try so hard to ap­pear more right than you are. How do you do that?

Say It Wrong

So I sup­pose it’s noth­ing so much harder than just tel­ling peo­ple your claims, even if you have low con­fi­dence in them, and see­ing how they re­act, al­though de­pend­ing on the cir­cum­stances you’ll prob­a­bly want to ad­e­quately ex­plain your con­fi­dence level so they can up­date on it ap­pro­pri­ately. The trou­ble is get­ting your­self to do that.

I can’t change your mind for you, al­though thank­fully some folks have de­vel­oped some tech­niques that might help if you’re not in­ter­ested in over-solv­ing that prob­lem. What I can do is point out a few things that might help you see where you are be­ing too mod­est, nudge you to­wards less mod­esty, and cre­ate an en­vi­ron­ment where it’s safe to be less mod­est.

  • Look for the feel­ing of “pul­ling your punches” when you are tel­ling peo­ple your ideas.

  • Ram­ble more and filter less.

  • Alter­na­tively, more bab­ble less prune.

  • Worry less about how you look to oth­ers.

  • Re­lat­edly, in­crease your abil­ity to gen­er­ate your own es­teem so you less need to re­ceive it from oth­ers, giv­ing you more free­dom to make mis­takes.

  • En­courage oth­ers to tell you their half-baked ideas.

  • When they do, be sup­port­ive.

  • Take a col­lab­o­ra­tive, nur­tur­ing ap­proach to truth seek­ing.

  • Play party games like “what do you think is true that you think no one else here also thinks is true?” and “what’s your least pop­u­lar opinion?”.

  • And my per­sonal fa­vorite, write up and share your im­mod­est, lower-con­fi­dence ideas that you think de­serve ex­plo­ra­tion be­cause they have high ex­pected value if they turn out to be right.

I sus­pect that the real rea­son most peo­ple try too hard to ap­pear more right than they are is fear—fear of be­ing wrong, fear of look­ing stupid, fear of los­ing sta­tus, fear of los­ing pres­tige, fear of los­ing face, fear of be­ing os­tra­cized, fear of be­ing ig­nored, fear of feel­ing bad, fear of feel­ing lesser. I see this fear, and I honor it, but it must be over­come if one wishes to be­come stronger. And when you fear be­ing more wrong, you will be too care­ful to ever be­come as less wrong as you could.

To sum it all up pithily:

To be­come less wrong, you must give up be­ing most right.