Meta-Honesty: Firming Up Honesty Around Its Edge-Cases

(Cross-posted from Face­book.)

0: Tl;dr.

  • A prob­lem with the ob­vi­ous-seem­ing “wiz­ard’s code of hon­esty” aka “never say things that are false” is that it draws on high ver­bal in­tel­li­gence and un­usu­ally per­mis­sive so­cial em­bed­dings. I.e., you can’t always say “Fine” to “How are you?” This has always made me feel very un­com­fortable about the priv­ilege im­plicit in recom­mend­ing that any­one else be more hon­est.

  • Gen­uinely con­sis­tent Glo­ma­riza­tion (i.e., con­sis­tently say­ing “I can­not con­firm or deny” whether or not there’s any­thing to con­ceal) does not work in prin­ci­ple be­cause there are too many coun­ter­fac­tual selves who might want to con­ceal some­thing.

  • Glo­ma­riza­tion also doesn’t work in prac­tice if the Nazis show up at your door ask­ing if you have fugi­tive Jews in your at­tic.

  • If you would lie to Nazis about fugi­tive Jews, then ab­solute truth­say­ing can’t be the whole story, which makes “never say things that are false” feel to me like a shaky foun­da­tion in that it is liter­ally false, and some­thing less shaky would be nice.

  • Robin Han­son’s “au­to­matic norms” prob­lem sug­gests differ­ent peo­ple might have very differ­ent ideas about what con­sti­tutes a good per­son’s nor­mal hon­esty, with­out re­al­iz­ing that they have very differ­ent ideas. Per­ceived vi­o­la­tions of an hon­esty norm can blow up and cause in­ter­per­sonal con­flict. It seems to me that this is some­thing that doesn’t always work well when peo­ple leave it alone.

A rule which seems to me more “nor­mal” than the wiz­ard’s literal-truth rule, more like a ver­sion of stan­dard hu­man hon­esty re­in­forced around the edges, would be as fol­lows:

“Don’t lie when a nor­mal highly hon­est per­son wouldn’t, and fur­ther­more, be hon­est when some­body asks you which hy­po­thet­i­cal cir­cum­stances would cause you to lie or mis­lead—ab­solutely hon­est, if they ask un­der this code. How­ever, ques­tions about meta-hon­esty should be care­ful not to probe ob­ject-level in­for­ma­tion.”

I’ve been ten­ta­tively call­ing this “meta-hon­esty”, but bet­ter ter­minol­ogy is so­lic­ited.

1: Glo­ma­riza­tion can’t prac­ti­cally cover many cases.

Sup­pose that last night I helped hide a fugi­tive mar­ijuana sel­ler from the Feds. You ask me what I was do­ing last night, and I, prefer­ring not to emit false state­ments, re­ply, “I can’t con­firm or deny what I was do­ing last night.”

We now have two ma­jor prob­lems here:

  • Even on an or­di­nary day, if you ca­su­ally ask me what I was do­ing last night, I the­o­ret­i­cally ought to an­swer “I can’t con­firm or deny what I was do­ing last night” be­cause some of my coun­ter­fac­tual selves were hid­ing fugi­tive mar­ijuana sel­l­ers from the Feds. If I don’t do this con­sis­tently, and I ac­tu­ally was hid­ing fugi­tives last night, I can’t Glo­ma­rize with­out re­veal­ing in­for­ma­tion. But then the num­ber of coun­ter­fac­tu­als I have to worry about is too large for me to ever an­swer any­thing.

  • If the Feds ac­tu­ally ask you this ques­tion, they will not be fa­mil­iar with your pre­vi­ous prac­tice of Glo­ma­riza­tion and will prob­a­bly not be very im­pressed with your an­swer.

This doesn’t mean that Glo­ma­riza­tion is never helpful. If you ask me whether my sub­marine is car­ry­ing nu­clear weapons, or whether I’m se­cretly the au­thor of “The Waves Arisen”, I think most listen­ers would un­der­stand if I replied, “I have a con­sis­tent policy of not say­ing which sub­marines are car­ry­ing nu­clear weapons, nor whether I wrote or helped write a doc­u­ment that doesn’t have my name on it.” An or­di­nary hon­est per­son does not need to lie on these oc­ca­sions be­cause Glo­ma­riza­tion is both the­o­ret­i­cally pos­si­ble and prag­mat­i­cally prac­ti­cal, so one should adopt a con­sis­tent Glo­ma­riza­tion rather than lie.

But that doesn’t work for hid­ing fugi­tives. Or any other oc­ca­sion where an or­di­nary high-hon­esty per­son would con­sider it obli­ga­tory to lie, in an­swer to a ques­tion where the asker is not ex­pect­ing eva­sion or Glo­ma­riza­tion.

(I’m sure some peo­ple read­ing this think it’s all very cute for me to be wor­ried about the fact that I wouldn’t tell the truth all the time. Feel free to state this in the com­ments so that we aren’t con­fused about who’s us­ing which norms. Smirk­ing about it, or laugh­ing, es­pe­cially con­veys im­por­tant info about you.)

2: The law of no literal false­hood.

One for­mu­la­tion of my au­to­matic norm for hon­esty, the one that feels like the ob­vi­ous de­fault from which any de­par­ture re­quires a crush­ingly heavy jus­tifi­ca­tion, was given by Ur­sula K. LeGuin in A Wizard of Earth­sea:

He told his tale, and one man said, “But who saw this won­der of drag­ons slain and drag­ons baf­fled? What if he—”

″Be still!” the Head Isle-Man said roughly, for he knew, as did most of them, that a wiz­ard may have sub­tle ways of tel­ling the truth, and may keep the truth to him­self, but that if he says a thing the thing is as he says. For that is his mas­tery.

Or in sim­pler sum­mary, this policy says:

Don’t say things that are liter­ally false.

Or with some of the un­spo­ken finicky de­tails added back in: “Don’t say things that you be­lieve to be liter­ally false in a con­text where peo­ple will (with rea­son­ably high prob­a­bil­ity) per­sis­tently be­lieve that you be­lieve them to be true.” Jokes are still al­lowed, even jokes that only get re­vealed as jokes ten sec­onds later. Or quo­ta­tions, etcetera ad ob­vi­ousum.

The no-literal-false­hood code of hon­esty has three huge ad­van­tages:

  • To the ex­tent peo­ple ob­serve you to con­sis­tently prac­tice it, it is eas­ier for you to com­mu­ni­cate be­liev­ably when you want to say a thing. They may still not be able to trust you perfectly, but the hy­po­thet­i­cal is “Did this per­son break their big-deal code of hon­esty?” rather than “Did this per­son tell an or­di­nary lie?” One would hope this would be good for co­or­di­na­tion and other in­ter­per­sonal is­sues, though this might only be a fond wish on my part.

  • Most peo­ple, even most un­usu­ally hon­est peo­ple, wan­der about their lives in a fog of in­ter­nal dis­tor­tions of re­al­ity. Re­peat­edly ask­ing your­self of ev­ery sen­tence you say aloud to an­other per­son, “Is this state­ment ac­tu­ally and liter­ally true?”, helps you build a skill for nav­i­gat­ing out of your in­ter­nal smog of not-quite-truths. For that is our mas­tery.

  • It’s good for your soul. At least, it’s good for my soul for rea­sons I’d ex­pect to gen­er­al­ize if I’m not just com­mit­ting the typ­i­cal-mind fal­lacy.

From Frank He­bert’s Dune Mes­siah, writ­ing about Truth­say­ers, peo­ple who had trained to ex­treme heights the abil­ity to tell when oth­ers were ly­ing and who also never lied them­selves:

“It re­quires that you have an in­ner agree­ment with truth which al­lows ready recog­ni­tion.”

This is prob­a­bly not true in nor­mal hu­man prac­tice for de­tect­ing other peo­ple’s lies. I’d ex­pect a lot of con artists are bet­ter than a lot of hon­est peo­ple at that.

But the phrase “It re­quires you have an in­ner agree­ment with truth which al­lows ready recog­ni­tion” is some­thing that res­onates strongly with me. It feels like it points to the part that’s good for your soul. Say­ing only true things is a kind of re­spect for the truth, a pact that you forge with it.

3: The priv­ilege of truthtel­ling.

I’ve never sug­gested to any­one else that they adopt the wiz­ard’s code of hon­esty.

The code of literal truth only lets peo­ple nav­i­gate any­thing like or­di­nary so­cial re­al­ity to the ex­tent that they are very fast on their ver­bal feet, and can re­spond to the ques­tion “How are you?” by say­ing “Get­ting along” in­stead of “Hor­ribly” or with an awk­ward silence while they try to think of some­thing tech­ni­cally true. (Be­cause of­ten “I’m fine” is false, you see. If this has never both­ered you then you are per­haps not in the tar­get au­di­ence for this es­say.)

So I haven’t ad­vo­cated any par­tic­u­lar code of hon­esty be­fore now. I was aware of the fact that I had an un­usu­ally high ver­bal SAT score, and also, that I spend lit­tle time in­ter­fac­ing with mun­danes and am not de­pen­dent on them for my daily bread. I thought it wasn’t my place for me to sug­gest to any­one else that they try their hand at say­ing only true things all the time, or for me to act like this con­veys moral virtue. I’m only even de­scribing the wiz­ard’s code pub­li­cly now that I can think of at least one al­ter­na­tive.

I once heard some­body claim that ra­tio­nal­ists ought to prac­tice ly­ing, so that they could sep­a­rate their in­ter­nal hon­esty from any fears of need­ing to say what they be­lieved. That is, if they be­came good at ly­ing, they’d feel freer to con­sider geo­cen­trism with­out wor­ry­ing what the Church would think about it. I do not in fact think this would be good for the soul, or for a co­op­er­a­tive spirit be­tween peo­ple. This is the sort of pro­posed solu­tion of which I say, “That is a ter­rible solu­tion and there has to be a bet­ter way.”

But I do see the prob­lem that per­son was try­ing to solve. One can also be priv­ileged in stub­born­ness when it comes to over­rid­ing the fear of other peo­ple find­ing out what you be­lieve. I can see how tel­ling fewer rou­tine lies than usual would make that fear even worse, ex­ac­er­bat­ing the pres­sure it can place on what you be­lieve you be­lieve; es­pe­cially if you didn’t have a lot of con­fi­dence in your ver­bal ag­ility. It’s one more rea­son not to pres­sure peo­ple (even a lit­tle) into adopt­ing the wiz­ard’s code, but then it would be nice to have some other code in­stead.

4: Literal-truth as my au­to­matic norm, maybe not shared.

This set of thoughts started, as so many things do, with a post by Robin Han­son.

In par­tic­u­lar Robin tweeted the pa­per: “The sur­pris­ing costs of silence: Asym­met­ric prefer­ences for proso­cial lies of com­mis­sion and omis­sion.”

Ab­stract: Across 7 ex­per­i­ments (N = 3883), we demon­strate that com­mu­ni­ca­tors and tar­gets make ego­cen­tric moral judg­ments of de­cep­tion. Speci­fi­cally, com­mu­ni­ca­tors fo­cus more on the costs of de­cep­tion to them—for ex­am­ple, the guilt they feel when they break a moral rule—whereas tar­gets fo­cus more on whether de­cep­tion helps or harms them. As a re­sult, com­mu­ni­ca­tors and tar­gets make asym­met­ric judg­ments of proso­cial lies of com­mis­sion and omis­sion: Com­mu­ni­ca­tors of­ten be­lieve that omit­ting in­for­ma­tion is more eth­i­cal than tel­ling a proso­cial lie, whereas tar­gets of­ten be­lieve the op­po­site.

This got me won­der­ing whether my de­fault norm of the wiz­ard’s code is some­thing other peo­ple will even per­ceive as proso­cial. Yes, in­deed, I feel like not say­ing things is much more law-abid­ing than tel­ling literal false­hoods. But if peo­ple feel just as wounded, or more wounded, then that policy isn’t re­ally benefit­ing any­one else. It’s just let­ting me feel eth­i­cal and maybe be­ing good for my own per­sonal soul.

Robin com­mented, “Men­tion all rele­vant is­sues, even if you have to lie about them.”

I don’t think this is a bul­let I can bite in daily prac­tice. I think I still want to emit literal truths for most dilem­mas short of hid­ing fugi­tives. But it’s one more ar­gu­ment worth men­tion­ing against try­ing to make an ab­solute wiz­ard’s code into a bedrock solu­tion for in­ter­per­sonal re­li­a­bil­ity.

Robin also pub­lished a blog post about “au­to­matic norms” in gen­eral:

We are to just know eas­ily and surely which ac­tions vi­o­late norms, with­out need­ing to re­flect on or dis­cuss the mat­ter. We are to pre­sume that fram­ing effects are unim­por­tant, and that ev­ery­one agrees on the rele­vant norms and how they are to be ap­plied.
In a rel­a­tively sim­ple world with limited sets of ac­tions and norms, and a small set of peo­ple who grew up to­gether and later of­ten enough ob­serve and gos­sip about pos­si­ble norm vi­o­la­tions of oth­ers, such peo­ple might in fact learn from enough ex­am­ples to mostly ap­ply the same norms the same way. This was plau­si­bly the case for most of our dis­tant an­ces­tors. They could in fact mostly be sure that, if they judged them­selves as in­no­cent, most ev­ery­one else would agree. And if they judged some­one else as guilty, oth­ers should agree with that as well. Norm ap­pli­ca­tion could in fact usu­ally be ob­vi­ous and au­to­matic.
To­day how­ever, there are far more peo­ple, and more in­ter­mixed, who grow up in widely vary­ing con­texts and now face far larger spaces of pos­si­ble ac­tions and ac­tion con­texts. Rel­a­tive to this huge space, gos­sip about par­tic­u­lar norm vi­o­la­tions is small and frag­mented...
We must see our­selves as tol­er­at­ing a lot of norm vi­o­la­tion. We ac­tu­ally tell oth­ers about and at­tempt to pun­ish so­cially only a tiny frac­tion of the vi­o­la­tions that we could know of. When we look most any­where at be­hav­ior de­tails, it must seem to us like we are liv­ing in a Sodom and Go­mor­rah of sin. Com­pared to the an­cient world, it must seem a lot eas­ier to get away for a long time with a lot of norm vi­o­la­tions...
We must also see our­selves as tol­er­at­ing a lot of overea­ger busy­bod­ies ap­ply­ing what they see as norms to what we see as our own pri­vate busi­ness where their so­cial norms shouldn’t ap­ply.

This made me re­al­ize that the wiz­ard’s code of hon­esty I grew up with is, in­deed, an au­to­matic norm for me. Which meant I was prob­a­bly over­es­ti­mat­ing and elieze­ro­mor­phiz­ing the de­gree to which other peo­ple even cared at all, or would think I was keep­ing any promises by do­ing it. Again, I don’t see this as a good rea­son to give up on emit­ting liter­ally true sen­tences al­most all of the time, but it’s one more rea­son I feel more open to al­ter­na­tives than I would’ve ten years ago. That said, I do ex­pect a lot of peo­ple read­ing this also have some­thing like that same au­to­matic norm, and I still feel like that makes us more like part of the same tribe.

5: Coun­ter­ar­gu­ment: The prob­lem of non-ab­solute rules.

A pro­posal like this one ought to come with a lot of warn­ing signs at­tached. Here’s one of them:

There’s a pas­sage in John M. Ford’s Web of An­gels, when the pro­tag­o­nist has fi­nally kil­led some­one even af­ter all the times his men­tor taught him to never ever kill. His men­tor says:

“No words can pre­vent all kil­ling. Words are not iron bands. But I taught you to hes­i­tate, to stay your hands un­til the weight of duty crushed them down.”

Sur­prise! Really the men­tor just meant to try to get him to wait be­fore kil­ling peo­ple in­stead of jump­ing to that right away.

Hu­mans are kind of in­sane, and there are all sorts of in­sane in­sti­tu­tions that have evolved among us. A fairly large num­ber of those in­sti­tu­tions are twisted up in such a way that some­thing ex­plodes if peo­ple try to talk openly about how they work.

It’s a hu­man kind of think­ing to ver­bally in­sist that “Don’t kill” is an ab­solute rule, why, it’s right up there in the Ten Com­mand­ments. Ex­cept that what sol­diers do doesn’t count, at least if they’re on the right side of the war. And sure, it’s also okay to kill a crazy per­son with a gun who’s in the mid­dle of shoot­ing up a school, be­cause that’s just not what the ab­solute law “Don’t kill” means, you know!

Why? Be­cause any rule that’s not la­beled “ab­solute, no ex­cep­tions” lacks weight in peo­ple’s minds. So you have to perform that the “Don’t kill” com­mand­ment is ab­solute and ex­cep­tion­less (even though it to­tally isn’t), be­cause that’s what it takes to get peo­ple to even hes­i­tate. To stay their hands at least un­til the weight of duty is crush­ing them down. A rule that isn’t even ab­solute? Peo­ple just dis­re­gard that when­ever.

(I spec­u­late this may have to do with how the hu­man mind reuses phys­i­cal on­tol­ogy for moral on­tol­ogy. I spec­u­late that brains started with an on­tol­ogy for ma­te­rial pos­si­bil­ity and im­pos­si­bil­ity, and reused that on­tol­ogy for moral­ity; and it in­ter­nally feels like only the moral reuse of “im­pos­si­ble” is a rigid moral law, while any­thing short of “moral-im­pos­si­ble” is more like a guideline. Kind of like how, if some­thing isn’t ab­solutely cer­tain, peo­ple think that means it’s okay to make up their own opinion about it, be­cause if it’s not ab­solutely cer­tain it must not be the do­main of Author­ity. But I digress, and it’s just a hy­poth­e­sis. We don’t need to know ex­actly what is the buried cause of the sur­face craz­i­ness to ob­serve that the craz­i­ness is in fact there.)

So you have to perform that the Law is ab­solute in or­der to make the ac­tual flex­ible Law ex­ist. That doesn’t mean peo­ple lie about how the Law ap­plies to the edge cases—that’s not what I mean to con­vey by the no­tion of “perform­ing” a state­ment. More like, pro­claim the Law is ab­solute and then just not talk about any­thing that con­tra­dicts the ab­solute­ness.

And when that hap­pens, it’s one more lit­tle chunk of in­san­ity that no­body can talk about on the meta-level with­out it ex­plod­ing.

Now, you will note that I am go­ing ahead and writ­ing this all down ex­plic­itly, be­cause… well, be­cause I ex­pect that in the long run we have to find a way that doesn’t re­quire a lit­tle knot of mad­ness that no­body is al­lowed to de­scribe faith­fully on the meta-level. So we might as well start to­day.

I trust that you, the reader, will be able to un­der­stand that “Don’t kill” is the kind of rule where you give it enough force-as-though-of-ab­solute­ness that it ac­tu­ally takes a de­on­tol­ogy-break­ing weight of duty to crush down your hands, as op­posed to you cheer­fully go­ing “oh well I guess there’s a crush­ing weight now! let’s go!” at the first sign of in­con­ve­nience.

Ac­tu­ally, I don’t trust that ev­ery­one read­ing this can do that. That’s not even close to liter­ally true. But most you won’t ever be called on to kill, and so­ciety frowns upon that strongly enough to dis­cour­age you any­way. So I did feel it was worth the risk to write that ex­am­ple ex­plic­itly.

“Don’t lie” is more dan­ger­ous to mess with. That’s some­thing that most peo­ple don’t take as an ex­cep­tion­less ab­solute to be­gin with, even in the sense of perform­ing its ab­solute­less so that it will ex­ist at all. Even ex­tremely hon­est peo­ple will agree that you can lie to the Gestapo about whether you are hid­ing any Jews in the at­tic, and not bother to Glo­ma­rize your re­sponse ei­ther; and I think they will mostly agree that this is in fact a “lie” rather than try­ing to dance around the sub­ject. Peo­ple who are less than ex­tremely hon­est think that “I’m fine” is an okay way to an­swer “How are you?” even if you’re not fine.

So there’s still a very ob­vi­ous thing that could go wrong in peo­ple’s heads, a very ob­vi­ous way that the no­tion of “meta-hon­esty” could blow up, or any other codebe­sides “don’t say false things” could blow up. It’s why the very first de­scrip­tion in the open­ing para­graphs says “Don’t lie when a nor­mal highly hon­est per­son wouldn’t, and fur­ther­more…” and you should never omit that pream­ble if you post any dis­cus­sion of this on your own blog. THIS IS NOT THE IDEA THAT IT’S OKAY TO LIE SO LONG AS YOU ARE HONEST ABOUT WHEN YOU WOULD LIE IF ANYONE ASKS. It’s not an es­cape hatch.

If any­thing, meta-hon­esty is the idea that you should be care­ful enough about when you break the rule “Don’t lie” that, if some­body else asked the hy­po­thet­i­cal ques­tion, you would be will­ing to PUBLICLY DEFEND EVERY ONE OF THOSE EXTRAORDINARY EXCEPTIONS as times when even an un­usu­ally hon­est per­son should lie.

(Un­less you were never claiming to be un­usu­ally hon­est, and your pat­tern of meta-hon­est re­sponses to hy­po­thet­i­cals openly shows that you lie about as much as an av­er­age per­son. But even here, I’d worry that any­one who lets them­selves be as wicked as they imag­ine the ‘av­er­age’ per­son to be, would be an un­usu­ally wicked per­son in­deed. After all, if Robin Han­son speaks true, we are con­stantly sur­rounded by peo­ple vi­o­lat­ing what seem to us like au­to­matic norms.)

6: Meta-hon­esty, the ba­sics.

Okay, enough pream­ble, let’s speak of the de­tails of meta-hon­esty, which may or may not be a ter­rible idea to even talk about, we don’t know at this point.

The ba­sic for­mu­la­tion of meta-hon­esty would be:

“Be at least as hon­est as an un­usu­ally hon­est per­son. Fur­ther­more, when some­body asks for it and es­pe­cially when you be­lieve they’re ask­ing for it un­der this code, try to con­vey to them a frank and ac­cu­rate pic­ture of the sort of cir­cum­stances un­der which you would lie. Liter­ally never swear by your meta-hon­esty that you wouldn’t lie about a hy­po­thet­i­cal situ­a­tion that you would in fact lie about.”

My first hor­rible ter­minol­ogy for this was the “Bayesian code of hon­esty”, on the the­ory that this code meant your sen­tences never pro­vided Bayesian ev­i­dence in the wrong di­rec­tion. Sup­pose you say “Hey, Eliezer, what were you do­ing last night?” and I re­ply “Stay­ing at home do­ing the usual things I do be­fore go­ing to bed, why?” If you have a good men­tal pic­ture of what I would lie about, you have now definitely learned that I was not out watch­ing a movie, be­cause that is not some­thing I would lie about. A very large num­ber of pos­si­bil­ities have been ruled out, and most of your re­main­ing prob­a­bil­ity mass should now be on me hav­ing stayed home last night. You know that I wasn’t on a se­cret date with some­body who doesn’t want it known we’re dat­ing, be­cause you can ask me that hy­po­thet­i­cal and I’ll say, “Sure, I’d hap­pily hide that fact, but that isn’t enough to force me to lie. I would just say ‘Sorry, I can’t tell you where I was last night,’ in­stead of ly­ing.”

You have not how­ever gained any Bayesian ev­i­dence against my hid­ing a fugi­tive mar­ijuana sel­ler from the Feds, where some­body’s life or free­dom is at stake and it’s vi­tal to con­ceal that a se­cret even ex­ists in the first place. Ideally we’d have com­mon knowl­edge of that, and hope­fully we’d agree that it was fine to lie in that case to a friend who asks a ca­sual-seem­ing ques­tion.

Let’s be clear, al­though this is a kind of soft­en­ing of de­cep­tion, it’s still de­cep­tion. Even if some­body has ex­ten­sively dis­cussed your code of hon­esty with you, they aren’t log­i­cally om­ni­scient and won’t ex­plic­itly have the pos­si­bil­ity in mind ev­ery time. That’s why we should go on hold­ing our­selves to the stan­dard of, “Would I defend this lie even if the per­son I was defend­ing it to had never heard of meta-hon­esty?”

“Eliezer,” you say, “if you had a tem­po­rary schizophrenic break­down and robbed a bank and this news hadn’t be­come pub­lic, would you lie to keep it from be­com­ing pub­lic?”

And this would cause me to stop and think and ag­o­nize for a bit (which it­self tells you some­thing about me, that my an­swer is not in­stantly No or Yes). I do have im­por­tant work to do which should not be trashed with­out strong rea­son, and this hy­po­thet­i­cal situ­a­tion would not have in­volved a great de­liber­ate be­trayal on my part; but it is also the sort of thing that you could rea­son­ably ar­gue an un­usu­ally hon­est per­son ought not to lie about, where lies do not in gen­eral serve the so­cial good.

I think in the end I might re­ply some­thing like “I wouldn’t lie freely and would prob­a­bly try to use at least tech­ni­cal truth or Glo­ma­rize, but in the end I might con­ceal that event rather than let­ting my work be trashed for no rea­son. I think I’d un­der­stand if some­body else had done like­wise, if I thought they were do­ing good work in the first place. Ex­cept that ob­vi­ously I’d need to tell var­i­ous peo­ple who are en­gaged in pos­i­tive-sum trades with me, where it’s a di­rectly im­por­tant is­sue to them whether I can be trusted never to have men­tal break­downs, and re­move my­self from cer­tain po­si­tions of trust. And if it hap­pened twice I’d be more likely to give up. If it got to the point where peo­ple were openly ask­ing ques­tions I don’t imag­ine my­self as try­ing to con­tinue a lie. I also want to caveat that I’m de­scribing my eth­i­cal views, what I think is right in this situ­a­tion, and ob­vi­ously enough pres­sure can make peo­ple vi­o­late their own ethics and it’s not always pre­dictable how much pres­sure it takes, though I gen­er­ally con­sider my­self fairly strong in that re­gard. But if this had ac­tu­ally hap­pened I would have spent a lot more time think­ing about it than the two min­utes I spent writ­ing this para­graph.” And this would help give you an ac­cu­rate pic­ture of the sort of per­son that I am in gen­eral, and what I take into ac­count in con­sid­er­ing ex­cep­tions.

In­so­far as you are prac­tic­ing a men­tal dis­ci­pline in be­ing meta-hon­est, the dis­ci­pline is to be ex­plic­itly aware of ev­ery time you say some­thing false, and to ask your­self, “Would I be okay pub­li­cly say­ing, if some­body asked me the hy­po­thet­i­cal, that this is a situ­a­tion where a per­son ought to lie?”

I still worry that this is not the thing that peo­ple need to do to es­tab­lish their in­ner pact with truth. Maybe you could pick some friends to whom you just never tell any kind of literal false­hood, in the pro­cess of be­com­ing ini­tially aware of how many false things you were just say­ing all the time… but I don’t ac­tu­ally know if that works ei­ther. Maybe that’s like try­ing to stop smok­ing cigarettes on odd-num­bered days. It’d be some­thing to no­tice if the ex­per­i­men­tal an­swer is “In re­al­ity, meta-hon­esty turns out not to work for prac­tic­ing the re­spect of truth.”

Meta-hon­esty should be for peo­ple who are com­fortable, not with ab­solute hon­esty, but with not try­ing to ap­pear any more hon­est than they are. This it­self is not the or­di­nary equil­ibrium, and if you want to do things the stan­dard hu­man way and not for­sake a well-tested and some­what en­forced so­cial equil­ibrium in pur­suit of a bright-eyed novel ideal­is­tic agenda, then you should not de­clare your­self meta-hon­est, or should let some­body else try it first.

7: Con­sis­tent ob­ject-level glo­ma­riza­tion in meta-level hon­est re­sponses.

Glo­ma­riza­tion can be work­able when re­stricted to spe­cial cases, such as only ques­tions about nu­clear weapons and sub­marines. Meta-hon­esty is such a spe­cial case and, if we’re do­ing this, we should all Glo­ma­rize it ac­cord­ingly. In par­tic­u­lar meta-ques­tions are not to be used to ex­tract ob­ject-level data, and we should all re­spect that in our ques­tions, and con­sis­tently Glo­ma­rize about it in our an­swers, in­clud­ing some ran­dom times when Glo­ma­riza­tion seems silly.

Some key re­sponses that need to be stan­dard:

  • “That ques­tion sounds too ob­ject-level.”

  • “I think you’re do­ing meta-hon­esty wrong.”

  • “I think I’m sup­posed to Glo­ma­rize that sort of an­swer in gen­eral.”

  • “I should an­swer a more ab­stract ver­sion of that.”

  • “I worry that some of my coun­ter­fac­tual selves are not in a mu­tu­ally benefi­cial situ­a­tion in this dis­cus­sion.”

And if you clearly say that you “ir­re­vo­ca­bly worry” about any of these things, it means the meta-hon­est con­ver­sa­tion has crashed; the other per­son is not sup­posed to keep press­ing you, and if they do, you can lie. Ideally, this is some­thing you should con­sis­tently do in any case where a sub­stan­tial mea­sure of your coun­ter­fac­tual selves as the other per­son might imag­ine them would be feel­ing pres­sured to the point of maybe meta-ly­ing. That is, you should not only say “ir­re­vo­ca­bly worry” in cases where you ac­tu­ally have some­thing to con­ceal, you should say it in cases where the dis­cus­sion would be pres­sur­ing some­body who did have some­thing to con­ceal and this seems high-enough-prob­a­bil­ity to you or to your model of the per­son talk­ing to you.

For ex­am­ple: “Eliezer, would you lie about hav­ing robbed a bank?”

I con­sider whether this sounds like an at­tempt to ex­tract ob­ject-level in­for­ma­tion from some of my coun­ter­fac­tual selves, and con­clude that you prob­a­bly place very lit­tle prob­a­bil­ity on my hav­ing ac­tu­ally robbed a bank. I re­ply, “Either it is the case that I did rob a bank and I think it is okay to lie about that, or al­ter­na­tively, my re­ply is as fol­lows: I wouldn’t or­di­nar­ily rob a bank. It seems to me that you are pos­tu­lat­ing some ex­traor­di­nary cir­cum­stance which has driven me to rob a bank, and you need to tell me more about this ex­traor­di­nary cir­cum­stance be­fore I tell you whether I’d lie about it. Or you’re pos­tu­lat­ing a coun­ter­fac­tual ver­sion of me that’s fallen far enough off the eth­i­cal rails that he’d prob­a­bly stop be­ing hon­est too.”

Some ad­di­tional state­ments that ought to be taken as praise­wor­thy:

  • “I only feel free to have a frank dis­cus­sion about that if ev­ery­one in the room has agreed to abide by the meta-hon­esty code.”

  • “I no­tice that I’m feel­ing in­ter­ro­gated, and should not try to give a code-abid­ing an­swer to that right now.”

  • “It is ei­ther the case that this ac­tu­ally hap­pened and I think it is okay to lie about it, or that my cur­rent quick guess is that I wouldn’t lie in that case.”

  • “Hold on, let me ei­ther gen­er­ate a ran­dom num­ber or pre­tend to gen­er­ate a ran­dom num­ber, such that if I’m ac­tu­ally gen­er­at­ing a ran­dom num­ber and it comes up as 0, I will try to seem more eva­sive than usual in this con­ver­sa­tion even if I have noth­ing to ac­tu­ally hide.”

This is not sup­posed to be a clever way to ex­tract in­for­ma­tion from peo­ple and you should shut down any at­tempt to use it that way.

“Harry,” says HPMOR!Dum­ble­dore, “I ask you un­der the code of meta-hon­esty (which we have just anachro­nis­ti­cally ac­quired): Would you lie about hav­ing robbed the Gringotts Bank?”

Harry thinks, Maybe this is about the Azk­a­ban break­out, and says, “Do you in fact sus­pect me of hav­ing robbed a bank?”

“I think that if I sus­pected you of hav­ing robbed a bank,” says Dum­ble­dore, “and I did not wish you to know that, I would not ask you if you had robbed a bank. Why do you ask?”

“Be­cause the cir­cum­stances un­der which you’re in­vok­ing meta-hon­esty have some­thing to do with how I an­swer,” says Harry (who has sud­denly ac­quired a view on this sub­ject that some might con­sider im­plau­si­bly de­tailed). “In par­tic­u­lar, I think I re­act differ­ently de­pend­ing on whether this is ba­si­cally about you try­ing to con­struct a new mu­tu­ally benefi­cial ar­range­ment with the per­son you think I am, or if you’re in an ad­ver­sar­ial situ­a­tion with re­spect to some of my coun­ter­fac­tual selves (where the term ‘coun­ter­fac­tual’ is stan­dardly taken to in­clude the ac­tual world as one that is coun­ter­fac­tu­ally con­di­tioned on be­ing like it­self). Also I think it might be a good idea gen­er­ally that the first time you try to have an im­por­tant meta-hon­est con­ver­sa­tion with some­one, you first spend some time hav­ing a meta-meta-hon­est con­ver­sa­tion to make sure you’re on the same page about meta-hon­esty.”

“I am not sure I un­der­stood all that,” said Dum­ble­dore. “Do you mean that if you think we have be­come en­e­mies, you might meta-lie to me about when you would lie?”

Harry shook his head. “No,” said Harry, “be­cause then if we weren’t en­e­mies, you would still never re­ally be able to trust what I say even as­sum­ing me to abide by my code of hon­esty. You would have to worry that maybe I se­cretly thought you were an en­emy and didn’t tell you. But the fact that I’m meta-hon­est shouldn’t be some­thing that you can use against me to figure out whether I… sneaked into the girl’s dorm and wrote in some­body’s di­ary, say. So if I’m in that situ­a­tion I’ve got to pro­tect my coun­ter­fac­tual selves and Glo­ma­rize harder. Whereas if this is more of a situ­a­tion where you want to know if we can go to Mor­dor to­gether, then I’d feel more open and try to give you a ful­ler pic­ture of me with more de­tail and not worry as much about Glo­ma­riz­ing the spe­cific ques­tions you ask.”

“I sus­pect,” Dum­ble­dore said gravely, “that those who try to be hon­est at all will always be at some­thing of a dis­ad­van­tage rel­a­tive to the most ready liars, at least if they’ve robbed Gringotts. But yes, Harry, I am afraid that this is more of a situ­a­tion where I am… con­cerned… about some of your coun­ter­fac­tual selves. But then why would you an­swer at all, in such a case?”

“Be­cause some­times peo­ple are hon­est and have good in­ten­tions,” an­swered Harry, “and I think that if in gen­eral they can have an ac­cu­rate pic­ture of the other per­son’s hon­esty, ev­ery­body is on net a bit bet­ter off. Even if I had robbed a bank, for ex­am­ple, you and I would both still not want any­thing bad to hap­pen to Bri­tain. And some of my coun­ter­fac­tual selves are in­no­cent, and they’re not bet­ter off if you think I’m more dishon­est than I am.”

“Then I ask again,” said Dum­ble­dore, “un­der the code of meta-hon­esty, whether you would lie about hav­ing robbed a bank.”

“Then my an­swer is that I wouldn’t or­di­nar­ily rob a bank,” Harry said, “and I’d feel even worse about ly­ing about hav­ing robbed a bank, than hav­ing robbed a bank. And I’d know that if I robbed a bank I’d also have to lie about it. So what­ever weird rea­son made me rob the bank, it’d have to be weird enough that I was will­ing to rob the bank and will­ing to lie about it, which would take a pretty ex­treme situ­a­tion. Where it should be clear that I’m not try­ing to an­swer about hav­ing speci­fi­cally robbed a bank, I’m try­ing to give you a gen­eral pic­ture of what sort of per­son I am.”

“What if you had been black­mailed into rob­bing the bank?” in­quired Dum­ble­dore. “Or what if things crept up on you bit by bit, so that in the end you found your­self in an ab­surd situ­a­tion you’d never in­tended to en­ter?”

Harry shrugged hel­plessly. “Either it’s the case that I did end up in a weird situ­a­tion and I don’t want to let you know about that, or al­ter­na­tively, I feel like you’re de­scribing a very broad range of pos­si­bil­ities that I’d have to think about more, be­cause I haven’t yet ended up in that kind of situ­a­tion and I’m not quite sure how I’d be­have… I think I’d have in mind that just tel­ling the Head­mas­ter the truth can pre­vent big prob­lems from blow­ing up any fur­ther, but there’d be cases ex­treme enough that I wouldn’t do that ei­ther… I mean, the ba­sic an­swer is, yes, there’s things that would make me lie right to your face, but, I wouldn’t do that just for hav­ing stolen candy from the kitchen, I don’t think. I’d just be like ‘I con­sis­tently Glo­ma­rize when peo­ple ask me if I’ve stolen candy from the kitchen.’”

“Would you lie to pro­tect a fugi­tive from the law who you be­lieved was be­ing un­justly pur­sued?” said Dum­ble­dore. “Say that Sirius Black was in­no­cent af­ter all, as the Quib­bler would have it; would you hide him and then lie to Min­erva about what you were do­ing on the night you hid him?”

“Yes,” Harry said un­hesi­tat­ingly. “That’s some­thing I think is clearly right and the sort of thing I would or­di­nar­ily do.”

“What if you thought that some­body was be­ing un­justly pur­sued by the law?” said Dum­ble­dore. “Is that worth ly­ing for?”

Harry frowned. “I think I maybe shouldn’t an­swer ques­tions about what in par­tic­u­lar I think ought not to be ille­gal,” Harry said, “but if the ba­sic ques­tion is whether I’d lie to my friends to keep some­body who’d only used mar­ijuana out of Azk­a­ban, then I would. I don’t think my friends should ex­pect oth­er­wise; they should want me to do that.”

“We are not be­ing meta-hon­est any more, only hon­est,” said Dum­ble­dore. “Were you in­volved in the break­out from Azk­a­ban?”

Harry stared at Dum­ble­dore a mo­ment, and said, “I think I have to con­sis­tently not con­firm or deny that, but my coun­ter­fac­tual and pos­si­bly ac­tual self who wasn’t in­volved would re­ally like to know at this point what the heck you think might’ve hap­pened.”

If some­how the Gestapo has heard of meta-hon­esty and they ask you if you would lie to hide Jews in your at­tic, they should be ex­pect­ing a law-abid­ing meta-hon­est Nazi cit­i­zen to Glo­ma­rize, ab­stract the ques­tion in such a way as to not re­veal the ob­ject-level in­for­ma­tion that might be sought, or say “I ir­re­vo­ca­bly worry that some of my coun­ter­fac­tual selves are not in a pos­i­tive-sum situ­a­tion.” That, I now pro­claim, ought to be done by Nazis and re­sis­tance fighters al­ike, when you are faced with a ques­tion that might be meant to re­veal ob­ject-level in­for­ma­tion about what hap­pened.

“Eliezer,” says the hy­po­thet­i­cal Gestapo officer who has some­how heard about my meta-hon­esty code, “it hap­pens that I’m a per­son who’s heard of meta-hon­esty. Now, are you the sort of per­son who would lie about hav­ing Jews hid­den in your at­tic?”

This hy­po­thet­i­cal Gestapo officer has a gun. Most peo­ple ask­ing you meta-hon­est ques­tions won’t have a gun. In fact I bet this will liter­ally never hap­pen un­til the end of the world. Let’s sup­pose he has a gun any­way.

“I am the fol­low­ing sort of per­son,” I re­ply. “If I was hid­ing the Führer in my at­tic to pro­tect him from Jewish as­sas­s­ins, I’d lie about that to the as­sas­s­ins. It’s clear you know about my code of meta-hon­esty, so you should un­der­stand that is a very in­no­cent thing to say. But these cir­cum­stances and the ex­act coun­ter­fac­tual you are ask­ing make me ner­vous, so I’m afraid to ut­ter the words I think you may be look­ing for, namely the ad­mis­sion that if I were the kind of per­son who’d hide Jews in his at­tic then I’d be the kind of per­son who would lie to pro­tect them. Can I say that I be­lieve that in re­spect to your ques­tion as you mean it, I think that is no more and no less true of me than it is true of you?”

“My, you are fast on your ver­bal feet,” says the Gestapo officer. “If some­body were less fast on their ver­bal feet, would you tell them that it was ac­cept­able for a meta-hon­est per­son to just meta-lie to the Jewish as­sas­s­ins in or­der to hide the Führer?”

“If they didn’t feel that their coun­ter­fac­tual loyal Nazi self would think that their coun­ter­fac­tual dis­loyal self was be­ing pres­sured and clearly state that fact ir­re­vo­ca­bly,” I say, “I’d say that, just like their coun­ter­fac­tual loyal self, they should make some effort to re­veal the gen­eral limits of their hon­esty with­out be­tray­ing any of their coun­ter­fac­tual selves, but say they ir­re­vo­ca­bly couldn’t han­dle the con­ver­sa­tion as soon as they thought their al­ter­nate loyal self would think their al­ter­nate’s coun­ter­fac­tual dis­loyal self couldn’t han­dle the con­ver­sa­tion. It’s not as if the Jewish as­sas­s­ins would be fooled if they said oth­er­wise. If the Jewish as­sas­s­ins do con­tinue past that point, which is blatantly for­bid­den and ev­ery­one should know that, they may lie.”

“I see,” says the Gestapo officer. “If you are tel­ling me the truth, I think I have grasped the ex­tent of what you claim to be hon­est about.” He turns to his sub­or­di­nates. “Go search his at­tic.”

“Now I’m cu­ri­ous,” I say. “What would you have done if I’d sworn to you that I was an ab­solutely loyal Ger­man cit­i­zen, and that my char­ac­ter was such that I would cer­tainly never lie about hav­ing Jews in my at­tic even if I were the sort of dis­loyal cit­i­zen who had Jews in his at­tic in the first place?”

“I would have de­tailed twice as many men to search your house,” says the Gestapo officer, “and had you de­tained, for that is not the re­sponse I would ex­pect from an hon­est Nazi who knew how meta-hon­esty was sup­posed to work. Now I ask you meta-meta-hon­estly, why haven’t you said that you are ir­re­vo­ca­bly wor­ried that I am abus­ing the code? Ob­vi­ously I put sub­stan­tial prob­a­bil­ity on you be­ing a traitor, mean­ing I am de­liber­ately pres­sur­ing you into a meta-con­ver­sa­tion and try­ing to use your code of hon­esty against those coun­ter­fac­tual selves. Why didn’t you just shut me down?”

“Be­cause you do have a gun, sir,” I say. “I agree that it’s what the rules called for me to say, but I thought over the situ­a­tion and de­cided that I was com­fortable with say­ing that in gen­eral this was a sort of situ­a­tion where that rule could be bent so as for me to not end up be­ing shot—and I tell you meta-meta-hon­estly that I do be­lieve the situ­a­tion has to be that ex­treme in or­der for that rule to even be bent.”

Really the prin­ci­ple is that it is not okay to meta-ask what the Gestapo officer is meta-ask­ing here. This kind of de­tailed-edge-case-check­ing con­ver­sa­tion might be ap­pro­pri­ate for shoring up the edges of an in­ter­ac­tion in­tended to be mu­tu­ally benefi­cial, but ab­solutely not for storm­ing in look­ing for Jews in the at­tic of a per­son who in your mind has a lot of mea­sure on hav­ing some­thing to hide.

But I do want to have trust­wor­thy foun­da­tions some­where.

And I think it’s rea­son­able to ex­pect that over the course of a hu­man life­time you will liter­ally never end up in a situ­a­tion where a Gestapo officer who has read this es­say is point­ing a gun at you and ask­ing overly-ob­ject-level-prob­ing meta-hon­esty ques­tions, and will shoot you if you try to glo­ma­rize but will be­lieve you if you lie out­right, given that we all know that ev­ery­one, in­no­cent or guilty, is sup­posed to glo­ma­rize in situ­a­tions like that. Up un­til to­day I don’t think I’ve ever seen any ques­tions like this be­ing asked in real life at all, even hang­ing out with a num­ber of peo­ple who are heav­ily into re­cur­sion.

So if one is declar­ing the meta-hon­esty code at all, then one shouldn’t meta-lie, pe­riod; I think the rules have been set up to al­low that to be ab­solute. I don’t want you to have to worry that maybe I think I’m be­ing pres­sured, or maybe I thought you meta-asked the wrong thing, so now I think it’s okay to meta-lie even though I haven’t given any out­ward sign of that. To that end, I am will­ing to sac­ri­fice the very tiny frac­tion of the mea­sure of my fu­ture selves who will end up fac­ing an ex­tremely weird Gestapo officer. To me, for now, there doesn’t seem to be any real-life cir­cum­stance where you should lie in re­sponse to a meta-hon­esty ques­tion—rather than con­sis­tently glo­ma­rize that kind of ques­tion, con­sis­tently ab­stract that kind of ques­tion, con­sis­tently an­swer in an anal­ogy rather than the origi­nal ques­tion, or con­sis­tently say “I be­lieve some coun­ter­fac­tual ver­sions of me would say that cuts too close to the ob­ject level.” (It be­ing a stan­dard con­ven­tion that coun­ter­fac­tu­als may in­clude the ac­tual.)

I also think we can rea­son­ably ex­pect that from now un­til the end of the world, hon­est peo­ple should liter­ally ab­solutely never need to evade or mis­lead at all on the meta-meta-level, like if some­body asks if you feel like the meta-level con­ver­sa­tion has abided by the rules. (And just like meta-hon­esty doesn’t ex­cuse ob­ject-level dishon­esty, by say­ing that meta-meta-hon­esty seems like it could be ev­ery­where open and to­tal, I don’t mean to ex­cuse meta-level lies. We should all still re­gard meta-lies as ex­tremely bad and a Code Vio­la­tion and You Can­not Be Trusted Any­more.)

If there’s a meta-hon­est dis­cus­sion about some­one’s code of hon­esty, and a dis­cus­sion of what they think about the cur­rent meta-meta con­di­tions of how the meta-hon­esty code is be­ing used, and it sounds to you like they think things are fine… then things should be fine, pe­riod. If you ask, do they think that any pres­sure strong enough to po­ten­tially shake their meta-hon­esty is po­ten­tially around, do they think that the over­all situ­a­tion here would have treated any of their plau­si­ble coun­ter­fac­tual selves in a nega­tive-sum way, and they say no it’s all fine—then that is sup­posed to be ab­solute un­der the code. That ought to es­tab­lish a foun­da­tion that’s as re­li­able as the per­son’s claim to be meta-hon­est at all.

If you go through all that and lie and meta-lie and meta-meta-lie af­ter say­ing you wouldn’t, you’ve lied un­der some of the kind­est en­vi­ron­ments that were ever set up on this Earth to let peo­ple not lie, among peo­ple who were try­ing to build trust in that code so we could all use it to­gether. You are be­ing a gen­uinely awful per­son as I’d judge that, and I may ad­vo­cate for se­vere so­cial sanc­tions to ap­ply.

As­sum­ing this ends up be­ing a thing, that is. I haven’t run it past many peo­ple yet and this is the first pub­lic dis­cus­sion. Maybe there’s some gi­ant hole in it I haven’t spot­ted.

If any­body ever runs into an ac­tual real cir­cum­stance where it seems to them that meta-hon­esty as they tried to use it was giv­ing the es­say-read­ing Gestapo too much power or too much in­for­ma­tion, maybe be­cause they weren’t fast enough on their ver­bal feet, please email me about it so I can con­sider whether to mod­ify or back­track on this whole idea. I will try to pro­tect your anonymity un­der all cir­cum­stances up to and in­clud­ing the end of the world un­less you say oth­er­wise. The pre­vi­ous sen­tence is not the sort of thing I would lie about.

8: Coun­ter­ar­gu­ment: Maybe meta-hon­esty is too sub­tle.

I worry that the no­tion of meta-hon­esty is too com­pli­cated and sub­tle. In that it has sub­tleties in it, at all.

This con­cept is cer­tainly too sub­tle for Twit­ter. Maybe it’s too sub­tle for us too.

Maybe “meta-hon­esty” is just too com­pli­cated a con­cept to be able to make it be part of a cul­ture’s Law, com­pared to the stan­dard-twisti­ness-com­pli­ant perfor­mance of say­ing “Always be hon­est!” and wait­ing for the weight of duty to crush down peo­ple’s hands, or say­ing “Never say any­thing false!” and just-not-dis­cussing all the ex­cep­tions that peo­ple think ob­vi­ously don’t count.

(But of course that sys­tem also has dis­ad­van­tages, like peo­ple hav­ing differ­ent au­to­matic norms about what they think are ob­vi­ous ex­cep­tions.)

I’ve started to worry more, re­cently, about which cog­ni­tive skills have other cog­ni­tive skills as pre­req­ui­sites. One of the rea­sons I hes­i­tated to pub­lish Inad­e­quate Equil­ibria (be­fore cer­tain per­sons yanked it out of my drafts folder and pub­lished it any­way) was that I wor­ried that maybe the book’s ideas were use­less or harm­ful with­out mas­tery of other skills. Like, maybe you need to have de­vel­oped a skill for de­mo­ti­vat­ing cog­ni­tion, and un­til then you can’t rea­son about charged poli­ti­cal is­sues or your startup idea well enough for com­pli­cated thoughts about Nash equil­ibria to do more good than harm. Or maybe un­less you already know a bunch of microe­co­nomics, you just stare at so­ciety and see a diffuse mass of phe­nom­ena that might or might not be bad equil­ibria, and you can’t even guess non-wildly in a way that lets you get started on learn­ing.

Maybe meta-hon­esty con­tains enough meta, in that it has meta at all, that it just blows up in most peo­ple’s heads. Sure, peo­ple in our lit­tle sub­com­mu­nity tend to max out the Cog­ni­tive Reflec­tion Test and ev­ery­thing that cor­re­lates with it. But com­pared to scor­ing 3 out of 3 on the CRT, the con­cept of meta-hon­esty is prob­a­bly harder to live in real life—stop­ping and ask­ing your­self “Would I be will­ing to pub­li­cly defend this as a situ­a­tion in which un­usu­ally hon­est peo­ple should lie, if some­body posed it as a hy­po­thet­i­cal?” Maybe that just gets turned into “It’s per­mis­si­ble to lie so long as you’d be hon­est about whether you’d tell that lie if any­one asks you that ex­act ques­tion and re­mem­bers to say they’re in­vok­ing the meta-hon­esty code,” be­cause peo­ple can’t pro­cess the meta-part cor­rectly. Or maybe there’s some sub­tle nonob­vi­ous skill that a few peo­ple have prac­ticed ex­ten­sively and can do very eas­ily, and that most peo­ple haven’t prac­ticed ex­ten­sively and can’t do that eas­ily, and this sub­skill is re­quired to think about meta-hon­esty with­out blow­ing up. Or maybe I just get an email say­ing “I tried to be meta-hon­est and it didn’t work be­cause my ver­bal SAT score was not high enough, you need to re­tract this.”

If so, I’m not sure there’s much that could be done about it, be­sides me declar­ing that Meta-Hon­esty had turned out to be a ter­rible idea as a so­cial in­no­va­tion and no­body should try that any­more. And then that might not undo the dam­age to the law-as-ab­solute perfor­mance that makes some­thing be part of the Law.

But I’d out­right lie to the Gestapo about Jews in my at­tic. And even to friends, I can’t con­sis­tently Glo­ma­rize about ev­ery point in my life where one of my coun­ter­fac­tual selves could pos­si­bly have been do­ing that. So I can’t ac­tu­ally promise to be a wiz­ard, and I want there to ex­ist firm foun­da­tions some­where.

Ques­tions? Com­ments?