The Parable of the Dagger

Once upon a time, there was a court jester who dab­bled in logic.

The jester pre­sented the king with two boxes. Upon the first box was in­scribed:

“Either this box con­tains an an­gry frog, or the box with a false in­scrip­tion con­tains an an­gry frog, but not both.”

On the sec­ond box was in­scribed:

“Either this box con­tains gold and the box with a false in­scrip­tion con­tains an an­gry frog, or this box con­tains an an­gry frog and the box with a true in­scrip­tion con­tains gold.”

And the jester said to the king: “One box con­tains an an­gry frog, the other box gold; and one, and only one, of the in­scrip­tions is true.”

The king opened the wrong box, and was sav­aged by an an­gry frog.

“You see,” the jester said, “let us hy­poth­e­size that the first in­scrip­tion is the true one. Then sup­pose the first box con­tains gold. Then the other box would have an an­gry frog, while the box with a true in­scrip­tion would con­tain gold, which would make the sec­ond state­ment true as well. Now hy­poth­e­size that the first in­scrip­tion is false, and that the first box con­tains gold. Then the sec­ond in­scrip­tion would be—”

The king or­dered the jester thrown in the dun­geons.

A day later, the jester was brought be­fore the king in chains, and shown two boxes.

“One box con­tains a key,” said the king, “to un­lock your chains; and if you find the key you are free. But the other box con­tains a dag­ger for your heart, if you fail.”

And the first box was in­scribed:

“Either both in­scrip­tions are true, or both in­scrip­tions are false.”

And the sec­ond box was in­scribed:

“This box con­tains the key.”

The jester rea­soned thusly: “Sup­pose the first in­scrip­tion is true. Then the sec­ond in­scrip­tion must also be true. Now sup­pose the first in­scrip­tion is false. Then again the sec­ond in­scrip­tion must be true. So the sec­ond box must con­tain the key, if the first in­scrip­tion is true, and also if the first in­scrip­tion is false. There­fore, the sec­ond box must log­i­cally con­tain the key.”

The jester opened the sec­ond box, and found a dag­ger.

“How?!” cried the jester in hor­ror, as he was dragged away. “It’s log­i­cally im­pos­si­ble!”

“It is en­tirely pos­si­ble,” replied the king. “I merely wrote those in­scrip­tions on two boxes, and then I put the dag­ger in the sec­ond one.”

• Did the dag­ger have ‘pwned’ in­scribed on it?

• And if the king wanted to be par­tic­u­larly nasty the other box would also con­tain a dag­ger :)

• No, If the king REALLY wanted to be a dick, he would have put the key and the dag­ger in the same box, and then said “one box con­tains a key, and one box con­tains a dag­ger.”

• And if the king wanted to be par­tic­u­larly nasty the other box would also con­tain a dagger

No, that the king speci­fied couldn’t hap­pen. One of the morals of the parable is that the king didn’t lie.

• What, it doesn’t count as a lie if it’s in writ­ing? That’s a hell of a sys­tem of con­tract law they’ve got in this alle­gor­i­cal king­dom.

• What, it doesn’t count as a lie if it’s in writ­ing? That’s a hell of a sys­tem of con­tract law they’ve got in this alle­gor­i­cal king­dom.

I have a differ­ent an­swer to this than what has been given so far :

It’s a ques­tion of im­plicit con­ven­tions. The king’s challenge fol­lows and mimics the jester’s challenge. In the jester’s challenge, the jester makes a state­ment about the truth value of the in­scrip­tions on the boxes. By do­ing this, he sets the prece­dent that the in­scrip­tions on the boxes are part of the game and do not en­gage the hon­esty of the game maker. The in­scrip­tions can be true of false, and it’s part of the challenge to guess what is each one. Only the jester’s own words en­gage his hon­esty. If he lied, the challenge would be rigged.

The king mimics the jester’s setup, but makes no state­ment about the truth value of the in­scrip­tions on the boxes. That differ­ence should have sounded sus­pi­cious to the jester. He should have asked the king if the state­ments were log­i­cal. The king could have lied, but at that point if the king was ready to lie then he’d prob­a­bly kill the jester even if he found the key.

• Defi­ni­tions mat­ter. If you define a lie as an in­ten­tional de­cep­tion at­tempt, then the king lied, if you define it as ut­ter­ing a false­hood, then he didn’t. The mod­ern le­gal tra­di­tion is hazy on this point, and in­ten­tional de­cep­tion with­out ac­tu­ally mak­ing false state­ments some­times in­val­i­dates a con­tract, and some­times doesn’t.

• I could make up a new lan­guage for ev­ery sen­tence I ut­ter, and claim that 23 of the words I am merely speak­ing to my­self in an un­re­lated monologue.

Com­mu­ni­ca­tion is so con­text-de­pen­dent that I see the ut­ter­ance of “it was as­sumed, not im­plied” as an ad­mis­sion to de­ceit.

• A state­ment that’s nei­ther true nor false can’t be false...

• It’s a dressed up ver­sion of “This sen­tence is a lie”. It’s only self refer­en­tial, so it’s truth value can’t be de­ter­mined in any mean­ingful, em­piri­cal sense.

Jester should’ve re­mem­bered the pri­mary rule of logic: Don’t make some­body look like an idiot if they can kill you.

• I’m hav­ing some trou­ble with the logic here. I won­der if the parable got a bit gar­bled.

“You see,” the jester said, “let us hy­poth­e­size that the first in­scrip­tion is the true one.”

The first in­scrip­tion says, “Either this box con­tains an an­gry frog, or the box with a false in­scrip­tion con­tains an an­gry frog, but not both.” Now we are hy­poth­e­siz­ing that this is the true one. There­fore “the box with a false in­scrip­tion” means “the sec­ond box”. So, “Either the 1st box con­tains an an­gry frog, or the 2nd box con­tains an an­gry frog, but not both”.

The jester goes on, “Then sup­pose the first box con­tains an an­gry frog.”

So we know (by as­sump­tion) that the 1st clause of the in­scrip­tion is true, the 1st box con­tains an an­gry frog. Since “not both” clauses are true, it means the 2nd clause is false, and so the 2nd box does not con­tain an an­gry frog—it must con­tain gold.

But the jester claims that this is a con­tra­dic­tion: “Then the other box would con­tain gold and this would con­tra­dict the first in­scrip­tion which we hy­poth­e­sized to be true.” For this to be a con­tra­dic­tion, the 1st in­scrip­tion would have had to say that the 2nd box should con­tain an an­gry frog, but we just saw that it doesn’t say that.

I can’t make much progress with the 2nd in­scrip­tion ei­ther. I’m get­ting pretty con­fused now!

• Bx is true if box x has gold, false if frog. one con­tains frog, other gold → B1 == ~B2. only one in­scrip­tion is true → Bf == ~Bt

We know:

B2 && Bf || Bt && B1 (I1)

B2 && Bt || B1 && Bt (I2)

Bt == B1 && Bf == B2 && I1 && ~I2 || Bf == B1 && Bt == B2 && ~I1 && I2 # only one in­scrip­tion is true

From this:

((B2 && B2 || B1 && B1) && ~(B2 && B1 || B1 && B1)) || (~(B2 && B1 || B2 && B1) && (B2 && B2 || B1 && B2))

((B2 || B1) && ~(false || B1)) || (~(false || false) && (B2 || false))

(true && (true && B2)) || ((true && true) && B2)

B2 || B2

B2 # so, Box 2 con­tains gold

• Ra­tion­al­ity is choos­ing to ac­knowl­edge that can­dlelight is fire, in­stead of pre­serv­ing your dig­nity by main­tain­ing the search.

Now Eliezer has clev­erly got­ten us to turn down a cer­tain \$1,000 by tel­ling us lies about how the other box will con­tain \$1,000,000 if we choose only it! Wasn’t that clever of him?

• The sim­plest way to solve the jester’s puz­zle is to make a table of the four cases (where the frog is, where the true in­scrip­tion is), then de­ter­mine for each case whether the in­scrip­tions are in fact true or false as re­quired for that case. (All the while mak­ing la-la-la-can’t-hear-you noises at any doubts one might have about whether self-refer­ence can be for­mal­ised at all.) The con­clu­sion is that the first box has the frog and the true in­scrip­tion. That as­sumes that the jester was hon­est in stat­ing his puz­zle, but given his shock at the out­come of the king’s puz­zle, that ap­pears to be so.

But can self-refer­ence be for­mal­ised? How, for ex­am­ple, do two perfect rea­son­ers ne­go­ti­ate a deal? In gen­eral, how can two perfect rea­son­ers in an ad­ver­sar­ial situ­a­tion ever in­ter­pret the other’s words as any­thing but noise?

“Are you the sort of man who would put the poi­son into his own gob­let or his en­emy’s? Now, a clever man would put the poi­son into his own gob­let be­cause he would know that only a great fool would reach for what he was given. I am not a great fool so I can clearly not choose the wine in front of you...But you must have known I was not a great fool; you would have counted on it, so I can clearly not choose the wine in front of me.” …etc.

Or con­sider a con­ver­sa­tion be­tween an FAI that wants to keep the world safe for hu­mans, and a UFAI that wants to turn the world into pa­per­clips.

• I un­der­stand that the “turn the world into pa­per­clips” thing comes from a writ­ing of Eliezer, but it is short­hand for a very un­likely sce­nario. More­over, this site has got­ten re­ally far away from ac­tu­ally deal­ing with the prob­lems that an un­friendly AGI is likely to cause. In­stead, it seems to deal with stupid hu­man prob­lems, foibles, un­rea­son, etc.

The prob­lem with this, as I see it, is that hu­mans are a di­verse group, and what’s ra­tio­nal for those with­out much brain­power is to­tally ir­ra­tional for those with a lot of brain­power.

If you have the ca­pac­ity to de­velop a func­tional AGI with mir­ror neu­rons, then that’s what you should be do­ing. If you have the ca­pac­ity to de­velop a part of such an AGI, then that’s what you should be do­ing.

If you don’t have such a ca­pac­ity (in brains, or in some other nec­es­sary cap­i­tal, such as mon­e­tary/​ma­te­rial cap­i­tal), then you shouldn’t waste your time try­ing to shape the post-sin­gu­lar­ity fu­ture.

Most of this site is word games that point out that words are in­ad­e­quate com­mu­ni­ca­tors, and gen­er­ally only used to sig­nal sta­tus of one pri­mate to an­other. True enough, but not re­lated to the do­main in ques­tion: how to stop (or “make less likely”) homo eco­nomi­cus (var. so­cio­pathi­cus) from kil­ling/​dis­plac­ing the lesser pri­mates?

First, we must re­al­ize that so­ciopa­thy is our pri­mary prob­lem. We are en­ter­ing the sin­gu­lar­ity us­ing so­ciopath-defined so­cial sys­tems, so­ciopath-con­trol­led so­cial sys­tems, and so­ciopath-pop­u­lated so­cial sys­tems. The tools of liberal democ­racy have been aban­doned, in­cre­men­tally, due to the former facts /​ situ­a­tion(s). Now it’s true that I’ve used a lot of what Marvin Min­sky (more suc­cinctly than this site) called “suit­case words.”

You can ei­ther de­bate me in what Ray Kurzweil calls “slow, se­rial, and im­pre­cise” lan­guage (again, more suc­cinctly than this site, in his book “The Age of Spiritual Machines”), or you can “most fa­vor­ably” in­ter­pret what I say, and re­al­ize I’m right, and make your way to the fire es­cape.

Time is short. Hu­man stu­pidity is long. We will all likely per­ish. Make haste.

Just to clar­ify: I think it’s smart to build AGI right now that starts off not know­ing much, build it with a weak robot body that can in­ter­act with the world to­wards a goal, al­low un­limited self-im­prove­ment, and raise the child with love and re­spect. I think it’s also good to have mul­ti­ple such “Mind Chil­dren.” The more there are, the more like­li­hood that the non-so­ciopaths will be able to ame­lio­rate the dam­age from the so­ciopaths, both by de­stroy­ing them, and by de­sign­ing sys­tems that re­ward them enough so that the de­struc­tive­ness of their so­ciopa­thy is not fully re­al­ized (as hu­mans have tried and failed to do with their own sys­tems).

• We note that the king did not say one thing the jester did: ”… one, and only one, of the in­scrip­tions is true.”

• The Jester never as­sumed that. He showed that if the first in­scrip­tion is true, it must be false, so he as­sumed it was false.

• Un­like the jester’s rid­dle, the king never claimed there was any cor­re­la­tion be­tween the con­tents of the boxes and the in­scrip­tions on those boxes. The jester merely as­sumed that there was.

• The jester as­sumed that the in­scrip­tions on the boxes were ei­ther true or false, and noth­ing else.

• For the in­scrip­tions to be ei­ther true or false, they would have to cor­re­late with the con­tents of the boxes. If he didn’t as­sume this cor­re­la­tion ex­isted, why would he have both­ered try­ing to solve the im­plied rid­dle, and then be­lieve upon solv­ing it that he could choose the cor­rect box?

The as­sump­tion that one of the in­scrip­tions is true is also the as­sump­tion that the con­tents of the boxes cor­re­late with the truth­ful­ness of the in­scrip­tions. And the key point is that nei­ther in­scrip­tion need be true, be­cause the con­tents of the boxes don’t cor­re­late with the truth­ful­ness of the in­scrip­tions. And in fact, nei­ther in­scrip­tion was true.

In other words, I don’t un­der­stand why you’re ar­gu­ing a sim­ple clar­ifi­ca­tion of es­sen­tially the same point you made.

• He as­sumed some­thing that im­plied the cor­re­la­tion, but he did not as­sume the cor­re­la­tion. He also as­sumed some­thing that im­plied that the key was in the sec­ond box, but if he as­sumed that the key was in the sec­ond box, he wouldn’t have even both­ered read­ing the in­scrip­tions.

• I’m still not get­ting the differ­ence. He chose the sec­ond box be­cause he de­duced the the key must be there based on the as­sump­tion that one of the in­scrip­tions was true. There is no equiv­alence be­tween as­sum­ing a key in the sec­ond box and de­duc­ing a key in the sec­ond box based on a false premise.

How­ever, as­sum­ing one of the in­scrip­tions is true and as­sum­ing a cor­re­la­tion be­tween the in­scrip­tions and the con­tents of the box seem the same to me. He can’t de­duce a cor­re­la­tion be­tween them, be­cause the only ba­sis for such a cor­re­la­tion is the ex­is­tence of the in­scrip­tions and the ba­sic for­mat of the king’s challenge (which was not iden­ti­cal to the jester’s own rid­dle). There is noth­ing in the first in­scrip­tion to sug­gest a cor­re­la­tion ex­ists, par­tic­u­larly if he de­ter­mined that the in­scrip­tion must be false! It has to be a faulty as­sump­tion, and I don’t see how it is differ­ent than as­sum­ing one of the in­scrip­tions must be true, other than se­man­ti­cally.

I’m not try­ing to be ob­tuse here, I’m just not see­ing the differ­ence be­tween what you’ve said and what I’ve said.

• based on the as­sump­tion that one of the in­scrip­tions was true.

He did not as­sume ei­ther of the in­scrip­tions were true. He as­sumed that each was ei­ther true or false.

He never as­sumed a cor­re­la­tion. He de­duced a cor­re­la­tion. He was wrong be­cause the de­duc­tion hinged on a false as­sump­tion.

Edit: Look­ing back on this, I guess he did as­sume a cor­re­la­tion. He im­plic­itly as­sumed that the po­si­tion of the dag­ger did not cause the liar para­dox. This is still a lot less of an as­sump­tion than as­sum­ing that ei­ther in­scrip­tion was true.

• In the ex­pla­na­tion for the puz­zle this is adapted from (Puz­zle 70 in What is the Name of this Book?, in the “Por­tia’s Cas­ket’s” chap­ter), Ray­mond Smul­lyan raises both points: “The suitor should have re­al­ized that with­out any in­for­ma­tion given about the truth or falsity of the sen­tences, nor any in­for­ma­tion given about the re­la­tion of their truth-val­ues, the sen­tences could say any­thing, and the ob­ject (por­trait or dag­ger, as the case may be) could be any­where. Good heav­ens, I can take any num­ber of cas­kets that I please and put an ob­ject in one of them and then write any in­scrip­tions at all on the lids; these sen­tences won’t con­vey any in­for­ma­tion what­so­ever. So Por­tia was not re­ally ly­ing; all she said was that the ob­ject in ques­tion was in one of the boxes, and in each case it re­ally was. … Another way to look at the mat­ter is that the suitor’s er­ror was to as­sume that each of the state­ments was ei­ther true or false.”

The given puz­zle (the boxes are la­beled “the por­trait is not in here” and “ex­actly one of these two state­ments is true”, where the por­trait is the de­sired ob­ject, is con­trasted with an ear­lier prob­lem, where there are two boxes say­ing “the por­trait is not in here” and “ex­actly one of these two boxes was la­beled by some­one who always tells the truth” (and it’s given that the only other box-maker always lies). The dis­tinc­tion the au­thor draws is that the sec­ond box in the ear­lier prob­lem re­ally does have to be true or false, since “it is a his­toric state­ment about the phys­i­cal world”, but there’s no such guaran­tee with purely self-refer­en­tial la­bels.

• The dis­tinc­tion the au­thor draws is that the sec­ond box in the ear­lier prob­lem re­ally does have to be true or false, since “it is a his­toric state­ment about the phys­i­cal world”

If one of the boxes says that ex­actly one of them was writ­ten by Alice, and you know from an­other source that Alice always tells the truth, Bob always lies, and both boxes were in­scribed by one of them, and Alice and Bob never say any­thing self-refer­en­tial, then this is cor­rect.

If it says that one of the boxes was la­bel­led by some­one who always tells the truth, then it’s not just talk­ing about the per­son who la­bel­led that box. It’s also talk­ing about ev­ery as­pect of re­al­ity that they’ve ever refer­enced, and if they were the one to write that in­scrip­tion, then it’s self-refer­en­tial.

• Good point—in the origi­nal word­ing, it says it was in­scribed by “Bel­lini”, who is es­tab­lished ear­lier to always tell the truth.

• In which case, if Bel­lini ever refer­ences any­thing self-refer­en­tial, the idea that he always tells the truth is not a state­ment about the phys­i­cal world. It’s likely that the ori­gin of the para­dox is that the claim that Bel­lini always tells the truth and the rest of the sce­nario are con­tra­dic­tory.

• I no­tice we’re some­how not de­bat­ing what Bel­lini always tel­ling the truth means for the truth value of the in­scribed text which may have had no mean­ing to him?

• But can self-refer­ence be for­mal­ised?

Yes. Godel demon­strated this.

• If this ma­te­rial con­di­tional is true, you should give me a hun­dred dol­lars. ;)

• The King DID lie, be­cause he wrote the in­scrip­tions. What is writ­ten on the in­scrip­tions is in­ac­cu­rate if the dag­ger is not in the sec­ond box.

• Given that it’s strongly im­plied, and log­i­cally nec­es­sary, that both in­scrip­tions not be true, I don’t think it could be con­sid­ered a lie.

• So, if some­one came up to you and told you some­thing that couldn’t pos­si­bly be true, you’d say they weren’t ly­ing?

• It’s not dishon­est any­way. The king did not sug­gest that all in­scrip­tions he wrote were true, nor did the jester as­sume that.

• The king did, how­ever, count on the Jester’s as­sump­tion that the con­tent of the boxes could be de­duced from the in­scrip­tions.

• The King counted on the Jester mak­ing a de­duc­tive er­ror in the sec­ond puz­zle (namely in­fer­ring that the con­tent of the boxes could be de­duced from the in­scrip­tions given what the King said), just like the Jester counted on the King mak­ing a de­duc­tive er­ror in the first puz­zle.

• In this situ­a­tion, it is still a cor­rect de­duc­tion to say “if the state­ments are true or false, then the con­tent of the boxes is....” With these con­tents, these state­ments aren’t true or false.

• Sorry, it’s not clear to me why you wrote this re­ply. Are you try­ing to dis­pute some­thing I said, or are you bring­ing up an in­ter­est­ing ob­ser­va­tion for dis­cus­sion, or what?

• It sounds like Jiro was say­ing that the Jester re­ally does not as­sume that “The con­tent of the boxes can be de­duced from the the in­scrip­tions.” He just as­sumes “The in­scrip­tions are ei­ther true or false,” and it log­i­cally fol­lows from what the in­scrip­tions say that he can de­duce the con­tents. So the prob­lem wasn’t mak­ing an as­sump­tion about how the con­tents could be dis­cov­ered, but mak­ing an as­sump­tion that the in­scrip­tions had to be ei­ther true or false.

• If some­one came up to you with a puz­zle in­volv­ing tran­scrip­tions where there is an ex­pec­ta­tion that some of the in­scrip­tions are true and some of the in­scrip­tions are false, and noth­ing the per­son ac­tu­ally ut­ters is false, then that per­son was not ly­ing.

In con­trast, if some­one came up to me and gave me some­thing that looks like a le­gal no­tice—a sce­nario where there is NOT an ex­pec­ta­tion that the no­tice might be false—and it turns out that the no­tice makes false claim, then that per­son is in­deed “ly­ing”, es­pe­cially if, when I take the no­tice and say “Thank you” and start to close my door, the guy says “Ac­tu­ally, you have to pay the fine im­me­di­ately; you can’t just mail it to the po­lice sta­tion later” or what­ever.

• The sim­plest way to solve the jester’s puz­zle is to make a table of the four cases … then de­ter­mine for each case whether the in­scrip­tions are in fact true or false as re­quired for that case. The con­clu­sion is that the first box has the frog and the true in­scrip­tion.

If you do this, the case where the sec­ond in­scrip­tion is true and the first box con­tains a frog is also con­sis­tent.

• If you do this, the case where the sec­ond in­scrip­tion is true and the first box con­tains a frog is also con­sis­tent.

No, be­cause in that case the first in­scrip­tion would also be true. Both in­scrip­tions can­not be true.

• Mark­down syn­tax. As­ter­ixes give ital­ics. > at start of para­graph for block quotes. Help link just be­low com­ment box. Wel­come. Etc.

• In­ter­est­ingly enough, I just mapped this whole prob­lem out care­fully in a spread­sheet, and it ap­pears to agree with zzz2. I’ll have to check it now that I’ve seen your com­ment.

• I must have ed­ited this parable into an in­con­sis­tent state at some point—should’ve dou­ble-checked it be­fore reprint­ing it. I’ve rewrit­ten the jester’s ex­pla­na­tion to make sense.

• Eliezer will think that this state­ment is false.

i.e. the above state­ment.

Of course, when he does, that will make it true, and with­out para­dox, so he will be wrong. On the other hand, if he thinks it is true, it will be false, and with­out para­dox, so he will be wrong.

• He will not be wrong, just ig­no­rant. Hy­po­thet­i­cally:

Un­known: Eliezer, do you think that the state­ment in my com­ment is false?
Eliezer: Let me see… No, I do not.
U: Aha! Then it is false! Do you think so now?
E: No.
U: Do you think it’s true?
E: No. I un­der­stand that I can­not be cor­rect in as­sign­ing a truth value to it. Not ev­ery se­quence of words has a truth value. More­over, the truth value of some sen­tences can never be known to me.
U: This makes me so much more con­fi­dent that the sen­tence is false.

So we all know some­thing Eliezer can­not ever know. He may even read these lines, and it’ll still be the lit­tle se­cret of hu­man­ity-minus-Eliezer.

• So, the king put the dag­ger in the sec­ond box that he touched, with­out re­gard for whether the jester can find it—right? Is that what the last sen­tence means?

• The last sen­tence is the King point­ing out to the Jester that all the rea­son­ing in the world is no good if it is based on false premises, in this case the false pre­sump­tion was that the text on the boxes was truth­ful.

• Ian, no, the jester didn’t pre­sume the text was true: he sim­ply pre­sumed the first in­scrip­tion was ei­ther true or false, and the prob­lem arose from this pre­sump­tion.

In my ex­am­ple, on the other hand, the state­ment is ac­tu­ally true or false, but Eliezer can never know which (if he doesn’t de­cide, then it is false, but he will never know this, since he will be un­de­cided.)

• I always thought that the state­ment “You can never know that this state­ment is true” illus­trates the prin­ci­ple most clearly.

• You’re right, zzz. Proof, if I needed it, that I am not yet a perfect rea­soner.

Cale­do­nian: While Gödel for­mal­ised some sorts of self-refer­ence, it’s not clear to me how his work ap­plies to puz­zles like these, or to the ques­tion of how hos­tile perfect rea­son­ers can com­mu­ni­cate. Bar­wise and Etchemendy’s “The Liar” has other ap­proaches to the prob­lem, but I don’t think they solve it ei­ther.

• the ques­tion of how hos­tile perfect rea­son­ers can communicate

Hos­tile rea­son­ers are rarely in­ter­ested in com­mu­ni­cat­ing with each other. When they are, they use lan­guage—just like ev­ery­one else.

• Oh, I get it, the other box couldn’t con­tain a dag­ger as well, be­cause the king ex­plic­itly said that only one box has a dag­ger in it. But he never claimed that the writ­ings on boxes are in any way re­lated to the con­tents of the boxes. Is that it? Or is it that if the “both are true or both are false” sign is false, ba­si­cally any­thing goes?

This re­minds me strongly of a silly rus­sian puz­zle. In the origi­nal it is about tur­tles, but I sort of pre­fer to trans­late it us­ing bulls. So, three bulls are walk­ing sin­gle file across the field. The first bull says “There are two bulls in be­hind me and no bulls in front of me.” The sec­ond one says “There is a bull in front of me and a bull be­hind me.” The third one says “There are two bulls in front of me and two bulls be­hind me.”

• The third one says “There are two bulls in front of me and two bulls be­hind me.”

Sorry, don’t you mean, “0 in front /​ 2 be­hind”? (third bull is walk­ing back­wards)

• JonathanG,

Ac­tu­ally, the third bull is just straight up ly­ing. (That’s why Dmitriy called the puz­zle silly.)

• Oh, I as­sumed that they were walk­ing in a cir­cle and the third bull was count­ing both ahead of him and be­hind him, even though those bulls are both the same, on the as­sump­tion that ‘sin­gle file’ =/​= ‘straight line’.

• Us­ing the jester’s rea­son­ing, it’s pos­si­ble to make him be­lieve that the earth is flat by writ­ing down “ei­ther this in­scrip­tion is true and the earth is flat, or this in­scrip­tion is false and the earth is not flat, but not both” this makes an un­flat earth log­i­cally im­pos­si­ble!

I won­der what this says about the law of the ex­cluded mid­dle, I guess that it slides if self refer­ence is in­volved.

• It’s not the law of the ex­cluded mid­dle that’s the prob­lem, it’s the jester’s as­sump­tion that the en­tire state­ment “ei­ther this …, or this..., but not both” is true. The jester rea­sons cor­rectly un­der his as­sump­tions, but fails to re­al­ize that he still has to discharge those as­sump­tions be­fore reach­ing re­al­ity.

• “One box con­tains a key,” said the king, “to un­lock your chains; and if you find the key you are free. But the other box con­tains a dag­ger for your heart, if you fail.”

And the Jester opened both boxes, suc­cess­fully find­ing (that is, not failing to find) the key. Of course, the King could de­clare “you know what I meant to say” and kill him any­way but that does change the in­tended moral some­what.

• Well, I’m cer­tainly not go­ing to ob­ject to that moral.

• And the Jester opened both boxes

… and was first set free from his chains, and then stabbed through the heart with the dag­ger.

• Nope. The dag­ger is only if he fails to find the key, NOT if he suc­ceeds in find­ing the dag­ger.

• Of course, the King could de­clare “you know what I meant to say” and kill him any­way but that does change the in­tended moral some­what.

• A prob­lem with self-refer­ence which I find nearly as amus­ing but which is much more terse:

“This sen­tence is false, and Santa Claus does not ex­ist.”

• I have cre­ated an ex­er­cise that goes with this post. Use it to solid­ify your knowl­edge of the ma­te­rial.

• It took me a while to un­der­stand this one be­cause theres al­lot of as­sump­tions within it. They are;

• that the king isnt lying

• that the king isnt mistaken

• that the in­scrip­tion isnt lying

• that there is in­fact 1 key or dag­ger.

All of which have to be taken on faith. Which my brain ob­vi­ously couldnt han­dle.

But if you be­live all of that. The you should find that; as the king told you one box con­tained the key, then there is only one key, and of the other box is to be be­lieved “that both boxes con­tain the same mys­tery item” then thats a con­tra­dic­tion, which means the op­po­site box is more likly to be true.

How­ever this is wrong.

If the king is to be be­lieved, then theres a 5050 chance no mat­ter what box you pick. But if the box is to be be­lieved, then the other box is the con­tainer, but it could be ly­ing. There­fore the chance is still an ir­re­ducible 5050. Fur­ther­more, be­liev­ing ei­ther claim would re­quire an as­sump­tion that the game was set up fairly or un­fairly. And we know as­sump­tions to be fal­la­cies and never to make them.

The an­swer to the box ques­tion can only be worked out once the box is opened and the ev­i­dence is found. The val­idity of the claims can only be tested by us­ing them.

As this is used as a proof of the core se­quence “37 ways words can be wrong” “A word fails to con­nect to re­al­ity in the first place.”

I must say that it in no way sup­ports this con­clu­sion.

• The only nec­es­sary as­sump­tions are that the King isn’t ly­ing, and that he isn’t mis­taken. Once you know this, you can de­duce that there is one key and one dag­ger.

The jester made an ad­di­tional, in­cor­rect, as­sump­tion that ev­ery­thing on the first box was ei­ther “true” or “false”.

• And the first box was in­scribed: “Either both in­scrip­tions are true, or both in­scrip­tions are false.” And the sec­ond box was in­scribed: “This box con­tains the key.”

Sup­pose the sec­ond in­scrip­tion is false. In that case, the first in­scrip­tion must also be false, in which case the king can put what­ever he damn well pleases in the boxes.

• The first in­scrip­tion says that the in­scrip­tions have the same truth value. If the sec­ond one is false then the first one im­plies that it is false which, in turn, im­plies that the first one is true. Con­tra­dic­tion. So the premise that “the sec­ond in­scrip­tion is false” is false. So the sec­ond in­scrip­tion is true.

The Jester’s log­i­cal in­fer­ence is right. The point isn’t that the Jester’s logic was wrong—it wasn’t. It’s that the Jester as­sumed that the lo­ca­tions of the key and the dag­ger would fol­low the logic when there re­ally was no good rea­son to as­sume so. This is meant to illus­trate that mak­ing un­war­ranted as­sump­tions about re­al­ity isn’t a good idea.

• That would make the first in­scrip­tion true. (And there­fore false, and there­fore para­dox­i­cal, etc)

• Was there enough in­for­ma­tion around for the Jester to cor­rectly de­ter­mine the box? I guess he could have figured that the more ob­vi­ous solu­tion was the key be­ing in the box la­bel­led as hav­ing the key in it, and the king was mad at him, so that prob­a­bly wasn’t it.

That doesn’t seem all that strong ev­i­dence.

• The parable im­plied the dis­con­nect be­tween in­scrip­tions and box con­tent, so no, there couldn’t have been enough in­for­ma­tion.

• Do I read this cor­rectly—that there was no key?

• That’s in­cor­rect—the king’s ut­tered words (“One box con­tains a key, to un­lock your chains; and if you find the key you are free. But the other box con­tains a dag­ger for your heart, if you fail.”) were still com­pletely true. The key was in the first box, the dag­ger on the sec­ond.

It’s just that the jester’s rea­son­ing about the sup­posed log­i­cal im­pos­si­bil­ity of the state­ments in­scribed on the boxes was ut­ter non­sense. He knew that nei­ther of the state­ments in­scribed need have been true, but he still fool­ishly ar­gued him­self into think­ing that whether true or false they ‘proved’ the key be­ing on the sec­ond box.

• So then the ac­tual cor­rect solu­tion, per the king’s de­scrip­tion of events, would be to ig­nore the in­scrip­tions and just open both boxes?

Since the King didn’t say that he’d be kil­led if he found the dag­ger, only that the dag­ger would be em­ployed if he failed to find the key. Open­ing both boxes means find­ing the key, there­fore, open both boxes.

(bonus points for chutz­pah if he opens the box with the knife first, says “cool! this will make open­ing the other box MUCH eas­ier!” and then uses that to get the key out of the sec­ond box)

• King: Very clever. (to the guards) set him free from the top of the tallest tower.

• I sup­pose the mes­sage here is that though the in­scrip­tions (liter­ally) la­beled the boxes as X and Y, this does not con­form in re­al­ity. The words do not make it true, and the Jester made the mis­take of pre­sum­ing that his strict logic meant that re­al­ity has to fol­low the la­bels that were given. His last words, sadly, was “It’s log­i­cally im­pos­si­ble!” One should re­con­sider call­ing things log­i­cal im­pos­si­bil­ities, when they are oc­cur­ring right in front of you. Who know what other log­i­cal im­pos­si­bil­ities you were miss­ing.

If I were man of liter­a­ture, I would also com­ment on the jux­ta­po­si­tion of the Jester and King. The Jester, who is a fan of logic, lives in the court. His de­vo­tion to log­i­cal rea­son­ing plays it­self out in en­ter­tain­ment form, whether pri­vately in his bed­room, or by stick­ing an an­gry frog onto a king. The King, on the other hand, lives in a world of poli­tics, diplo­macy, and war. He does not have the lux­ury of syl­l­o­gisms, as he is sur­rounded by flat­ter­ers, ri­vals and en­e­mies. He can­not pre­sume that any­thing that is pre­sented is not an ex­ag­ger­a­tion, in­ac­cu­rate or an out­right lie.

The fi­nal moral; do not stick an­gry frogs on some­one who has the abil­ity and the po­ten­tial dis­po­si­tion to kill you. Or more gen­er­ally, do not stick an­gry frogs onto peo­ple, it is just bad be­hav­ior. Just don’t do it.

• Or, as P.T. Bar­num put it… there is a Sucker Born Every Minute.

Even for Jesters, it’s never a good idea to hu­mil­i­ate the King...

• There are a lot of com­ments here that say that the jester is un­jus­tified in as­sum­ing that there is a cor­re­la­tion be­tween the in­scrip­tions and the con­tents of the boxes. This is, in my opinion, com­plete and ut­ter non­sense. Once we as­sign mean­ings to the words true and false (in this case, “is an ac­cu­rate de­scrip­tion of re­al­ity” and “is not an ac­cu­rate de­scrip­tion of re­al­ity”), all other state­ments are ei­ther false, true or mean­ingless. A state­ment can be mean­ingless be­cause it de­scribes some­thing that is not real (for ex­am­ple, “This box con­tains the key” is mean­ingless if the world does not con­tain any boxes) or be­cause it is in­con­sis­tent (it has at least one in­finite loop, as with “This state­ment is false”). If a state­ment is mean­ingful it af­fects our ob­ser­va­tions of re­al­ity, and so we can use Bayesian rea­son­ing to as­sign a prob­a­bilty for the state­ment be­ing true. If the state­ment is mean­ingless, we can­not as­sign a prob­a­bilty for it be­ing true with­out vi­o­lat­ing our as­sump­tion that there is a con­sis­tent un­der­ly­ing re­al­ity to ob­serve, in which case we can­not trust our ob­ser­va­tions. Halt, Melt and Catch Fire.

The state­ment “This box con­tains the key” is a de­scrip­tion of re­al­ity, and is ei­ther false or true. The state­ment “Both in­scrip­tions are true” is mean­ingful if there ex­ists an­other in­scrip­tion, true if the sec­ond de­scrip­tion is true and false if the sec­ond de­scrip­tion is false or mean­ingless. The state­ment “Both in­scrip­tions are false” is mean­ingless be­cause it is in­con­sis­tent—we can­not as­sign a truth-value to it. The state­ment “Either both in­scrip­tions are true, or both in­scrip­tions are false” is there­fore ei­ther true (both in­scrip­tions are true, im­ply­ing that the key is in box 2) or mean­ingless. In the lat­ter case, we can gain no in­for­ma­tion from the state­ment—the jester might as well have been given only the sec­ond box and the sec­ond in­scrip­tion. The jester’s mis­take lies in as­sum­ing that both in­scrip­tions must be mean­ingful—“one is mean­ingless and the other is false” is as valid an an­swer as “both are true”, in that both of those state­ments are mean­ingful—the lat­ter is true if the sec­ond box con­tains the key, and the former is true if the sec­ond box does not con­tain the key. The jester should have eval­u­ated the prob­a­bilty that the prob­lem was meant to be solv­able and the prob­a­bil­ity that the prob­lem was not meant to be solv­able, given that the prob­lem is not solv­able, which is an as­sess­ment of the king’s abil­ity at puz­zle-de­vis­ing and the king’s de­sire to kill the jester.

It is also prov­able that we can­not as­sign a prob­a­bilty of 1 or 0 to any state­ment’s truth (in­clud­ing tau­tolo­gies), since we must have some func­tion from which truth and falsity are defined, and spec­i­fy­ing both an in­put and an out­put (a state­ment and its truth value) changes the func­tion we use. If a state­ment is as­signed a truth-value ex­cept by the rules of what­ever log­i­cal sys­tem we pick, the log­i­cal sys­tem fails and we can­not draw any in­fer­ences at all. A sys­tem with a defi­ni­tion of truth, a set of thruth-pre­serv­ing op­er­a­tions and at least one ax­iom must always be mean­ingless—the as­sump­tion of the ax­iom’s truth is not a truth-pre­serv­ing op­er­a­tion, and nei­ther is the as­sump­tion that our truth-pre­serv­ing op­er­a­tions are truth-pre­serv­ing. Ax­io­matic logic works only if we ac­cept the pos­si­bil­ity that the ax­ioms might be false and that our rea­son­ing might be flawed—you can’t ar­gue based on the truth of A with­out ei­ther al­low­ing ar­gu­ments based on ~A or in­clud­ing “A” in your defi­ni­tion of truth. In other words, ax­io­matic logic can’t be ap­plied to re­al­ity with cer­tainty—we would end up like the jester, as­sert­ing that re­al­ity must be wrong. As a con­se­quence of the above, defin­ing “true” as “re­flect­ing an ob­serv­able un­der­ly­ing re­al­ity” im­plies that all mean­ingful state­ments must have ob­serv­able con­se­quences.

The ar­gu­ment above ap­plies to it­self. The last sen­tence ap­plies to it­self and the para­graph be­fore that. The last sen­tence… (If I ac­quire the karma to post ar­ti­cles, I’ll prob­a­bly write one ex­plain­ing this in more de­tail, as­sum­ing any­one’s in­ter­ested.)

• All of these com­ments on the jester wrongly as­sum­ing the box in­scrip­tions re­lated to the world seem over­wrought to me. I cre­ated this ac­count just to make this point (and be­cause this site looks amaz­ing!):

The jester’s only mis­take was dis­count­ing the pos­si­bil­ity of both in­scrip­tions be­ing false.

That’s it...the in­scrip­tions (both) ‘be­ing false’. Not ‘per­tain­ing to the real world’, not ‘hav­ing truth val­ues’...just ‘be­ing false’.

He figured out that it could not be the case that both in­scrip­tions were true—so far so good. He then as­sumed that it must be the case that one must be true and the other false, which was only al­low­ing for 1 out of the 2 re­main­ing pos­si­bil­ities (1 true and 1 false, or 2 false). He was mod­el­ling his solu­tion af­ter the ear­lier prob­lem he had con­structed (with the frog and the gold), or he was es­sen­tially try­ing to max­i­mize the num­ber of true in­scrip­tions, or both. Nei­ther was war­ranted.

(I mostly agree with the poster above (Chrys­o­phy­lax), or at least the first two para­graphs of that long post, in that the in­scrip­tions cer­tainly did have truth val­ues per­tain­ing to the world and speci­fi­cally to the con­tents of the boxes. That is mostly the point I wanted to make. I dis­agree with her or him about this part, though: “The state­ment “Both in­scrip­tions are false” is mean­ingless be­cause it is in­con­sis­tent—we can­not as­sign a truth-value to it.” I see that state­ment as false, not mean­ingless. So I ac­tu­ally take slightly more pos­si­ble state­ments as per­tain­ing the world and hav­ing ac­tual truth val­ues than does Chrys­o­phy­lax (which in turn is far more than most other com­menters here seem to be re­port­ing). …ba­si­cally any­thing that does not match Chrys­o­phy­lax’s other ex­am­ples of mean­ingless state­ments. I could even go so far as say­ing that the state­ment “The in­visi­ble uni­corn is happy.” is false though maybe also be­ing ‘mean­ingless’ (maybe be­cause it de­mands the ac­cep­tance of the false state­ment “An in­visi­ble uni­corn ex­ists.” and could be trans­lated as “There ex­ists an in­visi­ble uni­corn, and it is happy.”). I’d love to hear opinions on that, though!)

• That’s it...the in­scrip­tions (both) ‘be­ing false’. Not ‘per­tain­ing to the real world’, not ‘hav­ing truth val­ues’...just ‘be­ing false’.

If they were both false, that would make the first in­scrip­tion true.

• If they were both false, that would make the first in­scrip­tion true.

In­deed. So the in­scrip­tion is both true and false. You got a prob­lem with that? ;)

• If some­thing is both true and false, then it be­comes triv­ial to prove that any given fact is true. This is called the prin­ci­ple of ex­plo­sion.

If it’s nei­ther true nor false, that doesn’t hap­pen.

To make defi­ni­tions clear, I am us­ing “X is false” to mean “Not X is true”, rather than “X is some­thing other than true”.

• ...and then the first in­scrip­tion would be false, etc.

If you are point­ing out that would be un­sta­ble in that way, or ‘mean­ingless’, then OK. good point.

(I did spec­ify that I see the state­ment “Both in­scrip­tions are false” as false rather than just mean­ingless, though, and the first in­scrip­tion would be of that same form if the sec­ond one were false.)

In any case I still defend the jester’s im­pres­sion that state­ments have truth val­ues (ex­clud­ing ‘mean­ingless’ ones, as nec­es­sary), while still fault­ing him for some­thing else en­tirely:

He was (still) mod­el­ling his solu­tion af­ter the ear­lier prob­lem he had con­structed (with the frog and the gold), or he was as­sum­ing a situ­a­tion in which none of the state­ments were ‘mean­ingless’. Nei­ther was war­ranted.

(That is one step closer to what many com­menters have men­tioned, but “This box con­tains the key.” is plainly just false, not un­con­nected to the world.)

• 19 Jan 2015 19:26 UTC
2 points

...but could not the Jester rat­tle the boxes be­fore open­ing one, and then up­date his be­liefs upon that ev­i­dence? I mean, it would not be much to go by, but it’s bet­ter than noth­ing… ‘But Sire, what­ever I find, you lose a Jester! What can ever rec­on­cile you to such a lamentable tragedy?’ ‘A gob­let from your skull?’ ‘In that case, the im­por­tant thing for me is not to find the dag­ger, for which the best choice is not to choose any box.’ ‘Then you fail by de­fault.’ ‘Then take the box with the dag­ger, since I failed by de­fault, and I shall pick the other one.’

• Re­gard­ing the cor­re­la­tion be­tween in­scrip­tions and con­tents be­ing merely as­sumed: are the spo­ken claims any differ­ent? I don’t see them be­ing called into ques­tion the same way.

• There isn’t cor­re­la­tion be­tween these in­scrip­tions and im­plied con­tents (since he could have put the key and dag­ger in ei­ther box), but there /​is/​ cor­re­la­tion be­tween {the in­scrip­tions and con­tents} and the king’s hon­esty. The king didn’t lie and he wouldn’t have put in­scrip­tions and con­tents into such an ar­range­ment that would make it true that he lied. This puts a con­straint on how he could ar­range the in­scrip­tions and con­tents.

• Salient point: why you men­tion ar­range­ments of in­scrip­tions and con­tents at all? That is what con­fuses me. Either the ar­range­ments mat­ter at some point—such as in­scribing—in which case there had been a lie when the king la­beled an (ap­par­ently?) empty box with “This box con­tains the key.” (not “this box doesn’t con­tain the dag­ger”, which would have been true), or not at all, in which case I re­it­er­ate my pre­vi­ous ques­tion.

• As­sume not that it is true or false, as­sume that it’s a para­dox (i.e. both true and false), and from that it fol­lows that the king didn’t lie.

But, still, that’s not the only moral of the story. A moral of the story is also that we shouldn’t start by as­sum­ing some state­ments are ei­ther true or false, and then see what that im­plies about the refer­ents, un­less those state­ments are /​en­tan­gled with their refer­ents/​. If state­ments aren’t en­tan­gled with their refer­ents, then no log­i­cal rea­son­ing from these state­ments can tell you any­thing about the refer­ents.

• The king wrote “This box con­tains the key.” on the 2nd box, be­fore putting the dag­ger in. Did the sec­ond box con­tain the key as well as the dag­ger?

• I can’t speak for Eliezer’s in­ten­tions when he wrote this story, but I can see an in­cred­ibly sim­ple moral to take away from this. And I can’t shake the feel­ing that most of the com­menters have com­pletely missed the point.

For me, the strik­ing part of this story is that the Jester is shocked and con­fused when they drag him away. “How?!” He says “It’s log­i­cally im­pos­si­ble”. The Jester seems not to un­der­stand how it is pos­si­ble for the dag­ger to be in the sec­ond box. My ex­pla­na­tion goes as fol­lows, and I think I’m just para­phras­ing the king here.

1- If a king has two boxes and a means to write on them, then he can write any damn thing on them that he wants to. 2- If a king also has a dag­ger, then he can place that dag­ger in­side one of the two boxes, and he can place it in whichever box he de­cides to place it in.

That’s it. That’s the en­tire ex­pla­na­tion for how the dag­ger could “pos­si­bly” be in­side the sec­ond box. It’s a very sim­ple ar­gu­ment, that a five year old could un­der­stand, and no amount of de­tailed con­sid­er­a­tion by a lo­gi­cian is go­ing to stop this sim­ple ar­gu­ment from be­ing true.

The jester, how­ever, thought it was im­pos­si­ble for the dag­ger to be in the sec­ond box. Not just that it wasn’t there, but that it was IMPOSSIBLE. That’s how I read the story, any­way. He used sig­nifi­cantly more com­pli­cated logic, and he thought that he’d proven it im­pos­si­ble. But it only takes a mo­ment’s re­flec­tion to see that he’s wrong.

Some of the com­ments above have tried to work out what was wrong with Jester’s logic, and they’ve ex­plained the de­tailed and sub­tle flaws in his rea­son­ing. That’s great—if you want to de­velop a deep un­der­stand­ing of logic, self-refer­en­tial state­ments, and math­e­mat­i­cal truth val­ues (and lets be fair, I sup­pose most of us do), but in the con­text of the se­quences on ra­tio­nal­ity, I think there’s a much bet­ter les­son to learn.

Re­mem­ber: ra­tio­nal­ists are sup­posed to WIN. We’re sup­posed to de­velop rea­son­ing skills that give us a bet­ter and more use­ful un­der­stand­ing of re­al­ity. So the les­son is this: don’t be se­duced by com­plex and de­tailed logic, if that logic is tak­ing you fur­ther and fur­ther away from an ac­cu­rate de­scrip­tion of re­al­ity. If some­thing is already true, or already false, then no amount of rea­son­ing will change it.

Real­ity is NOT re­quired to con­form to your un­der­stand­ing or your rea­son­ing. It is your rea­son­ing that should be re­quired to con­form to re­al­ity.

• Break­ing #24 of the Evil Over­lord List makes me wince, too, even if it’s a jester do­ing it. Not sure if that’s the main point, though, but then, none of the pro­posed ex­pla­na­tion for how the king could pull his “rid­dle” off with­out at any point ly­ing feel en­tirely right to me, so, un­less some­one offers to help me, I shall have to take your ad­vice and not let my­self get en­tan­gled in the “com­plex and de­tailed logic”, when the an­swer might as well be “BS”.

• There’s a lot of value in that. Some­times it’s best not to go down the rab­bit hole.

What­ever the tech­ni­cal­ities might be, the jester definitely fol­lowed the nor­mal, rea­son­able rules of this kind of puz­zle, and by those rules he got the right an­swer. The king set it up that way, and set the jester up to fail.

If he’d done it to teach the jester a valuable les­son about the differ­ence be­tween ab­stract logic and real life, then it might have been jus­tified. But he’s go­ing to have the jester ex­e­cuted, so that ar­gu­ment dis­ap­pears.

I think we can all agree, The King is definitely a dick.

• I’ll some­what echo what Cyn­i­calOp­ti­mist wrote. I think the mes­sage is is one any first semester logic stu­dent should have been taught or know: a valid ar­gu­ment is not nec­es­sar­ily true. The val­idity of an ar­gu­ment’s con­clu­sion is all about form of the ar­gu­ment. The truth of the con­clu­sion is an ex­ter­nal fact ex­ist­ing com­pletely in­de­pen­dent from the ar­gu­ment’s struc­ture.

• I’m try­ing to stay lev­el­headed about King Richard. What I meant was that there seems to be ex­tra­ne­ous de­tails here—about the or­der things were done in, first in­scribe (“key is here”, on an empty(?) box), then put dag­ger in, or that it was writ­ten, not spo­ken. Many com­ments only en­force the im­por­tance of that.

The “real” an­swer seems to be one that effec­tively makes all kinds of com­mu­ni­ca­tion use­less, and what I’ve spent so much time on was try­ing to pin down the bor­ders of this in­san­ity, some marker say­ing “ab­stract logic ap­pli­ca­tion to real life* not al­lowed past this point”.

*) the use of phys­i­cal boxes bind­ing the rid­dle to “real life”

• The jester should have seen this com­ing.

“Either both in­scrip­tions are true, or both in­scrip­tions are false.”

If this state­ment is true then the sec­ond box must hold the key by the jester’s rea­son­ing. How­ever if this state­ment is false then it doesn’t re­quire that the sec­ond state­ment be true. In his test­ing the jester negated only half of the state­ment at a time. If this state­ment is en­tirely false then it could sim­ply mean that the true-false val­ues of the state­ments on ei­ther box have no re­la­tion­ship to each other. Which did in­deed turn out to be the case.

In other words: If the state­ment on the sec­ond box is false, then the state­ment on the first box claiming that the state­ments on the two boxes are in any way re­lated is also false and the fact that both be­ing false would cause a para­dox if the first state­ment were true is not rele­vant.

The jester made the mis­take of as­sum­ing “at least one of the state­ments is true” and con­fused val­idity and sound­ness, and there­fore de­served to be stabbed.