Karate Kid and Realistic Expectations for Disagreement Resolution

There’s an es­say that pe­ri­od­i­cally feels deeply rele­vant to a situ­a­tion:

Some­day I want to write a self-help book ti­tled “F*k The Karate Kid: Why Life is So Much Harder Than We Think”.

Look at any movie with a train­ing mon­tage: The main char­ac­ter is very bad at some­thing, then there is a se­quence in the mid­dle of the film set to up­beat mu­sic that shows him prac­tic­ing. When it’s done, he’s an ex­pert.

It seems so ob­vi­ous that it ac­tu­ally feels in­sult­ing to point it out. But it’s not ob­vi­ous. Every adult I know—or at least the ones who are de­pressed—con­tinu­ally suffers from some­thing like sticker shock (that is, when you go shop­ping for some­thing for the first time and are shocked to find it costs way, way more than you thought). Only it’s with effort. It’s Effort Shock.

We have a vague idea in our head of the “price” of cer­tain ac­com­plish­ments, how difficult it should be to get a de­gree, or suc­ceed at a job, or stay in shape, or raise a kid, or build a house. And that vague idea is al­most always catas­troph­i­cally wrong.

Ac­com­plish­ing worth­while things isn’t just a lit­tle harder than peo­ple think; it’s 10 or 20 times harder. Like los­ing weight. You make your­self mis­er­able for six months and find your­self down a whop­ping four pounds. Let your­self go at a sin­gle all-you-can-eat buf­fet and you’ve gained it all back.

So, peo­ple bail on diets. Not just be­cause they’re harder than they ex­pected, but be­cause they’re so much harder it seems un­fair, al­most crim­i­nally un­just. You can’t shake the bit­ter thought that, “This amount of effort should re­sult in me look­ing like a panty model.”

It ap­plies to ev­ery­thing. [The world] is full of frus­trated, bro­ken, baf­fled peo­ple be­cause so many of us think, “If I work this hard, this many hours a week, I should have (a great job, a nice house, a nice car, etc). I don’t have that thing, there­fore some­thing has cor­rupted the sys­tem and kept me from get­ting what I de­serve.”

Last time I brought this up it was in the con­text of re­al­is­tic ex­pec­ta­tions for self im­prove­ment.

This time it’s in the con­text of pro­duc­tive dis­agree­ment.

In­tu­itively, it feels like when you see some­one be­ing wrong, and you have a sim­ple ex­pla­na­tion for why they’re wrong, it should take you, like, 5 min­utes of say­ing “Hey, you’re wrong, here’s why.”

In­stead, Bob and Alice peo­ple might de­bate and dou­ble­crux for 20 hours, mak­ing se­ri­ous effort to un­der­stand each other’s view­point… and the end re­sult is a con­ver­sa­tion that still feels like mov­ing through mo­lasses, with both Alice and Bob feel­ing like the other is miss­ing the point.

And if 20 hours seems long, try years.

AFAICT the Yud­kowsky/​Han­son Foom De­bate didn’t re­ally re­solve. But, the gen­eral de­bate over “should we ex­pect a sud­den leap in AI abil­ities that leaves us with a sin­gle vic­tor, or a mul­ti­po­lar sce­nario?” has ac­tu­ally pro­gressed over time. Paul Chris­ti­ano’s Ar­gu­ments About Fast Take­off seemed most in­fluen­tial of re­fram­ing the de­bate in a way that helped some peo­ple stop talk­ing past each other, and fo­cus on the ac­tual differ­ent strate­gic ap­proaches that the differ­ent mod­els would pre­dict.

Holden Karnofsky ini­tially had some skep­ti­cism about some of MIRI’s (then SIAI’s) ap­proach to AI Align­ment. Those views changed over the course of years.

On the LessWrong team, we have a lot of dis­agree­ments about how to make var­i­ous UI trade­offs, which we still haven’t re­solved. But af­ter a year or so of pe­ri­odic chat­ting about I think we at least have bet­ter mod­els of each other’s rea­son­ing, and in some cases we’ve found third-solu­tions that re­solved the is­sue.

I have ob­served my­self tak­ing years to re­ally as­similate the wor­ld­views of oth­ers.

When you have deep frame dis­agree­ments, I think “years” is ac­tu­ally just a fairly com­mon timeframe for pro­cess­ing a de­bate. I don’t think this is a nec­es­sary fact about the uni­verse, but it seems to be the sta­tus quo.


The rea­sons a dis­agree­ment might take years to re­solve vary, but a few in­clude:

i. Com­plex Beliefs, or Frame Differ­ences, that take time to com­mu­ni­cate.

Where the blocker is just “ded­i­cat­ing enough time to ac­tu­ally ex­plain­ing things.” Maybe the to­tal pro­cess only takes 30 hours but you have to ac­tu­ally do the 30 hours, and peo­ple rarely ded­i­cate more than 4 at a time, and then don’t pri­ori­tize finish­ing it that highly.

ii. Com­plex Beliefs, or Frame Differ­ences, that take time to absorb

Some­times it only takes an hour to ex­plain a con­cept ex­plic­itly, but it takes awhile for that con­cept to prop­a­gate through your im­plicit be­liefs. (Maybe some­one ex­plains a pat­tern in so­cial dy­nam­ics, and you nod along and say “okay, I could see that hap­pen­ing some­times”, but then over the next year you start to see it hap­pen­ing, and you don’t “re­ally” be­lieve in it un­til you’ve seen it a few times.)

Some­times it’s an even va­guer thing like “I dunno man I just needed to re­lax and not think about this for awhile for it to sub­con­sciously sink in some­how”

iii. Idea In­noc­u­la­tion + In­fer­en­tial Distance

Some­times the first few peo­ple ex­plain­ing a thing to you suck at it, and give you an im­pres­sion that any­one ad­vo­cat­ing the thing is an idiot, and causes you to sub­se­quently dis­miss peo­ple who pat­tern match to those bad ar­gu­ments. Then it takes some­one who puts a lot of effort into an ex­pla­na­tion that coun­ter­acts that ini­tial bad taste.

iv. Hit­ting the right ex­pla­na­tion /​ circumstances

Some­times it just takes a spe­cific com­bi­na­tion of “the right ex­pla­na­tion” and “be­ing in the right cir­cum­stances to hear that ex­pla­na­tion” to get a mag­i­cal click, and un­for­tu­nately you’ll need to try sev­eral times be­fore the right one lands. (And, like rea­son #1 above, this doesn’t nec­es­sar­ily take that much time, but nonethe­less takes years of in­ter­mit­tent at­tempts be­fore it works)

v. So­cial pres­sure might take time to shift

Some­times it just has noth­ing to do with good ar­gu­ments and ra­tio­nal up­dates – it turns out you’re a mon­key who’s win­dow-of-pos­si­ble be­liefs de­pends a lot on what other mon­keys around you are will­ing to talk about. In this case it takes years for enough peo­ple around you to change their mind first.

Hope­fully you can take ac­tions to im­prove your so­cial re­silience, so you don’t have to wait for that, but I bet it’s a fre­quent cause.

Op­ti­mism and Pessimism

You can look at this glass half-empty or half-full.

Cer­tainly, if you’re ex­pect­ing to con­vince peo­ple of your view­point within a mat­ter of hours, you may some­times have to come to terms with that not always hap­pen­ing. If your plans de­pend on it hap­pen­ing, you may need to re-plan. (Not always: I’ve also seen ma­jor dis­agree­ments get re­solved in hours, and some­times even 5 min­utes. But, “years” might be an out­come you need to plan around. If it is tak­ing years it may not be worth­while un­less you’re ac­tu­ally build­ing a product to­gether.)

On the plus side… I’ve now got­ten to see sev­eral deep dis­agree­ments ac­tu­ally progress. I’m not sure I’ve seen a years-long dis­agree­ment re­solve com­pletely, but have definitely seen peo­ple change their minds in im­por­tant ways. So I now have ex­is­tence proof that this is even pos­si­ble to ad­dress.

Many of the rea­sons listed above seem ad­dress­able. I think we can do bet­ter.