Strong stances

Link post

I. The ques­tion of confidence

Should one hold strong opinions? Some say yes. Some say that while it’s hard to tell, it ten­ta­tively seems pretty bad (prob­a­bly).

A quick re­view of pur­ported or plau­si­ble pros:

  1. Strong opinions lend them­selves to re­vi­sion:

    1. Noth­ing will sur­prise you into up­dat­ing your opinion if you thought that any­thing could hap­pen. A perfect Bayesian might be able to deal with myr­iad sub­tle up­dates to vast un­cer­tain­ties, but a hu­man is more likely to no­tice a red cup­cake if they have claimed that cup­cakes are never red. (Ar­guably—some would say hav­ing opinions makes you less able to no­tice any threat to them. My guess is that this de­pends on topic and per­son­al­ity.)

    2. ‘Not hav­ing a strong opinion’ is of­ten va­guer than hav­ing a flat prob­a­bil­ity dis­tri­bu­tion, in prac­tice. That is, the un­cer­tain per­son’s po­si­tion is not, ‘there is a 51% chance that policy X is bet­ter than policy -X’, it is more like ‘I have no idea’. Which again doesn’t lend it­self to at­tend­ing to de­tailed ev­i­dence.

    3. Uncer­tainty breeds in­ac­tion, and it is harder to run into more ev­i­dence if you are wait­ing on the fence, than if you are out there mak­ing prac­ti­cal bets on one side or the other.

  2. (In a bit­terly un­fair twist of fate) be­ing over­con­fi­dent ap­pears to help with things like run­ning star­tups, or maybe all kinds of things.
    If you run a startup, com­mon wis­dom ad­vises go­ing around it say­ing things like, ‘Here is the dream! We are go­ing to make it hap­pen! It is go­ing to change the world!’ in­stead of things like, ‘Here is a plau­si­ble dream! We are go­ing to try to make it hap­pen! In the un­likely case that we suc­ceed at some­thing rec­og­niz­ably similar to what we first had in mind, it isn’t in­con­ceiv­able that it will change the world!’ Prob­a­bly some of the value here is just a zero sum con­test to mis­in­form peo­ple into mis­in­vest­ing in your dream in­stead of some­thing more promis­ing. But some is prob­a­bly real value—sup­pose Bob works full time at your startup ei­ther way. I ex­pect he finds it eas­ier to ded­i­cate him­self to the work and has a bet­ter time if you are more con­fi­dent. It’s nice to fol­low lead­ers who stand for some­thing, which tends to go with hav­ing at least some strong opinions. Even alone, it seems eas­ier to work hard on a thing if you think it is likely to suc­ceed. If be­ing un­re­al­is­ti­cally op­ti­mistic just gen­er­ates ex­tra effort to be put to­ward your pro­ject’s suc­cess, rather than steal­ing time from some­thing more promis­ing, that is a big deal.

  3. So­cial com­pe­ti­tion
    Even if the benefits of over­con­fi­dence in run­ning com­pa­nies and such were all zero sum, ev­ery­one else is do­ing it, so what are you go­ing to do? Fail? Only em­ploy peo­ple will­ing to work at less promis­ing look­ing com­pa­nies? Similarly, if you go around be­ing suit­ably cau­tious in your views, while other peo­ple are un­rea­son­ably con­fi­dent, then on­look­ers who trust both of you will be more in­ter­ested in what the other peo­ple are say­ing.

  4. Whole­heart­ed­ness
    It is nice to be the kind of per­son who knows where they stand and what they are do­ing, in­stead of always liv­ing in an in­tractable set of place-plan com­bi­na­tions. It ar­guably lends it­self to en­ergy and vi­gor. If you are un­sure whether you should be go­ing North or South, hav­ing re­luc­tantly eval­u­ated North as a bit bet­ter in ex­pected value, for some rea­son you of­ten still won’t power North at full speed. It’s hard to pas­sion­ately be re­ally con­fused and un­cer­tain. (I don’t know if this is re­lated, but it seems in­ter­est­ing to me that the hu­man mind feels as though it lives in ‘the world’—this one con­crete thing—though its epistemic po­si­tion is in some sense most nat­u­rally seen as a prob­a­bil­ity dis­tri­bu­tion over many pos­si­bil­ities.)

  5. Creativity
    Per­haps this is the same point, but I ex­pect my imag­i­na­tion for new op­tions kicks in bet­ter when I think I’m in a par­tic­u­lar situ­a­tion than when I think I might be in any of five differ­ent situ­a­tions (or worse, in any situ­a­tion at all, with differ­ent ‘weight­ings’).

A quick re­view of the con:

  1. Per­va­sive dishon­esty and/​or dis­en­gage­ment from re­al­ity
    If the ev­i­dence hasn’t led you to a strong opinion, and you want to pro­fess one any­way, you are go­ing to have to some­how dis­en­gage your per­sonal or so­cial epistemic pro­cesses from re­al­ity. What are you go­ing to do? Lie? Believe false things? Th­ese both seem so bad to me that I can’t con­sider them se­ri­ously. There is also this sub-con:

    1. Ap­pear­ance of per­va­sive dishon­esty and/​or dis­en­gage­ment from re­al­ity
      Some peo­ple can tell that you are ei­ther ly­ing or be­liev­ing false things, due to your boldly claiming things in this un­cer­tain world. They will then sus­pect your epistemic and moral fiber, and dis­trust ev­ery­thing you say.

  2. (There are prob­a­bly oth­ers, but this seems like plenty for now.)

II. Ten­ta­tive answers

Can we have the pros with­out the dev­as­tat­ingly ter­rible con? Some ideas that come to mind or have been sug­gested to me by friends:

1. Main­tain two types of ‘be­liefs’. One set of play be­liefs—con­fi­dent, well un­der­stood, prob­a­bly-wrong—for im­prov­ing in the sand­pits of tin­ker­ing and chat­ting, and one set of real be­liefs—un­cer­tain, defer­en­tial—for when it mat­ters whether you are right. For in­stance, you might have some ‘be­liefs’ about how can­cer can be cured by vi­tam­ins that you chat about and pon­der, and read jour­nal ar­ti­cles to up­date, but when you ac­tu­ally get can­cer, you fol­low the ex­pert ad­vice to lean heav­ily on chemother­apy. I think peo­ple nat­u­rally do this a bit, us­ing words like ‘best guess’ and ‘work­ing hy­poth­e­sis’.

I don’t like this plan much, though ad­mit­tedly I ba­si­cally haven’t tried it. For your new fake be­liefs, ei­ther you have to con­stantly dis­claim them as fake, or you are again ly­ing and po­ten­tially mis­lead­ing peo­ple. Maybe that is man­age­able through always say­ing ‘it seems to me that..’ or ‘my naive im­pres­sion is..’, but it sounds like a mess.

And if you only use these be­liefs on unim­por­tant things, then you miss out on a lot of the up­dat­ing you were hop­ing for from let­ting your strong be­liefs run into re­al­ity. You get some though, and maybe you just can’t do bet­ter than that, un­less you want to be test­ing your whacky the­o­ries about can­cer cures when you have can­cer.

It also seems like you won’t get a lot of the so­cial benefits of seem­ing con­fi­dent, if you still don’t ac­tu­ally be­lieve strongly in the re­ally con­fi­dent things, and have to con­stantly dis­claim them.

But I think I ac­tu­ally ob­ject be­cause be­liefs are for true things, damnit. If your ev­i­dence sug­gests some­thing isn’t true, then you shouldn’t be ‘be­liev­ing’ it. And also, if you know your ev­i­dence sug­gests a thing isn’t true, how are you even go­ing to go about ‘be­liev­ing it’? I don’t know how to.

2. Main­tain sep­a­rate ‘be­liefs’ and ‘im­pres­sions’. This is like 1, ex­cept im­pres­sions are just claims about how things seem to you. e.g. ‘It seems to me that vi­tamin C cures can­cer, but I be­lieve that that isn’t true some­how, since a lot of more in­formed peo­ple dis­agree with my im­pres­sion.’ This seems like a great dis­tinc­tion in gen­eral, but it seems a bit differ­ent from what one wants here. I think of this as a dis­tinc­tion be­tween the ev­i­dence that you re­ceived, and the to­tal ev­i­dence available to hu­man­ity, or per­haps be­tween what is ar­rived at by your own rea­son­ing about ev­ery­one’s ev­i­dence vs. your own rea­son­ing about what to make of ev­ery­one else’s rea­son­ing about ev­ery­one’s ev­i­dence. How­ever these are about ways of get­ting a be­lief, and I think what you want here is ac­tu­ally just some be­liefs that can be got in any way. Also, why would you act con­fi­dently on your im­pres­sions, if you thought they didn’t ac­count for oth­ers’ ev­i­dence, say? Why would you act on them at all?

3. Con­fi­dently as­sert pre­cise but highly un­cer­tain prob­a­bil­ity dis­tri­bu­tions “We should work so hard on this, be­cause it has like a 0.03% chance of re­shap­ing 0.5% of the world, mak­ing it a 99.97th per­centile in­ter­ven­tion in the dis­tri­bu­tion we are draw­ing from, so we shouldn’t ex­pect to see some­thing this good again for fifty-seven months.” This may solve a lot of prob­lems, and I like it, but it is tricky.

4. Just do the re­search so you can have strong views. To do this across the board seems pro­hibitively ex­pen­sive, given how much re­search it seems to take to be al­most as un­cer­tain as you were on many top­ics of in­ter­est.

5. Fo­cus on act­ing well rather than your effects on the world. In­stead of try­ing to act de­ci­sively on a 1% chance of this in­ter­ven­tion ac­tu­ally bring­ing about the de­sired re­sult, try to act de­ci­sively on a 95% chance that this is the cor­rect in­ter­ven­tion (given your rea­son­ing sug­gest­ing that it has a 1% chance of work­ing out). I’m told this is re­lated to Sto­icism.

6. ‘Opinions’
I no­tice that peo­ple of­ten have ‘opinions’, which they are not very care­ful to make true, and do not seem to straight­for­wardly ex­pect to be true. This seems to be com­monly un­der­stood by ra­tio­nally in­clined peo­ple as some sort of failure, but I could imag­ine it be­ing an­other solu­tion, per­haps along the lines of 1.

(I think there are oth­ers around, but I for­get them.)

III. Stances

I pro­pose an al­ter­na­tive solu­tion. Sup­pose you might want to say some­thing like, ‘groups of more than five peo­ple at par­ties are bad’, but you can’t be­cause you don’t re­ally know, and you have only seen a small num­ber of par­ties in a very limited so­cial mi­lieu, and a lot of things are go­ing on, and you are a con­gen­i­tally un­cer­tain per­son. Then in­stead say, ‘I deem groups of more than five peo­ple at par­ties bad’. What ex­actly do I mean by this? In­stead of mak­ing a claim about the value of large groups at par­ties, make a policy choice about what to treat as the value of large groups at par­ties. You are adding a new vari­able ‘deemed large group good­ness’ be­tween your highly un­cer­tain be­liefs and your ac­tions. I’ll call this a ‘stance’. (I ex­pect it isn’t quite clear what I mean by a ‘stance’ yet, but I’ll elab­o­rate soon.) My pro­posal: to be ‘con­fi­dent’ in the way that one might be from hav­ing strong be­liefs, fo­cus on hav­ing strong stances rather than strong be­liefs.

Strong stances have many of the benefits of con­fi­dent be­liefs. With your new stance on large groups, when you are choos­ing whether to ar­range chairs and snacks to dis­cour­age large groups, you skip over your un­cer­tain be­liefs and go straight to your stance. And since you de­cided it, it is cer­tain, and you can re­ar­range chairs with the vi­gor and sin­gle-mind­ed­ness of a per­son who knowns where they stand. You can con­fi­dently de­clare your op­po­si­tion to large groups, and unite fol­low­ers in a broader cru­sade against gi­ant cir­cles. And if at the en­su­ing party peo­ple form a large group any­way and seem to be re­ally en­joy­ing it, you will hope­fully no­tice this the way you wouldn’t if you were merely un­cer­tain-lean­ing-against re­gard­ing the value of large groups.

That might have been con­fus­ing, since I don’t know of good words to de­scribe the type of men­tal at­ti­tude I’m propos­ing. Here are some things I don’t mean by ‘I deem large group con­ver­sa­tions to be bad’:

  1. “Large group con­ver­sa­tions are bad” (i.e. this is not about what is true, though it is re­lated to that.)

  2. “I de­clare the truth to be ‘large group con­ver­sa­tions are bad’” (i.e. This is not of a kind with be­liefs. Is not di­rectly about what is true about the world, or em­piri­cally ob­served, though it is in­fluenced by these things. I do not have power over the truth.)

  3. “I don’t like large group con­ver­sa­tions”, or “I no­tice that I act in op­po­si­tion to large group con­ver­sa­tions” (i.e. is not a claim about my own feel­ings or in­cli­na­tions, which would still be a pas­sive ob­ser­va­tion about the world)

  4. “The de­ci­sion-the­o­ret­i­cally op­ti­mal value to as­sign to large groups form­ing at par­ties is nega­tive”, or “I es­ti­mate that the de­ci­sion-the­o­ret­i­cally op­ti­mal policy on large groups is op­po­si­tion” (i.e. it is a choice, not an at­tempt to es­ti­mate a hid­den fea­ture of the world.)

  5. “I com­mit to stop­ping large group con­ver­sa­tions” (i.e. It is not a com­mit­ment, or di­rectly claiming any­thing about my fu­ture ac­tions.)

  6. “I ob­serve that I con­sis­tently seek to avert large group con­ver­sa­tions” (this would be an ob­ser­va­tion about a con­sis­tency in my be­hav­ior, whereas here the point is to make a new thing (as­sign a value to a new vari­able?) that my fu­ture be­hav­ior may con­sis­tently make use of, if I want.)

  7. “I in­tend to stop some large group con­ver­sa­tions” (per­haps this one is clos­est so far, but a stance isn’t say­ing any­thing about the fu­ture or about ac­tions—if it doesn’t get changed by the fu­ture, and then in fu­ture I want to take an ac­tion, I’ll prob­a­bly call on it, but it isn’t ‘about’ that.)

Per­haps what I mean is most like: ‘I have a policy of eval­u­at­ing large group dis­cus­sions at par­ties as bad’, though us­ing ‘policy’ as a choice about an ab­stract vari­able that might ap­ply to ac­tion, but not in the sense of a com­mit­ment.

What is go­ing on here more gen­er­ally? You are adding a new kind of ab­stract vari­able be­tween be­liefs and ac­tions. A stance can be a bit like a policy choice on what you will treat as true, or on how you will eval­u­ate some­thing. Or it can also be its own ab­stract thing that doesn’t di­rectly mean any­thing un­der­stand­able in terms of the be­liefs or ac­tions nearby.

Some ideas we already use that are pretty close to stances are ‘X is my pri­or­ity’, ‘I am in the dat­ing mar­ket’, and ar­guably, ‘I am op­posed to daschunds’. X be­ing your pri­or­ity is heav­ily in­fluenced by your un­der­stand­ing of the con­se­quences of X and its al­ter­na­tives, but it is your choice, and it is not dishon­est to pri­ori­tize a thing that is not im­por­tant. To pri­ori­tize X isn’t a claim about the facts rele­vant to whether one would want to pri­ori­tize it. Pri­ori­tiz­ing X also isn’t a com­mit­ment re­gard­ing your ac­tions, though the pur­pose of hav­ing a ‘pri­or­ity’ is for it to af­fect your ac­tions. Your ‘pri­or­ity’ is a kind of ab­stract vari­able added to your men­tal land­scape to col­lect up a bunch of rea­son­ing about the mer­its of differ­ent things, and pack­age them for easy use in de­ci­sions.

Another way of look­ing at this is as a way of for­mal­iz­ing and con­cretify­ing the step where you look at your un­cer­tain be­liefs and then de­cide on a ten­ta­tive an­swer and then run with it.

One can be con­fi­dent in stances, be­cause a stance is a choice, not a guess at a fact about the world. (Though my stance may con­tain un­cer­tainty if I want, e.g. I could take a stance that large groups have a 75% chance of be­ing bad on av­er­age.) So while my be­liefs on a topic may be quite un­cer­tain, my stance can be strong, in a sense that does some of the work we wanted from strong be­liefs. Nonethe­less, since stances are con­nected with facts and val­ues, my stance can be wrong in the sense of not be­ing the stance I should want to have, on fur­ther con­sid­er­a­tion.

In sum, stances:

  1. Are in­puts to de­ci­sions in the place of some be­liefs and values

  2. In­te­grate those be­liefs and val­ues—to the ex­tent that you want them to be—into a sin­gle reusable statement

  3. Can be thought of as some­thing like ‘poli­cies’ on what will be treated as the truth (e.g. ‘I deem large groups bad’) or as new ab­stract vari­ables be­tween the truth and ac­tion (e.g. ‘I am pri­ori­tiz­ing sleep’)

  4. Are cho­sen by you, not im­plied by your epistemic situ­a­tion (un­til some spoils­port comes up with a the­ory of op­ti­mal be­hav­ior)

  5. there­fore don’t per­mit un­cer­tainty in one sense, and don’t re­quire it in an­other (you know what your stance is, and your stance can be ‘X is bad’ rather than ‘X is 72% likely to be bad’), though you should be un­cer­tain about how much you will like your stance on fur­ther re­flec­tion.

I have found hav­ing stances some­what use­ful, or at least en­ter­tain­ing, in the short time I have been try­ing hav­ing them, but it is more of a spec­u­la­tive sug­ges­tion with no other ev­i­dence be­hind it than trust­wor­thy ad­vice.