# So8res comments on The trouble with Bayes (draft)

• Sure! I would like to clar­ify, though, that by “log­i­cally om­ni­scient” I also meant “while be­ing way larger than ev­ery­thing else in the uni­verse.” I’m also read­ily will­ing to ad­mit that Bayesian prob­a­bil­ity the­ory doesn’t get any­where near solv­ing de­ci­sion the­ory, that’s an en­tirely differ­ent can of worms where there’s still lots of work to be done. (Bayesian prob­a­bil­ity the­ory alone does not pre­scribe two-box­ing, in fact; that re­quires the ad­di­tion of some de­ci­sion the­ory which tells you how to com­pute the con­se­quences of ac­tions given a prob­a­bil­ity dis­tri­bu­tion, which is way out­side the do­main of Bayesian in­fer­ence.)

Bayesian rea­son­ing is an ideal­ized method for build­ing ac­cu­rate world-mod­els when you’re the biggest thing in the room; two large open prob­lems are (a) mod­el­ing the world when you’re smaller than the uni­verse and (b) com­put­ing the coun­ter­fac­tual con­se­quences of ac­tions from your world model. Bayesian prob­a­bil­ity the­ory sheds lit­tle light on ei­ther; nor is it in­tended to.

I per­son­ally don’t think it’s that use­ful to con­sider cases like “but what if there’s two log­i­cally om­ni­scient rea­son­ers in the same room?” and then de­mand a co­her­ent prob­a­bil­ity dis­tri­bu­tion. Nev­er­the­less, you can do that, and in fact, we’ve re­cently solved that prob­lem (Benya and Jes­sica Tay­lor will be pre­sent­ing it at LORI V next week, in fact); the an­swer, as­sum­ing the usual de­ci­sion-the­o­retic as­sump­tions, is “they play Nash equil­ibria”, as you’d ex­pect :-)

• Cool, I will take a look at the pa­per!