Can someone explain what “subjectively objective” is supposed to mean?
“Subjectively objective” means this:
When you calculate a probability, the value that you get depends on your prior, and hence on the kind of mind that you have. A different kind of mind with a different prior would arrive at a different value.
But suppose that you made a probabilistic inference about some system that doesn’t include your mind, such as a lottery machine. Then the calculation that you performed did not refer to your mind. Which calculation you performed was determined by the mind that you had, but the calculation so determined did not include your mind as one of its inputs.
In other words, the calculation itself did not take into consideration which mind you had. The only facts taken into consideration were objective facts about how lottery machines work.
As a result, when your mind calculated the probability, this value was not tagged as “depending on which mind you have”. On the contrary, you conclude that someone with a different kind of mind that assigns a different probability-value (given the same information) is objectively wrong. You consider his wrongness to be objective in the sense that, were he to bet according to his probability-value, he would lose money on average, regardless of what kind of mind you or anyone else had.
Nonetheless, the fact that you consider him to be objectively wrong — that fact depends on your mind. You would have reached a different conclusion had you had a different mind. His own mind, for example, concludes that he is objectively right. Thus, the fact that you consider him to be objectively wrong is a subjective fact. This is the sense in which probability is subjectively objective.
I don’t understand your intended meaning for “kind of mind.”
Assume that I believe proposition P with probability x. Then I acquire new evidence E. Like a good rationalist, I update my belief to y.
Then I meet Bob. Bob claims:
1) His degree of belief in P is z.
2) He also just received new evidence E, and no other evidence
3) His prior probability was x.
4) He is a rationalist like me.
Assume I cannot come to agreement with Bob about the correct probability of P.
By Aumann’s agreement theory, something is wrong. One of Bob’s statements might be a lie or I might not be a rationalist.
Or E might not be objective evidence—or universal (that’s technically different from objective). In the category of not particularly interesting problems: our senses are not sufficiently reliable.
I’ve always taken the “subjectively objective” as addressing this problem somehow. Am I missing the point?
I don’t understand your intended meaning for “kind of mind.”
For the purposes of my comment, two minds are “of the same kind” if and only if they began with the same prior. But all the minds under consideration perform standard Bayesian updating on their prior when they encounter evidence.
Assume that I believe proposition P with probability x. Then I acquire new evidence E. Like a good rationalist, I update my belief to y.
Then I meet Bob. Bob claims: 1) His degree of belief in P is z. 2) He also just received new evidence E, and no other evidence 3) His prior probability was x. 4) He is a rationalist like me.
Assume I cannot come to agreement with Bob about the correct probability of P.
By Aumann’s agreement theory, something is wrong. One of Bob’s statements might be a lie or I might not be a rationalist.
If two Bayesian agents started with the same prior probability x for P, and both updated on the same evidence E, then you don’t need Aumann’s agreement theorem to know that they assign the same posterior probability y. That’s just an immediate consequence of both agents using Bayesian updating on the same prior and evidence.
Under the “common-knowledge” conditions of Aumann’s theorem, the agents agree even when they didn’t update on exactly the same evidence E. That is what makes Aumann’s theorem remarkable. Even though their evidence doesn’t exactly coincide, they still agree.
But Aumann’s theorem still requires that the agents started with the same prior. The “different kinds of minds” that I was talking about started with different priors, so Aumann’s theorem doesn’t apply to them. They might disagree even if they have seen exactly the same evidence.
“Subjectively objective” means this:
When you calculate a probability, the value that you get depends on your prior, and hence on the kind of mind that you have. A different kind of mind with a different prior would arrive at a different value.
But suppose that you made a probabilistic inference about some system that doesn’t include your mind, such as a lottery machine. Then the calculation that you performed did not refer to your mind. Which calculation you performed was determined by the mind that you had, but the calculation so determined did not include your mind as one of its inputs.
In other words, the calculation itself did not take into consideration which mind you had. The only facts taken into consideration were objective facts about how lottery machines work.
As a result, when your mind calculated the probability, this value was not tagged as “depending on which mind you have”. On the contrary, you conclude that someone with a different kind of mind that assigns a different probability-value (given the same information) is objectively wrong. You consider his wrongness to be objective in the sense that, were he to bet according to his probability-value, he would lose money on average, regardless of what kind of mind you or anyone else had.
Nonetheless, the fact that you consider him to be objectively wrong — that fact depends on your mind. You would have reached a different conclusion had you had a different mind. His own mind, for example, concludes that he is objectively right. Thus, the fact that you consider him to be objectively wrong is a subjective fact. This is the sense in which probability is subjectively objective.
I don’t understand your intended meaning for “kind of mind.”
Assume that I believe proposition P with probability x. Then I acquire new evidence E. Like a good rationalist, I update my belief to y.
Then I meet Bob. Bob claims: 1) His degree of belief in P is z. 2) He also just received new evidence E, and no other evidence 3) His prior probability was x. 4) He is a rationalist like me.
Assume I cannot come to agreement with Bob about the correct probability of P.
By Aumann’s agreement theory, something is wrong. One of Bob’s statements might be a lie or I might not be a rationalist.
Or E might not be objective evidence—or universal (that’s technically different from objective). In the category of not particularly interesting problems: our senses are not sufficiently reliable.
I’ve always taken the “subjectively objective” as addressing this problem somehow. Am I missing the point?
For the purposes of my comment, two minds are “of the same kind” if and only if they began with the same prior. But all the minds under consideration perform standard Bayesian updating on their prior when they encounter evidence.
If two Bayesian agents started with the same prior probability x for P, and both updated on the same evidence E, then you don’t need Aumann’s agreement theorem to know that they assign the same posterior probability y. That’s just an immediate consequence of both agents using Bayesian updating on the same prior and evidence.
Under the “common-knowledge” conditions of Aumann’s theorem, the agents agree even when they didn’t update on exactly the same evidence E. That is what makes Aumann’s theorem remarkable. Even though their evidence doesn’t exactly coincide, they still agree.
But Aumann’s theorem still requires that the agents started with the same prior. The “different kinds of minds” that I was talking about started with different priors, so Aumann’s theorem doesn’t apply to them. They might disagree even if they have seen exactly the same evidence.