I don’t see any evidence that Person B won’t defect just as readily, just that they haven’t yet realized that other people are wrong. Maybe Person B is wrong simply out of an easily cured ignorance, and will happily become a “Person A” once that ignorance is cured.
In short, I actually know more about the behavior of Person A, and therefore I trust them more. All I know about Person B is that they’re ignorant.
Remember person A is the odd one in his society. He dosen’t share most other peoples map of reality. Other people have very good reasons to doubt his rationality. Easily cured ignorance of person B is all but so.
Certainly a particular person B might just have not gotten around to realizing this. But I think generally you are missing what was implied in the comparison. Being person B seems to have greater fitness in certain circumstances. And we know there are mechanism developed in our own minds that help us stay person B.
I think we actually know more about the typical person B than just he that he is ignorant. For starters we de facto know he is less rational than A. Secondly its much more likley than with person A, that the mentioned mechanisms are doing their job properly.
Remember person A is the odd one in his society. He dosen’t share most other peoples map of reality. Other people have very good reasons to doubt his rationality.
But by assumption, his society is irrational, so their reasons for doubting his rationality are themselves irrational. Needless to say, all socially desirable beliefs in our society are of course wonderfully beneficent, but let us instead suppose the society is Soviet Russia, Nazi Germany, or any society that no longer meets our highly enlightened stamp of approval. Are you better off associated with the fake Nazi or the sincere Nazi?
Clearly, you are better off associating with the fake Nazi.
The society given in the example is wrong. But that’s not exactly the same as being irrational, I do think however that its probable to say that person A is more rational than the society as a whole. This may be a high or low standard mind you.
Now again I dislike the highly charged example, since they narrow down the scope of thinking, but I suppose you do make a vivid case.
is rationality are themselves irrational.
But how can they know this? If they know this why don’t they change? All else being equal an individual being mistaken seem more likley than the societal consensus being wrong. I don’t think you realize on just how much human societies agree. Also just because society is wrong, dosen’t mean the individual is right.
Are you better off associated with the fake Nazi or the sincere Nazi?
The answer for the typical person living in Nazi Germany would be? Mind you a Nazi Germany where we don’t have the benefit of hindsight that the regime will be short lived.
But how can they know this? If they know this why don’t they change?
They don’t change because their beliefs are politically convenient. Because their beliefs justify the elite exercising power over the less elite. Because their beliefs justify behavior by the elite that serves the interests of members of the elite but destroys society.
Searching for an example of suicidal delusions that is not unduly relevant to either today’s politics or yesterdays demons—unfortunately, such examples are necessarily obscure.
The nineteenth century British belief in benevolent enlightened imperialism justified a transfer of power and wealth from the unenlightened and piratical colonialists, to members of the British establishment more closely associated with the government, the elite and the better schools. Lots of people predicted this ideology would wind up having the consequences that it did have, that the pirates actually governed better, but were, of course, ignored.
Now again I dislike the highly charged example, since they narrow down the scope of thinking, but I suppose you do make a vivid case.
If I reference beliefs in our society that might cause harmful effects were they not so wise and enlightened, it also makes vivid case. Indeed, any reference to strikingly harmful effects makes a vivid case.
The answer for the typical person living in Nazi Germany would be? Mind you a Nazi Germany where we don’t have the benefit of hindsight that the regime will be short lived.
But some people did have the foresight that the regime was going to be short lived, at least towards the end. Nazi strategy was explained in Hitler’s widely read book. The plan was to destroy France (done), force a quick peace settlement with the anglophones (failed), and then invade and ethnically cleanse a large part of Russia. The plan was for short wars against a small set of enemies at any one time. When the British sank the Bismark, the plan was in trouble, since Anglophone air and sea superiority made it unlikely that Germany could force a quick peace, or force them to do anything they did not feel like doing, nor force them to refrain from doing anything they might feel like doing. When they sank the Bismark in May 1941, it was apparent that anglophones could reach Germany, and Germany could not effectively reach them. At that point all type A’s should have suspected that Germany had lost the war. At Stalingrad, the plan sank without a trace, and every type A must have known that the war was lost.
In general, a type A will predict the future better than a type B, since false beliefs lead society to unforseen consequences.
For starters we de facto know he is less rational than A
Ignorance does not imply unintelligent, irrational, etc., much less make a de facto case for them. There’s nothing irrational about honestly believing the group-consensus if you don’t have the skill foundation to see how it could be wrong. Sure, one should be open about one’s ignorance, but you still have to have anticipations to function, and Bayesian evidence suggests “follow the leader” is better than “pick randomly”. Especially since, not having the background knowledge in the first place, one would be hard pressed to list choices to pick randomly amongst :)
There’s nothing irrational about honestly believing the group-consensus if you don’t have the skill foundation to see how it could be wrong.
If someone does not have the skill foundation to see how the group-consensus is wrong, he is ignorant or stupid. Such people are, quite inadvertently, dangerous and harmful. There is no con man worse or more dangerous than a con man who sincerely believes his own scam, and is therefore quite prepared to go down with his ship.
There is no con man worse or more dangerous than a con man who sincerely believes his own scam, and is therefore quite prepared to go down with his ship.
This is true in a big way that I haven’t mentioned before though. Type B seem to me more likley to cause trouble for anyone attempting to implement solutions that might avert tragedy of the commons situations caused by a false society wide belief, than type A.
There’s nothing irrational about honestly believing the group-consensus if you don’t have the skill foundation to see how it could be wrong.
Actually he is right. Just because you can’t find a flaw with common consensus dosen’t mean you are ignorant or stupid because its perfectly possible there is no flaw with common consensus on a particular subject or that the flaw is too difficult to detect by the means available to you. Perhaps its too difficult to detect the flaw with the means the entire society has available to it!
A rational agent is not an omniscient agent after all!
I think you may be letting yourself slightly adversarial in your thinking here because you perceive this as a fight over a specific thing you estimate society is delusional about. Its not, its really not. Chill man. :)
Edit: Considering the downvotes, I just want to ask what I missing in this comment? Thanks for any help!
Yes but the odds of A getting the right answer from picking randomly are even lower. ;)
Remember person A was defined in this example as having a better map on this little spot, though I suppose most of the analysis done by people so far works equally well for someone who thinks he has a better map and is hiding it.
So Person A believes in MWI because they read the Quantum Mechanics sequence, and Person B never thought about it beyond an article in Discover Magazine saying all the top scientists favor the Copenhagen interpretation. They’re both being entirely rational about the information they have, even if Person A has the right answer :)
I suppose they are in a sense, but what exactly are the rewards/lack of benefit for a layman, even an educated one, believing or not in MWI? .I think a major indicator is that I haven’t heard in recent years of anyone been outed as a MWIist and loosing their job as a consequence :P
Nitpick: The average person who has read QM sequence is likley above average in rationality.
but what exactly are the rewards/lack of benefit for a layman, even an educated one, believing or not in MWI?
Everyone is avoiding realistic examples, for fear that if they should disturb any of the several large elephants in the living room, they will immediately be trampled.
Substitute a relevant example as needed, I’m simply trying to make the point that ignorance != irrationality. Someone who simply has more information on a field is going to reach better conclusions, and will thus need to hide controversial opinions. Someone with less information is generally going to go with the “follow the herd” strategy, because in the absence of any other evidence, it’s their best bet. Thus, just based on knowledge (not rationality!) you’re going to see a split between A and B types.
There dosen’t have to be a correlation of 1 between ignorance and irrationally. There just has to be some positive correlation for us to judge in the absence of other information A probably more rational than B.
And if there isn’t a correlation greater than 0 between rationality and a proper map of reality, uhm what is this rationality thing anyway?
For starters we de facto know he is less rational than A
Ahhh, you’re meaning “we have Bayesian evidence that Person B is less likely to be rational than Person A”? I’d agree, but I still think it’s weak evidence if you’re only looking at a single situation, and
I’d still feel I therefore know more about Person A (how they handle these situations) than I do about Person B (merely that they are either ignorant or irrational). How someone handles a situation strikes me as a more consistent trait, whereas most people seem to have enough gaps in their knowledge that a single gap is very little evidence for other gaps.
Ahhh, you’re meaning “we have Bayesian evidence that Person B is less likely to be rational than Person A”?
Yeah I should have been more explicit on that, sorry for the miscommunication!
I’d agree, but I still think it’s weak evidence if you’re only looking at a single situation, and
I’d still feel I therefore know more about Person A (how they handle these situations) than I do about Person B (merely that they are either ignorant or irrational). How someone handles a situation strikes me as a more consistent trait, whereas most people seem to have enough gaps in their knowledge that a single gap is very little evidence for other gaps.
Perhaps for convenience we can add that person A and B are exposed to the same information? It dosen’t change the spirit of the thought experiment. I was originally implicitly operating with that as given but since we started discussing it I’ve noticed I never explicitly mentioned it.
Basically I wanted to compare what kinds of things person A/B would signal in a certain set of circumstances to others.
I don’t see any evidence that Person B won’t defect just as readily, just that they haven’t yet realized that other people are wrong. Maybe Person B is wrong simply out of an easily cured ignorance, and will happily become a “Person A” once that ignorance is cured.
In short, I actually know more about the behavior of Person A, and therefore I trust them more. All I know about Person B is that they’re ignorant.
Remember person A is the odd one in his society. He dosen’t share most other peoples map of reality. Other people have very good reasons to doubt his rationality. Easily cured ignorance of person B is all but so.
Certainly a particular person B might just have not gotten around to realizing this. But I think generally you are missing what was implied in the comparison. Being person B seems to have greater fitness in certain circumstances. And we know there are mechanism developed in our own minds that help us stay person B.
I think we actually know more about the typical person B than just he that he is ignorant. For starters we de facto know he is less rational than A. Secondly its much more likley than with person A, that the mentioned mechanisms are doing their job properly.
But by assumption, his society is irrational, so their reasons for doubting his rationality are themselves irrational. Needless to say, all socially desirable beliefs in our society are of course wonderfully beneficent, but let us instead suppose the society is Soviet Russia, Nazi Germany, or any society that no longer meets our highly enlightened stamp of approval. Are you better off associated with the fake Nazi or the sincere Nazi?
Clearly, you are better off associating with the fake Nazi.
I eat babies.
(Translation: Please don’t ask rhetorical questions that make me choose between agreeing with you and affiliating with sincere Nazis.)
Upvoted since I strongly agree. Arguments shouldn’t require using such strongly emotionally biasing examples to be persuasive.
And you made your point so wonderfully concise.
I think the question was mostly intended to be about fake and sincere creationists rather than fake and sincere Nazis.
The society given in the example is wrong. But that’s not exactly the same as being irrational, I do think however that its probable to say that person A is more rational than the society as a whole. This may be a high or low standard mind you.
Now again I dislike the highly charged example, since they narrow down the scope of thinking, but I suppose you do make a vivid case.
But how can they know this? If they know this why don’t they change? All else being equal an individual being mistaken seem more likley than the societal consensus being wrong. I don’t think you realize on just how much human societies agree. Also just because society is wrong, dosen’t mean the individual is right.
The answer for the typical person living in Nazi Germany would be? Mind you a Nazi Germany where we don’t have the benefit of hindsight that the regime will be short lived.
They don’t change because their beliefs are politically convenient. Because their beliefs justify the elite exercising power over the less elite. Because their beliefs justify behavior by the elite that serves the interests of members of the elite but destroys society.
Searching for an example of suicidal delusions that is not unduly relevant to either today’s politics or yesterdays demons—unfortunately, such examples are necessarily obscure.
The nineteenth century British belief in benevolent enlightened imperialism justified a transfer of power and wealth from the unenlightened and piratical colonialists, to members of the British establishment more closely associated with the government, the elite and the better schools. Lots of people predicted this ideology would wind up having the consequences that it did have, that the pirates actually governed better, but were, of course, ignored.
If I reference beliefs in our society that might cause harmful effects were they not so wise and enlightened, it also makes vivid case. Indeed, any reference to strikingly harmful effects makes a vivid case.
But some people did have the foresight that the regime was going to be short lived, at least towards the end. Nazi strategy was explained in Hitler’s widely read book. The plan was to destroy France (done), force a quick peace settlement with the anglophones (failed), and then invade and ethnically cleanse a large part of Russia. The plan was for short wars against a small set of enemies at any one time. When the British sank the Bismark, the plan was in trouble, since Anglophone air and sea superiority made it unlikely that Germany could force a quick peace, or force them to do anything they did not feel like doing, nor force them to refrain from doing anything they might feel like doing. When they sank the Bismark in May 1941, it was apparent that anglophones could reach Germany, and Germany could not effectively reach them. At that point all type A’s should have suspected that Germany had lost the war. At Stalingrad, the plan sank without a trace, and every type A must have known that the war was lost.
In general, a type A will predict the future better than a type B, since false beliefs lead society to unforseen consequences.
Ignorance does not imply unintelligent, irrational, etc., much less make a de facto case for them. There’s nothing irrational about honestly believing the group-consensus if you don’t have the skill foundation to see how it could be wrong. Sure, one should be open about one’s ignorance, but you still have to have anticipations to function, and Bayesian evidence suggests “follow the leader” is better than “pick randomly”. Especially since, not having the background knowledge in the first place, one would be hard pressed to list choices to pick randomly amongst :)
If someone does not have the skill foundation to see how the group-consensus is wrong, he is ignorant or stupid. Such people are, quite inadvertently, dangerous and harmful. There is no con man worse or more dangerous than a con man who sincerely believes his own scam, and is therefore quite prepared to go down with his ship.
This is true in a big way that I haven’t mentioned before though. Type B seem to me more likley to cause trouble for anyone attempting to implement solutions that might avert tragedy of the commons situations caused by a false society wide belief, than type A.
Actually he is right. Just because you can’t find a flaw with common consensus dosen’t mean you are ignorant or stupid because its perfectly possible there is no flaw with common consensus on a particular subject or that the flaw is too difficult to detect by the means available to you. Perhaps its too difficult to detect the flaw with the means the entire society has available to it!
A rational agent is not an omniscient agent after all!
I think you may be letting yourself slightly adversarial in your thinking here because you perceive this as a fight over a specific thing you estimate society is delusional about. Its not, its really not. Chill man. :)
Edit: Considering the downvotes, I just want to ask what I missing in this comment? Thanks for any help!
Yes but the odds of A getting the right answer from picking randomly are even lower. ;)
Remember person A was defined in this example as having a better map on this little spot, though I suppose most of the analysis done by people so far works equally well for someone who thinks he has a better map and is hiding it.
So Person A believes in MWI because they read the Quantum Mechanics sequence, and Person B never thought about it beyond an article in Discover Magazine saying all the top scientists favor the Copenhagen interpretation. They’re both being entirely rational about the information they have, even if Person A has the right answer :)
I suppose they are in a sense, but what exactly are the rewards/lack of benefit for a layman, even an educated one, believing or not in MWI? .I think a major indicator is that I haven’t heard in recent years of anyone been outed as a MWIist and loosing their job as a consequence :P
Nitpick: The average person who has read QM sequence is likley above average in rationality.
Everyone is avoiding realistic examples, for fear that if they should disturb any of the several large elephants in the living room, they will immediately be trampled.
Substitute a relevant example as needed, I’m simply trying to make the point that ignorance != irrationality. Someone who simply has more information on a field is going to reach better conclusions, and will thus need to hide controversial opinions. Someone with less information is generally going to go with the “follow the herd” strategy, because in the absence of any other evidence, it’s their best bet. Thus, just based on knowledge (not rationality!) you’re going to see a split between A and B types.
There dosen’t have to be a correlation of 1 between ignorance and irrationally. There just has to be some positive correlation for us to judge in the absence of other information A probably more rational than B.
And if there isn’t a correlation greater than 0 between rationality and a proper map of reality, uhm what is this rationality thing anyway?
Ahhh, you’re meaning “we have Bayesian evidence that Person B is less likely to be rational than Person A”? I’d agree, but I still think it’s weak evidence if you’re only looking at a single situation, and
I’d still feel I therefore know more about Person A (how they handle these situations) than I do about Person B (merely that they are either ignorant or irrational). How someone handles a situation strikes me as a more consistent trait, whereas most people seem to have enough gaps in their knowledge that a single gap is very little evidence for other gaps.
Yeah I should have been more explicit on that, sorry for the miscommunication!
Perhaps for convenience we can add that person A and B are exposed to the same information? It dosen’t change the spirit of the thought experiment. I was originally implicitly operating with that as given but since we started discussing it I’ve noticed I never explicitly mentioned it.
Basically I wanted to compare what kinds of things person A/B would signal in a certain set of circumstances to others.
No worries. I think part of it was on me as well :)