Here are a cluster of things. Does this cluster have a well-known name?
A voter has some radical political preferences X, but the voting system where they live is FPTP, and their first preference has no chance of winning. So they vote for a person they like less who is more likely to win. The loss of the candidate who supported X is then cited as evidence that supporting X means you can’t win.
A pollster goes into the field and gets a surprising result. They apply some unprincipled adjustment to move towards the average before publishing. (this example has a name—it’s herding)
A blogger believes unpopular position Y. They know that writing an argument for Y would be bad for their reputation. So they write a softened version, arguing for something less unpopular. This then gets added to the mound of evidence that position Y is unpopular.
Some related concepts: self-fulfilling prophecy, herding, preference falsification
I don’t know a standard name. I call it “fallacy of the revealed preferences”, because these situations have in common “you do X, someone concludes that X is what you actually wanted because that’s what you did, duh”.
More precisely, the entire concept of “revealed preferences” is prone to the motte-and-bailey game, where the correct conclusion is “given the options and constraints that you had at the moment, you chose X”, but it gets interpreted as “X is what you would freely choose even if you had no constraints”. (People usually don’t state it explicitly like this, they just… don’t mention the constraints, or even the possibility of having constraints.)
Is the thing you’re trying to label the peculiar confirmation bias where people instead of interpreting evidence to confirm to what they prefer or would like to be true, only to what they believe to be true—even if from their perspective it is pessimistic?
Or are you looking for a label for “this is unpopular therefore it can’t win” as a specific kind of self-fulfilling prophecy? Like an inverted Keynesian beauty contest?
Here are a cluster of things. Does this cluster have a well-known name?
A voter has some radical political preferences X, but the voting system where they live is FPTP, and their first preference has no chance of winning. So they vote for a person they like less who is more likely to win. The loss of the candidate who supported X is then cited as evidence that supporting X means you can’t win.
A pollster goes into the field and gets a surprising result. They apply some unprincipled adjustment to move towards the average before publishing. (this example has a name—it’s herding)
A blogger believes unpopular position Y. They know that writing an argument for Y would be bad for their reputation. So they write a softened version, arguing for something less unpopular. This then gets added to the mound of evidence that position Y is unpopular.
Some related concepts: self-fulfilling prophecy, herding, preference falsification
I don’t know a standard name. I call it “fallacy of the revealed preferences”, because these situations have in common “you do X, someone concludes that X is what you actually wanted because that’s what you did, duh”.
More precisely, the entire concept of “revealed preferences” is prone to the motte-and-bailey game, where the correct conclusion is “given the options and constraints that you had at the moment, you chose X”, but it gets interpreted as “X is what you would freely choose even if you had no constraints”. (People usually don’t state it explicitly like this, they just… don’t mention the constraints, or even the possibility of having constraints.)
Is the thing you’re trying to label the peculiar confirmation bias where people instead of interpreting evidence to confirm to what they prefer or would like to be true, only to what they believe to be true—even if from their perspective it is pessimistic?
Or are you looking for a label for “this is unpopular therefore it can’t win” as a specific kind of self-fulfilling prophecy? Like an inverted Keynesian beauty contest?