The majority of those who best know the arguments for and against thinking that a given social movement is the world’s most important cause… are presumably members of that social movement.
Knowing the arguments for and against X being the World’s Most Important Cause (WMIC) is fully compatible with concluding X is not the WMIC, even a priori. And deeply engaging with arguments about any X being the WMIC is an unusual activity, characteristic of Effective Altruism. If you do that activity a lot, then it’s likely you know the arguments for and against many causes, making it unlikely you’re a member of all causes for which you know the arguments for and against.
If they decide to hear out a first round of arguments but don’t find them compelling enough, they drop out of the process.
The simple hurdle model presented by OP implies that there is tremendous leverage in coming up with just one more true argument against a flawed position. Presented with it, a substantial number of the small remaining number of true believers in the flawed position will accept it and change their mind. My perception is that this is not at all what we typically assume when arguing with a true believer in some minority position—we expect that they are especially resistant to changing their mind.
I think a commonsense point of view is that true believers in flawed positions got there under the influence of systematic biases that dramatically increased the likelihood that they would adopt a flawed view. Belief in a range of conspiracy theories and pseudoscientific views appear to be correlated both in social groups and within individuals, which would support the hypothesis of systematic biases accounting for the existence of minority groups holding a common flawed belief. Possibly, their numbers are increased by a few unlucky reasoners who are relatively unbiased but made a series of unfortunate reasoning mistakes, and will hopefully see the light when presented with the next accurate argument.
Knowing the arguments for and against X being the World’s Most Important Cause (WMIC) is fully compatible with concluding X is not the WMIC, even a priori. And deeply engaging with arguments about any X being the WMIC is an unusual activity, characteristic of Effective Altruism. If you do that activity a lot, then it’s likely you know the arguments for and against many causes, making it unlikely you’re a member of all causes for which you know the arguments for and against.
The simple hurdle model presented by OP implies that there is tremendous leverage in coming up with just one more true argument against a flawed position. Presented with it, a substantial number of the small remaining number of true believers in the flawed position will accept it and change their mind. My perception is that this is not at all what we typically assume when arguing with a true believer in some minority position—we expect that they are especially resistant to changing their mind.
I think a commonsense point of view is that true believers in flawed positions got there under the influence of systematic biases that dramatically increased the likelihood that they would adopt a flawed view. Belief in a range of conspiracy theories and pseudoscientific views appear to be correlated both in social groups and within individuals, which would support the hypothesis of systematic biases accounting for the existence of minority groups holding a common flawed belief. Possibly, their numbers are increased by a few unlucky reasoners who are relatively unbiased but made a series of unfortunate reasoning mistakes, and will hopefully see the light when presented with the next accurate argument.