Note: Although this post cites specific real-life examples, the intent of the discussion is intended to be entirely at the meta level.
Scott Alexander’s definition is apt to cite:
The straw man is a terrible argument nobody really holds, which was only invented so your side had something easy to defeat. The weak man is a terrible argument that only a few unrepresentative people hold, which was only brought to prominence so your side had something easy to defeat.
Also instructive is Bryan Caplan’s gradation:
OK, what about “collective straw manning” – questionably accusing a group for its painfully foolish positions? Now we have:
3. Criticizing a viewpoint for a painfully foolish position no adherent holds.
4. Criticizing a viewpoint for a painfully foolish position some adherents hold.
5. Criticizing a viewpoint for a painfully foolish position many adherents hold.
6. Criticizing a viewpoint for a painfully foolish position most adherents hold.
What Caplan is describing as “collective straw manning” seems to be a good scale for weakmanning. And lastly, consider also Julian Sanchez’s disclaimer:
With a “weak man,” you don’t actually fabricate a position, but rather pick the weakest of the arguments actually offered up by people on the other side and treat it as the best or only one they have. As Steve notes, this is hardly illegitimate all the time, because sometimes the weaker argument is actually the prevalent one. Maybe the best arguments for Christianity are offered up by Thomas Aquinas or St. Augustine, but I doubt there are very many people who are believers because they read On Christian Doctrine. Probably this will be the case with some frequency, if only because the less complex or sophisticated an argument is, the easier it is for lots of people to be familiar with it. On any topic of interest, a three-sentence argument is unlikely to be very good, but it’s a lot more likely to spread.
At least in theory, I think weakmanning should be avoided, but I struggle with how to draw the line exactly. If your goal is to avoid weakmanning, there’s at least two axes that you must consider:
All the possible arguments for position X, ranked on a spectrum from least to most defensible.
All the possible arguments for position X, ranked on a spectrum from least to most representative of believers in X.
Weakmanning is not much of an issue if you’re arguing against a single individual, because they either endorse the particular arguments or not. You can’t showcase the error of one’s ways by refuting arguments they never held.
But generally we tend to argue over positions endorsed by many different people, where each person may differ with regard to which argument they either advance or prioritize, so what should count as “representative”?
For example, many people believe in the theory of evolution, but some believers do so under the erroneous belief that evolutionary change occurs within an individual organism’s lifespan. If you use a crude heuristic and only poll relevant experts (e.g. biology professors) you’re not likely to encounter many adherents of the “change-within-lifespan” argument, so this could be a decent filter to narrow your focus on what should count as “representative” for a given position. This is generally an effective tactic, since it helps you avoid prematurely declaring victory at Wrestlemania just because you trounced some toddlers at the playground.
But sometimes you get a crazy position believed by crazy people based on crazy arguments, with a relatively tiny minority within/adjacent to the community of believers aware of the problems and doing the Lord’s work coming up with better arguments. InverseFlorida coined the term “sanewashing” to describe how the meaning of “defund the police” (DTP) shifted to something much more neutered and, correspondingly, much more defensible:
So, now say you’re someone who exists in a left-adjacent social space, who’s taken up specific positions that have arrived to you through an “SJW” space, and now has to defend them to people who don’t exist in any of your usual social spaces. These are ideas that you don’t understand completely, because you absorbed them through social dynamics and not by detailed convincing arguments, but they’re ones you’re confident are right because you were assured, in essence, that there’s a mass consensus behind them. When people are correctly pointing out that the arguments behind the position people around your space are advancing fail, but you’re not going to give up the position because you’re certain it’s right, what are you going to do? I’m arguing you’re going to sanewash it. And by that I mean, what you do is go “Well, obviously the arguments that people are obviously making are insane, and not what people actually believe or mean. What you can think of it as is [more reasonable argument or position than people are actually making]”.
Keep in mind that this is not an object-level discussion on the merits of DTP. Assume arguendo that the “sanewashed” arguments are much more defensible than the “crazy” ones they replaced. If someone were to take a position against DTP by arguing against the now obsolete arguments, one of the sanewashers would be technically correct accusing you of weakmanning for daring to bring up that old story again. This fits the literal definition of weakmanning after all.
As Sanchez noted above, for most people for most positions, intuition predates rationality. They stumble around in the dark looking for any sort of foothold, then work backwards to fill in any necessary arguments. Both the sanewashers and the crazies are reliant on the other. Without the sanitization from the hygiene-minded sanewashers, the position would lack the fortification required to avoid erosion; and without the crazy masses delivering the bodies and zeal, the position would fade into irrelevance. The specific ratio may vary, but this dynamic is present in some amount on any given position. You very likely have already experienced the embarrassment that comes from a compatriot, purportedly on your side, making an ass of both of youse with their nonsensical arguments.
If your ultimate goal is truth-seeking, weakmanning will distract you into hacking away at worthless twigs rather than striking at the core. But sometimes the goal isn’t seeking truth (either because it’s irrelevant or otherwise already beyond reasonable dispute) and instead the relevant topic is the collective epistemological dynamics. InverseFlorida’s insightful analysis would not have been possible without shining a spotlight on the putative crazies — the very definition of weakmanning in other words.
Here’s the point, at last. Normally someone holding a belief for the wrong reasons is not enough to negate that belief. But wherever a sanewasher faction appears to be spending considerable efforts cleaning up the mess their crazy neighbors keep leaving behind, it should instigate some suspicion about the belief, at least as a heuristic. Any honest and rational believer needs to grapple for an explanation for how the crazies managed to all be accidentally right despite outfitted — by definition — with erroneous arguments. Such a scenario is so implausible that it commands a curious inquiry about its origin.
It’s possible that this inquiry unearths just another fun episode in the collective epistemological dynamics saga; it’s also possible the probe also exposes a structural flaw with the belief itself. In either circumstances, a weakmanning objection is made in bad faith and intended to obfuscate. Its only purpose is to get you to ignore the inconvenient, the annoying. You should pay no heed to this protest and continue deploying the magnifying glass; don’t be afraid to focus the sun’s infernal rays into a burning pyre of illumination.
I know some smartass in the comments will pipe up about some endangered tropical beetle or whatever does demonstrate “change-within-lifespan” evolutionary changes. Just remember that this is not an object-level discussion.
TracingWoodgrains described the same dynamic with the gentrification of r/antiwork. Credit also to him for most of the arborist-themed metaphor in this post.
I dare you to use this phrase at a dinner party without getting kicked out.
In some cases, your goal is to figure out the “correct” political position to hold (based on your ethics, goals, and other beliefs). In that scenario, what any particular person believes is, logically speaking, irrelevant. (If you’re debating a certain person, then it’s probably rude and possibly against the rules of the debate (if there are any) for you to spend all your time talking about positions your opponent doesn’t hold, and never engage with their actual views; so that’s a reason to talk about their actual views, but not a reason to believe them any more than other potential views you find equally plausible.)
In other cases, your goal is to decide if some political movement is a good one, or whether to support a political party or coalition or group. In that case, questions like “How many of you actually support position X vs position Y” are relevant. (And “How much better is X than Y” is also relevant, if there are enough supporters of those positions to be worth considering.)
Truthseekers do well to bear in mind the difference between these goals, and when questions bear on one but not the other.
Except to the extent that person X believing Y is taken as evidence that Y is true. That would apply where X is known to be an expert on Y.