Power vs Precision

I’ve written a couple of posts (1,2) about General Semantics and precision (IE, non-equivocation). These posts basically argue for developing superpowers of precision. In addition, I suspect there are far more pro-precision posts on LessWrong than the opposite (although most of these would be about specific examples, rather than general calls for precision, so it’s hard to search for this to confirm).

A comment on my more recent general-semantics-inspired posts applied the law of equal and opposite advice, pointing out that there are many advantages to abstraction/​vagueness as well; so, why should we favor precision? At least, we should chesterton-fence the default level of precision before we assume that turning the dial up higher is better.

I totally agree that there are advantages to fuzzy clusters as well; what I actually think is that we should level up both specificity-superpowers and vagueness-superpowers. However, it does seem to me like there’s something special about specificity. I think I would prefer a variant of LessWrong which valued specificity more to one which valued it less.

To possibly-somewhat explain why this might be, I want to back up a bit. I’ve been saying “specificity” or “precision”, but what I actually mean is specificity/​precision in language and thought. However, I think it will be useful to think about physical precision as an analogy.

Physical Power vs Precision

Power comes from muscles, bones, lungs, and hearts. Precision comes from eyes, brains, and fingers. (Speaking broadly.)

Probably the most broadly important determiner of what you can accomplish in the physical world is power, by which I mean the amount of physical force you can bring to bear. This determines whether you can move the rock, open the jar, etc. In the animal kingdom, physical power determines who will win a fight.

However, precision is also incredibly important, especially for humans. Precision starts out being very important for the use of projectiles, a type of attack which is not unique to humans, but nearly so. It then becomes critical for crafting and tool use.

Precision is about creating very specific configurations of matter, often with very little force (large amounts of force are often harder to exercise precision with).

We could say that power increases the range of accessible states (physical configurations which we could realize), while precision increases the optimization pressure which we can apply to select from those states. A body-builder has all the physical strength necessary to weave a basket, but may lack the precision to bring it about.

(There are of course other factors than just precision which go into basket-weaving, such as knowledge; but I’m basically ignoring that distinction here.)

Generalizing

We can analogize the physical precision/​power dichotomy to other domains. Social power would be something like the number of people who listen to you, whereas social precision would be the ability to say just the right words. Economic power would simply be the amount of money you have, while economic precision would involve being a discerning customer, as well as managing the logistics of a business to cut costs through efficiency gains.

In the mental realm, we could say that power means IQ, while precision means rationality. Being slightly more specific, we could say that “mental power” means things like raw processing power, short-term memory (working memory, RAM, cache), and long-term memory (hard drive, episodic memory, semantic memory). “Mental precision” is like running better software: having the right metacognitive heuristics. Purposefully approximating probability theory and decision theory, etc.

(This is very different from what I meant by precision in thought, earlier, but this post itself is very imprecise and cluster-y! So I’ll ignore the distinction for now.)

Working With Others

In Conceptual Specialization of Labor Enables Precision, Vaniver hypothesizes that people in the past who came up with “wise sayings” actually did know a lot, but were unable to convey their knowledge precisely. This results in a situation where the wise sayings convey relatively little of value to the ignorant, but later in life, people have “aha” moments where they suddenly understand a great deal about what was meant. (This can unfortunately result in a situation where the older people mistakenly think the wise sayings were quite helpful in the long run, and therefore, propagate these same sayings to younger people.)

Vaniver suggests that specialized fields avoid this failure mode, because they have the freedom to invent more specialized language to precisely describe what they’re talking about, and the incentive to do so.

I think there’s more to it.

Let’s think about physical precision again. Imagine a society which cares an awful lot about building beautiful rock-piles.

If you’re stacking rocks with someone of low physical precision, then it matters a lot less that you have high precision yourself. You may be able to place a precisely balanced rock, but your partner will probably knock it out of balance. You will learn not to exercise your precision. (Or rather, you will use your precision to optimize for structures that will endure your partner’s less-precise actions.)

So, to first approximation, the precision of a team of people can only be a little higher than the precision of its least precise member.

Power, on the other hand, is not like this. The power of a group is basically the sum of the power of its members.

I think this idea also (roughly) carries over to non-physical forms of precision, such as rationality and conceptual/​linguistic precision. In hundreds of ways, the most precise people in a group will learn to “fuzz out” their language to cope with the less-precise thinking/​speaking of those around them. (For example, when dealing with a sufficiently large and diverse group, you have about five words. This explains the “wise sayings” Vaniver was thinking of.)

For example, in policy debates should not appear one-sided, Eliezer laments the conflation of a single pro or con with claims about the overall balance of pros/​cons. But in an environment where a significant fraction of people are prone to make this mistake, it’s actively harmful to make the distinction. If I say “wearing sunscreen is correlated with skin cancer”, many people will automatically confuse this with “you should not wear sunscreen”.

So, conflation can serve as the equivalent of a clumsy person tipping over carefully balanced rocks: it will incentivize the more precise people to act as if they were less precise (or rather, use their optimization-pressure to avoid sentences which are prone to harmful conflation, which leaves less optimization-pressure for other things).

This, imho, is one of the major forces behind “specialization enables precision”: your specialized field will filter for some level of precision, which enables everybody to be more precise. Everyone in a chemistry lab knows how to avoid contamination and how to avoid being poisoned, so together, chemists can set up a more fragile and useful configuration of chemicals and tools. They could not accomplish this in, say, a public food court.

So, imho, part of what makes LessWrong a great place is the concentration of intellectual precision. This enables good thinking in a way that a high concentration of “good abstraction skill” (the skill of making good fuzzy clusters) does not.

Conversations about good fuzzy abstract clusters are no more or less important in principle. However, those discussions don’t require a concentration of people with high precision. By its nature, fuzzy abstract clustering doesn’t get hurt too much by conflation. This kind of thinking is like moving a big rock: all you need is horsepower. Precise conversations, on the other hand, can only be accomplished with similarly precise people.

(An interesting nontrivial prediction of this model is that clustering-type cognitive labor, like moving a big rock, can easily benefit from a large number of people; mental horsepower is easily scaled by adding people, even though mental precision can’t be scaled like that. I’m not sure whether this prediction holds true.)