The Parable of the Boy Who Cried 5% Chance of Wolf

Epistemic status: a parable making a moderately strong claim about statistics

Once upon a time, there was a boy who cried “there’s a 5% chance there’s a wolf!”

The villagers came running, saw no wolf, and said “He said there was a wolf and there was not. Thus his probabilities are wrong and he’s an alarmist.”

On the second day, the boy heard some rustling in the bushes and cried “there’s a 5% chance there’s a wolf!”

Some villagers ran out and some did not.

There was no wolf.

The wolf-skeptics who stayed in bed felt smug.

“That boy is always saying there is a wolf, but there isn’t.”

“I didn’t say there was a wolf!” cried the boy. “I was estimating the probability at low, but high enough. A false alarm is much less costly than a missed detection when it comes to dying! The expected value is good!”

The villagers didn’t understand the boy and ignored him.

On the third day, the boy heard some sounds he couldn’t identify but seemed wolf-y. “There’s a 5% chance there’s a wolf!” he cried.

No villagers came.

It was a wolf.

They were all eaten.

Because the villagers did not think probabilistically.

The moral of the story is that we should expect to have a large number of false alarms before a catastrophe hits and that is not strong evidence against impending but improbable catastrophe.

Each time somebody put a low but high enough probability on a pandemic being about to start, they weren’t wrong when it didn’t pan out. H1N1 and SARS and so forth didn’t become global pandemics. But they could have. They had a low probability, but high enough to raise alarms.

The problem is that people then thought to themselves “Look! People freaked out about those last ones and it was fine, so people are terrible at predictions and alarmist and we shouldn’t worry about pandemics”

And then COVID-19 happened.

This will happen again for other things.

People will be raising the alarm about something, and in the media, the nuanced thinking about probabilities will be washed out.

You’ll hear people saying that X will definitely fuck everything up very soon.

And it doesn’t.

And when the catastrophe doesn’t happen, don’t over-update.

Don’t say, “They cried wolf before and nothing happened, thus they are no longer credible.”

Say “I wonder what probability they or I should put on it? Is that high enough to set up the proper precautions?

When somebody says that nuclear war hasn’t happened yet despite all the scares, when somebody reminds you about the AI winter where nothing was happening in it despite all the hype, remember the boy who cried a 5% chance of wolf.

Originally posted on my Twitter and personal blog.

Reminder that if this reaches 35 upvotes, you can listen to this post on your podcast player using the Nonlinear Library.

Cross-posted