If the argument thus far is correct, we need to ask why reasonable people endorse that principle. If precautions themselves create risks, and if no course of action lacks significant worst-case scenarios, it is puzzling why people believe that the Precautionary Principle offers real guidance. The simplest answer is that a weak version is doing the real work. The more interesting answer is that the principle seems to give guidance because people single out a subset of risks that are actually involved. In other words, those who invoke the principle wear blinders. But what kind of blinders do they wear, and what accounts for them? I suggest that two factors are crucial. The first, emphasized in Chapter 1, is availability; the second, which we have not yet encountered, involves loss aversion.
Availability helps to explain the operation of the Precautionary Principle for a simple reason: Sometimes a certain risk, said to call for precautions, is cognitively available, whereas other risks, including the risks associated with regulation itself, are not. For example, everyone knows that nuclear power is potentially dangerous; the associated risks, and the worst-case scenarios, are widely perceived in the culture, because of the Chernobyl disaster and popular films about nuclear catastrophes. By contrast, a relatively complex mental operation is involved in the judgment that restrictions on nuclear power might lead people to depend on less safe alternatives, such as fossil fuels. In many cases where the Precautionary Principle seems to offer guidance, the reason is that some of the relevant risks are available while others are barely visible.
...But there is another factor. Human beings tend to be loss averse, which means that a loss from the status quo is seen as more distressing than a gain is seen as desirable… Because we dislike losses far more than we like corresponding gains, opportunity costs, in the form of forgone gains, often have a small impact on our decisions. When we anticipate a loss of what we already have, we often become genuinely afraid, in a way that greatly exceeds our feelings of pleasurable anticipation when we anticipate some addition to our current holdings.
The implication in the context of danger is clear: People will be closely attuned to the potential losses from any newly introduced risk, or from any aggravation of existing risks, but far less concerned about future gains they might never see if a current risk is reduced. Loss aversion often helps to explain what makes the Precautionary Principle operational. The status quo marks the baseline against which gains and losses are measured, and a loss from the status quo seems much more “bad” than a gain from the status quo seems good.
This is exactly what happens in the case of drug testing. Recall the emphasis, in the United States, on the risks of insufficient testing of medicines as compared with the risks of delaying the availability of those medicines. If there is a lot of testing, people may get sicker, and even die, simply because medicines are not made available. But if the risks of delay are off-screen, the Precautionary Principle will appear to give guidance notwithstanding the objections I have made. At the same time, the lost benefits sometimes present a devastating problem with the use of the Precautionary Principle. In the context of genetic modification of food, this is very much the situation; many people focus on the risks of genetic modification without also attending to the benefits that might be lost by regulation or prohibition. We can find the same problem when the Precautionary Principle is invoked to support bans on nonreproductive cloning. For many people, the possible harms of cloning register more strongly than the potential therapeutic benefits that would be made unattainable by a ban on the practice.
More (#4) from Worst-Case Scenarios: