[An updated version of this article can be found at Risk and Safety in the 2nd edition.]
Since the late fifties the regulation of risks to health and safety has taken on ever-greater importance in public policy debates—and actions. In its efforts to protect citizens against hard-to-detect hazards such as industrial chemicals and against obvious hazards in the workplace and elsewhere, Congress has created or increased the authority of the Food and Drug Administration, the Environmental Protection Agency, the Occupational Health and Safety Administration, and Consumer Protection Agency, and other administrative agencies.
Activists in the pursuit of a safer society decry the damage that industrial progress wreaks on unsuspecting citizens. Opponents of the "riskless society," on the other hand, complain that government is unnecessarily proscribing free choice in the pursuit of costly protection that people do not need or want. This article describes some facts about risk, along with some academic theories about why people on both sides of the risk debate take the positions they do.
The health of human beings is a joint product of their genetic inheritance (advice: choose healthy and long-lived parents), their way of life (the poor black person who eats regularly and in moderation, exercises, does not smoke or drink, is married, and does not worry overly much is likely to be healthier than the rich white person who does the opposite), and political economy (live in a rich, democratic, and technologically advanced society). Contrary to common opinion, living in a rich, industrialized, technologically advanced country that makes considerable use of industrial chemicals and nuclear power is a lot healthier than living in a poor, nonindustrialized nation that uses little modern technology or industrial chemicals. That individuals in rich nations are far healthier, live far longer, and can do more of the things they want to do at corresponding ages than people in poor countries is a rule without exception.
Prosperous also means efficient. The most polluted nations in the world, many more times polluted than democratic and industrial societies, are the former communist countries of Central Europe and the Soviet Union. To produce one unit of output, communist countries use two to four times the amount of energy and material used in capitalist countries. Therefore, individuals unfortunate enough to live in an inefficient economy die younger and have more serious illnesses than in the Western and industrial democracies. A little richer is a lot safer. As Peter Huber demonstrated in Regulation magazine, "For a 45-year-old man working in manufacturing, a 15 percent increase in income has about the same risk-reducing value as eliminating all hazards—every one of them—from his workplace."
Among the many facts that might be observed from tables 1 and 2 is that longevity has increased dramatically (with only occasional downturns) since the middle of the last century. Black Americans and other minorities lag behind white Americans, but their life expectancy has also nearly doubled, albeit from a lower starting point. The most unequal relationship, though seldom commented upon, is the far greater longevity of females of all races compared to males (a 6.7-year advantage to white females, a 7.9 year advantage to black females). This female advantage is far greater than the lead in longevity of white men over nonwhite men (4.9 years) or of white women over nonwhite women (3.7 years).
Turning to death rates, note the decline by half since 1900 of deaths from all forms of accidents, and the spectacular declines in all sorts of diseases. The sixfold drop in deaths from pneumonia and influenza is par for the course. On the other side of the ledger, cancer continues to rise, though it has slowed down, and major cardiovascular diseases remain high. Why these discrepancies? Cancer is largely a disease of old age. When people died at roughly half the present life expectancy, they died before having an opportunity, if one may call it that, to get cancer. Of course, people must die of something. Lacking other information, it is usual to classify deaths due to heart failure, given that heart stoppage is one of the signs of death.
The most dangerous activities are precisely what we might think they are—sports such as motorcycling and parachuting and occupations such as fire fighting and coal mining. On the other hand, many of the risks that people have begun to worry about in recent years are far smaller than generally perceived. However low the risk of being killed by lightning (see table 3), the risk of getting cancer from drinking tap water (chlorine forms chloroform, which is a weak carcinogen) is less than one-third of that, and the harm done by pesticides in food, based largely on animal studies, is even less. The lowest risk that statisticians have measured—getting killed by a falling meteorite—measures in at six-millionths of 1 percent.
In its regulations specifying maximum discharges of potentially harmful substances from factories, the Environmental Protection Agency (EPA) sets a safety threshold of one additional death in a million. How, we might ask, did the EPA arrive at one in a million? Well, let's face it, no real man tells his girlfriend that she is one in a hundred thousand. But the real root of "one in a million" can be traced to the Food and Drug Administration's (FDA) efforts to find a number that was essentially equivalent to zero.
Many experts argue that insisting on essentially zero risk is going too far. As Professor John D. Graham, director of the Harvard School of Public Health's Center for Risk Analysis, wrote, "No one seriously suggested that such a stringent risk level should be applied to a hypothetical maximally exposed individual." This mythical, "maximally exposed" human being is created by assuming that he or she lives within two hundred meters of the offending industrial plant, lives there for a full seventy years, remains outdoors day and night or at least all day, and will get cancers at the same level as rodents or other small animals that are bred to be especially susceptible to such cancers, and who are given doses running into the thousands of times larger than any person other than those who receive lifetime occupational exposures on the job.
The other assumption is that cancer causation is a linear process, meaning that there is no safe dose and that damage occurs at a constant rate as exposure increases. Yet scientific evidence increasingly shows that there are, indeed, threshold effects, and that the cancers animals develop as a result of being subjected to huge doses in short periods of time tell us essentially nothing about the reactions of human beings. To go from mouse to man, for instance, requires statistical adjustments for the hugely different weights of the two creatures and for the hugely different doses. Many statistical models fit the data that scientists have about risks. These models vary in their outcomes for risk by thousands of times over. And yet there is no scientifically approved way of choosing among them. Only if the mechanism by which a chemical causes cancer is well-known would it be possible to choose a good model. In short, current measures of risk from low-level exposures to industrial technology have no true validity whatsoever. This explains why health rates keep getting better and better while government estimates of risk keep getting worse and worse.
Why are some people frightened of risks and others not? Surveys of risk perception show that knowledge of the known hazards of a technology does not determine whether or to what degree an individual thinks a given technology is safe or dangerous. This holds true not only for laymen, but also for experts in risk assessment. Thus, the most powerful factors related to how people perceive risk apparently are "trust in institutions" and "self-rated liberal and conservative identification." In other words, these findings suggest strongly that people use a framework involving their opinion of the validity of institutions in order to interpret riskiness.
According to one cultural theory, people choose what to fear as a way to defend their way of life. The theory hypothesizes that adherents of a hierarchical culture will approve of technology, provided it is certified as safe by their experts. Competitive individualists will view risk as opportunity and, hence, be optimistic about technology. And egalitarians will view technology as part of the apparatus by which corporate capitalism maintains inequalities that harm society and the natural environment.
One recent study sought to test this theory by comparing how people rate the risks of technology compared to risks from social deviance (departures, such as criminal behavior, from widely approved norms), war, and economic decline. The results are that egalitarians fear technology immensely but think that social deviance is much less dangerous. Hierarchists, by contrast, think technology is basically good if their experts say so, but that social deviance leads to disaster. And individualists think that risk takers do a lot of good for society and that if deviants don't bother them, they won't bother deviants; but they fear war greatly because it stops trade and leads to conscription. Thus, there is no such thing as a risk-averse or risk-taking personality. People who take or avoid all risks are probably certifiably insane; neither would last long. Think of a protester against, say, nuclear power. She is evidently averse to risks posed by nuclear power, but she also throws her body on the line—i.e., takes risks in opposing it.
Other important literature pursues risk perception through what is known as cognitive psychology. Featuring preeminently the path-breaking work of Daniel Kahneman and Amos Tversky, using mainly small group experiments in which individuals are given tasks involving gambling, this work demonstrates that individuals are very poor judges of probability. More important, perhaps, is their general conservatism: large proportions of people care more about avoiding loss than they do about making gains. Therefore, they will go to considerable lengths to avoid losses, even in the face of high probabilities of making considerable gains.
In regard to the consequences of technological risk, there are two major strategies for improving safety: anticipation versus resilience. The risk-averse strategy seeks to anticipate and thereby prevent harm from occurring. In order to make a strategy of anticipation effective, it is necessary to know the quality of the adverse consequence expected, its probability, and the existence of effective remedies. The knowledge requirements and the organizational capacities required to make anticipation an effective strategy—to know what will happen, when, and how to prevent it without making things worse—are very large.
A strategy of resilience, on the other hand, requires reliance on experience with adverse consequences once they occur in order to develop a capacity to learn from the harm and bounce back. Resilience, therefore, requires the accumulation of large amounts of generalizable resources, such as organizational capacity, knowledge, wealth, energy, and communication, that can be used to craft solutions to problems that the people involved did not know would occur. Thus, a strategy of resilience requires much less predictive capacity but much more growth, not only in wealth but also in knowledge. Hence it is not surprising that systems, like capitalism, based on incessant and decentralized trial and error accumulate the most resources. Strong evidence from around the world demonstrates that such societies are richer and produce healthier people and a more vibrant natural environment.
Aaron Wildavsky, who died in 1993, was the Class of 1940 Professor of Political Science and Public Policy at the University of California at Berkeley. He wrote Cultural Theory (with Richard Ellis and Michael Thompson), The Rise of Radical Egalitarianism, and Searching for Safety.
Dake, Karl, and Aaron Wildavsky. "Theories of Risk Perception: Who Fears What and Why?" Daedalus 119, no. 4 (Fall 1990): 41-61.
Dietz, Thomas, and Robert Rycroft. The Risk Professionals. 1987.
Douglas, Mary, and Aaron Wildavsky. Risk and Culture. 1982.
Kahneman, Daniel, and Amos Tversky. "Variants of Uncertainty." Cognition 11, no. 2 (March 1982): 143-57.
Kahneman, Daniel, Paul Slovic, and Amos Tversky, eds. Judgment under Uncertainty: Heuristics and Biases. 1982.
Keeney, Ralph L. "Mortality Risks Induced by Economic Expenditures." Risk Analysis 10, no. 1 (1990): 147-59.
Moronwe, Joseph G., and Edward J. Woodhouse. Averting Catastrophe. 1986.
Schwarz, Michael, and Michael Thompson. Divided We Stand: Redefining Politics, Technology and Social Choice. 1990.
Wildavsky, Aaron. Searching for Safety. 1988.