Households, businesses, and government policy makers all face the challenge of managing risk. This is the topic tackled by journalist Greg Ip in his book Foolproof: Why Safety Can Be Dangerous and How Danger Makes Us Safe.1 Ip, an intelligent and independent thinker, identifies a tendency to deceive ourselves into believing that we are safer than we really are. Indeed, one of his central themes is that a period of success in avoiding bad outcomes can lead to complacency, and then to catastrophe.

Ip distinguishes between two philosophies for handling risk.

One, which I call the engineers, seeks to use the maximum of our knowledge and ability to solve problems and make the world safer and more stable; the other, which I call the ecologists, regards such efforts with suspicion, because given the complexity and adaptability of people and the environment, they will always have unintended consequences that may be worse than the problem we are trying to solve. (page 19)

Engineers prefer to try to prevent risk with careful design and management. Ecologists are more inclined to treat risk as inevitable and to emphasize resilience rather than prevention. I like to say that instead of trying to make the banking system hard to break, we should try to make it easier to “fix.” I would note that mainstream economists tend to be engineers, while F.A. Hayek and other Austrian school economists tend to be ecologists.

Ip identifies several flaws in the way that we handle risk. One is simple forgetfulness. He writes,

Memory and experience shape our behavior. The more vivid our sense of danger, the greater care we take. On Wall Street, those who take risks can reap spectacular rewards. Those with longer memories hang back; their performance and profits suffer, and customers go elsewhere. Thus, trading is a young person’s profession. (page 8)

This reminded me of a vignette in the 1960’s best-seller The Money Game,2 by George Goodman (under the pseudonym ‘Adam Smith’). In it, the author converses with a trader with the nickname The Great Winfield. Winfield tells the author that this is a “kids’ market,” because only young people with no memory would buy the then-popular “story” stocks—many of which were soon to plummet in value.

Short memories are particularly dangerous after a long period of stability. Ip credits Hyman Minsky3 with taking a contrarian view that the stable macroeconomic performance of the late 1980s and early 1990s was not a new era but only the prelude to a major crisis. Minsky, of course, is known for the catch-phrase, “stability breeds instability.”

I believe that Ip correctly captures the thinking of bank regulators in the 1980s and 1990s. Far from wanting to turn banks loose to take risks, regulators thought that they were driving risk out of the banking system.

There was plenty of both regulation, such as increased capital requirements, and deregulation, such as the repeal of Glass-Steagall. Some deregulation, to be sure, was meant to better enable banks to compete. But regulators, in the process, thought they were correcting the flaws in earlier rules that made banks more fragile…

Higher capital requirements and tougher regulation meant that traditional deposit gathering and loans were becoming more expensive. The financial system became adept at coming up with alternative ways to gather up savings and lend them, the practice later dubbed “shadow banking.”

… Regulators considered this a good thing. Since prior crises had always involved banks, a bank sector that was better regulated and that had farmed out much of its riskier activity to the capital markets ought to be safer. (pages 43-46)

See the EconTalk podcast episode Peltzman on Regulation, November 2006.

Ip points out that another challenge with risk management is moral hazard. People who believe themselves protected from risk tend to take more chances. In the area of automobile safety, this is known as the Peltzman effect. Economist Sam Peltzman pointed out that safety requirements such as motorcycle helmets, car seatbelts, and anti-lock brakes tended to induce offsetting increases in risky driving.

Another problem Ip describes is uncompensated risk transfer. If an upstream town builds a levee along a river, this reduces the risk of a flood in that town. However, by leaving more water in the river, the levee adds to the risk of flooding in downstream towns. Ip suggests that some financial practices, such as portfolio insurance, can have similar effects. (Think of portfolio insurance as consisting of “stop-loss” orders, which are orders to sell stock that are triggered automatically if the stock price falls to a certain level. My “upstream” stop-loss order can drive the price of the stock down further, triggering “downstream” stop-loss orders, effectively creating a panic.)

Ip argues that Goldman Sachs’ buildup of liquid assets to protect itself from problems in the mortgage securities market also had a downstream effect. For example, by requiring AIG to put up additional collateral as the value of the securities that AIG was insuring came down, Goldman and other counterparties to AIG brought that company down, even though it was not yet required to make any restitution to security holders.

On the other hand, sometimes people are too reluctant to accept risk. As a result, they overpay for insurance and undertake attempts at prevention that are inordinately costly. For example,

Former NASA scientist James E. Hansen, one of the loudest voices in the scientific community warning of global warming, and his colleague Pushker A. Kharecha have tried to quantify the relationship between nuclear power and mortality. They figured that between 1971 and 2009, the use of nuclear power had prevented 1.84 million deaths by avoiding the burning of coal and natural gas and the resulting air pollution. (page 168)

Nonetheless, the risk of nuclear power is more salient to the public than is the risk of burning coal.

“The theme that unites many of the flaws in people’s judgment about risk is a focus on the salient at the expense of the less salient.”

In fact, I would argue that the theme that unites many of the flaws in people’s judgment about risk is a focus on the salient at the expense of the less salient. Consider:

  • The engineers become overconfident because they believe that they can design systems to protect against risks. However, such systems only address risks that are currently salient to engineers. They do not address risks that are currently hidden or will emerge in the future.
  • Towns that build levees are dealing with the risk that is salient to them while ignoring the risk that they are transferring downstream.
  • People who overpay for small risks (buying extended warranties for small appliances, preferring comprehensive health insurance to catastrophic coverage or long-term care insurance) do so because the smaller potential losses are more salient.
  • Regulators who impose safety requirements are conscious of the intended reduction in risk, not of the unforeseen changes in behavior due to moral hazard and the Peltzman effect.

For more on these topics, see “Why Financial Regulation Is Doomed to Fail,” by Philip Maymin, Econlib, March 7, 2011, and the EconTalk podcast episode Lars Hansen on Risk, Ambiguity, and Measurement, June 2014.

Overall, the reader of Foolproof comes away with a sense that we need to learn to think more astutely about risk. This includes learning from history. I believe it also includes thinking more carefully about what is seen and what is not seen.4


Greg Ip, Foolproof: Why Safety Can be Dangerous and How Danger Makes Us Safe Little, Brown, and Company, 2015.

Adam Smith, The Money Game. Vintage, 1976.

For more on Minsky’s work, see my column from last month, “A Wray of Light on Hyman Minsky.”

The notion of “what is seen and what is not seen” was famously introduced by Frederic Bastiat in his essay “What Is Seen and What Is Not Seen” reprinted in Selected Essays on Political Economy available at the Library of Economics and Liberty.


*Arnold Kling has a Ph.D. in economics from the Massachusetts Institute of Technology. He is the author of five books, including Crisis of Abundance: Rethinking How We Pay for Health Care; Invisible Wealth: The Hidden Story of How Markets Work; and Unchecked and Unbalanced: How the Discrepancy Between Knowledge and Power Caused the Financial Crisis and Threatens Democracy. He contributed to EconLog from January 2003 through August 2012.

For more articles by Arnold Kling, see the Archive.