The main finding in Philip Tetlock’s awe-inspiring Expert Political Knowledge is that open-minded “foxes” are better predictors than theory-driven “hedgehogs.” But toward the end of the book, he has a fascinating chapter about a fascinating exception.

Background: There’s a whole industry of “scenario consultants” who claim to mitigate overconfidence – and attendant financial embarassment – by making businesspeople imagine reasons why they might be wrong. As Tetlock explains:

They appeal to clients to stretch their conceptions of the possible, to imagine a wider range of futures than they normally would, and then to construct full-fledged stories that spell out the “drivers” that, under the right conditions, could propel our world into each alternative future.

Tetlock tried out this technique on his political experts. He made them imagine all sorts of possible scenarios about (a) Canadian disintegration, and (b) the Japanese economy. His findings:

1. Before the exercise, people’s probabilities at least added to about 100%. The exercise led participants to violate this elementary logical principle: after thinking about different scenarios, every outcome seemed more likely. For the Canadian problem, the average expert gave all the possibilities a 158% chance of happening. The error, in essence, was that merely imaginining possibilities, however outlandish, made people take them seriously. As Tetlock explains:

One takes a vague abstraction, all possible paths to Canada’s disintegration, and explores increasingly specific contingencies. Quebec secedes and the rest of Canada fragments: the Maritimes – geographically isolated – clings to Ontario, but Alberta flirts with the United States… [T]he images are vivid, the plotlines plausible, and it becomes increasingly taxing to keep all the other logical possibilities in focus.

2. The exercise led people to raise their probabilities for events that didn’t happen. In other words, more openness led to less accuracy.

3. The people most subject to this error were foxes who specialized in the topic at hand! Hedgehogs’ “dogmatic” scoffing at odd hypotheticals saved them from a serious intellectual embarassment.

One upshot is that the scenario consulting industry purports to help solve the problem of overconfidence, but in fact seems rather overconfident in the value of their method.

But for me, the more fundamental lesson is that – the whole literature on overconfidence notwithstanding – it’s not that hard to make people suffer from excessive open-mindedness. All of which backs up one of my favorite sayings: “Keep an open mind, but not so open that your brain falls out.” (Richard Feynman)