Joe Nocera has a nice piece on risk management in last Sunday’s New York Times magazine. A great line:
“The old adage, ‘garbage in, garbage out’ certainly applies,” Groz said. “When you realize that VaR is using tame historical data to model a wildly different environment, the total losses of Bear Stearns’ hedge funds become easier to understand. It’s like the historic data only has rainstorms and then a tornado hits.”
The article quotes Nassim Nicholas Taleb, author of The Black Swan, at length. Interesting and well-done. Great paragraph:
Eventually, though, you do start to get the point. Taleb says that Wall Street risk models, no matter how mathematically sophisticated, are bogus; indeed, he is the leader of the camp that believes that risk models have done far more harm than good. And the essential reason for this is that the greatest risks are never the ones you can see and measure, but the ones you can’t see and therefore can never measure. The ones that seem so far outside the boundary of normal probability that you can’t imagine they could happen in your lifetime — even though, of course, they do happen, more often than you care to realize. Devastating hurricanes happen. Earthquakes happen. And once in a great while, huge financial catastrophes happen. Catastrophes that risk models somehow always manage to miss.
I agree with all the above.
Taleb, though, as is typical, overstates:
And that’s the point. Because we don’t know what a black swan might look like or when it might appear and therefore don’t plan for it, it will always get us in the end. “Any system susceptible to a black swan will eventually blow up,” Taleb says.
But precisely because it’s like a black swan, that is, a rare event, it won’t always get us. He’s literally correct that any system susceptible to a black swan will eventually blow up, but the truth hinges on the word “eventually.” What if “eventually” means, “in one hundred years?”
HT to Jeff Hummel.
READER COMMENTS
Publius
Jan 8 2009 at 2:06pm
Taleb should have remembered his economese and substituted “in the long run” for all the eventually’s.
tom
Jan 8 2009 at 2:37pm
Also see Kling on Nocera’s article: http://www.econlib.org/archives/2009/01/todays_reading.html
Gary Rogers
Jan 8 2009 at 2:38pm
The thing about black swans is that there is not just one of them. The probability of any single low probability event can be discounted, but if you add up the probabilities of all the unlikely events that can possibly occur and there is a reasonable risk that needs to be considered. As an example, consider the situation where I flip a coin 100 times and it comes up heads each time. You would be correct if you predict beforehand that all heads will not happen. But, if it does, you have to consider that all heads has just as much chance of happening as any other combination. So, if I get all heads, all tails, 50 heads followed by 50 tails, etc. I will be amazed, but it is not as amazing as it appears. The danger of black swans is not in predicting that a specific event will happen, but the danger that any of a large number of unspecified possibilities might happen.
dWj
Jan 8 2009 at 2:39pm
It’s worth noting, also, that every system is susceptible to black swans. The United States could be invaded; how do you hedge for that? I think a lot of Wall Streeters are actually more sophisticated than Taleb is a lot of the time; they know that their tools are imperfect, but they know that all tools are imperfect. They’ll do the best they can, and the smart ones will try to keep down unnecessary exposure to unmeasurables, and then they cross their fingers.
Mike Rulle
Jan 8 2009 at 3:46pm
I agree with dWj. This whole VAR and Black Swan issue is a red herring. I never met a risk person in the last decade who did not understand the “fat tail” problem. Anyone who thinks the whole chain of events which made up the mortgage bubble could have been mitigated or avoided by a better understanding of “randomness” and “black swans” is at least as naive as the Straw Men they attack.
H
Jan 8 2009 at 8:13pm
dWj and Mike, you both are right. I think this is one of the most overlooked aspects of the crisis. People are too busy assuming greed was at the heart of what we now to be excess risk taking. I would counter that it was not greed, but an inability to predict an event (nationwide fall in home prices) that no one had seen in their lifetimes. There is more reason to blame Katrina victims for not provisioning for Katrina than there is to blame risk managers for not provisioning for a fall in housing prices among all major markets. Using historical data, this seemed like a very, very low-probability event.
John Thacker
Jan 9 2009 at 12:53am
Bah, he should have said “as t goes to infinity” or “infinitely often.”
Using historical data, it seemed like a fairly high probability event to me. That’s from judging the California experience, which did have declining housing values in two periods in the 80s and 90s; the California economy is large enough that anything that happened there could happen in the entire nation. Also from judging the UK, Spain, and other nations that had already experienced housing bubbles pop shortly before the US one.
Of course, California (and Hawaii, and a few others) had already adopted various regional growth planning that made bubbles and crashes inevitable. See Ed Glaeser’s research, among others. When the elasticity of housing supply is decreased, you have bubbles and price declines, since the price of housing changes dramatically in response to demand, instead of the supply changing.
Since many other major markets (TX and NC excepted) had adopted regional growth planning in the 80s and 90s like that in CA, HI, and in Europe, a nationwide crash seemed inevitable.
Of course, as a separate issue, there’s no reason that a housing price decline must lead to such a crisis.
dWj, Mike Rulle, H, you are all massively wrong. VaR makes the martingale betting strategy (double your bet each time you lose until you win, then start over) look risk-free. Unsurprising, since most primitive mathematical analysis can make it look so. But that’s profoundly dangerous, since the idiocy has been invented many times throughout the years. Inventing a metric– and then mandating its use via regulation– as the sole measure of risk that ignores the most common probabilistic fallacy of betting is insane, and an obvious error that would lead to problems down the road. Doob’s optional sampling/stopping theorem is important.
There are metrics that are not vulnerable to black swans, or at least give you an accurate measure of what you’re risking, and how you’re leveraged. But VaR ignores leverage (in tail events), and can completely misstate liquidity requirements.
Justin H.
Jan 9 2009 at 5:31pm
Recall also that Taleb isn’t advocating a methodology to avoid black swans – and he isn’t even saying you “shouldn’t” be engaged in activities susceptible to them. He stays out of the normative debate entirely choosing instead to complain about being unaware that you are susceptible to said blowups – eventually. In practice, eventually is sooner than you think, as you are probably underestimating the odds anyway. Without a diligent and ongoing questioning of our practices, we will convince ourselves that our operations are safer than they really are. The message isn’t that existing risk models are terrible (though they are), it is that in many lines of work, if you feel safe then you are deluding yourself, and are likely exposed.
This means that playing in a game that has black swan potentiality may make you better off, and may make you worse off. Taking the gamble based on odds you’ve calculated is folly – so just know going in that you’re sitting on the bomb. And hope eventually is when someone else takes your place…
Comments are closed.