Testing for the Avalanche

IMG_2197[1]

As Nassim Nicholas Taleb famously explained in The Black Swan, it is the unexpected that is most unexpected. For compliance professionals, testing is one of the tools that tries to expose the unexpected.

I was thinking about testing as I was out in the snowpack in my front yard. I tried out some of the avalanche tests I remembered from my mountaineering days.  As you can see from the cracks above, the snow failed the test and had a slab release. Fortunately, it was only my snowblower that fell victim to the unstable snow.

The six feet of snow in the last 30 days is pushing the infrastructure limits in Greater Boston. The subway system is failing, the roads are clogged with snow, nearly every roof has ice dams.

Perhaps the avalanche of snow is not a true Black Swan event. Huge amounts of snow are not unprecedented. Boston was subject to five feet snow in 30 days in 1978. (Of course, that event crippled Greater Boston for weeks.) It is more of a statistical anomaly than an unexpected and unforeseeable event.

How robust, or Antifragile, do you design the infrastructure to deal with an event that only happens every 30 years? How do you test your systems for an event that only happens once every few decades?

I’m not sure I have an answer. I’m sure that I have a sore back from shoveling so much snow.

Antifragile – Things That Gain from Disorder

antifragile

Taleb is back, and he is even more brash and brilliant. Nassim Nicholas Taleb’s fame grew from his second book, The Black Swan, being timely released just before the 2008 financial crisis. Antifragile continues his narrative on probability and risk. The Black Swan took on the theory that the highly improbable was a lot more probable than we thought. Antifragile takes the next step and proposes that it’s good to be subject to black swan events and other disorder.

He coins the term “antifragile” to address the situation where something improves as it is subject to disorder or stress. I goes beyond robust. I thought the best explanation was through the use of mythology.

The sword of Damocles is fragile. All it takes is a small amount of damage to sever the horsehair and send the sword plummeting into the person sitting beneath it.

The phoenix is robust. Once it dies, it is reborn from its ashes to live again.

The hydra is antifragile. If you mange to cut off one of its heads, two sprout in its place. The more damage it takes, the more powerful it becomes.

The next step is to realize that the effects of fragility are not linear. As the overload increases, the damage increases dramatically. Getting hit by by one rock weighing 50 pounds will hurt a lot more than getting hit by 1000 rocks that weigh 0.05 pounds. There is an acceleration of harm. The black swan comes when you get hit by the 50 pound rock, when you had been expecting another pebble.

The book is an excellent reading choice for anyone involved in compliance or risk management

The publisher sent me a free copy of the book in hopes of a review.

Six Mistakes Executives Make in Risk Management

Harvard-Business-Review-October-2009-Cover

Nassim N. Taleb, Daniel G. Goldstein, and Mark W. Spitznagel discuss risk management and short comings in approaches in the October 2009 issue of the Harvard Business Review (subscription required).

They offer up six mistakes in the way we think about risk:

1.  We think we can manage risk by predicting extreme events.
2.  We are convinced that studying the past will help us manage risk.
3.  We don’t listen to advice about what we shouldn’t do.
4.  We assume that risk can be measured by standard deviation.
5.  We don’t appreciate that what’s mathematically equivalent isn’t psychologically so.
6.  We are taught that efficiency and maximizing shareholder value don’t tolerate redundancy.

Black Swan events – low-probability, high-impact events that are almost impossible to forecast— are increasingly dominating the economic environment. The world is a complex system, made up of a tangled web of relationships and other interdependent factors.  Complexity makes forecasting even ordinary events impossible. So, complexity increases the incidence of Black Swan events as we have a harder time seeing the relationship and connection. All we can predict is that Black Swan events will occur and we won’t expect them.

The authors propose a different approach to risk management:

“Instead of trying to anticipate low-probability, high-impact events, we should reduce our vulnerability to them. Risk management, we believe, should be about lessening the impact of what we don’t understand—not a futile attempt to develop sophisticated techniques and stories that perpetuate our illusions of being able to understand and predict the social and economic environment.”

The authors end up equating risk to ancient mythology:

“Remember that the biggest risk lies within us: We overestimate our abilities and underestimate what can go wrong. The ancients considered hubris the greatest defect, and the gods punished it mercilessly. Look at the number of heroes who faced fatal retribution for their hubris: Achilles and Agamemnon died as a price of their arrogance; Xerxes failed because of his conceit when he attacked Greece; and many generals throughout history have died for not recognizing their limits. Any corporation that doesn’t recognize its Achilles’ heel is fated to die because of it.”

That is a bit lofty for my tastes. After all, the danger of the black swan is that you don’t know that you don’t know about that risk. If you know about a risk, you can deal with it. If you know that you don’t know about risk, you can manage that also. It’s hard to be a victim of hubris when you don’t know the danger for your downfall even exists.

Nassim N. Taleb is the Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute and a principal of Universa Investments, a firm in Santa Monica, California. He is the author of several books, including The Black Swan: The Impact of the Highly Improbable. Daniel G. Goldstein is an assistant professor of marketing at London Business School and a principal research scientist at Yahoo. Mark W. Spitznagel is a principal of Universa Investments.

Ten Principles for a Black Swan-Proof World

Nassim Nicholas Taleb penned an opinion piece in the Financial Times: Ten principles for a Black Swan-proof world .

Check out the piece for details behind each item:

1. What is fragile should break early while it is still small.
2. No socialisation of losses and privatisation of gains.
3. People who were driving a school bus blindfolded (and crashed it) should never be given a new bus.
4. Do not let someone making an “incentive” bonus manage a nuclear plant – or your financial risks.
5. Counter-balance complexity with simplicity
6. Do not give children sticks of dynamite, even if they come with a warning.
7. Only Ponzi schemes should depend on confidence. Governments should never need to “restore confidence”.
8. Do not give an addict more drugs if he has withdrawal pains.
9. Citizens should not depend on financial assets or fallible “expert” advice for their retirement.
10. Make an omelette with the broken eggs.

If you have not read The Black Swan yet, you should. it was one of those few books that changed the way I view the world.
The Black Swan

Book Review: The Black Swan

The Black SwanI just finished reading The Black Swan by Nassim Nicholas Taleb. The title of the book comes from the observations of Europeans that all swans are white. Much to their surprise, they came to Australia and found their first black swan. The book starts with this story to illustrate the “limitations to our learning from observations or experience and the fragility of our knowledge.” As Taleb points out, it very different to think there is evidence of no possible black swans, than there is no evidence of the possibility of black swans.

Taleb has received lots of press and admirers given the recent meltdown in the financial markets. The book was published in 2007. Taleb seems to have perceived the coming collapse (and probably got a rich financial reward based on his strategy).

His supreme self-confidence (arrogance) shines brightly through in his writing. He has little time for shallow thinking and those who think they understand risk or the financial markets.

Another example running through the book is the first 1,001 days of a turkey’s life. For a 1,000 days the farmer brings food to the turkey every morning. On that last day, things change dramatically. The farmer shows up with an axe instead of food. A surprise and horrible change in circumstance for the turkey. But all the historical evidence for the turkey indicated that the farmer would show up with food and not an axe. Of course, on the flip side, the farmer saw the axe day coming.

As Yogi Berra philosophized: “It’s tough to make predictions, especially about the future.”

It is the unknown unknown that is most dangerous. We spend too much time focusing on knowing what we know. We need to spend focusing some energy in realizing what we do not know and what we do not know what we do not know.

As a compliance and risk professional I was particularly intrigued by the story of the four largest losses by casinos. As you might expect, casinos run very thorough security programs, compliance programs and risk management programs. The four largest losses fell completely outside the casinos’ models. One was the white tiger’s attack on Roy, the second was a disgruntled contractor who attempted to dynamite the casino, the third was the kidnapped daughter of a casino owner, and the fourth an incompetent employee who failed to file the 1099 reports with the Internal Revenue Service.

It is also important to draw the distinction between positive contingencies and negative contingencies. The black swan can be one that brings unexpected destruction or one that brings an unexpected windfall. His philosophy is to play it safe, but hedge for a disastrous losses and spectacular windfalls. Mitigate the unexpected consequences.

I expected to get a lot of insight from the book. But it was one of the few books that changed the outlook on my profession.

Risk Mismanagement

I really enjoyed the story by Joe Nocera in the New York Times: Risk Mismanagement. The author focuses on the failures of risk management during the most recent financial crisis.

The author starts with the failure of the VaR (Value at Risk) model used by many companies. He then moves on to the theories of Taleb captured in his book Black Swan (next on my reading list).

Taleb says that Wall Street risk models, no matter how mathematically sophisticated, are bogus; . . . . And the essential reason for this is that the greatest risks are never the ones you can see and measure, but the ones you can’t see and therefore can never measure. . . . Because we don’t know what a black swan might look like or when it might appear and therefore don’t plan for it, it will always get us in the end.

The key for a compliance professional is do handle the current know risks to your company, while at the same time keeping an eye out for unknown risks.