Testing for the Avalanche

IMG_2197[1]

As Nassim Nicholas Taleb famously explained in The Black Swan, it is the unexpected that is most unexpected. For compliance professionals, testing is one of the tools that tries to expose the unexpected.

I was thinking about testing as I was out in the snowpack in my front yard. I tried out some of the avalanche tests I remembered from my mountaineering days.  As you can see from the cracks above, the snow failed the test and had a slab release. Fortunately, it was only my snowblower that fell victim to the unstable snow.

The six feet of snow in the last 30 days is pushing the infrastructure limits in Greater Boston. The subway system is failing, the roads are clogged with snow, nearly every roof has ice dams.

Perhaps the avalanche of snow is not a true Black Swan event. Huge amounts of snow are not unprecedented. Boston was subject to five feet snow in 30 days in 1978. (Of course, that event crippled Greater Boston for weeks.) It is more of a statistical anomaly than an unexpected and unforeseeable event.

How robust, or Antifragile, do you design the infrastructure to deal with an event that only happens every 30 years? How do you test your systems for an event that only happens once every few decades?

I’m not sure I have an answer. I’m sure that I have a sore back from shoveling so much snow.

Nobody Expects The Spanish Inquisition

The most dangerous parts of managing risk are the risks you don’t expect. Looking back at my old four-box analysis, there are really two types of unexpected risks, the risk that you know that you don’t know and the risk that you don’t know that you don’t know. In the first case you know there is an unexpected risk. In the second, you missed that there was an even a risk.

The second case has been labeled the Black Swan. Those type of risks are well written about by Taleb.

While staying up late and watching some Monty Python, I came to the conclusion that the first case is the “Spanish Inquisition.” You know the Spanish Inquisition is out there roaming the countryside. You just don’t now when or where they will appear.

You also don’t know the danger. Their two weapons are fear and surprise…and ruthless efficiency.

The Drunkard’s Walk, The Butterfly Effect and The Black Swan

drunkards walk

The “drunkard’s walk” refers to the Brownian motion, the seemingly random movement of particles suspended in a fluid. The original thought was that you might be able to calculate the movement by measuring and calculating the interaction. It proved impossible. There are too many factors and too many interactions.

Small changes in a system can dramatically affect the outcome. This is the butterfly effect. The origin cam from a meteorologist who was using a computer model to rerun a weather prediction and one of the numbers he used was shortened from six decimal points to three decimal points. The result was a completely different weather scenario. It’s not that a butterfly can cause the problem. It’s that a seemingly inconsequential random event can lead to a big change in an outcome.

Leonard Mlodinow addressed this topic in The Drunkard’s Walk: How Randomness Rules Our Lives. (I mentioned the book previously in Criticism and Praise.) There is much more randomness in our lives than we give credit.

We poorly understand the effect of randomness.

He explores his concepts using the backdrop of Pearl Harbor. In hindsight there were many signs pointing to the eventual attack. “In any complex string of events in which each event unfolds with some element of uncertainty, there is a fundamental asymmetry between past and future.” It’s nearly impossible to predict before the fact, but relatively easy to understand afterward. We have seen the same 20/20 hindsight with the 9/11 attacks.

That’s why it is easy to explain why the weather happened three days ago, but have trouble getting the weather forecast right three days into the future.

Mlodinow never mentions it, but for me the next step is the theory of the Black Swan. How do you end up with high-impact, hard-to-predict, and rare events that are beyond the realm of normal expectations?

Combining the Black Swan with the Drunkard’s Walk and the Butterfly Effect, you see that a combination of small events can lead to an over-sized outcome. We get used to being able to calculate and measure so many things. There will always be factors that we miss, or overweight or underweight.

Not to be depressing. The Drunkard’s Walk leaves you feeling in less control than when you started. But there is a factor you can control: the number of chances that you take. “Even a coin weighted toward failure will sometimes land on success.” Keep flipping the coin.

The Drunkard’s Walk is worth reading if you deal with risk.

Six Mistakes Executives Make in Risk Management

Harvard-Business-Review-October-2009-Cover

Nassim N. Taleb, Daniel G. Goldstein, and Mark W. Spitznagel discuss risk management and short comings in approaches in the October 2009 issue of the Harvard Business Review (subscription required).

They offer up six mistakes in the way we think about risk:

1.  We think we can manage risk by predicting extreme events.
2.  We are convinced that studying the past will help us manage risk.
3.  We don’t listen to advice about what we shouldn’t do.
4.  We assume that risk can be measured by standard deviation.
5.  We don’t appreciate that what’s mathematically equivalent isn’t psychologically so.
6.  We are taught that efficiency and maximizing shareholder value don’t tolerate redundancy.

Black Swan events – low-probability, high-impact events that are almost impossible to forecast— are increasingly dominating the economic environment. The world is a complex system, made up of a tangled web of relationships and other interdependent factors.  Complexity makes forecasting even ordinary events impossible. So, complexity increases the incidence of Black Swan events as we have a harder time seeing the relationship and connection. All we can predict is that Black Swan events will occur and we won’t expect them.

The authors propose a different approach to risk management:

“Instead of trying to anticipate low-probability, high-impact events, we should reduce our vulnerability to them. Risk management, we believe, should be about lessening the impact of what we don’t understand—not a futile attempt to develop sophisticated techniques and stories that perpetuate our illusions of being able to understand and predict the social and economic environment.”

The authors end up equating risk to ancient mythology:

“Remember that the biggest risk lies within us: We overestimate our abilities and underestimate what can go wrong. The ancients considered hubris the greatest defect, and the gods punished it mercilessly. Look at the number of heroes who faced fatal retribution for their hubris: Achilles and Agamemnon died as a price of their arrogance; Xerxes failed because of his conceit when he attacked Greece; and many generals throughout history have died for not recognizing their limits. Any corporation that doesn’t recognize its Achilles’ heel is fated to die because of it.”

That is a bit lofty for my tastes. After all, the danger of the black swan is that you don’t know that you don’t know about that risk. If you know about a risk, you can deal with it. If you know that you don’t know about risk, you can manage that also. It’s hard to be a victim of hubris when you don’t know the danger for your downfall even exists.

Nassim N. Taleb is the Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute and a principal of Universa Investments, a firm in Santa Monica, California. He is the author of several books, including The Black Swan: The Impact of the Highly Improbable. Daniel G. Goldstein is an assistant professor of marketing at London Business School and a principal research scientist at Yahoo. Mark W. Spitznagel is a principal of Universa Investments.

The Four Areas of Risk and Knowledge

4 box black swan

When thinking about risk, I break things into four quadrants. There are things we know and there are things we don’t know as individuals. I then slice slice that further again with the things we know and the things we don’t know as part of the larger organization or conscious state.

Our sweet spot is the the things we know that we know. (The green area on my chart.) Those are our operations. Those are the things we have in the realm of compliance. We may not be fully compliant and dealing with the risk. But it is known.

At the opposite corner are the things that we don’t know that we don’t know. This is the black swan territory. This is an area of danger for an organization. This is a knowledge void and a compliance void. These are risks that we don’t know about. We don’t know the magnitude of the risk and we don’t know it even exists. Our models miss this factor. Our organizations are not paying attention to these risks.

4 box black swan

The other two areas are also interesting.

The things we know that we don’t know is an area that we know we can improve. (The orange quadrant on my chart) This is the area of known ignorance or accepted unknowns. You can manage these risks, because we know them. They have been identified, although not quantified. They may be on the list of things to address. Or we may just be willing to run naked in this area and are not worried about the risk.

The last area of the things that we don’t know we know is an area of opportunity. (The purple quadrant on my chart) This is risk that they are managing, even if they don’t know that risk exists. Often this will be a risk associated with another risk, either through causation or correlation. If an organization realizes they have this knowledge, they maybe able to create a new opportunity for themselves by discovering it. You do need realize that the causation or correlation may sever at some point, pushing this risk down into the territory of the black swan.

There is also an element of danger in the opportunity area when it comes to records management. These may be the pieces of information getting unearthed during litigation that gets an organization in trouble.

It’s important to realize and accept that there are things we don’t know. The key to bettering the organization is to continually try to reduce the amount of stuff that we don’t know.

I want to credit Liam Fahey, a professor at Babson College and co founder of the Leadership Forum, for the origins of this matrix. He gave a presentation using this analysis to a group of law firm knowledge management leaders in October of 2008.

Ten Principles for a Black Swan-Proof World

Nassim Nicholas Taleb penned an opinion piece in the Financial Times: Ten principles for a Black Swan-proof world .

Check out the piece for details behind each item:

1. What is fragile should break early while it is still small.
2. No socialisation of losses and privatisation of gains.
3. People who were driving a school bus blindfolded (and crashed it) should never be given a new bus.
4. Do not let someone making an “incentive” bonus manage a nuclear plant – or your financial risks.
5. Counter-balance complexity with simplicity
6. Do not give children sticks of dynamite, even if they come with a warning.
7. Only Ponzi schemes should depend on confidence. Governments should never need to “restore confidence”.
8. Do not give an addict more drugs if he has withdrawal pains.
9. Citizens should not depend on financial assets or fallible “expert” advice for their retirement.
10. Make an omelette with the broken eggs.

If you have not read The Black Swan yet, you should. it was one of those few books that changed the way I view the world.
The Black Swan

Book Review: The Black Swan

The Black SwanI just finished reading The Black Swan by Nassim Nicholas Taleb. The title of the book comes from the observations of Europeans that all swans are white. Much to their surprise, they came to Australia and found their first black swan. The book starts with this story to illustrate the “limitations to our learning from observations or experience and the fragility of our knowledge.” As Taleb points out, it very different to think there is evidence of no possible black swans, than there is no evidence of the possibility of black swans.

Taleb has received lots of press and admirers given the recent meltdown in the financial markets. The book was published in 2007. Taleb seems to have perceived the coming collapse (and probably got a rich financial reward based on his strategy).

His supreme self-confidence (arrogance) shines brightly through in his writing. He has little time for shallow thinking and those who think they understand risk or the financial markets.

Another example running through the book is the first 1,001 days of a turkey’s life. For a 1,000 days the farmer brings food to the turkey every morning. On that last day, things change dramatically. The farmer shows up with an axe instead of food. A surprise and horrible change in circumstance for the turkey. But all the historical evidence for the turkey indicated that the farmer would show up with food and not an axe. Of course, on the flip side, the farmer saw the axe day coming.

As Yogi Berra philosophized: “It’s tough to make predictions, especially about the future.”

It is the unknown unknown that is most dangerous. We spend too much time focusing on knowing what we know. We need to spend focusing some energy in realizing what we do not know and what we do not know what we do not know.

As a compliance and risk professional I was particularly intrigued by the story of the four largest losses by casinos. As you might expect, casinos run very thorough security programs, compliance programs and risk management programs. The four largest losses fell completely outside the casinos’ models. One was the white tiger’s attack on Roy, the second was a disgruntled contractor who attempted to dynamite the casino, the third was the kidnapped daughter of a casino owner, and the fourth an incompetent employee who failed to file the 1099 reports with the Internal Revenue Service.

It is also important to draw the distinction between positive contingencies and negative contingencies. The black swan can be one that brings unexpected destruction or one that brings an unexpected windfall. His philosophy is to play it safe, but hedge for a disastrous losses and spectacular windfalls. Mitigate the unexpected consequences.

I expected to get a lot of insight from the book. But it was one of the few books that changed the outlook on my profession.

Risk Mismanagement

I really enjoyed the story by Joe Nocera in the New York Times: Risk Mismanagement. The author focuses on the failures of risk management during the most recent financial crisis.

The author starts with the failure of the VaR (Value at Risk) model used by many companies. He then moves on to the theories of Taleb captured in his book Black Swan (next on my reading list).

Taleb says that Wall Street risk models, no matter how mathematically sophisticated, are bogus; . . . . And the essential reason for this is that the greatest risks are never the ones you can see and measure, but the ones you can’t see and therefore can never measure. . . . Because we don’t know what a black swan might look like or when it might appear and therefore don’t plan for it, it will always get us in the end.

The key for a compliance professional is do handle the current know risks to your company, while at the same time keeping an eye out for unknown risks.