Near Misses, Catastrophes, and Compliance

The theme of the April edition of the Harvard Business Review is “Failure.” As scary as that term is in the world of compliance, “catastrophe” is even scarier. That means that the failure resulted is real, significant damage.

But you can learn from failures. You can especially learn from others’ failures.

In How to Avoid Catastrophe, Catherine H. Tinsley, Robin L. Dillon, and Peter M. Madsen look at “unremarked small failures that permeate day-to-day business but cause no immediate harm.” Their research has revealed a pattern: “Multiple near misses preceded (and foreshadowed) every disaster and business crisis we studied, and most of the misses were ignored or misread.” (Sorry, you need a subscription to read the entire HBR article.)

They come up with seven strategies that can help an organization recognize near misses and root out the error behind them. These seem very applicable to compliance programs.

  1. Heed high pressure. The greater the pressure to meet performance goals, the greater likelihood that managers will discount near miss signals.
  2. Learn from deviations. Don’t just recalibrate operations, focus on the significance of the change.
  3. Uncover root causes. Don’t just correct the symptoms.
  4. Demand accountability. Require managers to justify their assessments of near misses.
  5. Consider worst-case scenarios. people tend to not think of the possible negative consequence of a near miss. Walk through a situation where the near-miss does not miss.
  6. Evaluate projects at every stage. Take a look at your successful projects, not just your failures.
  7. Reward owning up. Don’t punish errors, but reward those who uncover near misses.

The authors:

Failure and Compliance

The theme of the April edition of the Harvard Business Review is “Failure.” That’s a scary term in the world of compliance. Generally, that means you’ve got government regulators or enforcement personnel sitting in your offices. And they are not happy. Failure and compliance can mean disciplinary action, fines, or jail time.

But you can learn from failures. You can especially learn from others’ failures.

Ethical Breakdowns by Max H. Bazerman and Ann E. Tenbrunsel takes an insightful look at ethical breakdowns and comes up with five barriers to an ethical organization. (Sorry, you need a subscription to read the entire HBR article.)

  • Ill-conceived goals
  • Motivated blindness
  • Indirect blindness
  • Slippery slope
  • Overhauling outcomes

An ill-conceived goal is the classic failure seen in sales targets, revenue projections, and stock price targets. If you give mechanics a sales goal of $147 hour they can very easily lapse into fixing things that were not broken rather than being more efficient. Sears encountered this problem in the 1990s.

The authors lump a few things into the motivated blindness category, but most notably included are conflicts of interest. They use the failure of rating agencies during the financial collapse as one example. Since the rating agencies are paid by the issuer instead of the buyer of securities, they have a misalignment of motivation. They end up serving the one who pays them, leading to lax ratings and competition for business. That means they may have rated something higher than they should have. (That’s a big understatement.)

Indirect blindness is when third parties are involved. What caught my eye was an experiment examining perceptions of an increase in the cost of a pharmaceutical drug. In the first scenario, the drug company raises the price from $3 to $9. In the second scenario, the drug company sells the rights to a smaller company who then increases the price to $15. The first scenario was judged more harshly, even though it resulted in a lesser price.

We’ve all been concerned about the slippery slope. Little lapses lead to a culture of lapses, eventually leading a big failure. The authors present some interesting research showing how this works and that it is a real problem. From a compliance perspective, they focus on auditors and how good accountants can do bad audits.

The final category is the one I found the most intriguing: overvaluing outcomes. The author’s research showed an inclination to judge actions based on outcomes rather than the behavior. One example is a research failure.

In the scenario A, a researcher pulls four subjects back into the results after they were removed for technicalities. However, the researcher thinks their data is appropriate. When adding them back in, the results shift and allows the drug to go to market. Unfortunately, the drug ends up killing six people and is pulled from the shelves.

In scenario B, a researcher makes up four more data points for how he believes subjects are likely to behave. The drug goes to market, becomes profitable and effective.

The participants in the author’s experiment judged the researcher in scenario A much more critically than the researcher in scenario B. The problem is that the person B had the bigger ethical lapse and worse behavior. It’s just that the outcome, largely by luck, was worse in A than B.

They extrapolate the findings to the situation where a manager is overlooking ethical behaviors when outcomes are good and unconsciously helping to undermine the ethical culture of an organization.

The Case for Professional Boards

If you want to improve governance at a corporation, do you need professional directors? Did SOX merely add a layer of legal obligations of board, and do little to improve the quality of those serving as directors?

Robert C. Pozen makes the case in The Case for Professional Boards in the December issue of the Harvard Business Review.

Pozen starts by limiting the size of the board to seven people: the CEO plus six independent directors. He points to research that groups of this size are optimal for decision-making. Bigger groups can result in “social loafing”, relying on others to take the lead and ceding decision-making. Six also gives you enough people to populate the three key committees: nominating, compensation and audit.

The greatest need in a board is expertise. Pozen expects an accounting expert to head the audit committee. He also allows for one generalist to provide a broad perspective on the company’s strategy. But the rest should be experts in the company’s main line of business. That is not easy. Independent experts are most likely working for company’s competitors. He expects that most professional directors would be retired executives in the company’s industry. That would also lead to the elimination of mandatory retirement ages for directors.

Pozen makes a strong case. “To improve corporate oversight we need not more legal procedures but a culture of governance in which directors commit to the role as their primary occupation.” It’s just very radical strategy for companies who have grown and gathered their directors organically.

Power Corrupts – So Does Powerlessness

Rosabeth Moss Kanter points out another reason that the “tone at the top” is only one factor for corporate compliance in Powerlessness Corrupts.

“Power corrupts, as Lord Acton famously said, but so does powerlessness. Though powerlessness might not result in the egregious violations associated with arrogant officials who feel they are above the law, it is corrosive.”

  • Managers spread powerlessness by limiting information.
  • They compound the insult by sneaking unpopular decisions through when they think no one’s looking.
  • Powerlessness burgeons in blame cultures.
  • The powerless retaliate through subtle sabotage. They slow things down by failing to take action
  • Negativity and low aspirations show up in behaviors psychologists call defensive pessimism, learned helplessness, and passive aggression.

Those are a lot of points for targeting the tone at the middle and the tone at the bottom.

Dilbert, being the epitome of powerlessness, captures some of this in today’s strip.
Dilbert.com

Keeping Your Colleagues Honest

Mary C. Gentile put together a great piece on how to challenge unethical behavior at work in the March issue of the Harvard Business Review: Keeping Your Colleagues Honest.

She starts with four rationalizations for staying silent when encountering an ethical problem:

  • It’s standard practice.
  • It’s not a big deal.
  • It’s not my responsibility.
  • I want to be loyal.

The meat of the article is about helping a manager to speak up when confronted with an ethical problem.

  • Treat the conflict as a business matter.
  • Recognize that this is part of your job.
  • Be Yourself.
  • Challenge the rationalizations.
  • Turn newbie status into an asset.
  • Expose faulty either/or thinking.
  • Make long-term risks more concrete.
  • Present an alternative.

I particularly liked her use of the rationalization argument.

“If people make the point that an issue is not your responsibility, you are in a strong position to press ahead—in using this rationalization, they have already conceded that the behavior is wrong, or at least questionable. They are not arguing with your assessment; they’re looking for a way to avoid the conversation.”

She also pulls out the New York Times technique on rationalization: “If it is expected , are we comfortable being public about it?” I usually amplify this to ask “Would you be comfortable with this being told in a story on the front page of the New York Times?”

The full article is behind the paywall at HBR.org.

Mary C. Gentile is a senior research scholar at Babson College in Wellesley, Massachusetts. Her book Giving Voice to Values is forthcoming from Yale University Press in September 2010.

Six Mistakes Executives Make in Risk Management

Harvard-Business-Review-October-2009-Cover

Nassim N. Taleb, Daniel G. Goldstein, and Mark W. Spitznagel discuss risk management and short comings in approaches in the October 2009 issue of the Harvard Business Review (subscription required).

They offer up six mistakes in the way we think about risk:

1.  We think we can manage risk by predicting extreme events.
2.  We are convinced that studying the past will help us manage risk.
3.  We don’t listen to advice about what we shouldn’t do.
4.  We assume that risk can be measured by standard deviation.
5.  We don’t appreciate that what’s mathematically equivalent isn’t psychologically so.
6.  We are taught that efficiency and maximizing shareholder value don’t tolerate redundancy.

Black Swan events – low-probability, high-impact events that are almost impossible to forecast— are increasingly dominating the economic environment. The world is a complex system, made up of a tangled web of relationships and other interdependent factors.  Complexity makes forecasting even ordinary events impossible. So, complexity increases the incidence of Black Swan events as we have a harder time seeing the relationship and connection. All we can predict is that Black Swan events will occur and we won’t expect them.

The authors propose a different approach to risk management:

“Instead of trying to anticipate low-probability, high-impact events, we should reduce our vulnerability to them. Risk management, we believe, should be about lessening the impact of what we don’t understand—not a futile attempt to develop sophisticated techniques and stories that perpetuate our illusions of being able to understand and predict the social and economic environment.”

The authors end up equating risk to ancient mythology:

“Remember that the biggest risk lies within us: We overestimate our abilities and underestimate what can go wrong. The ancients considered hubris the greatest defect, and the gods punished it mercilessly. Look at the number of heroes who faced fatal retribution for their hubris: Achilles and Agamemnon died as a price of their arrogance; Xerxes failed because of his conceit when he attacked Greece; and many generals throughout history have died for not recognizing their limits. Any corporation that doesn’t recognize its Achilles’ heel is fated to die because of it.”

That is a bit lofty for my tastes. After all, the danger of the black swan is that you don’t know that you don’t know about that risk. If you know about a risk, you can deal with it. If you know that you don’t know about risk, you can manage that also. It’s hard to be a victim of hubris when you don’t know the danger for your downfall even exists.

Nassim N. Taleb is the Distinguished Professor of Risk Engineering at New York University’s Polytechnic Institute and a principal of Universa Investments, a firm in Santa Monica, California. He is the author of several books, including The Black Swan: The Impact of the Highly Improbable. Daniel G. Goldstein is an assistant professor of marketing at London Business School and a principal research scientist at Yahoo. Mark W. Spitznagel is a principal of Universa Investments.

Ethics and the Sales Relationship in World-Class Bull

hbr-may-2009

The May issue of the Harvard Business Review offers up an ethics problem in its monthly case study: World-Class Bull (subscription required for full article). The three commentaries offer very different reactions to the facts presented in the case study’s fact pattern. John Humphreys, Zafar U. Ahmed, and Mildred Pryor penned the fact pattern.

The case study revolves around the acquisition of a new customer. The existing sales agent was having no luck. A hot shot salesman took on the challenge by using the customer’s love of livestock to generate the relationship and close the sale.

On one hand, you need to applaud the salesman for learning more about the customer and how to engage the customer in a relationship. The ethical issue arises because of the apparent subterfuge of the salesman in engaging the customer and developing the relationship. The ethical issue is raised to a higher level when the sales manager sends an email to the entire sales team applauding the salesman and describing all of the subterfuge in detail.

James Borg, author of Persuasion: The Art Of Influencing People, lauds the salesman for taking the steps to engage the customer on a personal basis. However, he thinks the sales manager should “be hauled in front of the company’s Idiocy Review Board for sending an ill-advised, potentially damaging e-mail.”

Don Peppers and Martha Rogers, the coauthors of Rules to Break and Laws to Follow: How Your Business Can Beat the Crisis of Short-Termism, flat out declare the saleman’s tactics as unethical. They think the company should immediately fire the sales manager, discipline the salesman, send a message to all employees firmly asserting that deceiving customers or prospects is not the Company’s way of doing business, and rewrite the ethics code.

Kirk O. Hanson, the University Professor of Organizations and Society and the executive director of the Markkula Center for Applied Ethics at Santa Clara University in California, thinks the company should publicly reprimand the salesman and doubts that the sales manager is salvageable.

I see a problem with the salesman’s tactics, but I would not be so harsh as to pass judgment without an investigation and without reviewing the company’s code of ethics. There is flat statement in the case study by the salesman that he didn’t violate a single item in the ethics code. To me it seems hard to punish the salesman if he didn’t violate the company’s policies or ethics code. Since the conduct seems questionable, perhaps there is a flaw in the ethics code. I agree with Peppers and Rogers that you may need to rewrite the ethics code. Of course, it could also be that the salesman did know the content of the code of ethics.

Like the other commentators I have a bigger problem with the sales manager for not recognizing the ethical problem and sending out the laudatory email without a review or investigation. That is the bigger failure. For a company to maintain high ethical standards, front-line managers like the sales manager are key. They must understand how the actions by the people they manage affect the long term success of the company. The sales manager failed this test.

To Lead, Create a Shared Vision

Harvard business review january 2009

In the January 2009 issue of the Harvard Business Review is a short Forethought piece on the importance of leaders creating vision: To Lead, Create a Shared Vision.

James M. Kouzes and Barry Z. Posner emphasize the important of leaders creating vision for their organization and develop a forward-looking capacity. But rather than leaders thinking that they themselves need to be the visionary, the authors think it is more important to get input from the people in your organization to develop the vision.

Too many leaders act as “emissaries from the future, delivering the news of how their markets and organizations will be transformed.” Instead, “constituents want visions of the future that reflect their own aspirations. They want to hear how their dreams will come true and their hopes will be fulfilled.” The best way to lead people into the future is to connect with them in the present.

What does this mean for compliance?

When putting together and maintaining your compliance program, you need to seek input from as many people as possible. It is too late to get buy-in after the policy is already drafted. Send early drafts to a wide population of the organization for review and comment. They may surprise you by pointing out weaknesses and ambiguity in the policy draft.

By sending drafts, you also emphasize the importance of the policy and its existence.  Many studies have shown that people need to be exposed to a policy several times before they can even remember that it exists. Circulating drafts can accomplish some of that information awareness.

Ways Companies Mismanage Risk

hbr_2009_march

René M. Stulz put together Six Ways Companies Mismanage Risk for the March issue of the Harvard Business Review. Professor Stulz summarizes his thoughts in that “conventional approaches to risk management present many pitfalls. Even in the best of times, if you are to manage risk effectively, you must make extremely good judgment calls involving data and metrics, have a clear sense of how all the moving parts work together, and communicate that well.” Risk management is a new discipline, moving from the domain of the quant geeks to the board room. It is hard to pull it all together

Based on the recent downfalls of financial companies, it is clear that they lost a sense of of how the pieces of their risk management worked together. (See my earlier post: The Risk Management Formula That Killed Wall Street.) You need to understand the data, understand the weaknesses of the formulas that manipulate the data, and the understand what is missing from the end result. Most of the danger comes from what you don’t know that you don’t know. To avoid that you need to continually learn so there is less you don’t know and continually be cognizant that there is still much that you don’t know.

Here are the six ways from Professor Stulz:

  • Lack of appropriate data. The rapid financial innovation of recent decades has made historical data less useful.
  • Narrow measures of risk. Traditional daily measures of risk can’t capture a company’s full exposure when market fundamentals are shifting.
  • Overlooked risks. Hedge funds that bought high-yielding Russian debt in the 1990s failed to properly account for counterparty risk.
  • Hidden risks. Unreported risks have a tendency to expand in financial institutions.
  • Poor communication. Complex and expensive risk-management systems can induce a false sense of security when their output is poorly communicated to top management.
  • Rate of change. The risk characteristics of securities may change too quickly to enable managers to properly assess and hedge risks.

“If you live in Florida or Louisiana, you shouldn’t spend a lot of time thinking about how likely it is that you’ll be hit by a hurricane. Rather, you should think about what would happen to your organization if it was hit by one and how you would deal with the situation. Instead of focusing on the fact that the probabilities of catastrophic risks are extremely small, risk managers should build scenarios for such risks, and the organization should design strategies for surviving them.”

René M. Stulz is the Everett D. Reese Chair of Banking and Monetary Economics at The Ohio State University’s Fisher College of Business in Columbus.

See:

The Unexpected Benefits of Sarbanes Oxley

coverThe April 2006 issue of the Harvard Business Review has an article by Stephen Wagner and Lee Dittmar on The Unexpected Benefits of Sarbanes Oxley.

Although the article is somewhat dated when it talks about the second year under Sarbanes Oxley, it foretells some of the current thoughts in compliance. Compliance is good for business. Two and a half years later, the Madoff scandal illustrates the need to be more transparent to your investors and for investors to look closer at their investments. Documenting business process and putting controls in place will make your business run better.

Good governance is a mixture of the enforceable and the intangible. Organizations with strong governance provide discipline and structure; instill ethical values in employees and train them in the proper procedures; and exhibit behavior at the board and executive levels that the rest of the organization will want to emulate.