Estimated Reading Time: 5 minutes
By Glenn Mott
“You don’t understand anything until you learn it more than one way.” – Marvin Minsky
The other night, I happened to catch the documentary program “Retro Report” on PBS. Retro Report is a nonprofit news organization that produces mini documentaries that look at today’s news stories through the lens of historical context. Executive Producer Kyra Darnton describes “Retro Report’s” mission as providing, “context and perspective by going back and re-reporting and reanalyzing older stories, or stories that we think of as not relevant anymore.” In many ways, this is tantamount to what risk managers are capable of, and tasked with doing, after a catastrophic risk event.
Not every “Retro Report” is about risk management, of course, but many of them involve some aspect of risk, in retrospect and in context. The segment that caught my attention is titled “Risks after Challenger.” According to Diane Vaughan, author of the book The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, in the period leading up to the Challenger launch, NASA was still riding high on clout earned during the Apollo missions, which “gave it an aura of invincibility. We were taken as the international leader in the space race. And no one was really expecting anything to go wrong.”
Many of us recall that tragic 1986 episode vividly, along with the Presidential Commission charged with investigating the disaster, and the public hearings that followed. The PBS documentary details how NASA managers not only knew about the risk of O-ring failure at temperatures similar to those on that fateful January launch day, but how NASA had willingly ignored the advice of its contractor Morton Thiokol, and Thiokol managers, in turn, capitulated to NASA over the advice of their own engineers.
As Vaughan relates the story, “At Thiokol the vice president was asking those engineers to stand up for what they said. Roger Boisjoly [a mechanical engineer] took the lead in the objections. He said, ‘I can’t prove it to you. All I know is that it’s away from goodness in our experience base.’ But the engineers at Thiokol didn’t have the data. So the vice president took the decision-making away from the engineers and asked the managers to decide.” Thiokol managers voted to reverse the recommendation and to launch the Challenger as planned. The subsequent Rogers Commission report contained the memorable phrase that the Challenger disaster was, “An accident rooted in history.”
Many of you may remember, as I do, the testimony of Nobel Laureate and theoretical physicist Richard Feynman during the televised hearings. To prove his point, he did something so simple that all the complexity of data and analysis that had blinded NASA’s managers to the problem was immediately clear to anyone watching from home. He submerged a sample of the O-ring material in a glass of ice water. The material lost its resilience at cold temperatures. As Feynman’s demonstration with a clamp and a glass of water made crystal clear, the cognitive, systemic, and practical disconnect between NASA’s engineers and executives, and between Thiokol’s managers and engineers, was far greater than anyone had supposed.
A few years later, NASA and the American space program again suffered a catastrophe with the loss of Columbia on reentry due to launch debris that had caused damage to tiles on the left wing. As Vaughn notes, “The similarity between Challenger and Columbia was the falling back on routine under uncertain circumstances.”
As I watched the documentary and the timeline of the shuttle disaster unfold, I recalled how Melanie Herman and Katharine Nesslage previewed this dilemma in a recent NRMC eNews column. In that column they write, “After many years of advising nonprofits across a colorful spectrum of missions, we’ve concluded that trying to find a completed set of risk management blueprints or best practices that can simply be followed like breadcrumbs is a costly and frustrating exercise that is fraught with risk.”
Again, Diane Vaughan: “This happens in many different kinds of organizations . . . their [NASA managers’] behavior was to a great deal determined by working in a very rule-oriented organization.” Vaughan continues, “We can never resolve the problem of complexity, but you have to be sensitive to your organization and how it works. While a lot of us work in complex organizations, we don’t really realize the way that the organizations we inhabit, completely inhabit us.”
As the above-mentioned NRMC column makes clear, risk managers must constantly monitor how the environment in which they operate is changing. They must be open to insights and ideas from different sectors, industries, and even different disciplines. To quote NRMC’s own best advice, “Keep in mind that while you may believe your quest is for best practices, a best practice in one organization could be a complete disaster in your environment. Be open to hearing about missteps as well as success stories.”
When reviewing the Challenger disaster, Diane Vaughan, a sociologist, coined the term “normalization of deviance,” and went on to apply the theory to organizations beyond the space agency. She says that it helps explain how over time, organizations come to accept risky practices as normal. “It’s widespread, with Katrina, with British Petroleum [when] early warning signs ignored [in the Deepwater Horizon disaster in Gulf of Mexico], the 2008 financial failure. You have a lot of heavy technology, derivatives and formulas, and there is a fine line between what is devious and what is a good business decision.” In 2017, Admiral William Moran of the U.S. Navy cited Vaughn’s term to explain two deadly collisions (the USS Fitzgerald and USS McCain) just over two months apart that summer while patrolling in the South China Sea.
Tearing down the risk function in your agency and rebuilding it differently may be the hardest thing you will need to do after a catastrophic risk event. Yet, every risk manager needs to be prepared to answer the question: “What would I do differently?” The alternative is adhering to a blueprint, following the rules of the system while ignoring the environment, and being unimpressed with any data to the contrary.
For more, see these resources:
The “Retro Report” documentary on PBS
“It’s Time to Banish Blueprints and Best Practices,” RISK eNews, Nonprofit Risk Management Center
“First let me congratulate you on a conference well done. I had a great time at the Nonprofit Employee Benefits Conference and walked away with some valuable tools and questions that we’ll need to be addressing in both the short and long term. Thanks to you and your staff for all you do to provide us with quality resources in support of our missions.”
“BBYO’s engagement of the Center to conduct a risk assessment was one of the most valuable processes undertaken over the past five years. Numerous programmatic and procedural changes were recommended and have since been implemented. Additionally, dozens (literally) of insurance coverage gaps were identified that would never have been without the work of the Center. This assessment led to a broker bidding process that resulted in BBYO’s selection of a new broker that we have been extremely satisfied with. I unconditionally recommend the Center for their consultative services.
“Melanie Herman has provided expert, insightful, timely and well resourced information to our Executive Team and Board of Directors. Our corporation recently experienced massive growth through merger and the Board has been working to better integrate their expanded set of roles and responsibilities. Melanie presented at our Annual Board of Director’s Retreat and captured the interest of our Board members. As a result of her excellent presentation the Board has engaged in focused review which is having immediate effects on governance.”
“The Nonprofit Risk Management Center has been an outstanding partner for us. They are attentive to our needs, and work hard to successfully meet our requests for information. Being an Affiliate member gave us access to so many time- and money-saving resources that it easily paid for itself! Nonprofit Risk Management Center is truly a valued partner of The Community Foundation of Elkhart County and we are continuously able to optimize staff time with the support given by their team.”
“The board and staff of the Prince George’s Child Resource Center are extremely pleased with the results of the risk assessment conducted by the Nonprofit Risk Management Center. A thorough scan revealed that while we are a well run organization, we had risks that we never imagined. We are grateful to know that we have now minimized our organizational risks and we recommend the Center to other nonprofits.”
Great American Insurance Group’s Specialty Human Services is committed to protecting those who improve your communities. The Center team has committed to delivering dynamic risk management solutions tailored to nonprofit organizations. These organizations have many and varied risk issues, hence the need for specialized coverage and expert knowledge for their protection. We’ve had Melanie speak on several occasions to employees and our agents. She is always on point and delivers such great value. Thank you for the terrific partnership and allowing our nonprofits to focus on their mission!