Minor issues, if left unattended, can snowball into larger and more complicated problems. And the longer it takes to address an issue, the harder it will become to do so effectively.
This is not a revelatory idea. And yet, all too often leaders and organizations choose to ignore, diminish, or underestimate seemingly small problems when they see them.
What we know from studying crises is that in high stakes environments even the smallest oversight can evolve into a catastrophic failure. A prime example of this is NASA's Space Shuttle Columbia disaster in 2003.
On January 16th, 2003, the Space Shuttle Columbia, officially known as STS-107, lifted off from the Kennedy Space Center in Cape Canaveral. It was NASA's 113th space shuttle flight. During its 16-day mission, seven NASA astronauts performed several scientific experiments that ranged from studying how microgravity affects the human body to testing new technologies. As the mission ended, seemingly a success from a scientific standpoint, the shuttle prepared to return to Earth.
On February 1st, the shuttle disintegrated upon re-entry into Earth's atmosphere, scattering debris over several U.S. States. All seven astronauts on board perished. This disaster marked one of the most significant tragedies in the history of space exploration.
So, what exactly caused the shuttle to disintegrate upon re-entry?
There were two core problems that led to the disaster. The first problem was damage sustained to the shuttle during launch. When the shuttle launched, a piece of foam insulation broke off from the external fuel tank and struck the shuttle’s left wing. The foam was designed to prevent ice from forming on the shuttle whilst in Space. The foam strike damaged the shuttle’s thermal protection system, a heat shield made up of tiles designed to protect the spacecraft from the extreme heat of re-entry. The damage sustained to the wing allowed hot gases to penetrate the shuttle during its descent, ultimately leading to its disintegration in the atmosphere.
The second problem that led to the disaster was not a physical issue with the shuttle, but rather a decision-making and cultural issue at NASA. During the mission, the astronauts onboard Columbia and NASA ground engineers raised concerns about the debris strikes, as well as other anomalies with the shuttle. However, despite these warnings, NASA’s mission control team did not consider these concerns to be that serious. Mission control concluded that the foam strike was a minor issue, as foam strikes had occurred before without major consequence to shuttles. Those previous experiences became the lens through which mission control viewed this issue. As a result, mission control underestimated the danger the foam strike posed and dismissed the astronauts and ground engineers’ concerns.
Additionally, NASA’s internal culture also played a major factor in mission control’s analysis and decision-making. At the time, NASA’s leadership prioritized efficiency and meeting deadlines over investigating every possible concern or potential issues, especially when those concerns seemed minor. This created a culture of downplaying risks in favor of efficiency. Taking time to investigate damage from a minor foam strike was contrary to leadership’s mandate and the operating culture of NASA at that time. There were also major communication breakdowns within NASA. As a result, the concerns about the foam strike were never escalated effectively.
As a result of these factors, no serious steps were taken to investigate the damage on Columbia while it was still in orbit, resulting in the shuttle and the seven astronauts on board never making it back home.
The Columbia disaster could have been prevented – had NASA’s mission control taken the risks the foam strike represented seriously and addressed the problem early. This directly reflects a core principle of effective crisis response – the need to take risks seriously and do all that we can to mitigate those risks.
The lessons of the Columbia disaster extend far beyond space exploration. In the business world, when early warning signs are ignored, small issues can become big problems that ultimately lead to significant harm to the organization. Whether it’s a minor operational glitch, a small customer complaint, or an internal team raising concerns about a project – these incidents may signal the potential of much larger problem that can prevented or addressed early.
By taking risks seriously and addressing issues promptly, organizations can ensure small issues do not snowball into major problems and prevent potential crises from occurring. In our fast-paced and interconnected world, organizations that recognize and respond quickly to early warnings of bigger issues are far better equipped to maintain or enhance their competitive position.
Comments