Latest Posts

The Hidden Mathematics of Sport The 2026 USTA’s Friend at Court is Out… and a Foot Fault! The Racquet Bag Leaf Blower: A Small Tennis Tech Upgrade Tennis Beyond the Headlines: March 2, 2026 Beyond the Bell Curve: Why Competitive Tennis Ecosystems Need Edges The Participation Pyramid and the Cost of Lopping Off the Top Winter Is No Longer Coming: The LTA’s County Cup Decision

In the publishing rhythm of this site, the topics for the “Unplugged” series of posts that run every Friday, Saturday, and Sunday are intentionally unconstrained. The weekends are reserved for whatever happens to be on my mind at the time, without any requirement that the topic connect directly to tennis. Most of the time, it does, and the link is immediately apparent. At other times, the relationship to the sport is more subtle. Today’s definitely falls into that second category. This is a deep dive into a set of events that shaped my mental models and attitudes. That includes decision-making, systems thinking, responsibility, and the social dynamics of institutions.

On January 28, 1986, the Space Shuttle Challenger broke apart just 73 seconds after liftoff, killing all seven crew members aboard. The launch was carried live on television, in part because the mission included Christa McAuliffe, a civilian teacher selected for NASA’s Teacher in Space program. What was intended to be a routine launch quickly turned into a visible catastrophe as the shuttle disintegrated against a clear Florida sky. The images were replayed repeatedly in the hours and days that followed, making the failure impossible to look away from and embedding the event in the public consciousness in a way few technical accidents ever are.

For many people, the Space Shuttle Challenger disaster is one of those moments that comes with a fixed memory of place. People remember where they were, what they were doing, and who they were with when the news broke. It sits in the same category as other shared national shocks, events that freeze a moment in time and permanently attach it to a broader realization that something had gone terribly wrong. Even decades later, the memory is often recalled with surprising clarity, not because of the technical details, but because of how abruptly assumptions about the safety of the Space Shuttle program were shattered.

I was a freshman in college when the Challenger disaster occurred. That morning, the launch was not carried on any of the three over-the-air television stations available in my dorm room. However, within minutes, every station cut to live coverage of the disaster. I made the personal, and in hindsight very questionable, decision to skip my Calculus II class and stay with the coverage throughout the morning. Like so many others, I was transfixed. The unfolding analysis, the repeated replays, and the growing realization that this was not a simple accident made it hard to pull myself away.

The events that followed the Challenger disaster fundamentally changed my engineering education. As the independent investigations and congressional hearings unfolded, their findings quickly impacted undergraduate and graduate engineering curricula. By the time I reached my upper-division engineering courses, the Challenger was a contemporary case study that reshaped course content, discussions, and expectations. It reframed engineering not simply as a technical discipline, but as one embedded in organizational structures, decision processes, and human judgment. That shift in emphasis altered the lens through which the rest of my formal engineering education was delivered.

Across nearly every upper-division course, there was a clear and consistent emphasis on systems-oriented thinking and personal responsibility for decision-making. Risk was no longer treated as optional, but rather as a foundational element that engineers were expected to see, surface, and actively consider throughout their work. That emphasis was most explicit in my engineering ethics course, a class students could not even take until they had accumulated enough credits to be classified as a junior. The course was built around a freshly revised textbook that returned to the Challenger disaster again and again as its central case study. The connection between technical decisions, organizational behavior, and real-world consequences was emblazoned on my soul.

For those of us who graduated from engineering programs around that same period, one of the most enduring lessons centered on what later emerged about the engineers at Morton Thiokol. We were taught about how serious concerns about the O-ring’s safety in cold weather had been repeatedly raised, including on the morning of the launch itself. The risk of catastrophic failure was debated, softened, and ultimately suppressed as it moved up the decision chain. Those specific concerns never reached NASA’s executive leadership in a form that accurately reflected the level of risk involved. The takeaway for our generation of engineering students was explicit. Identifying risk was not enough. Articulating it clearly, preserving its severity as it moved through an organization, and being willing to push back against authority when risks became unacceptable were framed as core professional responsibilities, not optional acts of courage.

For the first thirteen years of my career, I worked at a major telecommunications firm where the consequences of most engineering decisions were not particularly catastrophic. Dropping a wireless mobile call, especially in an era when those devices were largely limited to well-paid doctors and corporate executives, was inconvenient but rarely consequential. Even so, the Challenger-era lessons occasionally resurfaced as my company pushed aggressively toward the goal of “five nines” availability. That metric, 99.999% uptime, translates to roughly five minutes of allowable downtime per year. Even in a relatively low-stakes domain, it required disciplined systems thinking, careful consideration of tradeoffs, and a growing appreciation for how seemingly small design and operational decisions could compound in unexpected ways.

That changed decisively when I moved into the aerospace domain, where engineering decisions directly affect safety and the country’s national defense interests. In that environment, the margin for error narrows quickly, and the consequences of failure are no longer abstract or recoverable. My education, grounded in the lessons from the Challenger, left me well prepared to quickly climb the learning curve of safety-critical systems. What also became increasingly apparent was that other engineers who had completed their degrees around the same period I did also tended to share a heightened sensitivity to risk, an instinct to ask harder questions, and a willingness to challenge assumptions when the stakes demanded it.

Tennis, of course, is not a safety-critical domain. No lives are at stake if a league rule is poorly written, a process is inconsistently applied, or a decision is handled clumsily. The risks are modest, often limited to frustration, lost trust, or diminished enjoyment of the sport. And yet, the institutional dynamics are remarkably familiar. Tennis is governed by systems. Decisions are made within organizations. Information moves up and down chains of authority, and incentives shape behavior in ways that are not always visible or intentional. For that reason, the same lessons about clear risk identification, disciplined decision-making, and well-designed processes remain relevant, even when the stakes are far lower.

My sport could benefit greatly by adopting a more systems-oriented approach to its governance. That means designing organizational structures, communication flows, and decision processes that account for inevitable human error, provide mechanisms to surface concerns effectively, and put systems into place that reduce the likelihood that seemingly small issues escalate into larger failures. Clear rules are important, but so are clear pathways for feedback, challenge, and correction when those rules interact with real people and real situations. The goal is not perfection or zero controversy. Rather, it is to build resilient processes and guardrails that make bad outcomes less likely, trust easier to sustain, and the sport healthier over the long run.

That perspective is why topics like this keep surfacing in my Unplugged posts, even when they seem far removed from tennis. Once you have been trained to see how small decisions, muted concerns, and fragile processes can align in unhealthy ways, it becomes hard not to notice the same patterns elsewhere. The stakes may be lower, but the lessons still matter.


One thought on “What the Challenger Disaster Taught Me About Systems, Risk, and Tennis

  1. Teresa Merklin says:

    The Challenger disaster has been back in the public conversation recently as the anniversary of the accident passed another milestone. Those anniversaries tend to do that. They pull events out of history and back into the present, not as abstractions, but as lived moments that still carry unresolved lessons. That renewed attention is part of why this story has been front of mind for me again.

    I also recently began reading Challenger: A True Story of Heroism and Disaster on the Edge of Space by Adam Higginbotham, a deeply researched and carefully constructed account of the disaster and the years that led up to it. Even early in the book, it is clear that this is not simply a retelling of a familiar tragedy, but a fuller examination of the people, pressures, and decisions that shaped the outcome. It reinforces why Challenger has endured as more than a historical event. It remains a case study in how institutions behave under stress, how warnings are managed or ignored, and how easily well-intentioned systems can drift toward failure. That combination of timing and perspective is what brought this topic back to the surface for me now.

Leave a Reply

Your email address will not be published. Required fields are marked *