Why You’ve Never Been In A Plane Crash

Kyra Dempsey

The United States leads the world in airline safety. That’s because of the way we assign blame when accidents do happen.

What is the worst imaginable consequence of making a mistake? For some, it might be this:

“I really thought I was going to die,” said USAir passenger Laurel Bravo, speaking to the Associated Press. “The row ahead of us just disappeared. The seats all went flying downward…”

Illustrations: Mike McQuade

For the 89 passengers and crew aboard USAir flight 1493, it had been an unremarkable flight from takeoff in Columbus, Ohio, to touchdown at Los Angeles International Airport. That evening, the first of February 1991, the weather was perfectly clear, the Boeing 737 had a clean sheet with no mechanical faults, and with the runway in sight from 25 miles out, the approach was uneventful. Flying the plane from the right seat, First Officer David Kelly pulled the nose up to ease the touchdown and greased it onto runway 24 Left. He then brought the nose back down, smooth and steady. And then, without time to even shout, all hell broke loose.

Kelly would later recall seeing the brief flash of a propeller before he instinctively pushed the brake pedal to the floor, but it was much too late. With a powerful jolt and a metallic screech, the 737 plowed into a small commuter plane, crushing the hapless turboprop beneath and propelling it forward down the runway at over 90 miles an hour. In the cabin of the 737, the lights went out and fire billowed past the windows. Locked together, the two planes veered left, crossed a grass strip and a taxiway, and collided at freeway speed with an abandoned airport fire station.

The impact instantly killed all 12 people aboard the commuter plane, as well as the USAir captain, Colin Shaw, but the ordeal for the survivors was far from over. The small plane’s fuel tanks burst during the crash. A pool of fuel ignited underneath the USAir 737, sending plumes of smoke into the cabin. Passengers fought their way to the exits amid choking fumes, braving long drops and roaring flames in their frantic rush to escape. Not everyone made it: firefighters would later discover the bodies of 19 passengers and a flight attendant lying in the aisle, where they collapsed. Two more badly burned passengers would later die in hospital, bringing the final death toll to 35.

But that was yet to come. In fact, in the first few minutes after the accident, airport officials didn’t even know that two planes were involved. Air traffic controllers told fire crews that a Boeing 737 crashed, but as firefighters worked the scene, they reported discovering a propeller in the wreckage. The Boeing 737 is a jet; it doesn’t have any propellers. Clearly another airplane was involved, but which one — and why was it there?

***

At the LAX control tower, local controller Robin Lee Wascher was taken off duty — as is standard practice after a crash. After hearing about the propeller, she knew she must have cleared USAir flight 1493 to land on an occupied runway. As tower supervisors searched for any sign of a missing commuter flight, Wascher left the room. Replaying the events in her mind, she realized that the missing plane was SkyWest flight 5569, a 19-seat Fairchild Metroliner twin turboprop bound for Palmdale. Several minutes before clearing the USAir jet to land, she had told flight 5569 to “taxi into position and hold” on runway 24L. But she could not recall having cleared it for takeoff. The plane was probably still sitting “in position” on the runway waiting for her instructions when the USAir 737 plowed into it from behind. It was a devastating realization, but an important one, so in an act of great bravery, she returned to the tower, pointed to flight 5569, and told her supervisor, “This is what I believe USAir hit.”

***

According to the timeline later reconstructed by the National Transportation Safety Board, the pilot of SkyWest 5569 asked for permission to take off from a location partway down the runway called Intersection 45. Wascher cleared the flight to enter the runway at this location, but she didn’t give takeoff clearance because another Metroliner from a rival commuter airline already had permission to cross the runway ahead of it. 

Several events then occurred in quick succession. First, the second Metroliner tuned in to the wrong frequency, and she had to track it down and reissue its clearance to cross the runway. Then a Southwest flight announced that it was ready to enter runway 24L, so she told it to hold short. And finally, another SkyWest flight took off on runway 24R and had to be handed to the next controller. Unfortunately, amid these multiple distractions, she simply forgot that SkyWest 5569 was still sitting on runway 24L, awaiting takeoff clearance. Moments later, she cleared USAir flight 1493 to land, unaware that she was making a catastrophic error.

The fact that Wascher made a mistake was self-evident, as was the fact that that mistake led, more or less directly, to the deaths of 35 people. The media and the public began to question the fate of Ms. Wascher. Should she be punished? Should she lose her job? Did she commit an offense? 

How the authorities choose to handle such a mistake says a lot about our society’s conceptions of justice, culpability, agency, empathy, and even vengeance, because the moral dilemma of what to do about Robin Wascher exists as a struggle between diverging values and, in fact, diverging value systems, rooted in the relative prioritization of individual and systemic responsibility.

Cutting straight to the case, Wascher was not punished in any way. At first, after being escorted, inconsolable, from the tower premises, her colleagues took her to a hotel and stood guard outside her room to keep the media at bay. Months later, Wascher testified before the NTSB hearings, providing a faithful and earnest recounting of the events as she recalled them. She was even given the opportunity to return to the control tower, but she declined. No one was ever charged with a crime. 

As the aviation industry has learned through hard-won experience, that’s usually how it should be.

A brief history of blame

In the aftermath of a disaster, our immediate reaction is often to search for some person to blame. Authorities frequently vow to “find those responsible” and “hold them to account,” as though disasters happen only when some grinning mischief-maker slams a big red button labeled “press for catastrophe.” That’s not to say that negligence ought to go unpunished. Sometimes there really is a malefactor to blame, but equally often there isn’t, and the result is that normal people who just made a mistake are caught up in the dragnet of vengeance, like the famous 2009 case of six Italian seismologists who were charged for failing to predict a deadly earthquake. But when that happens, what is actually accomplished? Has anything been made better? Or have we simply kicked the can down the road? 

It’s often much more productive to ask why than to ask who. In some industries, this is called a “blameless postmortem,” and in aviation, it’s a long-standing, internationally formalized tradition. In the mid-20th century, when technical investigations of aircraft accidents were first being standardized, an understanding emerged that many crashes were not the result of any particular person’s actions. Most famously, in 1956, the Civil Aeronautics Board’s Bureau of Aviation Safety, the predecessor to today’s NTSB, concluded that no one was at fault in a collision of two airliners over the Grand Canyon because the two crews likely could not have seen each other coming until it was too late. The cause of the accident, they determined, was the lack of any positive means to prevent midair collisions. 

The exact origins of this norm are debatable, but we might speculate that it arose from several factors, including the lack of survivors or witnesses in many early aircraft accidents, which left scant evidence with which to assign fault; the fact that pilots held high status in society and many were reluctant to blame them in the absence of such evidence; and the presumption that flying was dangerous and that disaster was not always an aberration of nature. These realities likely predisposed aeronautical experts to think in terms other than blame.

The end result was that the aviation industry became one of the first to embrace the concept of a “blameless postmortem” as a legally codified principle underpinning all investigations. In 1951, compelled by the reality that their industry was not widely regarded as safe, aviation experts from around the world gathered to compose Annex 13 to the Chicago Convention on International Civil Aviation. This seminal document aimed to standardize the conduct of air accident investigations among all member states of the International Civil Aviation Organization. 

Annex 13 holds that the primary purpose of an aircraft accident investigation is to prevent future accidents — a decision that implicitly privileged prevention above the search for liability. Conducting a police-style investigation that faults a deceased pilot does nothing to affect the probability of future accidents. To follow the spirit of Annex 13, investigators must ask how others could be prevented from making the same mistakes in the future. This document, and in particular this provision, formed the basis for the modern practice of aircraft accident investigation. Most aircraft accident reports around the world today open with some variant of the principle, such as the NTSB’s disclaimer:

The NTSB does not assign fault or blame for an accident or incident; rather, as specified by NTSB regulation, “accident/incident investigations are fact-finding proceedings with no formal issues and no adverse parties…and are not conducted for the purpose of determining the rights or liabilities of any person.”

When liability is not a concern, an investigation has leeway to draw more meaningful conclusions. In the case of the disaster in Los Angeles, if you listen to the tower tapes, you can easily identify the moment Wascher cleared two planes to use the same runway. But if you remove her from the equation, you haven’t made anything safer. That’s because there was nothing special about Wascher — she was simply an average controller with an average record, who came into work that day thinking she would safely control planes for a few hours and then go home. That’s why in interviews with national media her colleagues hammered home a fundamental truth: that what happened to her could have happened to any of them. And if that was the case, then the true cause of the disaster lay somewhere higher, with the way air traffic control was handled at LAX on a systemic level.

Lessons in Los Angeles

If 35 people can die because a single controller made a single mistake, that’s not a system in which we can place our trust. Humans are fallible creatures who make poor decisions, misinterpret data, and forget things. In a system where lives may depend on the accuracy of a single person, disaster is not only probable but, given enough time, inevitable. Barring cases of anomalous recklessness or incompetence, it won’t matter who is sitting in the controller’s chair when the collision happens. And the only way to fix such a system is to end the reliance on individuals by putting in place safeguards against error.

That’s where the NTSB steps in to uncover the overarching circumstances that made disaster possible. Why was the system dependent on one controller’s accuracy? What factors increased the probability of a mistake? The agency ultimately wrote a lengthy report on these questions, but the findings can be boiled down to the following:

1. LAX was equipped with ground radar that helped identify the locations of airplanes on the airport surface. However, it was custom built and finding spare parts was hard, so it was frequently out of service. The ground radar display at Wascher’s station was not working on the day of the accident.

2. It was difficult for Wascher to see Intersection 45, where the SkyWest plane was located, because lights on a newly constructed terminal blocked her view.

3. After clearing the USAir plane to land, Wascher failed to recognize her mistake because she became distracted searching for information about another plane. This information was supposed to have been passed to her by another controller but was not. The information transmission hierarchy at the facility was such that the task of resolving missing data fell to Wascher rather than intermediate controllers whose areas of responsibility were less safety-critical.

4. Although it’s inherently risky to instruct a plane to hold on the runway at night or in low visibility, it was legal to do so, and this was done all the time.

5. Although there was an alarm system to warn of impending midair collisions, it could not warn controllers about traffic conflicts on the ground.

6. Pilot procedure at SkyWest was to turn on most of the airplane’s lights only after receiving takeoff clearance. Since SkyWest flight 5569 was never cleared for takeoff, most of its lights were off, rendering it almost impossible for the USAir pilots to see.

With these facts in mind, the events of that night begin to make a lot more sense. In fact, it becomes plain that Wascher’s mistake was only one factor among many, a slight jolt that toppled a house of cards, rather than an earthquake that brought down a solidly constructed edifice. And as a result of these findings, genuine safety improvements have been made, including more reliable ground radar at more airports, automated ground collision alerting technologies, and a national ban on clearing planes to hold on the runway in low visibility. None of these improvements would have been made if the inquiry stopped at who instead of asking why.

The key takeaway from the success of this approach is that safety improvements are best achieved when an honest mistake is treated as such, regardless of the consequences. This principle underpins what is known in several advanced industries as the “just culture” concept. A just organizational culture recognizes that a high level of operational safety can be achieved only when the root causes of human error are examined; who made a mistake is far less important than why it was made. 

A just culture encourages self-reporting of errors in order to gather as much data about those errors as possible. In contrast, an organization without a just culture will be left unaware of its own vulnerabilities because employees hide their mistakes for fear of retribution. Such an organization will discover those vulnerabilities only when they result in consequences that are impossible to hide.

Sometimes disasters happen anyway, and when they do, it’s equally critical that the just culture is upheld. Although it can be hard to accept that a mistake that led to loss of life might go unpunished, just culture doesn’t permit us to discriminate based on the magnitude of the consequences — only on the attitude of the person who committed the error. If they were acting in good faith when the mistake occurred, then a harsh reaction would undermine the trust between employees and management that facilitates the just culture. But even more importantly, it would undermine the blameless investigative process that makes modern aviation so safe. Investigative agencies like the NTSB rely on truthful statements from those involved in an accident in order to determine what happened and why, and the truth can’t be acquired when individuals fear punishment for speaking it. Indeed, if Wascher were charged with a crime, her lawyers would have been required to produce a defense, and the investigative waters would have been forever clouded.

Sticking the landing

Examples of this problem can be found throughout aviation history. For instance, in 1983 an Air Canada Boeing 767 famously landed on a drag strip in Manitoba after running out of fuel in flight, an incident known as the “Gimli Glider.” The captain of the flight was found to have taken off without working fuel gauges, in direct contradiction of airworthiness requirements, which stipulate that at least one gauge must be working.

At that time accident investigations in Canada were assigned to a commission of inquiry led by a judge who possessed the power to recommend criminal prosecution of anyone involved (a power that the NTSB does not wield). During the investigation, the flight’s captain told investigators that he examined the master minimum equipment list, which describes the systems that may be inoperative when dispatching an airplane. He allegedly observed that at least one fuel gauge was required but then learned from a maintenance technician that Air Canada “Maintenance Central” had cleared the plane to fly in that condition, overriding the MMEL.

However, everyone else who was present, including the first officer and two maintenance techs, denied that anyone had mentioned a clearance from Maintenance Central or that the MMEL was ever consulted. Indeed, not only did Maintenance Central not give the claimed clearance, it lacked the authority to do so even if it wanted to. Considering this testimony, it seems likely that the captain simply failed to check whether dispatch with two inoperative fuel gauges was allowed. But because he could have exposed himself to the threat of prosecution if he admitted to such a lapse, he (or his lawyers) may have come up with the alternative story in order to preclude the possibility of retaliation. Therefore, the exact reason why he decided to take off with no fuel gauges couldn’t be determined conclusively.

Fortunately, in many countries (including present-day Canada) this is not the case. Not only does the NTSB have no law enforcement power, but its findings are legally inadmissible as evidence of liability. In the United States, Robin Wascher is a shining example of bravery in truth-telling, but plenty of other pilots and controllers who made mistakes have done the same, because assurances exist that what they tell the NTSB will not be used in court.

Sometimes, employers independently decide to fire pilots involved in accidents, but the practice is subject to heavy criticism from pilots’ unions and just culture experts. Equally as often, pilots who make honest mistakes get to keep their jobs, such as the first officer aboard American Airlines flight 1420, whose failure to arm the ground spoilers before landing in Little Rock in 1999 contributed to a runway overrun accident that killed 11 people. As of 2019 he was still flying for American Airlines — and had been promoted to captain — because his mistake was due to deeper cultural issues in the airline industry. And besides, given what he went through, it’s hard to imagine that he would ever forget to arm the spoilers again.

***

The efficacy of just culture and the blameless postmortem is hardly in doubt. The United States has achieved the safest airline industry in the world through rigorous root cause analysis made possible only by a commitment to transparency, justice, and truth. While nothing humanity builds is invincible, the safeguards that we have erected against human error are so formidable that in the 33 years since the crash at LAX, there hasn’t been another fatal runway collision at any U.S. airport with a control tower, even as the media breathlessly reports every near miss. Globally, airline accidents of all causes have been almost eliminated, even as global air traffic increases year-on-year. In 1972, by most measurements the nadir of global aviation safety, approximately one in 200,000 airline passengers worldwide did not reach their destination alive. Half a century later in 2022, this number was one in 17 million. In the U.S., where airline safety has always led the global average, no scheduled passenger airline has had a fatal crash in 15 years.

The laws of the universe dictate that this unbroken record will one day end, but it’s also true that if we believe responsibility for human errors to lie with the individual, then many more people would be senselessly sacrificed — not only those of the passengers aboard our planes but those of the pilots and air traffic controllers who are responsible for them. Recognizing that mistakes are inevitable has made us all safer by directing our collective energy toward the cause, rather than the symptoms — because the cause of the Los Angeles disaster was not Robin Wascher forgetting about an airplane, but rather an unforgiving system that required her to act with inhuman consistency. Our own humanity compels us to withhold judgment because it makes flying safer, because justice demands it, and because empathy is rewarded in kind.

Kyra Dempsey is a Seattle-based aviation writer who publishes accident and incident breakdowns for professional and lay audiences under the name Admiral Cloudberg.

Published February 2024

Have something to say? Email us at letters@asteriskmag.com.

Further Reading

Subscribe