Interview with Capt. Chesley Sullenberger
As the 737 MAX returns to skies after being grounded for nearly two years, many people want to know if automation has gone too far. We talk to Capt. Chesley Sullenberger about automation, training, safety and why we can’t send him a copy of our latest book.
Catching Up With An American Hero On Our Safety… and His
by Roger Rapoport and Shem Malmquist, authors of Grounded: How to Solve the Aviation Crisis.
As the 737 MAX returns to skies after being grounded for nearly two years, many people want to know if automation has gone too far.
The MCAS software system has been redesigned and new pilot training, that Boeing had opposed before the fast-selling aircraft was grounded, is happening worldwide. Nonetheless, families of the 346 people who died in those two crashes have been asking if the industry is addressing the wrong problem. Could it be possible that flight training is not based on the concept that a person is in control of the aircraft?
The argument that flying is still the safest way to travel begs this question: why in highly-automated aviation environments have far too many preventable accidents happened?
Today even expert system designers do not fully understand what is actually happening in real time with all the automation on their complex airplanes, nuclear power plants, chemical manufacturing lines, self-driving cars and military defense systems.
To gain perspective I spoke with Captain Chesley Sullenberger, best known for landing a US Airways Airbus 320 on the frigid Hudson River in January 2009 after a bird strike shut down both engines. Immortalized by Tom Hanks in the Clint Eastwood movie Sully, Sullenberger has dedicated his career to this concern on the minds of many. He certainly appreciates all the good things automation makes possible but at the same time he will be the first to tell you that more automation is not the answer to every safety challenge.
Our conversation began with a brief look at his training ahead of the now-legendary US Airways flight 1549.
“We were in a situation we hadn’t trained for. All we had to go on was a brief classroom discussion and a few paragraphs in a training manual. We had to come up with something that would work in less than three and a half minutes.”
“We were in a situation we hadn’t trained for. All we had to go on was a brief classroom discussion and a few paragraphs in a training manual. We had to come up with something that would work in less than three and a half minutes.”
In his pas de deux with co-pilot Jeff Skiles, Sullenberger opted to quickly turn on the auxiliary power unit: “By starting the auxiliary power unit I ensured an uninterrupted source of electrical power so that normal law would not be lost.”
In other words they had to use their own creativity to come up with a solution the plane’s designers had not foreseen.
“People believe,” says Sullenberger, “that it’s easier to fly a plane when the flight controls are automated. Actually it requires much more training and experience, not less, to fly highly-automated planes. Pilots need to be armed with the knowledge, skill and effective practice to be able to be the absolute master of highly-automated airplanes in every situation.”
Software safety experts like MIT’s Dr. Nancy Leveson are quick to point out that there is no way a human can understand exactly what an automated system is doing every second. Computers create situations where the person using a machine does not understand how a task is being performed.
This is especially true when flight automation is lost and the aircraft is being flown in a degraded manual mode that makes the aircraft harder to fly than an old 707 or DC-8. This happened on Air France 447 in 2009 and on some of the aircraft lost since. These so-called computer crashes are, in some cases, attributable to an automation surprise that can defeat experienced flight crews.
“Pilots should be offered and trained to use varying levels of technology based on the situation, workload and phase of flight, not just an all-or-nothing dichotomy. They should be able to easily and seamlessly transition from level to level, up or down,” says Sullenberger.
“Pilots must always be flying the airplane with their minds. The critical question they must ask themselves is how many layers of technology they want to interpose between their minds and the flight control surfaces. The answer must be the level that is most appropriate for the situation.”
Pilots have a short time to recover when flight automation literally shuts down due to the failure of an angle of attack indicator which provides airspeed data required by the computer operating system.
Boeing was so confident in the new 737 MAX design that it didn’t bother mentioning the autonomous MCAS system in the aircraft’s flight manual, much less provide any training. The flight crew on the ill-fated October 2018 Lion Air MAX flight didn’t even know MCAS existed. Five months later MCAS functioned exactly as it was designed on an Ethiopian Airlines plane that went down swiftly.
Those parallel crashes had multiple causes. They were a result of the engineers assuming, first, that a MAX computer receiving faulty information would be a rare event, and second, that even if that did happen, pilots would be able to manage it using existing training. Both of these assumptions were wrong.
More training is critical because computers create unique situations that the system designer often has not envisioned and the pilot cannot fully understand. One reason this happens is that key decisions have often been made by designers and engineers who have never flown a big jet themselves.
Sullenberger believes “pilots should be involved with system design from the beginning.”
This actually happened with the redesign of the MAX MCAS software after two crashes. Not only were pilots able to create a better design, they helped develop new training procedures—also being implemented for older Boeing aircraft such as the 737 NG.
“We found a better way to teach manual trimming of the aircraft while also improving on the 737 checklists,” says American Airlines 737 Captain Dennis Tajer who also represents the Allied Pilots Association. “Including pilots in the regulatory conversation is important. When you disconnect the human being from the airplane it can have serious consequences.”
Sullenberger agrees: “Automation has made aviation safer but it is not a panacea. It is a mixed blessing. At least for now, automation can only do what is foreseen and for which it has been programmed. While humans are often the least predictable part of any system, they are the most adaptable and resilient.
“Automation can be brittle. Humans are the ones who can face a situation they have never seen, take what they know and adapt it to solve the problem. Too much dependence on automation means the industry has let its guard down in many important areas beyond training.”
In Sullenberger’s view improved flight automation will never make aircraft failsafe. During 2020, when total air traffic was down 42 percent from the previous year, 299 lives were lost, compared with 257 in 2019. Although there has not been a fatal American passenger airline crash since 2009—a remarkable record—he believes we can “no longer define safety solely as the absence of accidents.”
“There can’t be a checklist for everything. Procedural compliance is a necessary but not sufficient condition for safety.”
“Automation has made aviation safer but it is not a panacea. It is a mixed blessing. At least for now, automation can only do what is foreseen and for which it has been programmed. While humans are often the least predictable part of any system, they are the most adaptable and resilient.”
Grandfathering in an existing design of a 50 year-old aircraft, such as the 737, can also lead to problems on a major upgrade.
“There needs to be a system-wide holistic nose-to-tail hazard analysis of every design in the last 10 to 20 years. If this approach had been adopted the MAX accidents might have been prevented.”
The pandemic’s impact on aviation has created yet another challenge.
“I’m concerned that a reduction in flying, with more airplanes parked for longer periods of time and less currency of experience, increases a variety of risks. Flying fewer hours means crews can be distracted and lose the rhythm of the job. Reduced passenger loads means planes are lighter and climbing faster. We are already seeing more precursor incidents. It’s an obvious but pernicious risk.”
A key answer to these challenges is “making it psychologically safe for the most junior member of the crew to speak openly to a senior captain. Creating a shared sense of responsibility opens channels of communication and makes sure we all have each other’s back.”
Airlines currently enforce a variety of gag orders that make it challenging for pilots to openly share critical safety information. Pilots who do try to speak out publicly in interviews, at conferences or on social media have been reprimanded, grounded, subject to psychiatric exams and, in some cases, fired.
A notable example is Delta Airbus 350 First Officer Karlene Petitt who won a record $500,000 OSHA whistleblower judgment against her employer last December. After writing a PhD thesis challenging the company’s safety culture, Dr. Petitt was grounded by the carrier for 21 months via an ultimately rejected, psychiatric exam, falsely alleging she was “bipolar.”
Sullenberger believes this ruling, which criticized FAA Administrator Steve Dickson (a former Delta vice-president), makes a strong case for strengthening aviation whistleblower laws. The changes he advocates would have made it easier for Boeing insiders to speak out on their MCAS safety concerns during the design and flight-testing phase.
“There is some critical safety information that only can be gleaned from the person involved and that requires absolute trust,” says Sullenberger. “If that trust is absent it can never happen. It has to come from the person to whom it happened. It has to be psychologically safe and there has to be an incentive to do it.”
A good example is the abnormal airspeed event, triggered by unexpected high altitude icing, which led to the 2009 Air France 447 crash that took the lives of 228 people. Prior to this tragedy, numerous pilots in America and Europe filed prophetic warnings about a potential aerodynamic stall. Unfortunately these details were never shared with Airbus pilots. If they had shared their concerns in public these “whistleblowers” could have been subject to disciplinary action.
This is one of Sullenberger’s key concerns: “How many red flags and precursor incidents do there have to be before the industry separates the signal from the noise? All incentives for pilots should be aligned to the public good. We should be begging pilots to tell us the truth.”
“There is a fundamental misunderstanding about information and bad news. The only bad news is news that is buried and not communicated and acted on in an effective fashion. Another key problem that automation can’t solve is the toxic impact of industry donations to the political campaigns of elected officials on both sides of the aisle tasked with governing aviation policy. Their work needs to reflect the public good.”
Despite these challenges Sullenberger believes the industry now has a unique opportunity to address many of them: “Given the fact that so many aircraft are mothballed and so many pilots aren’t flying, the airlines can do much more training beyond the confines of a simulator. We should also be modifying the planes parked in the desert to make them safer.”
“We also need to change the way we process people through the airport. Covid-19 is not the last biological agent we need to be worried about. We need to make adjustments while planes are not flying all day, creating more touchless surfaces and changing materials so they are less likely to harbor germs. Perhaps we can set up new sensors for cleaning devices and also create more robust airflow.”
Sullenberger is also concerned about crew safety on planes: “If I were the FAA administrator the first thing I would do would be to furnish airline crews with the power of law for dealing with abusive passengers.”
A final note. We were not able to have this conversation in person because Sullenberger is living in an undisclosed location following a death threat last September. This threat on his life came after he denounced President Trump’s alleged characterization of 1,800 Marines who died in a French World War I battle as “suckers and losers.” The FBI was informed and investigated.
“I am not alone in this situation,” says the pilot who saved so many lives with Jeff Skiles by landing their crippled Airbus on the Hudson. “Many people I respect who are courageous whistleblowers, such as Lt. Col Alexander Vindman (who spoke out against Trump) and parents of the Sandy Hook school shooting victims have received multiple death threats.”
After our interview I sent Captain Sullenberger, a voracious reader, (libraries all waived late fees on the four soaked books he had with him on Flight 1549) a note asking for a P.O. Box where I could send him a copy of Grounded, a book I wrote with Captain Shem Malmquist. Here’s the reply I received from his office:
“Capt. Sullenberger thanks you for the offer of the book, but unfortunately in the current environment, he does not have a secure way to receive mail.”
Roger Rapoport is the Muskegon based co-author of Angle of Attack on Air France 447 and Grounded: How To Solve the Aviation Crisis with Shem Malmquist. He is also the producer of the award winning feature film Pilot Error.
This article was updated to correct a few minor errors at 12:30pm, March 19th.