The crashes of two Boeing 737 Max jets that took the lives of 346 people in less than five months were preceded by a complex series of engineering, economic, corporate and regulatory decisions whose combined interplay contributed to tragic unintended consequences.

The fallout from the two disasters in Indonesia and Ethiopia is no less complex. Government oversight responsibilities, corporate practices and the impact of intense airline business competition are under scrutiny as the investigations of the accidents continue.

Boeing 737 Max planes remain grounded world-wide. The process used to certify the plane is the subject of congressional inquiries, a Department of Transportation audit and a criminal probe by the Department of Justice. And aviation experts, policy makers and laymen are re-examining old questions about our individual, cultural and societal responses to rapid technological advances.

The last loss of life in a US air disaster happened in February, 2009, when Colgan Air Flight 3407 crashed near Buffalo, NY, killing all 49 people on board and one person in a house on the ground. In 2013 the Federal Aviation Administration  (FAA) increased training and experience requirements for pilots, augmenting an already robust safety system.  In the last 10 years, US commercial airlines have transported some eight billion passengers without a fatal accident, according to CNBC.

Marc Narkus-Kramer, a retired engineer who worked for MITRE, an engineering company that provided technical research and advice for the FAA and other government agencies, recently wrote about the modernization in past decades of the Air Traffic Control (ATC) system.  In addition to looking at human interactions with technology, he examines how changes in organizational structure and behavior impacted development and adoption of new technologies. He presents his view through a complexity science lens, and his piece is available here. One of the key management practices he reviews in retrospect is participation of all stakeholders in projects, and voices of pilot around the world have emphasized the importance of that issue today.

On October 29, 2018, a Boeing 737 Max 8 operated by Lion Air, an Indonesian carrier, crashed into the Java Sea, killing 189 people. On March 10 a Boeing 737 Max 8 operated by Ethiopian Airlines crashed, killing 157 people.  Reports in The New York Times and The Washington Post noted similarities in the two crashes. Both occurred within minutes of take off, and aviation officials said flight path of both planes showed similar “vertical fluctuations” and “oscillations” before they went down.  Investigators say available data suggests a newly installed automated stall prevention system known as MCAS* may have been involved. Investigations remain unfinished, but aviation experts think that in both cases faulty sensor readings may have activated MCAS, pushing the nose of the plane down. Pilots, apparently lacking a full understanding of the malfunction or how to address it, lost control of the planes.

The 737 Max, an upgraded version of Boeing’s workhorse 737, was born of a frantic competition with rival manufacturer Airbus, which announced in late 2010 that it was introducing a more fuel efficient version of its best selling A320 passenger plane.  Boeing decided to update its 737 rather than spend a decade developing a new plane. The first 737 Max was finished in 2015.   The Times interviewed Boeing engineers and other workers who described intense pressure to get the new plane ready fast, though they did not think safety was compromised. But there was a ground rule for engineers: make only minimal design changes that would require only minimal pilot retraining, and no changes that would require pilots to spend training time in an expensive flight simulator. That was a big selling point for the airlines.

To compensate for changed aerodynamics caused by the bigger, more fuel-efficient engines that competition required, Boeing created the MCAS software.  But because the system was supposed to work in the background, Boeing believed it didn’t need to brief pilots on it, and regulators at the FAA agreed. Many pilots were furious about having been kept in the dark.

Policy makers and aviation watchdogs have criticized the FAA for its cozy relationship with Boeing, and its practice of allowing Boeing to designate its own employees to work on the FAA certification process.  Boeing has defended its self-regulatory role, but has announced upgraded software for all planes and upgraded training for pilots. 

What if pilots had known about the purposes and limits of new software, understood how it worked, been taught to recognize when data might be faulty, and practiced how to respond to malfunctions?  Could these tragedies have been prevented?  If automation does most of the work, will people have the knowledge and skill to take over if it fails? Those questions have worried Narkus-Kramer and other automation experts in many fields for decades.

In a Vanity Fair article “The Human Factor,” William Langewiesche examines the 2009 crash of Air France Flight 447, in which 228 people died.  In that accident, a series of small errors, brief technological malfunctions and major human misunderstandings, turned a state of the art Airbus plane into a death trap. Automation has made air travel far safer than it was in past decades, Langewiesche writes, but it brings its own hazards.  He quotes an engineer, the late Earl Wiener, of the University of Miami, who wrote in the 1980s about some of the unexpected risks that emerge with even the best technologies. As Wiener saw it, every device creates its own opportunity for human error; exotic devices create exotic problems; digital devices tune out small errors while creating opportunities for large errors; and whenever you solve one problem, you usually create another, and you can only hope that the one you created is less critical than the one you eliminated.

So question remains.  Narkus-Kramer says a successful safety culture has evolved in the airline industry because of stakeholder involvement, rigorous standards, thorough testing, non-punitive data collection, and correction of errors.  Will the industry’s safety record continue, or are pressures leading to dangerous shortcuts?   Does failure to fully inform pilots erode a safety environment? Does the fact that the system modification was accepted with one single sensor suggest a new pattern in engineering or design?

*”MCAS was originally designed to activate based on data from a single angle-of-attack sensor, which measures the level of the jet’s nose relative to oncoming air. Air-safety experts, as well as former employee at Boeing and the supplier that made the sensor, have expressed concern that the system had this single point of failure, a rarity in aviation.

Contributed by Prucia Buscell, a freelance writer and editor who was  formerly communications director at Plexus Institute. Before joining Plexus in 2001, she was a newspaper reporter and copy editor. Prucia can be contacted at prucia@gmail.com