Opinion

What’s the Best Way to Stop Tragic Accidents?

Tragic accidents have been in the news recently. The actor Alec Baldwin unintentionally shot and killed someone on a movie set after firing a gun that he was told was unloaded. A port explosion in Beirut killed more than 200 people last year. And a new documentary by Frontline, in partnership with The New York Times, examines the two crashes of Boeing 737 Max jets that killed 346 people over the course of four months in 2018 and 2019.

What can be done? Nancy Leveson has an answer. Leveson, an engineering professor at the Massachusetts Institute of Technology, has developed a distinctive approach to accident prevention. She doesn’t focus on identifying individual faulty components or singling out blundering people. Instead she looks at how accidents can be caused by unforeseen interactions between various components of a complex system.

Leveson’s approach, which is often described as “systems thinking,” is drawing a lot of interest. In June, about 2,300 people from 85 countries attended a virtual workshop she conducted. A handbook she created for people interested in using her method has been downloaded more than 100,000 times and translated into several languages, including Chinese, Japanese and Korean. In 2019 she was quoted in congressional testimony on aviation safety by Chesley Sullenberger III, the pilot who heroically landed a disabled passenger jet on the Hudson River in 2009.

Leveson, whom I interviewed this week, is pleased by the recent surge in interest in her work. “It’s very exciting,” she says. “I just assumed this was not going to happen until after I was gone.” She has been at this for a while: She helped write official accident reports for the space shuttle Columbia explosion in 2003, the Texas City refinery explosion in 2005 and the Deepwater Horizon oil spill in 2010.

One way to understand Leveson’s work is to think about blame. A traditional post-mortem after an accident is all about determining who or what was at fault. You work backward through the chain of events preceding the accident until you get to what you identify as the cause.

But pinpointing the “true” cause can be tricky. In every chain of events there’s always an earlier link in the chain: If someone fell asleep at the controls, why? Was the company working people too hard?

“I don’t believe in blame,” Leveson says. “When it’s about blame, you just find someone to blame and then you go on.” Instead, she emphasizes making systems mistake-resistant, if not mistake-proof. “You need to design your system to prevent accidents, not depend on the operator,” she says.

Leveson contends that too many systems today are designed so that “we’re guaranteed the operator is going to make a mistake of some kind.” For example, a 2010 investigation of radiation oncology accidents by The New York Times found that while new technology helped doctors better attack tumors, the complexity of the technology also “created new avenues for error through software flaws, faulty programming, poor safety procedures or inadequate staffing and training.”

Avoiding accidents has gotten harder in the computer era because the software is too complex for people to fully comprehend, Leveson says. As far back as 2009 an ordinary car contained some 100 million lines of code, according to Manfred Broy, a professor at the Technical University of Munich. It’s almost impossible to foresee every possible interaction among lines of code.

Leveson gives as an example the failure of the $165 million Mars Polar Lander mission in 1999. When the spacecraft’s landing legs were extended during descent, the onboard software incorrectly interpreted the vibrations from the deployment as an indication that the landing had occurred and shut down the engines prematurely, causing the lander to crash. “The landing legs and the software performed correctly — as specified in their requirements, that is, neither ‘failed,’” Leveson wrote in a 2019 paper, “but the accident occurred because the system designers did not account for all interactions between the leg deployment and the descent-engine control software.”

The field of medicine is especially ripe for systems thinking, Leveson says. In 2000 the Institute of Medicine (now the National Academy of Medicine) issued a report on health care safety that found that at least 44,000 patients were dying annually because of medical errors. In 2019 one of the authors of that report, Mark Chassin, wrote, “it’s been 20 years, and we haven’t moved the quality and safety needle as much as we had hoped.”

Leveson agrees. “Doctors don’t understand engineering and they don’t like systems thinking at all,” she says. “They think if you just get rid of bad doctors. It’s not the bad doctors. It’s the doctors who are put into situations where they are bound to have more problems.”

Lest you think forcing people to follow the rules would solve the problem, I leave you with a quotation from one of Leveson’s early papers on the topic, which appeared in the journal Safety Science in 2004, before she was drawing international crowds to her seminars:


Elsewhere

The chart below is a bit of a puzzle. It shows, for each U.S. state and for the United States as a whole, the relation between the unemployment rate and the job openings rate, using government data from August. You would expect that states where unemployment is high would have few job openings, whereas states where unemployment is low would have lots of openings. In other words, you’d expect that the dots in this graph would be clustered along a line stretching from the top left to the bottom right. Clearly, they aren’t.

I’ve called out some of the states that are the most anomalous: Nevada and Alaska, where both rates are high, and South Dakota and Nebraska, where both rates are low. I’d love to hear from readers about why they think this is.


Quote of the day

“All I ask of our brethren is that they will take their feet from off our necks and permit us to stand upright on that ground which God designed us to occupy.”

—Sarah Moore Grimké,an abolitionist, in a letter to her sister in 1837


Have feedback? Send a note to [email protected].

Related Articles

Back to top button