© 2024 Interlochen
CLASSICAL IPR | 88.7 FM Interlochen | 94.7 FM Traverse City | 88.5 FM Mackinaw City IPR NEWS | 91.5 FM Traverse City | 90.1 FM Harbor Springs/Petoskey | 89.7 FM Manistee/Ludington
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

After Boeing Crashes, New Attention On The Potential Flaws Of Software

The cockpit of a grounded Lion Air Boeing 737 Max 8 aircraft is seen on March 15.
Dimas Ardian
/
Bloomberg/Getty Images
The cockpit of a grounded Lion Air Boeing 737 Max 8 aircraft is seen on March 15.

Investigations into the causes of the two Boeing 737 Max crashes, in Indonesia and Ethiopia, have focused on software — and the possibility that it was autonomously pointing the planes' noses downward, acting without the pilots' consent.

It's a nightmare scenario. It's also a reminder that software is everywhere, sometimes doing things we don't expect.

This sank in for a lot of people four years ago, during the Volkswagen diesel emissions scandal. It turned out that software inside the cars had been quietly running the engines in such a way as to cheat on emissions tests.

While it's always possible for manufacturers to use software dishonestly, the more common problem is software that's used to enable sloppy designs.

"A lot of times, you see systems that would be much easier to control if somebody had been thoughtful about the mechanical design," says Chris Gerdes, a professor of mechanical engineering at Stanford University.

He says sometimes he's brought in as a consultant on a project and he'll find that a moving part was poorly conceived — perhaps it generates too much friction. Designers will leave the problem, assuming the control software will make up for it.

Still, he says software has mostly helped improve cars and other complex systems. And it would be impossible to go back to purely mechanical designs, such as pre-digital automobiles.

"If you open these things up, it's crazy!" he says, describing the intricate hydraulic systems inside automatic transmissions back before cars had computers. Complex systems of tubes and liquid would "calculate" when to shift gears, something no carmaker would attempt today.

"I have deep appreciation for this but also no idea about how one would implement logic in fluids," Gerdes says.

Software is quicker, lighter, cheaper and much more flexible than mechanical systems.

For Kara Pernice, senior vice president of the user-experience design consulting firm Nielsen Norman Group, software is a very necessary part of modern manufacturing. But she says it's often added too late in the design process.

"Many times, hardware-software creation is disjointed," she says, calling it a "huge problem."

As an example, she recalls driving her parents' car recently and being flummoxed by the touch screen.

"I could not figure out how to turn down the air conditioning on a touch screen," she says. "Are you kidding me?"

Touch screens may strike customers as up to date, but they can also be a shortcut for manufacturers. By leaving all the controls to the programmers of the screen, the mechanical designers can skip the more careful — and time-consuming — process of "considering the human that's going to use that technology in the end," as Pernice says. Touch screens often preclude consideration of mechanical controls — such as a knob for the air conditioning — in places where it would make more design sense.

Sometimes, the change to software controls can be deadly. Among the most notorious cases is the Therac-25, a radiation therapy machine built in the 1980s. It dispensed with mechanical safety interlocks of earlier models and replaced them with software. The software turned out to have bugs, and patients were over-radiated — a few were even killed. It became a case study for how not to design safety-critical systems.

But even now, software is a potential risk in medical devices — and programming was the most common cause of medical device recalls last year.

In aviation, software is indispensable. And generally speaking, designers say it has made airplanes much safer and more versatile.

It's not just about autopilot, says Kristi Morgansen, professor of aeronautics and astronautics at the University of Washington. The next time you fly, she says, just look out the window at the wing.

"You sometimes see the control surfaces doing things, and a bunch of that is automatic," she says. "Like gust load alleviation — ride quality, how it feels when you fly — a lot of that is handled automatically."

On the 737 Max, Boeing used software to compensate for a compromise in the physical design. New, larger engines were added to an older airframe, changing its center of gravity. The software suspected of causing the crashes was there to correct for that and push the nose down when it rose too high.

That may strike the layperson as a "kluge" — using software to cover up a problem. But aviation designers say compromises and compensations are a necessary part of design.

Morgansen won't comment on the 737 Max crashes, which are still under investigation, but she says that generally speaking, using new software on top of older systems is safe — and necessary.

"It would be so cost prohibitive to start from scratch," she says. "It would take so long that the business model wouldn't work."

Commercial pilots have generally come to accept autonomous software as part of flying.

"I don't think there's anything inherently wrong about doing it through software, provided you do it correctly," says Alex Fisher, a retired pilot of Boeing 767s. He says that long before computers, pilots have depended on mechanical autonomous systems to smooth out the controls.

"There are other features of the airplane that we were unaware of," Fisher says, "[but] whether they really applied to the control systems is another matter. If you aren't taught how the controls work, then you really don't stand much of a chance."

Columbia Law School Professor Eben Moglen, who has long championed transparency in software, says the real lesson to take from the 737 Max is the necessity for autonomous software systems to "explain themselves" to the people using them.

He says software has allowed manufacturers to cut corners and costs on things like camera phones — say, using image-enhancement software to compensate for inferior lenses. "Every smartphone manufacturer I've ever dealt with regards the color-enhancement part of its camera software as among its most valuable trade secrets," Moglen says.

But cheap physical designs are a minor consideration, he says, compared with what the 737 Max situation represents.

"What we're looking at in the case of some aerodynamic software taking over from pilots without telling them," he says, "is an example of why, even if you didn't think any of this had anything to do with politics, it is still true that systems that don't explain themselves to the human beings that interact with them are dangerous."

Moglen says authoritarian autonomous software is becoming a hallmark of authoritarian societies, such as China, and it's up to democratic societies "to build these technologies to support, rather than threaten, human freedom."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Martin Kaste is a correspondent on NPR's National Desk. He covers law enforcement and privacy. He has been focused on police and use of force since before the 2014 protests in Ferguson, and that coverage led to the creation of NPR's Criminal Justice Collaborative.