By Gregory Travis – a software architect, aircraft owner and writer. His first article identifying the issues with the 737 Max appeared in the May 2019 issue of IEEE Spectrum magazine.

Boeing’s most recent attempt to demonstrate a fix for its troubled MCAS system is another
demonstration of just how deep the problem is. Most important, it illustrates how desperate
Boeing is to “keep alive” a software solution to the 737 MAX’s longitudinal stability issues.
Most chillingly, it illustrates just how inadequate such a solution is to the issue.

Most sadly, it is a symbol of the collapse of institutions in the United States. We were once
considered the world’s gold standard in everything from education to manufacturing to
effective and productive public-sector regulation. That is all going down the drain, flushed by a
belief in things that just are not true.

Redundancy

In aviation, redundancy is everything. One reason is to guard against failure, such as the second
engine on a twin-engine airplane. If one fails, the other is there to bring the plane down to an
uneventful landing.

Less obvious than outright failure is the utility of redundancy in conflict resolution. A favorite
expression of mine is: “A person with one watch always knows what time it is. A person with
two watches is never sure.” Meaning if there’s only one source of truth, the truth is known. If
there are two sources of truth and they disagree about that truth there is only uncertainty and
chaos.

The straightforward solution to that is triple or more redundancy. With three watches it is easy
to vote the wrong watch out. With five, even more so. This engineering principle derives from
larger social truths and is embedded in institutions from jury pools to straw polls.

Physiologically, human beings cannot tell which way is up and which way is down unless they
can see the horizon. The human inner ear, our first source of such information, cannot
differentiate gravity from acceleration. The ear fails in its duty whenever the human to which it
is attached is inside a moving vehicle, such as an airplane.

Then only reliable indication of where up and down reside is the horizon. Pilots flying planes
can easily keep the plane level so long as they can see the ground outside. Once they cannot,
such as when the plane is in a cloud, they must resort to using technology to, as we pilots say,
“keep the greasy side down” (the greasy side being the underside of any airplane).

That technology is known as an “artificial horizon.” In the early days, pilots synthesized the
information from multiple instruments into a mental artificial horizon. Later a device was

developed that presented the artificial horizon in a single instrument, greatly reducing a pilot’s
mental workload.

But that device was prone to failure. Pilots were taught to continue to use other instruments to
cross-check the validity of the artificial horizon. Or, if the pocketbook allowed, to install
multiple artificial horizons in the aircraft.

What is important is that the artificial horizon information was so critical to safety that there
was never a single point of reference nor even two. There were always multiples so that there
was always sufficient information for the pilot to discern the truth from multiple sources —
some of which could be lying.

Information Takers, Information Givers and Information Processors

The machinery in an aircraft can be roughly divided into three classes: Information takers,
information givers and information processors. The first class is generally that machinery that
manages the aircraft’s energy, such as the engines or the control surfaces.
They are the aircraft’s machine “working class.”

The second class of machinery are the information givers. The information givers are
responsible for reporting everything from the benign (are the bathrooms in use?) to the critical
(what is our altitude? where is the horizon?).

They are the aircraft’s machine “man on the street.”

The third class of machinery are the information processors. The information processors
process information from the information givers, make decisions, and then command the
information takers to do something.

They are the aircraft’s machine bureaucracy.

Emergence of the machine bureaucracy

In aviation’s early days, there was no machine bureaucracy. Pilots were responsible for
processing the information from the givers and turning that into commands for the takers. Stall
warning (information giver) activated? Push the airplane’s nose down and increase power
(commands to the information takers).

Soon, however, the utility of allowing machines to perform some of the pilot’s tasks became
obvious. This was originally sold as a way to ease the pilot’s tactical workload, to free the
pilots’ hands and minds so that they could better concentrate on strategic issues – such as the
weather ahead – and be not so much pilots as captains-in-command.

 

Thus was born the “auto pilot.” Not to share the table with the captain, but to serve it.

Redundancy done right

All of the ideas and technology embodied in the Boeing 737 were laid down in the 1960s. This
ran from what kind of engines, to pressurization, to the approach to the needs of redundancy.
And the redundancy approach was simple: more than one of everything.

 

Laying that redundancy out in the cockpit became straight-forward. One set of information-
givers, such as airspeed, altitude, horizon on the pilot’s side. And another set of identical

information-givers on the co-pilot’s side. That way any failure on one side could be resolved by
the pilots, together, agreeing that the other side was the side to watch.

Origin of consciousness in the bicameral mind

 

Visualize, if you will, the cockpit of a Boeing 737 as a human brain. There is a left (pilot) side,
full of instrumentation (information givers, sensors such as airspeed and angle of attack), a
couple pilots and an autopilot.

 

 

 

 

 

 

 

 

 

 

 

 

 

And there is a right (co-pilot) side, with the exact same things. In the picture above items
encircled by same-colored ovals are duplicates of one another. For instance, airspeed and
vertical speed are denoted by purple ovals. The purple ovals on the left (pilot’s) side get their
information from sensors mounted on the outside of the plane, on the left side. The ones on
the right, well the right side.

And, like a human brain, there is a corpus callosum connecting those two sides. That
connection, however, is limited to the verbal and other communication that the human pilots
make between themselves.

 

 

 

 

 

 

 

 

 

 

In the human brain, the right brain can process the information coming from the left-eyeball
and left ear. The left brain can process the information coming from the right-eyeball and right
brain.

In the 737, however, the machinery on the co-pilot’s side is not privy to the information coming
from the pilot’s side, and vice versa. The machines on one side are alienated from the
machines on the other side. It is up to the humans to intermediate.

The 737 autopilot origin story

The 737 needed an autopilot, of course, and its development was straight forward.
Autopilots in those days were crude and simple electromechanical devices, full of hydraulic
lines, electric relays and rudimentary analog integration engines. They did little more than keep
the wings level, hold altitude and track a particular course.

Obtaining the necessary redundancy in the autopilot system was as simple as having two of
them. One on the pilot’s side, one on the co-pilot’s side. The autopilot on the pilot’s side
would get its information from the same information-givers giving the pilot herself information.
The autopilot on the co-pilot’s side would get its information from the same information-givers
giving the co-pilot his information.

 

And only one auto-pilot would function at a time. When flying on auto-pilot, it was either the
pilot’s or the co-pilots auto-pilot that was enabled. Never both.

And in those simple, straightforward, days of Camelot, it worked remarkably well.

 

 

 

 

In the picture above you can see the column labelled “A/P Engage.” This selects which of the
two autopilots (N.B. Boeing calls them “Flight Control Computers (FCC)”) is in use, A or B. The
A autopilot gets its information from the pilot’s side. The B from the co-pilot. If you select the
B autopilot when the A is engaged, the A autopilot will disengage (and vice-versa).

What this means is that the pilot’s autopilot does not see the co-pilot’s airspeed. And the co-
pilot’s autopilot does not see the pilot’s airspeed. Or any of the other information-givers, such

as angle of attack.

The fossil record

JFK was president when the 737s DNA, its mechanical, electronic and physical architecture,
were cast in amber. And that casting locked into the airplane’s fossil record two immutable
objects. One, the airplane sat close, really close, to the ground. And, two, the divided and
alienated nature of its bicameral automation bureaucracy.

These were things that no amount of evolutionary development could change.

Rise of the machines

Of all the -wares (hardware, software, humanware) in a modern airplane the least reliable and

thus the most liability-attracting is the humanware. Boeing, ironically, estimates that eighty-
percent of all commercial airline accidents are due to so-called “pilot error.” I am sure that if

Boeing’s communication department could go back in time, they’d like to revise that to 100%.

With that in mind, it’s not hard to understand why virtually every economic force at work in the
aviation industry has on its agenda at least one bullet-point addressed to getting rid of the
human element. Airplane manufacturers, airlines, everyone would like as much as possible to
get rid of the pilots up front. Not because pilots themselves cost much (their salaries are a
minuscule portion of operating an airline) but because they attract so much liability.

The best way to get rid of the liability of pilot error is to simply get rid of the pilots.

The Airbus consortium, long a leader in advancing the technological sophistication of aviation
(they succeeded with Concorde where Boeing had utterly failed with their SST, for example),
realized this. And in the late 1970s embarked on a program to create the first “fly by wire”
aircraft, the A320.

In a “fly by wire” aircraft, software stands between man and machine. Specifically, the flight
controls that pilots hold in their hands are no longer connected directly to the airplane’s
information takers, such as the control surfaces and engines. Instead the flight controls
become yet another set of information-givers.

Those information processors become the airplane’s bureaucrats. Taking information from the
information-givers – airspeed, angle of attack, altitude and the pilots. And they evaluate that
information – its quality, its reliability and its probity on an equal level. Which means with an
equal amount of skepticism.

The humans had suddenly been demoted out of the bureaucracy and into the role of
information-givers. Alongside things like the airspeed sensors, the angle of attack sensors, and
all the other sensors the humans were mere people on the street.

Next step, bathroom monitor.

The airlines saw the writing on the wall and were delighted. Boeing, caught with its pants
around its ankles, embarked on a huge anti-automation campaign – even as it struggled to
adapt other aspects of the A320 technology, such as its huge CFM56 engines, to Boeing’s
already-old 737 airframe.

Boeing’s strategy worked well for quite some time. By retrofitting the A320s engines to the
737, they were able to match the A320s fuel economics. And with a vigorous anti-automation
campaign, aided by pilots unions and a public fearful of machine control, kept the 737 sales
rolling.

 

It all comes down to money

As a forty-year veteran of the software development industry and a person responsible for
directing teams that generated millions of lines of computer code, I will tell you something
wonderful about the industry.

Anything you can do by building hardware – by casting metal, sawing wood, tightening
fasteners or running hoses – you can do faster, cheaper, and with less organizational heartache
with software. And you can do it with far fewer prying eyes, scrutiny or oversight.
And the icing on the cake: if you screw it up, you can pass along (“externalize,” as the
economists say) the costs of your mistakes to your customers. Or in this case, the flying public.

This is why Wall Street loves technology that involves little or no capital investment. Think
Uber. And why it hates old-line manufacturing with its expensive factories and machinery, and

people. Software developers can be sourced from anywhere in the world at dirt-cheap
prices and with zero experience in the industries for which they are developing software.

People who can bend metal, visualize airflows or anticipate manufacturing issues are as
expensive as they are skilled.

As any fund manager would say: “Ugh, why would I want that?”

Nobody would, of course. Which is exactly what Boeing’s managers – its board of directors –
understood when they embarked on an ambitious program to re-make the company. Re-make
the company away from its old-line industrial roots, which Wall Street abhorred, and more like
something along the lines of an Apple Computer.

Ideally all software, no hardware. And the invested capital-to-profit ratios that go along with it.

The dissolution of empathy

There were a lot of components to that transformation. Including firing all the old-line (and
old, but experienced) engineers in unionized Washington State, replacing them with a cadre of
unskilled workers, such as those putting together the mess that is the 787 “Dreamliner” in
antebellum South Carolina.

Putting together, not making, because another relentless part of the transformation was
liquidation of Boeing’s capital plant. Like another once-giant aviation company, Curtiss Wright,
Boeing’s managers had drunk the Wall Street koolaid and believed, with no empirical evidence,
that the best way of making money by making things was not to make things at all.

Think of it as aviation’s version of Mel Brook’s “The Producers.”

A key Wall Street shibboleth along those lines being something known as “Return on Net
Assets,” or RONA. RONA says “Making something with something is expensive. So make
something, but make it with nothing.”

Think of it as “Springtime for Hitler.”

Springtime for Hitler, in the Wall Street world, means that US manufacturing companies stop
making anything themselves. Instead everything they make, they get other people to make for
them. Using exploited labor, producing inferior product greased on the wheels of distrust, fear
and an utter lack of shared mission or shared sacrifice.

That means, for example, that the 787s being assembled in South Carolina are being put
together by people whose last job was working the fry pit at the local burger joint using parts
made by equally, and intentionally, marginalized people from half a world away.

What could go wrong?

The Rockwell EDFCS-730

 

For our purposes, “patient zero” in the 737 MAX tragedy is something called the Rockwell-
Collins EDFCS-730. It is an autopilot (again, called a flight control computer by Boeing), made

specifically for the 737 – starting with the 737 NG (not MAX).

The EDFCS-730 was intended as a “digital” replacement for the existing autopilots in the 737.
As mentioned earlier in this article, the original autopilots were a collection of
electromechanical controls made out of metal, hydraulic fluid, relays, etc.

 

 

 

 

 

 

 

 

 

 

 

 

Over time more and more of the electromechanical function of the autopilot was replaced/and-
or supplanted by digital components. In the EDFCS-730, that supplanting was total.

 

The EDFCS-730 offered Boeing an enormous set of opportunities. First, it was far cheaper on a
lifecycle basis than the old units it replaced. Second, it was trivial to re-configure the autopilot
when new functionality was needed – such as a new model of 737.

Third, its operating laws were embedded in software – not hardware. That meant that changes
could be made quickly, cheaply and with little or no oversight or scrutiny. One of the aspects of
software development in aviation is that there are far fewer standards, practices, or
requirements for making software in aviation than there are for making hardware.

A function that would draw an army of auditors, regulators and overseers in hardware gets by
with virtually no oversight if done in software. Which makes software unbelievably attractive
as a “manufacturing” option.

Longitudinal stability

Late in the 737 MAX’s development, after actual test flying began, it became apparent that
there was a problem with the airframe’s longitudinal stability. We do not know how bad that
problem is nor do we know its exact nature. But we know it exists because if it didn’t, Boeing
would not have felt the need to implement something called the Maneuvering Characteristics
Augmentation System, or MCAS.

MCAS is a system, implemented entirely in the EDFCS-730 and implemented entirely in
software, that pushes the 737 MAX’s nose down when the system believes that the airplane’s
angle of attack is too high. For more on that process, see

https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-
software-developer

 

I believe that Boeing anticipated the longitudinal stability issue arising from the MAX’s larger
engines and their placement. In the early 1980s Boeing had encountered something similar
when fitting the CFM56 engines to the 737 “Classic” series. Then it countered the issue with a
set of aerodynamic tweaks to the airframe, including large strakes affixed to the engine cowls
which are readily visible to any passenger sitting in a window seat over or just in front of the
wing.

When the issue of longitudinal stability arising from engine size and placement arose again with
the 737 MAX, Boeing had a tool at its disposal that it had not had with previous generations of
737.

And that tool was the EDFCS-730 autopilot.

 

Boeing had a choice: correct the stability problems in the traditional manner (meaning
expensive changes to the airframe) or utilize software in the EDFCS-730 to make the problem
go away in a much more expeditious, meaning cheaper and faster, way.

Quick and dirty

The result is, as they say, history. Wall Street had stripped Boeing of a leadership cadre of any
intrinsic business acumen. And its leadership had no skills beyond extraordinary skills of
intimidation through a mechanism of implied and explicit threats. See “Welcome to the
Machine” at the end of this article.

Empathy has no purchase in such an environment. The collapse of trust relationships between
individuals within the company and, more important, between the company and its suppliers
fertilized the catastrophe that now engulfs the enterprise.

 

From high in the company came a dictat: ship the airplane. Without empathy, there was no
ability to hear cautions about the method chosen by which to ship (a low-quality software
hack).

I have spoken to individuals at all of the companies involved and have yet to find anyone at
Rockwell Collins who can direct me to the individuals tasked with implementing the MCAS
software. Rockwell Collins is, predictably, extremely reluctant to take ownership of either the
EDFCS-730 or its software. I have been assured repeatedly that the internal controls within
Collins would never have allowed software of such low quality to go out the door and that none
of their other autopilot products share much, if any commonality, with the EDFCS-730.

That, together with off the record communications, leads me to believe that Boeing itself is
responsible for the EDFCS-730 software. Most important, for the MCAS component. The
responsibility for creating MCAS appears to have been farmed out to a low-level developer with
little or no knowledge of larger issues regarding aviation software development, redundancy,
information takers, information givers, or information processors.

And I believe this is deliberate. Because a more experienced developer, of the kind shown the
door by the thousands in the early oughts, would have immediately raised concerns about the
appropriateness of using the 737’s autopilot for the MCAS function.

Raising an autopilot, based on an architecture laid down before man walked on the Moon, from
an information-taker to an information-processor.

 

 

 

 

 

 

 

 

 

 

 

 

They would have immediately understood that the lack of an electronic corpous callosum
between the left and right flight control computers made uneconomical the use of both angle
of attack indicators.

(For reference, there are five (seven depending on how you count) flight control computers in
the A320 and every computer sees every sensor. There is no bicameral division.)
Using them to make the decision to relentlessly point the aircraft’s nose at the ground.
They would have pointed out that the software needed to realize that an angle of attack that
goes from the low teens to over seventy degrees, in an instant, is structurally and
aerodynamically impossible.

And not to point the nose at the ground when it does. Because the data, not the airplane, is
wrong.

And if they had, the families and friends of nearly four hundred dead would be spared their
bottomless grief.

The pathology of Boeing’s demise

Much has already been written about the effect of McDonnell Douglas’ takeover of Boeing.
John Newhouse’s Boeing vs. Airbus is the definitive text in the matter with L.J. Hart-Smith’s
“Out-sourced profits- The cornerstone of successful subcontracting” being the devastating
academic adjunct.

 

Recently Marshall Auerback and Maureen Tkacik have covered the subject comprehensively,
leaving no doubt about our society’s predilection for rewarding elite incompetence
handsomely.

Alec MacGillis’ “The Case Against Boeing” (
https://www.newyorker.com/magazine/2019/11/18/the-case-against-boeing ) lays out the
human cost of Wall Street’s murderous rampage in a manner that should leave claw marks on
the chair of anyone reading it.

 

Charles Pezeshki’s “More Boeing Blues” ( https://empathy.guru/2016/05/22/more-boeing-
blues-or-whats-the-long-game-of-moving-the-bosses-away-from-the-people/ ) is arresting in its

prescience. I am indebted also to Charles for his theories of the role of empathy in
organizations. It is a forensic tool of unbelievable power.

Boeing’s PR machine has repeatedly lied about the origin and nature of MCAS. It has tried to
imply that 737 MCAS is just a derivation of the MCAS system in the KC-46. It is not.

 

 

 

 

 

 

 

 

 

 

 

 

More nauseatingly, it promotes what I will call the “brown pilot theory.” Namely, that it is pilot
skill, not Wall Street malevolence, that is responsible for the dead. In service of that theory it
has enlisted aviation luminary (and a personal hero-no-more of mine) William Langewiesche.
For the best response to that, please see Elan Head’s “The limits of William Langewiesche’s

‘airmanship’” ( https://medium.com/@elanhead/the-limits-of-william-langewiesches-
airmanship-52546f20ec9a )

Those individuals “get it.” Missing here are accurate pontifications from much of the aviation
press, the aviation consultancies or financial advisory firms. All of whom have presented to the
public a collective face of “this is interesting, and newsworthy, but soon the status quo will be
restored.”

A well, poisoned

Boeing’s oft-issued eager and anticipatory restatements of 737 MAX recertification together
with its utter failure to actually recertify the aircraft invite questions as to what is actually going
on. It is now over a year since the first 737 MAX crash and coming up on the anniversary of the
second.

Yet time stands still.

What was obvious, months ago, was that the software comprising MCAS was developed in a
state of corporate panic and hurry. More important, it was developed with no oversight and no
direction other than to produce it, get it out the door, and make the longitudinal problem go
away as quickly, cheaply, and silently as a software solution would allow.

What became clear to me, subsequently, was that all of the software in the EDFCS-730 was
similarly developed. And when the disinfection of sunlight was shined on the entire EDFCS-730
software, going back decades, that – as my late wife’s father would say – the entertainment
value would be “zero.”

The FAA was caught both with its pants down and its hand in the cookie jar. The FAA’s
loathsome Ali Bahrami, nominally in charge of aviation safety, looked the other way as Boeing
fielded change after deadly change to the 737 with nary a twitter from the agency whose one
job was to protect the public. In the hope that a door revolving picks all for its bounty.

Collapse

Recent headlines speak in vague terms about Boeing’s inability to get the two autopilots
communicating on “boot up.” Forensically, what that means is that Boeing has made an
attempt to create a corpus collosum between the two, so that the one in charge can access the
sensors of the one not in charge.

And it has failed in that attempt.

Which, if you understand where Boeing the company is now, is not at all surprising. Not
surprising, either, is Boeing’s recent revelation that re-certification of the 737 MAX is pushed
back to “mid-year” 2020. Applying a healthy function to Boeing’s public relations
prognostications that is accurately translated as “never.”

 

For it was never realistic to believe that a blindered, incompetent, empathy-desert like Boeing,
which had killed nearly four hundred already, was able to learn from, much less fix, its mistakes.

 

This was driven infuriatingly home with today’s quotes from new-CEO David Calhoun. As the
Seattle Times reported, Calhoun’s position is:

“I don’t think culture contributed to that miss,” he said. Calhoun said he has spoken directly to
the engineers who designed MCAS and that “they thought they were doing exactly the right
thing, based on the experience they’ve had.””

This is as impossible as it is Orwellian. It shows that Boeing’s leadership is unwilling (and
probably unaware) of what the root issues are. MCAS not just bad engineering.

It was the inevitable result of the cutting of the sinews of empathy, sinews necessary for any
corporation to stand on its own two feet. Boeing is not capable of standing, any more and
Calhoun’s statements are the proof.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Gregory Travis is a software architect, aircraft owner and writer. His first article identifying the issues with the 737 Max appeared in the May 2019 issue of IEEE Spectrum magazine.