Air travel is generally safe, but not 100% safe. Sometimes equipment fails, sometimes the weather interferes, and sometimes humans err.
In his 2009 book, Outliers, Malcolm Gladwell examined a particular type of human error in a chapter called, “The Ethnic Theory of Plane Crashes.”
Although he would have been more correct to write “culture” rather than “Ethnic,” his observations are quite pertinent, given the recent crash of the Korean Aviana Airlines flight in San Francisco.
Gladwell examined the dreadful safety record of Korean airlines in the 1990s. A ten year period when they crashed 17 times more often than those of United Airlines.
Why would that be?
And they do.
But Korean culture also comes with something else, something known as a high ‘Power Distance Index,’ or PDI.
This basically means that there is a feeling of great distance between subordinates and their superiors. Like in the military, orders are given and obeyed, not questioned. And if they are questioned, it is done in a very indirect way.
This deference is effective at creating an organized and polite society, but a recipe for disaster in the cockpit.
Among flight crews originating in cultures with a high PDI, such as Korea and Brazil, Gladwell found that crews were slow or completely unable to communicate critical information to the pilot – if it contradicted the pilot’s authority.
Let’s examine the fate of a Korean Air flight that crashed in Guam in 1997.
On the approach to the island, it was raining, and the pilot complained of being tired. Normally there was a “Glide Scope” to assist planes to the runway. But on that night it was out of service so the crew would have to perform a “visual” landing.
They would follow a radio beacon to Guam, and then once they had spotted the runway – land manually. Visual landings are not uncommon, and the Korean Air pilot knew this in advance. He even mentioned it in the pre-flight briefing.
What he did not know is that at Guam, the radio beacon transmitter is not located at the runway, but on Nimitz Hill.
As they approached Guam, the flight voice recorder is mostly silent, until the First Officer decided to say, “Don’t you think it rains more? In this area, here?”
The First Officer is not taking a survey on Guam weather patterns.
He means to communicate, “Captain, the weather is terrible and we are committed to visual landing. You think we will break out of the clouds in time, but what if we don’t?”
But he can’t say that. It would be rude, perhaps harmful to his career.
So he hints.
But does the Captain get the hint?
Moments later, the plane breaks through the clouds and lights appear on the horizon. The Flight Engineer asks, “Is it Guam?” Then says, “It’s Guam, Guam.”
The Captain says, “Good!”
But it isn’t good.
They are still twenty miles away and there is worse weather ahead. The Flight Engineer knows this, because it is his job to track the weather, so he says, “Captain, the weather radar has helped us a lot.”
The Flight Engineer is perfectly aware that the Captain already knows how useful weather radar is. That’s not what he is trying to say.
What he is trying to say is, “There is trouble ahead. You can’t just rely on your eyes to land tonight.”
But he can’t say that. So he hints, hoping the Captain will seek more information.
The Captain’s response, “Yes, they are very useful.”
Another hint wasted.
But the flight crew remains silent until the Captain puts the landing gear down. The First Officer asks, “Not in sight?”
One second later, the automatic flight warning system engages and says, “Five Hundred Feet.”
This confuses the Captain and crew, because at five hundred feet they ought to be able to see the runway, but they can’t. They also cannot see that they are headed right towards Nimitz Hill.
The First Officer says, “Let’s make a missed approach.”
Finally he’s escalated from a hint to a suggestion. It was later determined that if he had taken over control of the plane at that point they would have averted the crash.
That’s what First Officers are trained to do if they feel the Captain is in error. But doing it in a classroom and doing it in reality are quite different things.
The Flight Engineer then chimes in, “Go round.”
The Captain replies, “Go around.”
Then the flight warning system completes the nights dialogue.
Deference for ones superiors and indirect speech are highly effective for cultivating a polite and efficient society, but horrendous for correcting errors.
On an Air Florida Flight 90, out of Washington DC, the First Officer tried to warn the Captain three times about ice on the wings.
But it was all hints, he said:
“Look how that ice is just hanging on back there, see that?”
“See all those icicles on the back there?”
“Boy this is a losing battle here on trying to de-ice those things…”
Finally as they are cleared for takeoff, the First Officer makes a suggestion, “Let’s check those wing tops again…”
The Captain’s response, “I think we get to go here in a minute.”
He didn’t get the hints.
Just before the plane plunges into the Potomac River, the First Officer says one more thing, “Larry, we’re going down.”
This time the Captain agrees, “I know it.”
Since the 1990s, Korean Air has retrained their flight crews to communicate more directly. Whether or not cultural distance played a part in the recent San Francisco crash, we don’t know.
We may never know. More likely a press release will say “pilot error,” and that will be the end of it.
Politeness can be deadly.
Western society in particular is moving toward a culture in which politeness is valued more highly than frankness. Civility more so than honesty.
I am no advocate for ill manners or crude behavior, but a culture that cannot speak openly and honestly, denies itself the means to correct errors before they lead to disaster.
And I’m afraid to say that the radio beacon guiding a large portion of society, is not leading to a smooth landing.
This is just a hint of course.