Wednesday, December 24, 2008

Failure of Imagination

Pilots are trained to trust their instruments, follow accepted procedures, and maintain situational awareness, but these elements don't guarantee safe results. One model of risk analysis and management puts forth the "swiss cheese" model, originally proposed by British psychologist James Reason in the 1990s.

The idea behind the "swiss cheese" model is that safety barriers we use are not perfect. Holes exist in the equipment, procedures and techniques that we use. Rather than having a static position, it's useful to imagine the holes in each safety barrier as constantly shifting. Should a sequence of holes in each of our safety barriers align, our risk has just increased and here's the important part, we may not even be aware of it. When we pilots rely on multiple systems and procedures to mediate risk, we are assuming that multiple barriers will reduce the likelihood that risk will permeate all of the barriers and cause an accident.

With all this in mind, here is the third and final part of the video series on how disaster was adverted by an alert Air New Zealand flight crew.



For GA pilots in single-pilot operations, our first safety defense is a preflight briefing that includes weather information and NOTAMs (Notices to Airmen). The holes in this first barrier are many and may include a lack of surface weather observations at our intended destination or the simple fact that forecasts are imperfect. Another big hole is reading and digesting NOTAMs, which can be a complicated and error-prone process. Deciphering the abbreviations used in NOTAMs is a skill that can require time and practice. Yet if you skip the preflight briefing altogether or just "get the weather," you have effectively taken this barrier out of the picture and significantly increased the risks associated with your flight.

The second safety barrier is the go/no-go decision. Remember that light GA aircraft are much more vulnerable to hazardous weather due to their inherently low performance, slow speed, and lack of weather detection capability. If the preflight briefing suggests significantly nasty weather, stay on the ground. If you gloss over the go/no-go decision and commit yourself to flight, your life could become very complicated, very quickly. If you are loath to stay on the ground for fear that it will make you less of an aviator, get some serious couch time with a mental health professional.

The third safety barrier for GA pilots is a careful preflight inspection of the aircraft. The older the aircraft, the more holes you're likely to find in this safety barrier. I wouldn't be the first person to point out that there is a strong tension between keeping aviation affordable and maintaining safety. Maintenance is expensive and it's no secret that some owners skimp on maintenance. If you have doubts about the aircraft or a piece of installed equipment, especially if that equipment is critical to your flight, the choice is simple - Find a mechanic, find another plane, or stay on the ground.

The last safety barrier is the GA pilot himself or herself and all the intangible elements that each of us embodies. Are you current and proficient? When did you last fly the aircraft type you are about to fly? Are you familiar with the route and destination? Have you been there before? Did you get enough sleep last night? If the flight conditions will be challenging, are you up to the task? Are you prepared for moderate turbulence, heavy rain, a possible icing encounter, or a strong crosswind? Is another pilot going to be flying with you and can they take on some of the load? How about your passengers? If they get scared or airsick, can you handle the distraction? Do you have a plan B and a plan C?

Lack of familiarity with an aircraft's electronic navigation system can contribute to tragic results, as evidenced by the mid-air collision of a Boeing 737 and a Embraer Legacy jet over the Amazon in 2006. If you haven't already done so, I recommend reading William Langewiesche's Vanity Fair article which vividly describes the breech of several safety barriers - appropriate altitude for direction of flight, ATC conflict resolution, Traffic Collision Avoidance System, and crew situational awareness.

An increasingly popular way to handle the risk of complex avionics systems is standardized, recurrent training. This approach emphasizes eliminating errors, procedural violations and variations, or dangerous behavior on the part of pilots. Professor Reason refers to this as the person approach and the main goal is to eliminate variations in human behavior. At its best, standardized training creates another barrier of safety. At it's worst, this sort of training can emphasize heavy-handed, orthodox principals - "Never use the Direct-to button!" or "Always set the heading bug with your right hand!" are just two examples. I respect and appreciate standardized training, but standardized procedures can't cover every eventuality. That's where imagination comes in.

Imagination allows pilots to conceive of where they are in space, what they need to do next, and how certain choices or changes might affect the flight. At its worst, imagination can lead pilots to make incorrect assumptions to incorrectly explain something that doesn't seem quite right, as seen in the video above.

Flying in moderate or greater turbulence can be distracting, especially during an approach to landing. Assuming a pilot understands the effects and dangers of windshear, I encourage them to imagine how they'd fly the plane in smooth air and then do their best to make what is happening in real-time match up with their idealized flight. This use of imagination gives the pilot a specific outcome to strive for, rather than being at the mercy of the elements. Imagination can backfire, too. A few years back, I flew with a very intelligent pilot whose quick thinking and imagination sometimes led him to jump to incorrect conclusions and talk himself into all sorts of situations.

The G1000 provides an excellent example of a system intended to improve pilot safety through increased situational awareness and navigational accuracy. Used properly, the G1000 can provide exactly what it advertises. Yet the G1000 is it not particularly easy to use. It requires thorough initial training to gain proficiency and more than just occasional use to maintain that proficiency. In my experience, GA pilots who fly infrequently or on an irregular schedule are more apt to become confused or task saturated when using the G1000.

Hard to use systems, with numerous options, and several ways to accomplish the same function may seem flexible, but just as often the multitude of options simply increases a pilot's workload, effectively increases the number of holes in what is supposed to be a safety barrier. At critical phases of flight, poorly designed systems can lead to task saturation as the lone pilot tries to figure out "Why is it doing that?" or "How do I get it to do what I want?" Standardized training alone will not solve this problem.

This brings us back to imagination. A good pilot can imagine all sorts of possible problems that could scrub a flight or make a flight more difficult to complete. Imagination helps us consider a situation where we're behind the plane or where the GPS doesn't function properly. Imagination helps us realize that we might not be proficient, that we need some recurrent training. I'm all for optimism, but a pilot who always assumes everything will be just fine suffers from the worst possible lack of imaginiation - the kind of imagination that can keep all those holes in all those barriers from aligning.

2 comments:

Dave Starr said...

John, this came to me on Christmas dDay and a nice present it was. Hope your holiday is as good as mine was and best wishes for 2009. I'm glad I watched all 3 segments of this incident report as even though I worked with the installation of ILS equipment for many years I learned a couple things about the failure modes.

All pilots should think through a couple facts.

The monitor system is not always taken as seriously as it should be, it's not just an add-on safety feature it is the only insurance the system is not transmitting false information. I'd give serious thought to simply refusing an unmonitored ILS ... there's no good way to assess the risk.

Secondly engineering types may see the age of the equipment shown in the video ... these are almost all analog electronics, often older than the pilots who fly the system.

Many of us would drive a 50 year-old car for fun, but we certainly wouldn't expect traction control, air bags, anti-lock brakes and such technology that males today's driving safer.

The ILS is a fine invention, it will likely be with us a long time still, but especially as one of the investigators in the video mentioned, tying all the aircraft systems to a single 50 year-old analog box with a known false signal fail state that will not be displayed to the pilot might be a recipe for disaster.

Cross check, guys and gals, cross check and may each of us always have at least one piece of solid cheese beneath us for all of 2009.

flyaway said...

Thanks for this posting. Both the Vanity Fair article and the New Zealand story/video cause me to think ahead, plan and review more. Both are sobering.