FLYING LESSONS for January 22, 2025

Topics this week include: >> Be pessimistic about flight >> Selective acceptance >> Flying through denial

Download this report in a pdf

FLYING LESSONS uses recent mishap reports to consider what might have contributed to accidents, so you can make better decisions if you face similar circumstances.  In most cases design characteristics of a specific airplane have little direct bearing on the possible causes of aircraft accidents—but knowing how your airplane’s systems respond can make the difference in your success as the scenario unfolds. So apply these FLYING LESSONS to the specific airplane you fly.  Verify all technical information before applying it to your aircraft or operation, with manufacturers’ data and recommendations taking precedence.  You are pilot in command and are ultimately responsible for the decisions you make.     

FLYING LESSONS is an independent product of MASTERY FLIGHT TRAINING, INC.

Pursue Mastery of Flight®

This week’s LESSONS

Train like you fly, fly like you train. That’s the mantra to master the basics, learn from realistic scenarios that test your systems knowledge and decision-making, and then fly every flight ready to exercise the same level of skill and judgment. Unfortunately training often is not fully reflected in our day-to-day flying. This article, which I co-wrote with Bonanza pilot and FLYING LESSONS reader Dr. Lorne Sheren and which was published—nearly 15 years ago—in the November 2011 AOPA Pilot. I see the need for this LESSON made clear in accident reports to this day.

Early in training instructors typically—and without warning—pull the throttle back, inform students that they have “lost the engine,” and ask the students what they plan to do next. After that, those who go on to obtain pilot certificates constantly have that “what if” question in mind. Pilots are always questioning and preparing for the worst.

True emergencies, however, do not always announce themselves as plainly as the instructor’s hand on the throttle. Frequently they are subtle; they whisper before they shout. They lull us into complacency before they become obvious and life threatening. The real trick to handling an emergency is realizing you have one in the first place. Then you can apply all your experience and training into handling the emergency. Until the emergency is recognized you are only along for the ride.

Take this scenario: You are flying along in a turbocharged airplane. It’s a beautiful day, all is routine. You look down at your manifold pressure gauge and notice that the manifold pressure has dropped a couple of inches. Figuring the throttle has slipped, you advance the lever a bit and the manifold pressure is restored. A few minutes later you again notice that the manifold pressure has dropped. Once again you advance the throttle. Manifold pressure is restored. You tighten the friction lock on the quadrant. Then it happens again, but this time the throttle is fully forward. This is puzzling. The other gauges look fine; the airplane is performing normally. A glance at the GPS shows an airport five miles to the east. What to do? Is there cause for concern? Don’t think so.

If you are thinking that this decision doesn’t sound like one made by a pessimist, you are correct. What happened to that healthy sense of doubt ingrained in us since our initial training?

Human nature is what happened. As pilots we are spring loaded to see situations as we want them to be rather than as they are. Psychologists call this effect “confirmation bias.” As Simon and Garfunkel sing in The Boxer, “A man hears what he wants to hear and disregards the rest.” It’s a common phenomenon, yet not understanding it can lull us into a dangerous state of complacency.

In order to obtain a pilot certificate we have to complete a rigorous course of instruction, pass periodic checkrides, and operate a complex piece of machinery in what is, after all, a dynamic environment. People who accomplish these tasks tend to be goal-oriented and are used to completing what they set out to do. Yet the same mindset that allows us to fly in the first place can also work against us. Frequently when we read an accident report we wonder how the pilot could have made so many obviously incorrect decisions that led to catastrophe. Yet in the cockpit we are inclined to make those same decisions because of both external and internal pressures to complete a mission. Continue on, everything is fine; I’ll take a look when I reach my destination. Sure I have enough gas. How many times have you rationalized an in-flight abnormality?

The true talent in handling an emergency is recognizing that an emergency exists. That’s a decision point we rarely train for. When emergencies in training occur they are sudden and obvious. Your instructor pulls the throttle back, announcing an engine failure. He puts a suction cup over the attitude indicator. The situation is obvious and the pilot does what he is trained to do. But, outside of a simulator, it is difficult to develop a training scenario that teaches not only how to cope with an emergency but how to recognize that an emergency exists. The slow tumble of the dying gyro is hard to simulate, but is the way it happens in real life. How long does it take to recognize that things are not the way we expect them to be?

The FAA and the [AOPA] Air Safety Foundation [now Air Safety Istitute] investigated this very question in a landmark study, the results of which were published in 2002. DOT/FAA/AM-02/19 exposed 41 instrument-rated pilots to an unannounced failure of attitude and heading instrumentation during flight in single-engine general aviation aircraft: 25 in a Piper Archer PA–28 and 16 in a Beechcraft Bonanza A36.

The PA–28 flights consisted of three groups:

(1) Group A—a failure of the attitude indicator (AI) and directional gyro (DG),

(2) Group B—same as group A but received 30 minutes of partial-panel instruction in a personal-computer-based aviation training device (PCATD) prior to the flight, and

(3) Group C—same as group A but had a failure-annunciator light (vacuum warning) on the panel.

The A36 flights consisted of two groups:

(1) Group A—a failure of the AI only (a common scenario in this type of aircraft), and

(2) Group B—a failure of the AI and the horizontal situation indicator (HSI).

All of the PA–28 pilots maintained control of the aircraft. Sixty-eight percent flew successful partial-panel approaches, and likely would have survived if it had been an actual emergency. However, 25 percent of the Bonanza pilots could not maintain control, and the evaluator had to assume control of the aircraft.

The average time from the onset of failure until the pilot recognized the problem was an alarming 7.6 minutes. Pilots who completed 30 minutes of PCATD training prior to the flight reduced their recognition time, but it was still an average 4.9 minutes—even in airplanes with a system failure annunciator light.

A close relative to confirmation bias is what we might call “selective acceptance” of indications. Consider the chain of events, and the decision-making process, in this NTSB account of a fatal American Legend Cub flight:

The airplane ditched in Lake Michigan following a total loss of engine power because of fuel exhaustion. The private pilot on board drowned following the ditching. The pilot had just purchased the airplane and was relocating it to an airport near his home. The pilot did not have a tailwheel signoff or a current flight review. Arrangements were made through the aircraft manufacturer for an ATP/CFI who was a test and demonstration pilot for the company to fly the airplane to Ohio with the new owner.

Prior to takeoff there was an aircraft accident on the airport involving another airplane, and the airport was temporarily closed. During this delay, the ATP/CFI stepped up on the right wing step and looked over the wing trying to see the accident. He stated that when he got down off the wing the pilot got up on the right wing step to look. While the pilot was up on the wing step they were informed that the first four airplanes to get to the engine start line would be able to depart. He stated that with the assistance of other pilots in the area, they were second in line at the start line and were able to take off.

The ATP/CFI stated that when up on the wing step the only handhold available was the fuel cap, which both he and the pilot held onto. The pilot flew the airplane to the shoreline and out over Lake Michigan. The ATP/CFI commented to the pilot about how far out over the water they were and the pilot “reluctantly turned back closer to the shoreline.” The ATP/CFI stated that after flying for approximately two hours, he noticed the left fuel tank quantity indicator was indicating that the fuel tank was empty and the right fuel tank quantity indicator was showing about one inch of fuel. The private pilot stated that the “fuel gauges must be malfunctioning as the EIS [electronic information system] indicated that the rate of fuel burn was 5.8 gph with 8.3 gallons of fuel remaining.” He stated the private pilot also told him the EIS was indicating that there was one hour and 20 minutes of flying time remaining. The pilot then diverted to a closer airport. The ATP/CFI suggested landing at the Gary-Chicago International Airport (GYY), Gary, Indiana. Approximately 20 minutes later, while en route to GYY, he noticed that both fuel gauges were indicating empty. The private pilot again reported the EIS was showing a fuel flow of 5.8 gph with 6.1 gallons of fuel left. Approximately four minutes later the engine lost power. The pilot landed the airplane on the water. Both pilots exited the airplane unharmed; however, the private pilot did not know how to swim and drowned.

The right fuel cap was missing from the airplane when it was recovered from the bottom of the lake. Blue fuel stains on the right wing and right horizontal stabilizer suggested fuel loss through the filler port. No mechanical failures/malfunctions were noted during examination of the airplane and engine. 

The NTSB probable cause(s): The failure of both pilots to assure that the fuel cap was securely in place prior to takeoff, which resulted in fuel siphoning and, ultimately, fuel exhaustion. An additional cause was the decision to fly over the lake outside of gliding distance to shore along with the delay in diverting.

Although both pilots noted the actual fuel level in visual fuel level tubes in the airplane’s cabin, the airplane’s new owner was mesmerized by the electronic fuel monitor on instrument panel—and neither saw the true significance of the discrepancy, or acted decisively to get over land as soon as possible and on the ground at the very earliest opportunity. The fuel monitor measures only fuel passing through lines to the engine, and cannot detect fuel lost through fuel caps, vents, or other leaks. The private pilot, especially, saw what he wanted to see.

In order to fly safely we need to overcome the biases that lull us into a sense of complacency. “No, that can’t be happening to me” is a common voice we hear inside our heads as a potential disaster unfolds. The pilot flying the turbocharged airplane knows that his loss of manifold pressure could be something as simple as a balky wastegate, or as dangerous as an unrestrained exhaust leak impinging on a fuel line. He knows that the correct course of action is to land at the nearest suitable location. But he soldiers on, recognizing the emergency from an intellectual standpoint but not from an operational standpoint. He knows what to do; he just doesn’t know when to do it.

Another way to look at it is that pilots in flight instruction are frequently pessimists, looking for failures and alternatives because they expect them in training scenarios. Once out of the instructional environment, however, pilots tend to rationalize even obvious indications and discrepancies to support their generally optimistic goal orientation. In practice, pilots are optimists. And that can sometimes get us into trouble.

The lesson to be learned is to look at any given situation through objective eyes. The chief pilot at our home FBO, a veteran airline captain, always tells his students, “Think of how it will look at the hearing.” Good advice. It takes concentration, training, and willpower to recognize that this situation is not what you want it to be and to act upon that knowledge. But remember, it is your duty, as the pilot in command of your aircraft, to bring your airplane, passengers, and crew to a safe landing somewhere. This requires concentration and objective thinking. Do you owe your passengers any less?

Questions? Comments? Supportable opinions? Let us know at [email protected]

Please help cover the ongoing costs of providing FLYING LESSONS through this secure PayPal donations link. Or send a check made out to Mastery Flight Training, Inc. at 247 Tiffany Street, Rose Hill, Kansas USA 67133. Thank you, generous supporters.

Thank you to our regular monthly financial contributors:

Thank you to these 2026 donors:

Thanks also to these donors in 2025:

Thomas P. Turner, M.S. Aviation Safety 

Flight Instructor Hall of Fame Inductee

2021 Jack Eggspuehler Service Award winner

2010 National FAA Safety Team Representative of the Year 

2008 FAA Central Region CFI of the Year

FLYING LESSONS is ©2026 Mastery Flight Training, Inc.  For more information see www.thomaspturner.com. For reprint permission or other questions contact [email protected].  

Disclaimer

FLYING LESSONS uses recent mishap reports to consider what might have contributed to accidents, so you can make better decisions if you face similar circumstances. In most cases design characteristics of a specific airplane have little direct bearing on the possible causes of aircraft accidents—but knowing how your airplane’s systems respond can make the difference in your success as the scenario unfolds. Apply these FLYING LESSONS to the specific airplane you fly.

Verify all technical information before applying it to your aircraft or operation, with manufacturers’ data and recommendations taking precedence. You are pilot in command, and are ultimately responsible for the decisions you make.