I tell pilot candidates I train that if they score 100% on their knowledge test, I will buy them a margarita, assuming they are of legal drinking age, or a Peet's cappuccino if they aren't. You see the current FAA's knowledge testing procedures have all the characteristics of a game. I'm not being flippant, I'm being honest. The tests are expensive to take ($150 on average), a bank of representative questions for each test is published in advance, you're only offered a quick review of any questions you missed, you won't be told the correct answers to any questions you missed, and the tests have historically contained trick questions and misleading graphics. And there's a whole test preparation industry that produces books, weekend study courses, and computer-based training all designed to help applicants get a good score. The only thing missing is popcorn and liquid refreshment.
All was generally well with the FAA knowledge tests until, without any notice, the FAA decided to change the rules of the game by changing the test question bank for the Airline Transport Pilot, Flight Engineer, and the Fundamental of Instruction knowledge tests without modifying their own publications. The predictable result was a sharp increase in the failure rate and that has led many to observe with shock and horror that what had heretofore been occurring was not learning at all! To better understand the situation, you'll need to set aside some popularly held misconceptions about the FAA's knowledge test procedures.
Learning is ...
The instructional theory and concepts that flight and ground instructors are taught can be found in the FAA's Aviation Instructor's Handbook. I'll be blunt and say that the handbook contains a hodgepodge of educational theory, some of it germane and useful, some of it ... not so much. It has been edited and changed over the years and in the process has been somewhat improved. One of the core concepts presented is the four levels of learning: Rote, Understanding, Application, and Correlation. Rote knowledge is the memorization of facts while correlative knowledge is the ability to combine new experiences and information with what you've learned in the past. Most FAA practical test standards contain this boilerplate (emphasis added):
Examiner’s shall test to the greatest extent practicable the applicant’s correlative abilities rather than mere rote enumeration of facts throughout the practical test.
While a practical test is dynamic and interactive affair, the knowledge test is anything but. An examiner can ask questions selectively and in a sequence so as to uncover the level of knowledge the applicant possesses. The knowledge test is static (okay, the groups of questions are randomly selected at the time your test is generated) and a much more difficult affair since the test must provide valid results across a diverse population. It's good to have goals with regard to testing and examining pilots, but the truth seems to be that both the knowledge test and the oral portion of the practical tend to rely heavily on rote knowledge for the simple reason that it is easiest to test. I know one flight instructor candidate who was pink-slipped because he could not recite, verbatim, the definition of instructional level of knowledge.
A knowledge test should measure what the candidate has learned in an objective and effective manner. The easiest type of assessment to administer and grade is a selection style (aka multiple choice) test. Selection questions must be designed so that there is only one correct answer or one best answer. The Aviation Instructor's Handbook provides guidance to instructors on how to avoid using puzzle, trick, or bewilderment questions. Apparently in an effort to discriminate between levels of learning, the FAA often resorts to the very types of questions they caution against.
Some of the more egregious examples that come to mind include the graphics used in the Instrument Rating knowledge test where an RMI needle was depicted as slightly bent or the arrow head of the needle was depicted so subtly that one might mistake it for the tail of the needle. Then there were the flight planning questions where the en route time or fuel burn or magnetic heading that you calculated didn't exactly match any of the answers, but was somewhat close to one of the supplied answers. What was actually being testing here? No one seems to know, but a whole industry began to grow up around these tests.
In the past, when you missed a question on a knowledge test the only feedback your test results showed was the knowledge codes indicating the general subject area for incorrectly answered questions. The FAA changed this system somewhat and now publishes the Learning Statement Reference Guide for Airman Knowledge Testing. How they arrived at the term "learning statement" is beyond me, but I'm sure it involved hours, if not weeks, of meetings.
Preparation, Memorization, Commercialization
When suggesting to a student how to best prepare for the knowledge test, I ask them a bit about their individual learning style. Some people like doing computer-based training, others prefer a printed study guide, while others prefer the group learning environment provided by a weekend seminar. One size does not fit all. The advantage of a computer-based study system is that you can take sample tests in a format that closely resembles what you'll see at the testing center and that helps to reduce test anxiety for most people. I know of no students who receive their knowledge test preparation from their individual instructor one-on-one because the cost would be prohibitive.
Ideally, test preparation should help the candidate ensure that they are adequately prepared to pass the test and uncover areas they need to work on. What often happened in the past was that some candidates tried to memorize the answers and the tests end up measuring rote knowledge instead of correlative knowledge. To combat this, the FAA made some changes. First, the actual test questions are no longer published. Instead, a bank of representative questions is available for each knowledge test. Many people mistakenly believe that all the actual test questions are published in advance, but this has not been the case for several years.
The National Association of Flight Instructors has raised some important questions about the FAA's unannounced test changes, pointing out that some questions may have been added without first being validated. It also seems that the FAA reference materials no longer adequately represent the knowledge tests that applicants are taking and some subject matter areas may now carry more weight than they did in the past. As an aside, NAFI seems to be doing a commendable job of representing the interests of flight instructors and their students.
Cost of Learning
For quite some time now, the FAA knowledge tests have been computed-based and administered through testing centers affiliated with LaserGrade (now PSI) and CATS. The testing centers must adhere to proctoring rules, including having a closed-circuit surveillance system installed in the testing area. While there is a certain overhead to providing these facilities, it seems hard to justify the 100% increase testing fees that has occurred over the last few years. No one seems to have talked about this increase, probably because the test-takers are a captive audience and have no recourse but to pay the piper. It baffles me why some keep saying over and over that cost is not a significant impediment to learning to fly. With avgas pushing $6/gallon in many areas and knowledge tests costing $150, cost most certainly is a factor.
The FAA has tried to create a valid and reliable knowledge test for each rating or certificate and though the whole thing may seem to be a bit of a mess, historically, the FAA's testing set-up appears to be adequate. Applicants must expend some amount of effort studying and even if they are just memorizing answers, some learning is bound to occur during this activity. Secondly, the knowledge test is just one part of the learning equation. The last line of defense in ensuring that adequate learning has occurred is the oral portion of the practical test, where the applicant must stand and deliver. While problems are occasionally reported with the manner in which some examiners have administered practical tests, by and large the system, as a whole, works. So while aviation testing may appear to be a mess, it has been a mess that everyone understood and accepted. At least we thought we understood.