Now before you get your torches and pitchforks, hear me out.

In many industries, but aviation in particular, a stellar safety record is a competitive advantage.  In High Reliability Organizations (HRO), such as airlines, safety is paramount to the success of the organization.  After all, a single accident, regardless of how rare an occurrence it is, could yield a catastrophic result, either in terms of cost of lives or liability.  Indeed, the primary, secondary, and tertiary victims are not the ones who made the unsafe decision but are left to bear the brunt of the result. 

Airlines, manufacturers, and their labor representation groups spend millions of dollars each year researching and developing safety protocols and procedures.  This is frequently augmented by public sources such as military research, information released from regulator investigations, and publicly funded research projects.  With effective safety protocols so high on the list of things that affect an organization’s health, it would follow that they would be closely-guarded industrial secrets.

Except they are not.  

As the aviation industry becomes more technologically complex, the risks and consequences of safety systems’ failure increase.  Over the past century, safety science has evolved from a linear blame-the-operator type thinking into complex sociotechnical modeling whose goal is to determine both active and latent failures within an event.  To reflect this growth, and to better meet its goal of ensuring the most efficient and safe airspace in the world, the Federal Aviation Administration (FAA) in 2015 altered its compliance philosophy from a strictly enforcement role to one that encourages collaboration in safety among its sponsors.  In administering their goal, the FAA has made available to the aviation community several safety programs, some mandated (such as internal Safety Management Systems), others voluntary, to better improve safety culture, sharing of best practices, and data gathering.  

Industry participants, whether they are large commercial operators, general aviation providers such as private corporate flight departments or tour operators, or anywhere in between, have been quick to recognize the advantages of internal safety programs.  Aviation safety thinking was born out of the need for the civilian aviation industry to meet public demands for service in the post war years of WWI and WWII.  Advances in technology, human factors understanding, and procedures were a direct result of accident investigation and spurred increases in infrastructure and research.  Early safety processes focused on the outcome of an accident, but this reactive measure of thinking often arrived at recommendations dealing with the specific active failure (what specifically went wrong), rather than what latent or underlying factors (what contributed or aggravated to the failure) caused the failure.  Internal safety programs allowed organizations to capture safety-related information, standardize risk management processes, and proactively share resources regarding hazard identification and mitigation.

To assist the aviation community in their development of safety programs, the FAA has issued Advisory Circulars (AC) that offer guidance on the development, implementation, and administration of safety programs.  Trade organizations and unions also have put forth standards for safety programs that, while voluntary, often become industry standards, sharing best practices and procedures as well as providing a clearing house for organizations which face similar hazards.

One of the more commonly utilized safety programs available is the FAA’s Aviation Safety Reporting System (ASRS).  The crash of TWA Flight 514 on December 1st 1974 drew widespread attention to the need for gathering safety information in aggregate and making it available to all parties. TWA flight 514 struck a small mountain 25 miles from Dulles International Airport, killing all on board.  The subsequent investigation found, amongst other causes, unclear charting and terminology in issuing the approach.  United Airlines had a similar mishap six weeks prior, narrowly avoiding the mountain, but their internal safety program flagged the mistake and sent out corrective action to their crews; TWA did not have access to that information.  

Even prior to the official National Transportation Safety Board (NTSB) probable cause evaluation, the FAA determined the need for a national safety information program with the intent to gather, analyze, and distribute information to users of the National Airspace System (NAS), as well as identify and mitigate hazards in the NAS.  Currently, the ASRS is available to all users of the NAS to submit safety-related information voluntarily and anonymously.  Administered by NASA, a non-regulatory organization, information is de-identified and analyzed; NASA’s role is to facilitate the gathering of information, as it ensures the non-punitive nature of submitting safety information in good faith, making the information more complete as submitters are not incentivized to hide details.

Similar in practice to ASRS is the Aviation Safety Action Program (ASAP), made available to employees certificate holders (airlines, manufacturers, Air Traffic Controllers, etc) who voluntarily engage in the program.  More limited in scope than the ASRS, ASAP’s goal is to give employees of eligible entities a way to bring safety-related information directly to an Event Resolution Committee made from FAA, company, and union representatives to identify precursors to accidents, unclear procedures or ineffective policies.  Reporters are given similar protections to ASRS, and the information is aggregated to identify potential hazards in the NAS.

Beginning in 2008, the FAA and their industry sponsors developed a bi-annual convention known as InfoShare where airlines, labor groups, and industry participants share safety-related information openly and discuss high-risk events.  The primary purpose of these events is to share what information they have gathered about the hazards they face, as well as best practices for mitigating identified hazards.  

Information sharing on this scale had to happen organically.  No amount of regulation could have forced it into existence, but the collaboration facilitated by the regulator (once they realized blunt force wasn’t working…go figure), allowed both the industry and the FAA to become more agile and proactive, rather than punitive and reactive.  Perhaps other industries and agencies could learn a lesson (looking at you, Health Care, with your notoriously high accident rates and draconian investigative methods).  

 


Dennis Murphy is a professional airline pilot with a background in aviation safety, accident investigation, and causality. When he’s not flying 737s, he enjoys the company of his wife, their dogs, cats, and bees.