In my last post I wrote about a recent presentation on human error that I attended. The speaker was Dan McCune, VP of Safety at Embry-Riddle University and Director of Trianing for HFACS, Inc. Not only is Dan a very knowledgeable and entertaining speaker, but he is passionate about learning as a key element for safety. His background is in military aviation, where he was exposed to extreme risk. As he spoke, many of his examples illustrating human error and their source were either from his personal experience or that of others. A key take-away that I got from his talk was about the importance of an organizational culture that embraces learning from mistakes, hopefully before they lead to serious consequences. Moreover, sharing what has been learned with others to imporve the good for all.
In James Reason’s book, Managing the Risks of Organizational Accidents, he writes about how to “engineer a safety culture”. (First, I digress: not only do I highly recommend this book, but when you read it please pay particular attention to Chapter 9.) In this chapter he discusses the Organizational Culture and that “A safety culture is not something that springs up ready-made from the organizational equivalent of a near-death experience, rather it emerges gradually from the persistent and successful application of practical and down-to-earth measures. Acquiring a Safety Culture is a process of collective learning.” Reason then goes on to discuss other business books about culture that support the notion that a strong organizational culture begets a strong safety culture. Tom Peters, in his book In Search of Excellence, writes that a strong culture is one in which all levels of the organization share the same fundamental goals and values. “In these companies with strong cultures, people that are way down the line know what they are supposed to do in most situations. This is because the handful of values that guide behavior are crystal clear.”
Reason’s focus on organizational culture, and the underlying factors that allow a strong safety culture, stresses the importance of not only removing barriers to learning, but narrowing in on how we react to situations. In safety, how do we react to a near-miss or worse, an incident with injury? All too often we focus on what happened or the individual behaviors that preceded the incident. By instinct we want to find the cause, place the blame, retrain, fix the person, etc. However, this type of reaction throws cold water on learning and as Reason argues, results in a short-term fix, such as “write another procedure” and the “blame and train” response cycle. Though many safety professionals know this all too well, understanding what they can do to change managment systems or reframe long-held beliefs is less clear and can be very difficult.
Several books on human error, including Reason, Dekker and Marx talk about the importance of institutional learning by creating a positive and just culture. (I’ll save this for another post, but you all should read the book by Edward L. Deci “Why We Do What We Do”. It may change your thinking about behavior based safety.) Reason’s book details several subcultures that work together creating one positive and productive culture. These include a Reporting Culture, a Just Culture, a Flexible Culture, and a Learning Culture. Here is where I will take you back to the presentation by Dan McCune.
In aviation safety it is critically important that pilots feel free to share and discuss everything related to flying and their performance. Though being a pilot requires knowledge, skill, physical ability and mental capability, each and every pilot will tell you that they really do not have command of the plane and situational control until they have a great deal of experience. This translates to thousands of flight hours. What happens over time while they are gaining this experience? They are making mistakes; many, many mistakes. It is the job of the flight instructor to help each pilot recognize and understand their mistakes. From this they learn about their perceptions, misapplication of knowledge, what could be done differently, etc. But how does learning occur after they graduate and are on their own? Pilots and the FAA have a system of near-miss reporting. It is a no-fault system that recognizes that if there are barriers to sharing and reporting, knowledge will be stifled. This reporting philosophy extends beyond the pilots to the entire crew, controllers, ground service personnel and anyone who could impact safe flight operations.
McCune shared that at Embry-Riddle, whether in their school of aviation or any other part of the University, they have embraced an important and unique philosophy: “No disciplinary action will be taken for reporting a safety hazard, concern or near-miss situation”. They realize that for the University to have a true safety culture it must be open to learning and sharing experiences. The safety culture must include just and open reporting, without blame or consequenses, so it can learn what is has missed or didn’t know. (of course situations of willful misconduct are dealt with differently) From the new employee and student orientation where the President of the University makes it perfectly clear, to the day-to-day actions that support it, everyone learns that “Event reporting is very important. Everything, without risk to you or anyone else, should be reported.” To support this Embry-Riddle has a simple and respectful system for reporting, providing feedback and thanks to the reporting person. They also have a data base for tracking and sharing information about near-misses, and always investigate each situation using the HFACS system. With over 700 reports this past year from their two campuses and no serious incidents it must be working.
In closing, here is a fundamental belief that I have learned about Near-Miss reporting. If you don’t have it in your organization and it’s not resulting in many hundred reports each year, you do not have a culture that values learning from mistakes. Further you don’t have an organization that really understands the nature of human error and designing work to expect it.
- Think about the barriers that exist and how you may get over or around them.
- Think about how your organization reacts to incidents or mistakes.
- Think about how your organization conducts incident investigations and what may be called root cause analysis.
If you see a culture that focuses on individual behavior, is quick to blame, stops short of looking for organizational solutions, adopts quick fixes, doesn’t encourage open discussion about problems and solutions, and
…… well you get it, you probably don’t have an organizational culture that can accept what it takes to have a true safety culture.
I apologize if this post has been a bit academic, or had the feel that I am preaching. Love it or leave it, I am passionate about this. Developing an organizational culture that values information and knowledge sharing leads to success in safety. Having a robust system for near-miss reporting is an important element and one I hope you decide to implement if you haven’t already.
Thanks for reading and feel free to comment or share this with others.