When Work Procedures Deviate – Human Error Can Be Disasterous

About 3 years ago I began an in-depth study of Human Error Management, Human Performance, Cognitive Science and Prevention Through Design.  In the process it has become clear that we have not effectively learned how to create work processes and procedures that are error tolerant.  People will make mistakes and in fact are designed to process by trial and error as they go through their day.  Until we understand this fallibility and learn to create an environment and systems that allow our employees to have a bad day and still finish without injury, we will never take safety to the next level.  Recently, I had the opportunity to learn more about Human Error management, and want to share it with you.

Last week I attended a blockbuster PDC put on by the Northwest Chapter of ASSE.  The organizers did a great job of assembling top-notch speakers including Joel Tiegnes, Adele Abrahms and Dan McCune, the first keynote speaker.  Donna Heidel of NIOSH and their lead authority on the Prevention Through Design delivered the afternoon keynote.  With a number of other excellent local speakers the NW Chapter proved that by increasing speaker quality greater attendance would be achieved.  They set an all-time record by more than 100 participants!

The morning keynote and follow-up session by Dan McCune, VP of Safety for Embry-Riddle University and Director of Training for HFACS, Inc. was outstanding.  First, he provided an excellent discussion of human error and introduced the audience to a deep dive into the Human Factors Accident Causation System – HFACS.  Created by Drs. Wiegmann and Sheppell for the DOT/FAA, HFACS provides a taxonomy of analyzing incidents to find they key contributing factors beyond the actual human error.  The system goes beyond the root cause analysis that most safety professionals know and use.  The system allows the users to consider a number of additional factors, one being the “Normalization of Deviation”.  I have heard of this before from Todd Conklin, another noted speaker on human error and the design of work.  Conklin stress this two-way phenomena.  “Deviation” refers to the day-to-day variation from established work standards and procedures, better known as the way work is actually performed.  “Normalization” is just that; the deviation from the establish standard that has become habitual and is now the new standard for performing the given task;  the new normal.

In McCune’s discussion of Normalization of Deviation he was referring to violation errors, where accidents are easily blamed on the person who didn’t follow set procedures.  However, to be a violation you must pass the “3 Question Rule:  1) was there a standard or established procedure and was the employee aware of and trained to perform to that standard?  2) did the organization know that deviation was occurring?  3)did the organization intervene and enforce the rule or redirect the deviation from the procedure?  If you cannot answer Yes to all three, there is not an enforceable violation.

As we learn more about organizational control and process management as it relates to human error, we can see clearly how performance expectations and methods must be checked regularly to assure that normal deviation is acceptable and not setting up significantly increased risk.

Conklin uses a graph to illustrate the problem of deviation, especially in work where the level of risk fluctuates.  During a work activity exposure to hazards can vary, so if the worker takes short cuts for the sake of increased productivity, the risk of injury often increases.  Put simply, the margin for human error and resulting negative consequences is reduced.  Among his many suggestions, McCune believes that regular worker and supervisor job reviews, together with an active near-miss reporting system can effectively limit deviation and decrease the risks involved.

That’s enough for now, but next post I will cover more about McCune’s thoughts on Near-Miss Reporting.