A Disruptive Paradigm: Systems Thinking About Human Error

Disruptive Paradigms

Recently, I was reading an article from The Firebird Forum that was published by William Corcoran, PhD.  The topic was about “disruptive paradigms” and was advocating for more transparency throughout organizations.  It was a great article that I highly recommend you add to your reading list.

There were a number of thoughts conveyed in this article by Dr. Corcoran, the least of which had to do with the term “disruption”.  “Disruption can be adverse or beneficial.  The value judgment of a disruption often depends on whether one has been helped or hurt by it and usually comes later than the disruption itself.  When we use the word ‘paradigm’ we are referring to a way of thinking, a way of perceiving, and/or a way of performing.”

Corcoran goes on to write that “A disruptive paradigm is a paradigm that is life-threatening to the old ways of looking at things. (he provides an interesting list of examples)  Often when a critical mass of people in an organization or in a society adopt the disruptive paradigm the days of the old paradigms are numbered.  This is often a tipping point after which the old paradigm recedes and the disruptive paradigm gains momentum.  But more often the disruptive paradigm exists side-by-side with the previously conventional paradigm for years.  And this co-existence is antagonistic.”

This article goes on to make its point about the need for greater transparency, but before it does it stresses that there is often great resistance to the disruptive paradigm.  Advocates are labeled as heretics and seen as threats to stability, which they most certainly are.

Instead of continuing here to discuss disruptive paradigms and the need for greater transparency (which you can read for yourself in Dr. Corcoran’s article) I want to put forward some thinking about another disruptive paradigm:  Systems Thinking About Human Error.

Working In Context

Systems thinking isn’t new to the safety world.  In fact it has been around for a long time in aerospace and nuclear safety.  But to most safety professionals or practitioners, focusing on risk factors that arise from the workers and their behavior has been more in-vogue than assessing risks from work related systems.

According to Nancy Leveson, PhD. and professor in aeronautics and engineering systems at MIT, systems thinking is an approach to problem solving that suggests the behavior of a system’s components only can be understood by examining the context in which the behavior occurs Viewing operator behavior (and human error) in isolation from the surrounding systems prevents full understanding of why an accident occurred; and thus the opportunity to learn from it.

Last month I spoke at the ASSE Professional Development Conference in Dallas on the topic of human error and how incident investigations often lead to erroneous conclusions.  Titled “Human Error: There is NO Root Cause”,  the presentation helped the attendees see how most incident investigations are biased from the start.

It is easy to see in hindsight what should have been done, but in the moment it is far more difficult for the employee to see what is about to happen.  Workers are engaged in their work, trying to be efficient and get the job done.  They probably have done the job successfully many times, and may have been praised for how quickly and effectively they performed the work.  But then one time things didn’t turn out as planned and an incident occurred.

Almost instinctively, the supervisor and others look to see what the person did wrong as they begin to investigate.  Often biased in their beliefs about the person’s role in the mishap, they embark down a road to find the root cause.  Assuming that accidents have a root cause these investigations focus on the mistakes or behaviors involved (operator error) or technical failures, and ignore the plethora of organizational related issues that likely influenced the behavior (the context).  Thus, the behavior of the worker is most often blamed for the mishap.

Focusing of the failure of the worker and their attendant behavior is easy to understand when you think about who gathers the initial information during investigations.  Then consider how difficult it is for them to point out flawed management decision-making, safety culture problems, regulatory deficiencies, inadequate resources, and the pressure of time related issues, to name a few.

In most organizations supervisors are poorly trained in the investigative process, and lack the knowledge, time, perspective and motivation to dig deeper into any of the system related issues.  As a result, most organizations find root causes that are really only symptoms without fixing the process that led to those symptoms.  As Dr. Leveson sees it, “if we don’t begin to look at the big picture and understand the context in which the behavior occurred we will continue to have process flaws that will fail again.”

So here is the disruptive paradigm shift:  Human error is not the root cause of most incidents.  All human behavior (and error) is affected by the context in which it occurs, and the context is the sum of all processes or organizational systems that influenced the situation.

The “Systems Paradigm”

We’ve all heard speeches at conference or read books that reference the work of James Reason.  A researcher and writer about cultural and organizational influences that affect humans and their actions,  Dr. Reason concluded “human error is only a symptom, not a cause —  A symptom of issues deeper inside the organization resulting from its systems (how the product is built, parts are sourced, contract deadlines are established, compensation is determined, performance is judged, etc.).”

From recent articles about the VPPPA conference last fall, we are learning about a new safety initiative at General Electric called Human and Organizational Performance (HOP).  Based on the work of James Reason, and the desire to move beyond the results achieved from other initiatives, such as Six Sigma and Lean, GE has embraced HOP in an effort to wring out systems issues within the organization.

GE is shifting its paradigm and embracing the new Systems Paradigm.  They have turned away from focusing on the employees’ behaviors to accepting the fact that humans will make errors or mistakes.  Though it is far easier to focus on people and their behaviors, HOP shifts the discussion to process design, methods, tooling, procedures, schedules, and such.  They know this shift will take time, and will likely face strong resistance (like Dr. Corcoran suggests), but in the end will lead to a much stronger culture of performance.

James Leemann, PhD., wrote about HOP in one of his ISHN articles.  To him, systems  thinking is long over-due.  Fixing the system and not trying to fix the people is the basic principle of the HOP philosophy.  He wrote, “Think about if you were a safety professional working in a manufacturing plant with 2,000 employees producing more than 400 products from 100 different processes, and the safety performance is less than stellar.  Would you rather fix the systems (i.e.,  100 processes) or fix the behaviors of 2,000 employees?  Of course, keep in mind that the systems tend to stay the same day in and day out; whereas the behaviors of the employees change constantly.”


Systems thinking is a disruptive paradigm.  Clearly it forces safety professionals and management to think differently about accident causation, but it also opens the door to many other possibilities.  Initiatives like HOP hold great promise in making this shift away from behavioral causation.  Leemann says it well, “consider the willingness and enthusiasm HOP-trained employees will have locating risks and impediments in their work area with the objective of reducing, eliminating, or employing defenses to prevent injury or loss, versus being watched and critiqued by someone else in the name of behavior based safety.”  Obviously Dr. Leemann is “a disruptor” and not a fan of BBS.  Regardless of your position on BBS, I hope you see the value in looking beyond what the worker did or didn’t do, the choice they made, or other identifiable failure and see the value in looking at the context they faced at the time.

It is an exciting time within the safety profession.  We are hearing presentation after presentation, and reading numerous articles that are talking about human error and the systems approach to safety.  This is part of our evolutionary journey as we move through the focus on behavior, to the importance of the influence of cultural, and now to initiatives directed at the organization and its systems.  Whether we call this new paradigm “Safety 2.0” or “Safety Different” doesn’t matter.  What does matter is that we all embrace this change of thinking and do our best to spread the word.

As organizations mature I see us embracing all methods, systems and behavioral.  In upcoming blog posts I will write more about the HOP methods, and what other companies are doing to shift to systems thinking.  I’d love to hear your thoughts on this topic and any ideas you might have that could help others.  Thank you for reading and please share this post with others.