Why Do We Settle for Poor Design?

Whether conducting research, reading blog and LinkedIn posts, listening to Podcasts, or working with clients, I am continually amazed at the magnificent capacity for exceptional human performance. In fact, I have stated many times that I believe that humans are the most important part of any system. It is amazing what workers, teams, and crews can and will accomplish when faced with the challenges of production work. And in most cases they will accomplish the work safely and successfully even if they have to create their own workarounds. I often wonder, though, if workers are being set up with situations that may eventually exceed their ability to adapt. If they are, what is the role of leaders and managers in identifying deficiencies and corrective actions? Additionally, when failure occurs, do organizations often place too much responsibility on workers for the control of their actions and circumstances? Another important question relates to why workers feel they have to create their own workarounds and how the operational environment, the context of their work, and the systems and equipment they are supplied with shapes their decision-making.

In many cases safety professionals use a risk-managed approach and attempt to reduce risks to a level As Low As Reasonably Practicable (ALARP) and the organizations, teams, crews, and workers accept the residual risk. What cannot be designed out is mitigated through administrative controls or Personal Protective Equipment (PPE), which is often used as a last line of defense. Unfortunately though, I think that many organizations over rely on administrative controls in the form of rules, policies, and procedures and assume they will be enough to keep employees safe. Then, when failure occurs or the employees are caught developing workarounds in the field, it is the employee that may be reprimanded for not using the rules and tools appropriately. In some cases perhaps it might be the fault of the worker, but I believe that in many cases investigations stop short of understanding the causal factors that shaped the operational environment and conditions of work. The worker or team is blamed, training (or retraining) occurs, the teams are told to pay more attention next time, and things appear fixed (until they break again). The problem with this approach is that it fails to take into consideration the limitations of the human and even the limitations of the team. The consequences can impact both production and safety performance. I truly believe that well-trained and competent workers can accomplish great feats, and do so safely most of the time. I also believe that a strong team or crew can accomplish even more and serve as a supportive measure to help workers minimize errors and detect, correct, and/or manage errors when they do occur. In fact, I teach a Crew Resource Management (CRM) workshop to help provide teams with a set of tools to improve safety and operations performance. However, anyone who has been through my CRM training will understand that a good team/crew performance program will also include an emphasis on learning from errors and designing systems to help minimize error and failure potential.  Unfortunately, I believe workers, teams, crews, and organizations are conditioned to accept poor designs. Here is an example:

Years ago I flew an older version of the KC-130 Hercules. It had a system that was designed to alert the crew if they got too close to terrain and were not in a position to land. For example, if the crew got too close to mountains the system was supposed to sound an alert warning the crew so they could take corrective action. This system included a Built-In-Test (BIT) feature. In many cases we would test the system and it would fail. So, we had a couple choices. We could use it and perhaps second-guess false positives, or use it and react to all alerts even if they were inappropriate for the specific circumstances. Either approach could potentially degrade operational performance. So, what did we do? We typically addressed this issue by pulling the circuit breaker for the system so we would not receive any alerts from the system. Is that good design? I think many times teams may be provided with deficient systems, tools, rules, and procedures that do not accurately capture the context of the operational environment and they have to adapt to perform their work and actively create their safety. Many crew performance tools place the burden on the employee rather than learning from errors and trying to figure out what can be done to improve the design. To paraphrase Todd Conklin in one of his recent Pre-Accident Investigation Podcasts, “We should make it easy to safely perform work and hard to do the job wrong.” From another perspective, when we investigate failure, we should try to make it easier for workers to do the job right. To me, this is where good design comes into play.

However, there may be some good reasons why organizations settle for poor design. It may not even be intentional and in many cases, planners and designers are doing their best and have the best interests of the workers in mind. However, here are some reasons why organizations may accept deficient design. (There may be a lot more reasons, but here are a few to consider):

1.     It may seem easier to settle on administrative controls (such as training) because they may seem readily available or quick to develop in many cases.

2.     The magnitude of change may seem too large. Sometimes investigations may reveal opportunities for system improvement, but the magnitude of change may seem too hard to push past the inertia of the emphasis on human control over their own work conditions.

3.     Fear of change. Sometimes teams may have done a job for so long it may be scary to change to a more system-oriented approach. I remember some of the concerns we had years ago when we transitioned to a more automated version of the KC-130 Hercules. Sometimes concerns and fear may be reduced by involving workers and teams in the design and/or redesign process. Educating the workers on the change and getting their feedback is important.

4.     Fear of unintended consequences. This is an important one. For every decision that is made there will be consequences (some intended and others unintended). If there are too many unintended consequences or if the severity of the unintended consequences is too high, this may negate the effectiveness of the design change.

I believe most (if not all) of these issues may be addressed using a system approach and by involving workers in the investigation and corrective action process. When errors occur, perhaps the following guiding questions may help:

1.     When workers make mistakes do investigations really dig deep enough to find out how and why they made the mistake (and do they go beyond human error as a cause)?

2.     Was there a mismatch between policy, rules, and the actual operational conditions?

3.     Did the equipment they were working on look different from the plan?

4.     Was there an error-provocative environment and did the tools used include excessive demands that essentially walked the employees into an error?

After this set of questioning, perhaps the next question relates to how these situations may be rectified or at least improved to minimize the potential for future error. What can be done to improve the components of the system or the system as a whole to maximize the potential for excellent human performance while minimizing the potential for error? I am a huge proponent of human performance and Crew Performance training and systems, and I think they can be made even better when the environment, systems, administrative tools (such as checklists), and equipment are optimized to place the human in the best position to do their best work. If a system approach is used to investigate error and help make the job easier to do, I think workers may be happier and the organization may be more productive. We must also remember that we may achieve a period of stability and error reduction after these actions are taken, yet this does not mean that errors are eliminated. Organizations need an open and honest dialogue between workers and managers to help narrow the gap between “Work-As-Designed” and “Work-As-Performed.” As I like to say in many of my workshops, “We can never be perfect, but we can learn and improve.”

If you liked this post you may also like this additional post titled “The Illusion of Control.” Also, if you liked this post I would appreciate it if you would subscribe to my newsletter. When able I try to provide my newsletter subscribers with bonus content that is not available on my public blog. In fact, if you subscribe using the button below you'll receive V-Speed's FREE Checklist Development Guidance document. 

Thanks for reading, and I wish you a safe and productive day!