Friday, April 12, 2013

Why do we resist new thinking about safety and systems?

Something I have been thinking about for a while is the way that we look at safety and systems - the unstated assumptions and core beliefs. The paradigm and the related shared ideas about safety are little different now to what they were 20 or 30 years ago. New thinking struggles to take root. We continue to explain adverse events in complex systems as 'human error'. We continue to blame people for making 'errors', even when the person is balancing conflicting goals under production pressure (such as this case). We continue to try to understand safety by studying very small numbers of adverse events (tokens of unsafety), without trying to understand how we manage to succeed under varying conditions (safety). It is a bit like trying to understand happiness by focusing only on rare episodes of misery. It doesn't really make sense. We are left with what Erik Hollnagel calls 'Safety I' thinking as well as what Sidney Dekker calls 'old view' thinking. The paradigm has a firm hold on our mindsets - our self-reinforcing beliefs. The paradigm is our mindset.

Why are we so resistant to change? Over at Safety Differently, Sidney Dekker recently posted a blog called 'Can safety renew itself?', which resonates with my recent thinking. Dekker asks "Is the safety profession uniquely incapable of renewing itself?" He makes the case that the safety profession is inherently conservative and risk averse. But these qualities stifle innovation, which naturally requires questioning the assumptions that underlie our practices, and taking risks.

It is something that I can't help but notice. When it comes to safety and systems, we seem much more comfortable dismissing new thinking than even challenging old thinking. Our skepticism is reserved for the new, while the old is accepted as 'time-served'. So we are left with old ideas and old models in a state of safety stagnancy. Our most widely accepted models of accident causation are still simple linear models. Non-linear safety models are dismissed as 'too complicted', or 'unproven'. There is not the same determination to question whether assumptions about linear cause-effect relationships really exist in complex systems. The 'new view', which sees human error as a symptom, not a cause, is often seen as just excuses. But we are less willing to think about whether human error is a viable 'cause'. We are even less willing to question whether 'human error' is even a useful concept, in a complex, underspecified system where people have to make constant adjustments and trade-offs - and failures are emergent. The concept of 'performance variability' is seen by some as wishy washy. But the good outcomes that arise from it are not considered further. Proposals to reconsider attempts to quantify human reliability in complex systems are dismissed. But there is not the same urge to critique the realism of the source data and the sense behind the formulae that underlie 'current' (i.e. 1980s) human reliability assessment (HRA) techniques. There are plenty of reviews of HRA, but they rarely seem to question the basic assumptions of the approach or its techniques.

Why is this? Dekker draws a parallel with the argument of Enlightenment thinker Immanuel Kant, regarding self-incurred tutelage as a mental self-defence against new thinking.
"Tutelage is the incapacity to use our own understanding without the guidance of someone else or some institution... Tutelage means relinquishing your own brainpower and conform so as to keep the peace, keep a job. But you also help keep bad ideas in place, keep dying strategies alive." 
Little Johnny came to regret asking awkward questions about Heinrich's pyramid.
The reason, according to Kant, is not a lack of intellect, but rather a lack of determination or courage.  This rings true. But I think we need to be a bit more specific. Why do we resist new thinking in safety? A few things spring to mind. They fall under the categories of personal barriers and system barriers.

Personal Barriers

I have almost certainly fallen prey to nearly all of these at some point, and so speak I from experience as well as observation. If new thinking strikes any of these nerves, I try to listen - hard as it may be.
  1. LACK OF KNOWLEDGE. This is the most basic personal barrier, and seems to be all too common. Lack of knowledge is not usually through a lack of ability or intellect, but a lack of time or inclination. For whatever reason, many safety practitioners do not seem to read much about safety theory. The word theory even seems to have a bad name, and yet it should be the basis for practice, otherwise our science and practice is populist, or even puerile, rather than pragmatic (this drift is evident also in psychology and other disciplines). Some reading and listening is needed to challenge one's own assumptions. Hearing something new can be challenging because it is new. For me personally, several system safety thinkers and systems thinkers have kept me challenging my own assumptions over the years.
  2. FEAR. Fear is the reason most closely related to Kant's, as cited by Dekker. Erik Hollnagel has cited the fear of uncertainty referred to by Nietzsche in 'Twilight of the Idols, or, How to Philosophize with a Hammer': "First principle: any explanation is better than none. Because it is fundamentally just our desire to be rid of an unpleasant uncertainty, we are not very particular about how we get rid of it: the first interpretation that explains the unknown in familiar terms feels so good that one "accepts it as true."" What old thinking in safety does, is provide an quick explanation that fits our mental model. A look at the reporting of accidents in the media nearly always turns up a very simple explanation: human error. More specifically for safety practitioners, when you have invested decades in a profession, there can be little more threatening than to consider that your mindset or (at least some of) your fundamental assumptions or beliefs may be faulty. If you are an 'expert' in something that you think is particularly important (such as root cause analysis or behavioural safety), it is threatening to be demoted to an expert in something that may not be so valid after all. Rethinking one's assumptions can create a cognitive dissonance, and may have financial consequences.
  3. PRIDE. If you are an 'expert', then there is not much room left to be a learner, or to innovate. Being a learner means not 'knowing', and instead being curious and challenging one's own assumptions. It means experimenting, taking risks and making mistakes. Notice how children learn? When left to their own devices, they do all of these things. Let's quit being experts (we never were anyway). Only by learning to be a learner can we ever hope to understand systems.
  4. HABIT. It seems to me that our mindsets about safety and systems are self-reinforced not only by beliefs, but by habits of thought, language and method. We habitually think in terms of bad apples or root causes, and linear cause-effect relationships. We habitually talk about "human error", "violation", "fault", "failure", etc - it is ingrained in our safety vocabulary. We habitually use the old methods that we know so well. It is a routine, and a kind of mental laziness. To make new roads, we need to step off these well-trodden mental paths.
  5. CONFORMITY. Most people naturally want to conform. We learn this from a young age, and it is imprinted via schooling. As Dekker mentions, you want to fit in with your colleagues, boss, and clients to keep the peace and keep a job. But fitting in and avoiding conflict is not how ideas evolve. Ideas evolve by standing out.
  6. OBEDIENCE. In some environments, you have to think a certain way to get by. It is not just fitting it, it is being told to, and having to - if you are to survive there. This is especially the case in command and control cultures and highly regulated industries where uniformity is enforced. If you work for a company which specialises in safety via the old paradigm, you have little choice - to obey or leave.

System Barriers

As pointed out by Donnella Meadows in 'Thinking in Systems', "Paradigms are the sources of systems". Barriers to new thinking about safety are built into the very structures of the organisations and systems that we work within, with and for, and so are the most powerful. These barriers breed and interact with the personal barriers, setting up multiple interconnected feedback loops that reinforce the paradigm itself.
  1. GOALS. Goals are one of the most important parts of any system because they represent the purpose of the system and set the direction for the system. They also reinforce the paradigm out of which the system arises. Safety goals are typically expressed in terms of unsafety, and so are the quantification of these goals, relating to accidents and injuries, such as target levels of safety or safety target values. Some organisations even have targets regarding errors. Such goals stifle thinking and reinforce the existing mindset.
  2. DEMAND. Market forces and regulation can be powerful suppressor of new safety thinking. Not knowing any different, and forced by regulation, internal and external clients demand work that rests on old thinking. The paradigm, and often the approach, is often specified in calls for tender. Old thinking is a steady cash cow.
  3. RULES & INCENTIVES. Rules regarding safety emerge from and reinforce the existing paradigm of safety. Rules limit and constrain safety thinking, and the products of thinking. Examples include regulations, standards and management systems with designed-in old thinking. Within the systems in which we work are various incentives - contracts, funding, prizes, bonuses, and publications - as well as punishments, that reinforce the paradigm.
  4. MEASURES. What is measured in safety has a great influence on the mindset about safety. A reading of Deming reveals that if you use a different measure, you get a  different result. We typically measure adverse events and other tokens of unsafety as the only measures of safety. What we need is measures of safety - of the system's ability to adapt, reorganise and succeeed under varying conditions.
  5. METHODS. Changing the paradigm inevitably means changing some methods. Most existing methods (especially analytical techniques), as well as databases, are based on old paradigm thinking. A common challenge to new thinking is that there is a lack of techniques. There is a general expectation that the techniques should just be there. For corporations, changing methods costs, especially those that are computerised and those that are used for comparing over time.
  6. EDUCATION. Training and education at post-graduate level (which is typically where safety concepts are encountered) are influenced heavily by demand. And demand is still to be rooted in old paradigms and models. Reflecting new thinking in safety courses either means not meeting demand or creating a conflict of paradigms within the course. This is not fun for an educator. 
So how to change the paradigm? This is the point at which I find myself caught up in a web of circular arguments. Since systems emerge from paradigms, changing the safety paradigm is probably the hardest thing to change about a system. But it seems that changing the system is necessary to change the paradigm and yet you can't change the system without changing paradigm. Maybe this is why "there is, perhaps, something indelibly conservative about the safety profession", as Sidney Dekker mentions.

Can the paradigm be changed directly? Donella Meadows seems to thinks it can.
"Thomas Kuhn, who wrote the seminal book about the great paradigm shifts of science, has a lot to say about that. You keep pointing at the anomalies and failures in the old paradigm. You keep speaking and acting, loudly and with assurance, from the new one. You insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather, you work with active change agents and with the vast middle ground of people who are open-minded." (p. 164).
This 'pointing out' is what protagonists such as Sidney Dekker, Erik Hollnagel, David Woods, and others have been doing (along with others on the Systems Thinking side, more generally, such as John Seddon and others). An awakening can occur in an instant or over a period of acceptance, but it takes time for this to spread to many people, especially when demand is lacking. The vast middle ground is important to gain a critical mass of new thinking, but perhaps more critical is the mindset of those who set system goals. These, in turn, trigger demand, rules and incentives, measures, methods and education.

Changing the safety paradigm is a slow process, but one that is possible, and has happened many times over in most disciplines. But it will take a bold step to pull the system levers, and this step - more than anything - needs courage.