Rethinking patient safety | Association of Anaesthetists

Rethinking patient safety

Rethinking patient safety

How many of us would survive the microscopic scrutiny of our actions? There is almost no human action or decision that cannot be made to look more flawed and less sensible in the misleading light of hindsight. When something has gone wrong, it is probably true to say it has gone right many times before, and that it will go right many times in the future, yet people are judged by one error or incident for the rest of their careers. This is at the heart of a poor safety culture, and we need to urgently address this.

The first thing we need to do is recognise that healthcare is a complex adaptive system, explained as a dynamic network of ‘agents’ acting in parallel, constantly reacting to what other ‘agents’ are doing, which in turn influences behaviour and the network as a whole. Complexity is a way of thinking about, and analysing, situations by recognising patterns and interrelationships. However, safety science has often viewed these in linear terms with simple rules of cause and effect, often relying on the Swiss cheese model and root cause analysis to assess what has happened. This is doomed to fail in a system that is constantly working in parallel and always changing, so that even in the days after an incident the functioning may have changed beyond recognition [1].

The second thing that we need to do is to find out what people’s lives are really like, not what we envision or expect. Human factors terminology refers to ‘work as imagined’/ ‘work as prescribed’ on the one hand, and ‘work as done’ on the other. In order to learn about ‘work as done’ it is crucial that there is a culture of disclosure, that is the ability for people to describe what they actually do, and not what policy states. This requires a psychologically safe environment where people feel accepted and respected, able to use their judgement, and able to challenge. It is achieved when team members feel safe to be vulnerable in front of each other, ask questions and ask for help. It has been shown in a study by the company Google to be one of the most important factors for successful high performing teams [2].

We rarely ask ourselves “How many patients were not harmed today, or how many patients’ lives were saved by our actions?”

The third thing we need to do is build a just culture: the fair, proportionate and consistent response for when things do not go as planned or expected. This is the balance of learning, support for staff, and accountability for actions taken and decisions made. It provides a framework that shifts the focus from blaming individuals to the wider system, and understanding why things went wrong on this particular occasion when they have normally gone fine. Ultimately, it helps us understand why it made sense for people to do what they did at the time [3].

In healthcare, numerous studies have tried to quantify the scale of the problem with regard to safety, with a recurring figure of 10% - 10% of patients are affected by patient safety incidents when care did not go as planned or expected. Like all statistics there is an opposite figure, which is that 90% of things go right; however, we don’t notice and study this. We rarely ask ourselves “How many patients were not harmed today, or how many patients’ lives were saved by our actions?” [4].

We therefore need to rethink our approach to safety, from a relentless focus on the negative (‘Safety I’) to the positive (‘Safety II’). The true picture is the combination of Safety I data i.e. the 10% of incidents, serious incidents, never events, learning from deaths and so on, with Safety II data from the 90% [5].

There are a number of ways in which we can do this including ethnography and video reflexivity, when we study our existing practices and pay attention to the mundane, the implicit routines and habits. We have to pay attention to the invisible day-to-day work that keeps our patients safe, and ask appreciative questions such as: what do we like about what we see; how often do we think it goes as planned like this; and how can we keep replicating what works [6]?

In summary we need to: 

  • Combine Safety I and Safety II thinking and methods.
  • Build psychologically safe teams.
  • Learn to be non-judgemental, neutral in our inquiries, and seek to minimise our natural biases. 
  • Study how people adapt and adjust every day to the conditions they face, and learn how things normally proceed in order to understand why things failed. 
  • Use the learning to replicate good practice and strengthen the system, but be cautious about making changes based on small numbers.

Professor Suzette Woodward
Independent Patient Safety Consultant
Visiting Professor Imperial College, London

Twitter: @suzettewoodward

References 

  1. Plsek PE, Greenhalgh T. The challenge of complexity in healthcare. British Medical Journal 2001; 323: 625–8. 
  2. Edmondson AC, Higgins M, Singer S, Weiner J. Understanding psychological safety in health care and education organizations: a comparative perspective. Research in Human Development. 2016; 13: 65-83. 
  3. Dekker S. Just culture: restoring trust and accountability in your organization. 3rd edn. New York: CRC Press, 2016. 
  4. Woodward S. Implementing Patient Safety. New York: Routledge, 2019. 
  5. Wears RL, Hollnagel E, Braithwaite J, eds. Resilient health care Volume 2. The resilience of everyday clinical work. Farnham: Ashgate Publishing, 2015. 
  6. Iedema, R, Carroll, K, Collier, A, Hor, S-Y, Mesman, J & Wyer. Video-reflexive ethnography in health research and healthcare improvement: theory and application. CRC Press, Boca Raton, 2019.

You might also be interested in: