Effective learning from serious incidents | Association of Anaesthetists

Effective learning from serious incidents

Effective learning from serious incidents

The previous article by Suzette Woodward sets out very eloquently the need to examine how things go right in healthcare. Of course, things go right far more commonly than they go wrong but, when the latter happens we have a duty (both contractual and moral) to patients and their families to investigate properly, and design robust and sustainable interventions to prevent similar future events.

How we investigate incidents in healthcare

I remember very well the first serious incident that I investigated. It took approximately 60 hours, including the research into interview techniques and human factors methods about which I knew little at the time, and caused me to lose sleep. Many of my colleagues have described similar experiences, and while things are better now there remains much room for improvement. A review of existing methods of investigation in healthcare commissioned by the Health Technology Assessment (HTA) programme in 2005 revealed that there was [1]:

  • little standardisation in methods used to analyse incidents in healthcare 
  • limited information on training provided for investigators 
  • a noticeable absence of human factors techniques and
  • little evidence of techniques used to design, implement and monitor interventions

Over a decade later, a House of Commons Select Committee report reinforced this viewpoint and stated that “…processes for investigating and learning from incidents are complicated, take far too long and are preoccupied with blame or avoiding financial liability” [2]. As a direct consequence, the Healthcare Safety Investigation Branch was established in 2017 with the stated aim of improving safety through “effective and independent investigations that don’t apportion blame or liability”.

How human factors approaches improve incident analysis

Too often, the questions asked about an incident focus on “Who did that?” rather than “How did that happen?”, with the result that individuals rather than systems are targeted and blamed. High reliability organisations have recognised the need to move away from a culture of blame that leads to reluctance to report incidents, and have developed a ‘just culture’ where learning from incidents, including near misses, is encouraged and expected. The paradigm shift in these organisations is outlined in Table 1 but, unfortunately, is not yet well developed in healthcare.

Table 1: Critical incident paradigms [3]

Old view New view
Human error is seen as a
cause of failure
Human error is seen as the effect of
systemic vulnerabilities deeper inside
the organisation
Saying what people should
have done is a satisfying
way to describe failure
Saying what people should have done
does not explain why it made sense for
them to do what they did
Telling people to be more
careful will make the
problem go away
Only by constantly seeking out
vulnerabilities can organisations enhance
safety

 

Recently, in Thames Valley, the Patient Safety Academy was funded by Health Education England to undertake a project to improve training in incident analysis. This was an eye-opening experience and revealed, not surprisingly, very similar findings to the HTA report. During the project we compared internal investigations with external investigations using human factors methods of the same cases. Without exception, we found that the internal reports focused heavily on the staff involved, often junior members of the team, with very little consideration of the contribution of systems, environmental and cultural issues.

Recommendations after serious incidents

The same focus on systems should apply to the design of recommendations after serious incidents. Too often they include ‘having a meeting’ or ‘giving a lecture’ which does nothing for the flawed work system. The hierarchy of recommendations in Figure 1 highlights the importance of using physical rather than procedural interventions after serious incidents i.e. putting barriers in place to make it difficult to do the wrong thing. This, of course, is far more straightforward in a factory setting where physical barriers can be designed to prevent harm from heavy machinery. In healthcare, we rely more on procedural interventions such as SOPs and checklists. This hierarchy would also suggest that training interventions are weak, because they are not designed properly. All the evidence supports the use of low dose high frequency training (e.g. regular simulations of emergencies in theatre) but we persist in using less effective, didactic forms of training (e.g. lectures).

Simulation training for serious incidents

Using checklists in simulated emergencies

Examples of potential interventions graded according to effectiveness

Figure 1: Examples of potential interventions graded according to effectiveness in preventing recurrence of a similar incident (adapted from the Canadian Incident Analysis Framework

The use of cognitive aids such as checklists is categorised as a more effective intervention than training. However, it is important to acknowledge that the use of checklists is not intuitive, and design, implementation and training must be a collaborative undertaking involving the team that will be using them. As anaesthetists, we regularly observe variability of engagement in the use of the WHO checklist in different theatres, but we know it only works properly with buy-in at all levels. The Association’s Quick Reference Handbook [4] is an example of good checklist design that we are currently emulating in primary care, where there are few cognitive aids [5].

The importance of compassion

Recently there has been an increased focus on the benefits of compassion in healthcare [6]. Whilst it may seem counterintuitive to require evidence that compassion is important in healthcare, the data are compelling. The feelings of guilt and self-blame that are so evident when someone has been involved in an incident are very difficult to counteract without compassion. It is a vital component of a successful investigation; without it you are likely to discourage honesty, reduce learning, and amplify a culture of blame.

While there is much work to be done on improving learning from serious incidents and near misses, there is cause for optimism. HSIB’s work has just begun and, by drawing on existing expertise in the NHS and embedding a culture of compassion when things do not go well, we will move closer to the widely shared ambition of learning from the past to improve the future.

Helen Higham
Associate Professor of Anaesthetics
University of Oxford, Oxford Director OxSTaR
Co-director Patient Safety Academy

Twitter: @HelenEHigham

References 

  1. Woloshynowych M, Rogers S, Taylor-Adams S, Vincent C. The investigation and analysis of critical incidents and adverse events in healthcare. Health Technology Assessment 2005; 9: 19. 
  2. House of Commons Public Administration Select Committee. Investigating clinical incidents in the NHS, 2015. https://publications.parliament.uk/pa/ cm201415/cmselect/cmpubadm/886/886.pdf (accessed 3/12/2020). 
  3. Woods DD, Dekker S, Cook R, Johannesen L, Sarter N. Behind human error. 2nd edn Farnham: Ashgate, 2010. 
  4. Association of Anaesthetists. Quick Reference Handbook, 2019. https://www.aagbi.org/safety/qrh (accessed 3/12/2020). 
  5. Greig P, Maloney A, Higham H. Emergencies in general practice: could checklists support teams in stressful situations? British Journal of General Practice 2020; 70: 304–5. 
  6. Trzeciak S, Mazzerelli A. Compassionomics: the revolutionary scientific evidence that caring makes a difference. Pensacola: Studer Group, 2019.

You might also be interested in: