Powered By Blogger

Sunday, February 5, 2017

Making Mistakes

It has been said that "to err is human" yet there are endeavours in which error cannot be accepted. The airline industry is an obvious example, and the nuclear industry is another.  Dose this mean that to be safe we need to remove humans from the equation?  The short answer is 'Yes' and that is exactly what has been done in the above examples.  Humans are still involved, but systems have been created that monitor for errors and correct or prevent them.
Now to health care.  In many ways, errors must be avoided because they cause illness or injury, increased length of stay in hospital, or death.  Everyone accepts that errors are bad, but for some reason it is expected, by regulators, managers, the public, and health care practitioners themselves, that health care will be error free. In effect we expect that health care workers are not human! This is patently absurd because it is their humanity that makes them effective.  For all our faults and propensity for making mistakes, humans are still better at caring for other humans than machines are. Granted, there may be some individuals who do feel they have a relationship with 'Siri' on their iPhone or iPad, but very few people would actually choose to have nursing or medical care delivered by a machine. The therapeutic relationship is a crucial component of caring.
The expectation of perfection is therefore unrealistic and unreasonable, and in fact is not supported by case law.  The law accepts the concept of 'reasonableness' which therefore allows for mistakes.  So why does everyone else, including health care workers themselves, expect perfection? I think it is because mistakes are taboo, and that is dangerous.  Very few health care workers go to work intending to harm someone that day, and if someone is harmed they feel tremendous guilt and self-doubt. We are typically not trained to deal with mistakes, as the unspoken rule is that if you are 'good enough' you won't make any.  So we hide our errors, possibly even from ourselves. If you do report an error and the report is then mishandled by managers who take the easy approach of blaming you for it, you will be far less likely to report any future errors.  What about near misses?  If no harm was caused, forget about anybody reporting it.  This robs the organisation of any chance to prevent future events that might not be near misses.  In many causes, managers only have themselves to blame for this culture of avoidance.  You can't encourage a person to be honest by beating them every time they tell the truth!
That's the problem, and it's a big one.  What can we do about it?  Firstly, we must adopt a no-blame approach to incident reporting.  Mistakes happen all the time in health care, yet very few are reported especially if the patient was not obviously harmed or not aware of the mistake.  Then we must investigate the incident reports to work out what system faults caused the error or allowed it to progress.  In some cases the person is the problem, but that should never be the first or primary conclusion because no-one works in isolation from the systems.  Disciplinary matters have their place, a long way down the priority list.
We need to educate health care workers that mistakes are inevitable and our only hope of preventing them is to work together.  'Two minds are better than one' rings true as cross-checking is a valuable tool to detect and prevent mistakes.  To make this work properly, all participants have to have a voice, and be heard. Part of investigating an incident report should be about how cross-checking failed, and whether the error could have been prevented if someone had spoken up.  If so, what prevented them from doing so? Shifting blame to the person who didn't speak up is unhelpful, so the focus is on addressing the culture of the workplace so that everyone has the right to speak.
In summary, errors are inevitable in health care, and will remain so because humans are involved in it. Systems need to be created that help fallible humans to detect and prevent them before harm is caused, and these systems cannot be based on blaming the person who made the mistake. System change is driven by incident reporting, and the focus of investigation must be heavily biased towards finding system flaws that facilitated the mistakes. Finally, we all need to stop expecting the impossible s that just perpetuates the problem.

No comments:

Post a Comment