Excellent insight into how we “make mistakes” and how to rectify them.
Remember the NYT expose on Radiation Oncology and how the daily quality assurances were skipped? It was damning, but then screwups do happen. I’ll delve into it sometime later, but we need to institutionalise the culture of safety (and rigorous assessments).
It sends a message that accuracy really matters. It uses valuable in-house knowhow that organisations ignore at their peril. And it’s full of useful tips for dealing with disaster when it happens, like the need to fess up fast. The earlier a mistake is detected, the sooner it can be fixed, a move that generally cements customer loyalty. Yet that means people have to feel it is safe to report a problem. One need not be a student of the Chernobyl nuclear disaster to know the dangers of a culture of fear, but it is striking how often people manage to muck this up.
Finally, it pays to be mildly paranoid at all times.