I stumbled on an excellent write up on safety, and if “disaster” happens, it is usually a culmination of several events- especially when there is a chain of hierarchy and multiple people involved. It has a parallel in radiation oncology, and I believe that understanding the risks of failure has equal importance. We do have an unparalleled safety record, partly due to oversight of the regulatory agencies. However, I always feel that a “real-time” monitoring system needs to be incorporated- at the time of delivery and then a mechanism to measure the delivered dose. Let me put it this way- reducing the global minimum of uncertainties. Due to inherent uncertainties, we can never be sure that the possibilities have been brought down to “zero”. Yet they can be minimised.
The excerpts have a lot to do with the “organisational behaviour” and how do the people respond to external “stimuli” with the “framework of internal cultures/attitudes” determines the sum goal of the “deliverables”. In radiation therapy, it has to do with the safety-first (from simulation, target delineation, delivery and management of the side effects)
These deviations may be required by circumstances that were not anticipated by the procedure’s authors, requiring frontline workers to develop workarounds. Other deviations are due to workers developing shortcuts and local optimizations which reduce their workload or improve productivity [Dekker 2011].
Over time, this phenomenon leads to “the slow steady uncoupling of practice from written procedure” [Snook 2000]. Behaviour that is acquired in practice and is seen to “work” becomes “legitimized through unremarkable repetition”, as Snook writes (“it didn’t lead to an accident, so it must be OK”).
Do we need “written procedures”? I think a constant reminder of “safety” will only lead to kick in of diminishing marginal utility- beyond which it would cease to make sense. I think a better way is to have “refresher” courses and a reward system to identify and emphasise safety culture. It would keep the individuals motivated.
The “migration model” is attractive:
Rasmussen represented the competing priorities and constraints that affect sociotechnical systems in his “migration model”, shown below. Any large system is subjected to multiple pressures: operations must be profitable, they must be safe, and workers’ workload must be feasible. Actors experiment within the space of possibilities formed by these constraints, as illustrated below:
- if the system reduces output too much, it will fail economically and be shut down
- if the system workload increases too far, the burden on workers and equipment will be too great
- if the system moves in the direction of increasing risk, accidents will occur
This drift towards the danger is natural; the authors have described it as a byproduct of their adaptive behaviour. That is, the innate desire to find “shortcuts” to routine procedures and lack of oversight from their peers.
The following para is also very instructive:
The drift into failure model highlights the importance of a system’s history in explaining why people work as they do, what they believe is important for safety, and which pressures can progressively erode safety. The model helps see safety as a control problem, where the underlying dynamics are very slow but also very powerful, and difficult to manage.
One important takeaway is that we need to have a system that wouldn’t “punish” the lapses, but a clear line of communication between the manager and the subordinates would help to report the errors anonymously. Radiation fractionation takes incrementally, but the setups are a significant concern with daily variability. Therefore, I lay emphasis on regular setups and better image guidance protocols, but the success came in from providing the technologists with operational freedom. I also prefer to do a random offline verification of ongoing patients as an audit that helps to close the feedback loop.
I hope this helps!