By Sidney W. A. Dekker, currently a professor at Griffith University in Brisbane, Australia, where he founded the Safety Science Innovation Lab. He is also Honorary Professor of Psychology at the University of Queensland. He flew as First Officer on Boeing 737s for Sterling and later Cimber Airlines out of Copenhagen. In 2004, he wrote the following article in the aftermath of the Linate accident.
We like to believe that accidents happen because a few people do stupid things. That some people just do not pay attention. Or that some people have become complacent and do not want to do a better job.
On the surface, there is often a lot of support for these ideas. A quick look at what happened at Linate, for example, shows that controllers did not follow up on position reports, that airport managers did not fix a broken radar system in time, that nobody had bothered to maintain markings and signs out on the airport, and that controllers did not even know about some of the stop marks and positions out on the taxiway system. And of course, that a Cessna pilot landed in conditions that were below his minima. He should never have been there in the first place.
When we dig through the rubble of an accident, these shortcomings strike us as egregious, as shocking, as deviant, or even as
The problem with this logic is that it does not get us anywhere. The problem with this logic is that it does not work the way we hope. What we believe is not what really happens. The reason the logic does not work is twofold. First, accidents don’t just happen because a few people do stupid things or don’t pay attention. Second, firing or punishing people does not create progress on safety: it does not prevent such accidents from happening again. The only thing that we sustain by this logic of individual errors and punishment is our illusions. Systems don’t get safer by punishing people. Systems don’t get safer by thinking that humans are the greatest risk.
Let’s look at the first problem. Accidents don’t just happen because a few people do stupid things or don’t pay
An additional problem is that the potential for having an accident can grow over time. Systems slowly, and unnoticeably, move towards the edge of their safety envelopes. In their daily work people — operators, managers, administrators — make numerous decisions and trade-offs. They solve numerous larger and little problems. This is part and parcel of their everyday work, their everyday lives. With each solved problem comes the confidence that they must be doing the right thing; a decision was made without obvious safety consequences. But other ramifications or consequences of those decisions may be hard to foresee, they may be impossible to predict. The cumulative effect is called drift: the drift into failure. Drifting into failure is possible because people in organizations make thousands of little and larger decisions that to them are seemingly unconnected. But together, eventually, all these little, normal decisions and actions can push a system over the edge. Research shows that recognizing drift is incredibly difficult, if not impossible — either from the inside or the outside of the organization.
Continue to part 2