Tragic Decisions: Preventable Accidents and Human Error

I recently had the opportunity to attend a short training session on hoist and crane safety. Rigging and lifting is not my area of expertise, but like most industrial workers, I have had some introductory training, just enough to ensure that I know to stay out of the cone of exposure and recognize unsafe conditions. This training was a good refresher and taught me a few things to look for when observing a lifting evolution.

Part of the training covered recent crane accidents. One, particularly, caught my attention; here is a very high-level overview.

An outdoor gantry crane was being operated in high wind conditions, but they weren’t sure how high because the crane’s manometer was broken. Three of the crane’s four storm and parking brakes, the brakes that would normally lock the crane in position, were inoperable. A particularly strong gust of wind started the crane rolling down its rails. It picked up speed so fast that the normal brake could not stop the momentum. The crane slammed into the blocks at the end of the track. The crane operator, who was not wearing a seatbelt, was ejected from the cab and suffered fatal injuries.

The discussion about the event centered around the decisions of the crane operator. Why was he attempting a lift in high wind conditions? Why was he not wearing his seatbelt? This is the typical response as we all think about how we could avoid a similar outcome by changing the things in our control; the risky decisions.

We do this all the time. We will watch a news report about a random homicide that could have easily been us until we hear the piece of information that allays our fears. “Oh, it happened downtown at 3 am? That couldn’t be me; I’d never be downtown at 3 am?”  Someone walked into traffic while staring at their phone? “I always look where I’m walking; no way I’d do that.” All those people on the Titanic that downed or froze to death? “Well, I would have gotten on one of those half-empty lifeboats, fashioned a raft out of life jackets and debris, or insisted Rose share that door. I could have made it.” It helps us sleep at night to think OUR decisions are better and we would avoid danger. It’s not an uncommon reaction.

What caught my attention, though, was the lack of discussion around the system breakdowns that helped set this event up. The first principle of Human Performance is “People are fallible; even the best make mistakes.” Is your work system resilient to these potential mistakes? For example, if the manometer worked, the crane operator might have solid information and not make faulty assumptions. Even if he did decide to proceed in face of risky weather conditions, a functional set of storm and parking brakes would have prevented the crane from rolling uncontrolled. Why wasn’t an interlock preventing crane operation with the seatbelt disengaged? If any of the error defenses had been in place, this tragedy could have likely been avoided.

The next time you hear about an industry event or lessons learned, consider how you could have made better choices and consider the organizational breakdowns and how the system was not resilient to human error.

back to all blogs