Industrial accidents occur in most sectors where humans are exposed to heights, speed, or heavy powerful machinery. While each accident is tragic, the consequences are often limited to the casualties and their relatives.

Free white paper: How to empower the operator  in the control room

While no one goes to work with the intent to do a bad job and expose oneself or colleagues to danger, the majority of accident investigations find that human error is a significant contributor. This could be in the form of mistakes (e.g., wrong operation of a device), risk prone behavior (e.g., improper use of personal protection equipment), distraction (e.g., fail to see a hazard), or poor judgement (e.g., fail to assess the potential consequences of an event).

 

Generally speaking, industrial accidents caused by human errors are more frequent amongst staff with insufficient training, that perform monotonous repetitive tasks, are over-worked, or have to perform in an environment with poor ergonomics. In other words, managers can, with the very best intentions for their organization, make ill-advised decisions that increase the risk of an accident at the coal face. While this type of human error bears similarities to poor judgement, it is directly affected by the level of tacit expectations between management layers: functional managers may think it would be a good career move to save money on the department budget and therefore buy sub-spec personal protection equipment, but top management is accountable for safety being the highest priority across the entire organization.

Sources of accidents at work from Eurostat

In process engineering, economy of scale has resulted in very large assets with significant concentration of production capacity. This implies that industrial accidents in process plants typically expose more staff to injuries, surrounding society and environment to consequential damage, and stakeholders to larger direct (repair) and indirect (lost production) losses.

 

The following infamous list of accidents in the processing is not exhaustive, but represents hundreds of casualties, billions of damage and orders of magnitude more in lost revenue and reputation:

  • Seveso in 1976
  • Alexander Kielland in 1980
  • Enchova Central Platform in 1984
  • Piper Alpha accident in 1988
  • Longford in 1998
  • Humber oil refinery in 2001
  • Toulouse in 2001
  • Petrobras Platform in P-36 2001
  • Snorre A in 2004
  • Texas City in 2005
  • Buncefield in 2005
  • Mumbai High North in 2005
  • Usumacinta Jack-up in 2007
  • Ngujima-Yin FPSO in 2009
  • Deep WaterHorizon in 2010
  • Gullfaks in 2010
  • Oklahoma Rig in 2018

All these major accidents were found to be due to human errors in whole or in part. However, the common denominator is that the complexity of the process plants made it difficult for humans to make the right decisions at the right time to prevent initiation and escalation of the accident.

 

In highly complex situations, our formidable ability to recognize patterns, reduce the world to a manageable size, and quickly reason about possible causes and outcomes, can sometimes lead us astray. Under the stress of a developing incident, we often exhibit cognitive bias, emphasizing information that supports our initial diagnosis and discrediting the information that contradicts it. In such cases, we remain on a course of action that in hindsight, easily can be shown by available evidence was sub-optimal in dealing with the event.

 

Improved awareness of the circumstances leading to human errors and working systematically with human factors to improve ergonomics, is critically important for process safety. It also significantly improves process performance. In addition, methods are called for to reduce the impact of cognitive bias in human decision making in the face of extreme complexity. White paper: improving efficiency of control room operators in process facilities

Ian Hamilton and Claus Myllerup

Written by Ian Hamilton and Claus Myllerup

Ian Hamilton of ERM and Claus Myllerup of Kairos Technology