Safety barriers: Organizational potential and forces of psychology

This paper examined the run-up to the Macondo blowout from a barrier element perspective, drawing on Andrew Hopkins’ account.

NB. I found this a challenging and tad confusing paper to summarise; I just couldn’t always follow the logics from start to finish. So, if you’re confused by what I’ve written – I probably was too.

Providing context:

·       “Safety barriers are often described as a safety function realized in terms of technical, operational and organizational barrier elements. These elements, in some shape or configuration are established to ensure that the barrier works as intended”

·       Technical and operational barrier elements “appear fairly definable”. In contrast, organisational barrier elements “often remains elusive”

·       They ask the question whether “it is possible or desirable to confine the organizational influences to categorical classifications?”

·       It’s said that “Organizational issues may be a forceful contributor to maintain safety; but also in the development of major accidents”

·       Barriers are often described in reference to their function, a verb and noun, e.g. barrier X close flow

·       The barrier function describes what a barrier does, and may be realised by barrier elements

·       Barrier elements are typically classified as technical, operational or organisational

·       Elements, individually or correctively, reduce the possibility for a specific failure mode, event or exchange of energy

Hopkins’ Account of Macondo

The paper spends a decent amount of time covering a few elements from Hopkins’ account of the disaster. I’ve skipped most of this due to brevity, so hopefully the authors’ subsequent arguments make sense.

Cement casing

The cement casing in the well demanded high pressure, increasing the possibility of loss into oil and gas sands. This loss was one of at least four other plausible error modes: 1) instable cement, 2) channeling, 3) contamination.

Hopkins’ maintains that here a fallacy was encountered, by “concluding that the cement job was successful (due to signs of full returns) the job was affirmed as completed”.

This “declared success” meant that the cement bond log was unnecessary, and that team flew home. Hence, “By declaring the job a success, corners could be cut by omitting the cement evaluation test, and thereby save money”

Tunnel vision engineering

Hopkins believes that the Macondo engineers encountered tunnel vison engineering. Here, their “eyes were fixated on one objective: a well design that was cheaper and would enable easier production when that time came”. At the same time, it’s said that “It was as if peripheral risk awareness was virtually eliminated”.

Hopkins traces this tunnel vision back to a management of change process that gave formal authorisation for the well design. The hazard of a potential of loss of mud into surrounding sands was called out specifically in the design, and hence, “This hazard was in other words primed in the engineers’ minds” and from the beginning, the design approval stage, “only one of at least four possible failure modes was addressed”.

Decisions in consensus-mode

Consensus-mode decision making are decisions “effectively made in settings where no one could be held accountable later on”.

Hopkins’ cites examples where 1) decisions were made in meetings intended to collect info with the effect of making “no one actually responsible”, 2) management of change documents were approved by a long string of signatures.

The signatures often came from the same people in the plan, and therefore “there was no independence, and the system of assurance served only to undermine the process”.

It’s argued that the MOC process was, in effect, “a consensus decision-making process; with the disturbing effect that responsibility was diluted”.

Well integrity test

Here the failed well integrity test was discussed. In short, the team were able to explain away conflicting/unexpected pressure test results.

Hopkins argues that this process followed a social psychological process, including confirmation bias, normalisation of deviance, inadequate situational awareness and groupthink.

Confirmation bias is the “unconscious tendency of preferring information that confirms ones beliefs; a tendency of selective use of information here”.

Because well integrity tests rarely fail, this biases the expectation that the test will succeed, hence, confirming that the well was OK, rather than to test if it was OK.

Moreover, the engineers prior to the day developed a decision tree. The integrity test was defined as a point in a sequence and not as a decision-process; said differently “the diagram presupposed that the test would be fine”.

Finally, the cement job had been declared a “textbook operation”.

Next was the normalisation of warnings; drawing on Vaughan’s normalisation of deviance in the Challenger accident, i.e. “a reconceptualization and normalization of a partial malfunction, until it at some point became assessed as an acceptable risk”.

Hopkins believes this happened at Macondo, where inconsistent findings were explained via reference to a ‘bladder effect’ theory. The bladder effect had ‘no credibility in comfortable hindsight’, but “provided a needed explanation of the pressure readings”.

Next was groupthink, being a process “that deters questioning the wisdom of the dominant view”. This may also be influenced by a ‘risky shift’, where groups are more incline to make riskier decisions than any of them would individually.

Another area to evaluate is to identify the power imbalances within the group. Moreover, drilling is a complex area and is “typically reflected in an esoteric language with extensive use of slang expressions and acronyms” and also influenced by extensive use of peer pressure. It’s said that via “widespread use of teasing and humor”, questions seen as “unintelligent … are heavily sanctioned”. This is the prevailing industry culture that the BP decisionmakers had to resist.

Hence, at first, the BP stakeholders were sceptical to the bladder effect theory, but on later reflection noted how his reluctance to accept it was found humorous by the drillers. Therefore, “The dominant view triumphed in the end, the test was declared as passed” and these social effects made it “virtually impossible for them to act independently”.

State of mind

Hopkins explored why the drillers had so little apparent concern over the criticality of the emerging situation. Here they acted “in a state of mind where the job was defined as over”. Hence, in their minds, drilling was finished, and the well “had been declared safe twice”.

Their operating model was now just to finish up, and they were short on time as the tank cleaning personnel were arriving.

Re-analysis of Hopkins’ arguments

The authors then re-analyse Hopkins’ core arguments. They note, first, that the Deepwater Horizon drilling rig wasn’t operating under a barrier management regime, with specified barrier elements.

They highlight some core arguments, for instance the engineers’ tunnel vision was traced back to the MoC-document or how accountability was “pulverized by consensus”. These facets are said to have an “organizational flavor”, but. “are these features [of] organizational barrier elements?”.

They argue that if we accept these as involving barrier elements, then this “suggests a long distance link between operational and organizational barrier elements”. That is, while operational barriers are in close vicinity to the issues, organisational barrier contributions “travels a considerable distance, from managerial echelons straight into the heart of the barrier”.

Its suggested that although a barrier having proximity to a hazard may be a “critical criterion” when considering barriers, it shouldn’t be the only criterion.

Moreover, risk transfer is another aspect that travels distances in organisations, from upstream organisational triggers down to the integrity of the barrier element.

Next they talk about the well integrity test, and the triggers for declaring it a success. Was this an operational failure, or was it something else? They argue that it functioned as a “transforming in character”. That is, it “morphs from operational to organizational; it becomes an organizational premise that plays a key role in the subsequent well-integrity test”.

Hence, organisational triggers impact barriers despite being located far away, temporo-spatially. In their words, the key implication of their research is that “the organizational contribution may come from ‘somewhere else”.  This means that barrier functionality is implicated via a coupling from factors that transfer from other parts in the organisation.

Thus, these organisational influences are “possibilities that must be actively sought [and] prevented in defense in depth strategies”. In contrast, current barrier foci are on front-end personal and technical systems.

Moreover, a common factor across the factors Hopkins discussed are that they are social; there are “strong social psychological forces at play here” that influence downstream barrier performance. This may mean that “paying attention to what happens between people” is necessary to understand technical/operational barriers; as with facets of persuasion, pressure and power.

The bladder effect dilemma is said to highlight the triad of persuasion, pressure and power; these factors “cut across and be part of all the defenses considered here (cement job, well integrity test, and kick monitoring)”.

Therefore, along this logic, “the dynamics of social interaction have an additional role in terms of barrier elements”.

To summarise, they provide the following points in acknowledging that organisational elements:

1. Widen the scope by loosening the categorical approach of barriers

2. Include organisational elements as part of risk influencing decisions that could influence barrier elements

3. However this is achieved, it needs to be flexible and sensitive to context

4. The impact of psychology “must be incorporated into safety barrier approaches”

5. Here, a focus should be placed on the “social forces and mechanisms that may well permeate” the intended controls

6. They suggest reviewing barriers against the ‘forces of psychology’, and how risks may ‘spread across barrier functions via these ‘dynamics of social interaction’

Ref: Størseth, F., Hauge, S., & Tinmannsvik, R. K. (2014). Safety barriers: Organizational potential and forces of psychology. Journal of Loss Prevention in the Process Industries31, 50-55.

Study link: https://www.sintef.no/globalassets/project/pds/reports/safety-barriers-organizational-potential-and-forces-of-psychology_2014_journal-of-loss-prevention-in-the-process-industries.pdf

My site with more reviews: https://safety177496371.wordpress.com

LinkedIn post: https://www.linkedin.com/pulse/safety-barriers-organizational-potential-forces-ben-hutchinson-tinhc

One thought on “Safety barriers: Organizational potential and forces of psychology

Leave a comment