Developing a Robust Safety Culture

Download PDF

The importance of a “safety culture” gained prominence after the 1986 Chernobyl disaster. The findings into the fallout attributed a lack of a “safety culture” as one of the key causes that led up to the catastrophe. “It was a direct consequence of Cold War isolation and the resulting lack of any safety culture”. The INSAG-7 report it was concluded that “The accident . . . flowed from a deficient safety culturei at all levels of the power station. Studies done by Finn and his colleagues confirmed that there is a direct correlation between the quality of an organisation’s safety culture and their safety performance.ii

Similar outcomes surfaced in other calamities. After the Esso Longford explosion, in Melbourne Australia, a Royal Commission was initiated. They determined that “the company’s ‘safety culture’ was more oriented towards preventing lost time due to accidents or injuries, rather than protection of workers and their health.”iii The Cullen Inquiry into the Piper Alpha accident ascertained that “It is clear that a combined approach is needed to ensure the importance of safety is recognised. A combination of strong leadership, vigorous process safety and a curiosity-invoking safety culture could hold the key to ensuring that society never has to encounter the tragedy that the industry witnessed 25 years ago in the North Sea off the coast of Scotland.iv

Unfortunately, both of BP’s debacles had the same verdict. The Baker panel cited a “weak safety culture” played a role at BP Texas City as they “did not adequately follow the Department of Energy’s published safety recommendations”.v Amongst other issues, the Chemical Safety and Hazard Investigation Board reported “a lack of corporate oversight on both safety culture and major accident prevention programs [and] a focus on occupational safety and not process safety,”vi as contributing factors. Bill Reilly, the co-chair of The Oil Spill Commission for the Deepwater Horizon blowout, in the Gulf of Mexico, said that “there was not a culture of safety on that rig”.vii The oil spill was not an isolated incident; it was symptomatic of a larger issue. It was reported that there was a history of where it became acceptable practice to take short cuts out and violate safety procedures. This, in turn, developed into a poor safety culture.

It is has become apparent that a safety culture, or lack thereof, can have a significant impact on a company’s safety performance.viii The OECD Nuclear Agency defined a safety culture as “an organisational atmosphere where safety and health are understood to be, and is accepted as, the number one priority”.ix They referred to a safety culture as one where safety is “an overriding priority”. Having a culture that embraces safety as a value doesn’t just happen; it needs to be forged. It needs to be made a priority and receive the attention, energy, and resources just as other critical success factors in the organisation do. James Reason argued that an organisation’s safety culture is ultimately reflected in the way in which safety is managed in the workplace.x I want to offer four very basic and practical ways that will help improve or reinforce your safety culture:

1.         What Gets Talked About

The continuous stories we share and that which we give emphasis to in our regular conversations sends a strong message of what is important to us. It reflects our values, focus areas, where we want to spend our time and money, as well as what we want to pay attention to. Typically, the main focus areas will be that of the drivers of the business; targets, deadlines, budget, performance criteria and profitability. None of these will ever go away. The question is; if safety is a value to us, do we integrate its importance during these key conversations? How often do we express our concern for people? Do we talk about our people as mere “tools” to get the job done or real assets to the life and success of the company? What signals are we sending by what we do and don’t say? Without exception, all of the companies we consult with have some form of safety meeting before the start of a shift. Too often these meetings are rushed through so workers can quickly get to their work stations.

People quickly pick up on what is important. The company may “sing” safety, but the workers hear production. If the company is not willing to give the necessary time (I am not talking about wasting time) to make sure their employees approach their responsibilities safely, then it becomes very evident to everyone that safety is a mere afterthought to those in charge. What I have also noticed is a tool box talk will spend most its time detailing the critical tasks for the day. Just before everyone disperses a casual comment like “work safely” is thrown in so that they can say “safety” was covered! But what does “be safe mean”? It is too vague, too general and has no connection to that work at hand. Then we want to know why workers don’t take safety seriously. The message is explicit, “production is king”! In reality, production needs to be “king”. Without being profitable no-one will have a job. Having said that, only when leaders take the time to discuss and explain how each task needs to be approached in order to execute it safely will they have the change in behaviour towards safety they want to see. Safety cannot be an addendum. It only becomes a value when we give it prominence by integrating it into our discussions regarding daily planning, production targets, strategy, HR, budgets, etc.

2.         What Gets Measured Gets Done

It is simply insufficient to talk about safety and only measure process performance and production. When we do that the message is loud and clear; safety is just lip service around here. If workers are measured against production targets only then that is where they are going to focus on. It is understandable to measure tonnes moved, structures erected, minerals extracted because it is tangible. It is a little more complex to truly measure whether someone is working safely. That is why many companies monitor lagging indicators such as their LTIFR or incident severity. However, measuring lagging indicators has two downfalls. Firstly, it is reactive. The incident has already occurred. Secondly, it is only an illusion of safety. It is possible for someone to drive 220 km/h all the time and never be in an incident. If we measured the driver’s LTIFR, it would indicate that he is a very safe driver even though the very opposite is true. On the other side; you can drive under the speed limit all your life and for a variety of unfortunate reasons be involved in several accidents. This doesn’t necessarily mean that you are a bad or unsafe driver. This anomaly was shockingly demonstrated when the management of the Deep Water Horizon was celebrating a significant LTI free period on the very day of the Macondo-well blowout.

Another problem with measuring safety is it can quickly turn into a paper exercise. As soon as workers are instructed to hand in 10 non-conformances or a foreman must complete seven PTOs for the week, the reason behind the activity quickly goes out the window and completing the amount required becomes the goal. Connected with this is safety rewards. Again, this one is tricky because you want to recognise and reward people for their dedication and commitment to safety. However, reaching a target such as a million LTI free hours doesn’t mean you didn’t have an LTI – it just means none were reported. Who would if it means that they are going to lose their bonus?

Companies also need to understand that some of their goals (what they measure) are in conflict with each other. The Airlines are a classic example of this. They push on-time arrivals and fuel efficiency.

Sometimes in the effort to meet these conflicting expectations safety is compromised! One of the companies we have the privilege to consult with has taken two daring steps concerning this. Historically safety was on everyone’s scorecards. If there was an incident, the site was penalised. It didn’t take long to notice that people were hiding incidents. Being a learning organisation is a core philosophy of how they wanted to do business. Having safety on the scorecards became counterproductive to what they wanted to achieve. They wanted to learn from their mistakes to avoid or not repeat them. A bold move was taken; if an incident occurred then that site would be penalised, as usual. But if they could prove to the executives that they took the time to truly get to the underlying causes of the incident, put everything in place to ensure it won’t be repeated and shared this learning with other similar sites their lost points would be redeemed. This sent a decisive and positive message – safety is an absolute value. If something happens, there are still consequences. Yet it did open the door for people to start sharing their concerns and even admit to their mistakes because it wasn’t about punishing them. The focus was on learning and saving lives. This courageous move had, in a short period, change the culture of “fear of failure” to open and honest reporting, sharing and learning in the Group.

It wasn’t too long after that that they took LTIs out of the score card altogether because as I already mentioned, it is not a true indication of your safety culture. More emphasis is being placed on identifying the leading indicators when it comes to safety. Sidney Dekker argues that “safety is not the absence of something, but it is the presence of something”.xi This calls for leaders to apply their minds when it comes to measuring safety. We need to take the time to ascertain what behaviours and processes are needed to create the safety culture we want and monitor it proactively.

3.         What Gets Rewarded

There is a well-known principle in life; what gets rewarded or recognised gets repeated. You cannot try to get what you want by focusing on what you don’t want. As I highlighted, safety is not about the absence of something, such as accidents or deviations, it is about the presence of something, which you need to quantify! Most companies give some form of a bonus or “reward” for meeting production targets. Surprisingly few have a positive safety category in their reward scheme. In fact, most companies still believe and faithfully apply a “punitive” approach towards safety performance. This is usually because the leaders haven’t taken the time to determine how they are going to measure safety, so it remains elusive and intangible. It almost became mystical, so we resort to ridiculous phrases like “zero harm”.  

I have seen first-hand how managers will praise workers for meeting production targets knowing all too well that the only way they reached those targets was by taking short-cuts. Again, the message is well received. What is appalling is if something does go wrong the same manager will then discipline that worker for his negligence and not following the very procedures he was praised for ignoring previously.  

Last year, while consulting at a large mining operation, I was abruptly interrupted while busy with an important conversation on my cell phone. A rather young man issued me a “yellow card”. At first, I was surprised to say the least, my emotions ranging from upset to confused. One of that site’s rules is, because of the amount of moving machinery, that you may not talk on your cell phones accept in the office or other designated areas. I simply wasn’t aware of this rule and to make it worse I had been with the Risk Manager earlier as he made a call in the exact spot. After a quick self-evaluation, I so appreciated this young man’s boldness that I immediately moved into the “safe zone” and then, in his presence, called the site manager. I told the manager what this person did and insisted that we find some way to express our appreciation for his dedication to safety. This is exactly the type of behaviour we want to recognise and send the message that we desire to see more of.

4.         How You React to Failure

The most powerful message about our safety culture that goes out is the way leadership responds to something that goes wrong. It is wonderful to have a lovely framed vision and value statements hanging in the foyer, but the real values are seen in the way in which problems are addressed and resolved. Todd Conklin said that “when something goes wrong your first reaction and everything you do from there on establishes the culture of the company. If you want to change your safety culture, change how you react to failure or unexpected events”.xii

Conklin shared a story where a supervisor shut a roller coaster down during peak season because he was worried about the noise the motor was making. The inspection revealed that the motor was dangerously close to breaking, which would have had a catastrophic outcome. The supervisor received a reward from the town’s Mayor, and his company gave him a bonus. This is sending the right kind of message – listening and taking action to warning signs is the right thing to do – it saves lives. Conklin then asks a critical question; what would the company have done if the supervisor closed down the ride only to find minor damage to the motor that could have been repaired at a later stage? Would they still have praised him for being so brave and decisive under the overwhelming pressure that “the show must go on” or would he have been disciplined for costing the company a lot of money? More importantly, how would your company have reacted?

Recently a friend of mine seriously hurt himself at home. His friends and work colleagues came to him and offered their support. He was showered with care. It was an appropriate response for someone who hurt them self. What is really sad is if he had the same accident at work (it wasn’t deliberate or negligent); instead of showing care and concern he probably would have been disciplined and treated like a delinquent. Why is this so?

The phrase, “name, blame, shame and retrain” is common practice for many organisations. This approach fails to deal with the deeper underlying causes of mishaps. When something unexpected happens, do you quickly try to get rid of the problem by finding the guilty party and discipline them? Or do you make sure everyone is okay? Do you set aside time to make sure a comprehensive investigation is done? Do you go beyond looking for what happened but why and how it happened? Do you meticulously seek out learnings to take the necessary strategic steps to prevent this and similar events from taking place again? If only a disciplinary stance is taken, everyone expects that if they make a mistake, they will be punished. The reality is people make mistakes, and things do go wrong, but because no-one wants to get into trouble, they start to hide them from their managers. This means that the trust between the managers and workers has corroded, which leads to an unhealthy working environment. As a result, managers have an inaccurate assessment and understanding of their safety culture. Critical opportunities to learn and take preventative steps are lost, only to increase the likelihood of an incident.

At the end of the day creating a robust safety culture is going to take dedication and leaders who are willing to put their time, resources and effort into giving safety the priority it deserves. Walking the talk is no longer a nice catch phrase, it is an absolute necessity. We need to reflect and take stock about what we speak about, what and how we measure safety, what we reward and how we deal with failures.

Dr Brett Solomon is the CEO of The Kinetic Leadership Institute and is a recognised leader in combining neuroscience, change management and leadership theory to drive cultural transformation processes. He also specialises in neuroleadership, especially when it comes to understanding what drives human behaviour and how to influence it. 

 Using this knowledge, Brett has been involved in projects throughout Australia, the petrochemical and power industry in Saudi Arabia, as well as Dominion Diamond in Canada. Locally he has had the privilege of working with BHP Billiton, Impala Platinum, De Beers, PPC Cement, Anglo American, Glencore Alloys, Aveng Moolmans and Assmang.

References:

                                                   

i IAEA Report INSAG-7 Chernobyl Accident: Updating of INSAG-1 Safety Series, No.75-INSAG-7. (1992). International Atomic Energy Agency. Vienna. ii Flin, R., Mearns, K., O’Conner, P., & Bryden, R. (2000). Measuring safety climate: Identifying the common features. Safety Science (34), 177 – 192.

  • Dawson, D., & Brooks, B. (1999). The Esso Longford Gas Plant Accident: Report of the Longford Royal Commission. Melbourne: Parliament of Victoria.
  • Cullen, W.D. (1990). The public inquiry into the Piper Alpha Disaster. London: H.M. Stationery Office.
  • Safety culture weakness cited in BP Accident, Department of Energy Office of Health, Safety, and Security Safety Advisory No. 2007-02. (2007). Department of Energy. Retrieved 18 November 2013, from http://www.hss.doe.gov/sesa/corporatesafety/advisory/SAd_2007-02.pdf
  • S. Chemical Safety Board concludes “Organizational and safety deficiencies at all levels of the BP

Corporation” caused March 2005 Texas City Disaster that killed 15, injured 180. (2007). US Chemical Safety Board. Retrieved 18 November 2013, from http://www.csb.gov/u-s-chemical-safety-boardconcludes-organizational-and-safety-deficiencies-at-all-levels-of-the-bp-corporation-caused-march-2005texas-city-disaster-that-killed-15-injured-180/

  • Adams, P. (2010). Gulf oil spill: President’s panel says firms complacent. Retrieved 18 November 2013, from http://www.bbc.co.uk/news/world-us-canada-11720907
  • Shannon, H. S., Mayr, J., & Haines, T. (1997). Overview of the relationship between organizational and workplace factors and injury rates. Safety Science (26), 201 – 217.
  • Traits of a healthy nuclear safety culture. (2013). Atlanta, GA: Institute of Nuclear Power Operations.

Retrieved 18 November 2013, from http://nuclearsafety.info/wp-content/uploads/2010/07/Traits-of-aHealthy-Nuclear-Safety-Culture-INPO-12-012-rev.1-Apr2013.pdf

  • Reason, J. (1998) Achieving a safe culture: Theory and practice. Work and Stress (12), 293 – 306. xi Dekker, S (2006). The Field Guide to Understanding Human Error. Hampshire, England: Ashgate.

xii Conklin, T. (2012).Pre-accident investigations: An introduction to organisational safety. Surrey, England: Ashgate.