What science has to say about safety — and the role of people in raising standards

1*J4nQ1h QaYeclL33YuqUWA What science has to say about safety — and the role of people in raising standards

Well, first of all, what is science? And what is safety? Is there at all something called safety science? Or is understanding safety and creating safe and resilient processes at work perhaps more about understanding the array of commonalities going on in the workplace where people, organizations and technology interface?

This can turn into a more philosophical debate, and it has already. What is more interesting to investigate here is the difference between addressing safety directly i.e. treating it like a concept in its own right (often manifested by compliance and behavioral control), and addressing safety indirectly i.e. relying on the basic premise that safety (or accidents for that matter) is a byproduct of all the messy stuff that is going on when people in organisations do work in pursuit of success and growth.

What interests us here at Scoutbase is the latter — we do not care much about whether safety is considered science or not.

But in this blog post, we will highlight some points raised by voices in the academic field, and look at how such points connect to our mission of bringing the seafarers’ knowledge to the forefront in pursuit of safety.

1*Z3EUWRoJdq What science has to say about safety — and the role of people in raising standards

On bureaucracy and safety

So what does looking at safety directly mean? It is pretty much what we try do do when we take an engineering approach of closing known gaps.

Let me explain: When we see a problem with safety, we skip right on to writing a new procedure that is supposed to solve the issue. We tell people to do something in a different way, which in hindsight would have prevented an incident. Or we appeal to people’s moral code because we believe that they will not do their best out of own free will. This is really a bureaucratic approach that holds the basic assumption that the environments that we perform our work in can be fully understood and managed. An array of tools and approaches to help us has already been produced; a well known being the Swiss Cheese model that underlie most root cause analysis applications.

When safety is addressed directly like this `Human Error´often becomes a cause of bad things in some way.

A holistic view on work and safety

But what if we say that safety is only something we imagine; it does not exist as a concept that we can address in itself? Rather, what we concern ourselves with are the conditions of people’s performance at work; for example, do people have the right tools (literally and figuratively) to carry out effective and safe work and does the wider organization support actual work?

Figuring out these things takes an approach very different from the bureaucratic one; what is needed is a naturalistic approach where the perspectives of people doing daily work in their context are sought to be understood and incorporated into workflow designs.

It is crucial to understand where and how people must balance goals of working thoroughly, economically (on the short run) and under time constraints. Where and how are they expected to work in an idealised way that is just not possible when reality kicks in?

Surely people will make mistakes, but when safety is addressed indirectly like this it begins to surface that Human Error is often the last manifestation of a lot of (sometimes messy) stuff going on in high-risk workplaces.

1*FjgtamD9GK1X1BFJvUHdag What science has to say about safety — and the role of people in raising standards

Improving safety and work with a holistic and proactive perspective

These two different approaches call for different means to improve safety and to make work more effective: Focusing on the individual vs. zooming out to look at contributions from the whole environment that work is performed within.

But what is it that the smart guys (the scientists) are actually talking about, in concrete terms? In the more popular end of the spectre, Dekker puts it something like this:

“To understand safety, an organization needs to capture the dynamics in the banality of its organizational life and begin to see how the emergent collective moves toward the boundaries of safe performance”.

The essence here is that we should be interested in all the everyday stuff that together determines if we are heading towards disaster — which in itself may not look dangerous during daily routines.

A concrete maritime example could be how adding additional ports to an already tight schedule, will ask crew members on board ships to stretch their capacity even further.

Changes like this one are usually not a problem because people are outstanding at adapting to their circumstances. Not until something goes wrong on board and, then, it takes great effort to trace it back to organisational decisions, like in the example. It is easier to focus on more immediate factors (often boiled down to one or a few root causes and how people should act better).

Another concept often mentioned is that we need to avoid the “Psychologist’s Fallacy”: This is a phenomenon where well-intentioned observers think that their distant view of the workplace captures the actual experience of those who perform work in context.

Distant views can fail to see important aspects of the actual work situation and thus miss critical factors that determine human performance in that field of practice. While this is most likely common sense to a majority of professionals, also in the safety domain, it is seldom practiced very well in the maritime domain and many will probably recognize the situation where `the guys behind the desk at the office know what is best´.

So the question is; what can we do to grow our capacity in learning from the frontline experience, and how can we do it at scale to make it financially viable?

Scientists provide other clues to this. For example in discussing safe and reliable software development and design, Baxter and Somerville points out that:

“Taking software developers into the workplace, even for a short time, can reveal to them the complexity of work and the difficulties faced by system users.”

They go on and ask a very interesting question:

“How can different types of knowledge be captured at low cost and maintained in an accessible way? …We need to discover techniques that capture information from normal work activities with minimal intervention from the people involved in these processes.”

Translated into a context of safety-critical work in the maritime domain, the above question is exactly what we are working on providing an answer for with Scoutbase:

How can we capture the vast knowledge workers on the frontlines have about their work contexts and about the challenges they experience on a daily basis? And how can we make this knowledge easily accessible across the whole workforce?

Stay tuned for more updates on the development of Scoutbase, and feel free to get in touch for more information and any questions you might have.

Mads Ragnvald Nielsen, Co-founder and CEO at www.scoutbase.com, [email protected]

scoutbase.com | Monthly newsletter | TwitterLinkedIn

stat?event=post What science has to say about safety — and the role of people in raising standards


What science has to say about safety — and the role of people in raising standards was originally published in Scoutbase — Realtime Leading Safety Indicators on Medium, where people are continuing the conversation by highlighting and responding to this story.