Petr Vaganov
University of St.Petersburg

Towards a Quantitative Approach to Safety Culture in High-Risk Technologies

    The term "safety culture" appeared in 1988, it has been stipulated by the Chernobyl catastrophe. One of the major lessons of such disasters as Chernobyl, Three Mile Island, Bhopal, and the Challenger explosion, was the relevance of human reliability problem. Another major issue that surfaced after these catastrophes was the importance of management and organizational factors, which are within the purview of safety culture and the human factors. Many definitions of safety culture abound in the academic literature, this led to a certain confusion that surrounds this concept today. Different definitions will be collated in the presentation; a necessity to develop a quantitative approach to safety culture will be demonstrated.

How safety culture is related to other cultures?

    According to Schein (1996), three particular cultures exist in every organization: "operator culture", "engineering culture", and "executive culture". Cooper (2000) states that safety culture is a sub-facet of organizational culture. Some scholars argue that it is a national culture that is the most important. In order to combine organizational and national aspects of culture, the author propose to use the term "national dimensions of organizational culture".

How to quantify national dimensions of organizational culture?

    Hofstede (1980) has put forward an approach to cultural framework where any national culture is considered in four dimensions. They are as follows: Power Distance, Uncertainty Avoidance, Individualism-Collectivism, and Masculinity-Femininity. Hofstede's dimensions can be quantified, using a scale from 1 to 100. In the presentation, empirical results of Bollinger and Fey (1994, 1999) concerning national dimensions of organizational culture in Russia and the USA will be given.

What is the role of human errors in safety culture?

    Operators are often the weak link in any complex system, they are one of the biggest sources of errors in high-risk technologies. Reason presented in 1990 a conceptual framework within which it is possible to locate the origins of the basic human error. His classification of unsafe acts includes errors (as unintended actions) and violations (intended actions); errors are subdivided into slips, laps, and mistakes, whereas violations - into routine, exceptional, and acts of sabotage. The author proposes to add one more category to Reason's taxonomy - delinquency. Its roots are in such factors as complacency, overconfidence, and arrogance, its perception is "safety process is a burden".

How to quantify human errors?

    An approach called human reliability assessment (HRA) appears as a promising tool of quantifying the effects of human errors on high-risk systems performance. Beside qualitative techniques such as task analysis and human error analysis (proposed by Rasmussen, Reason, Norman et al), HRA can be regarded as a perspective means of improving safety culture. According to Kriwan, the three basic goals of HRA are to (1) identify human error ("what can go wrong?"); (2) quantify human error ("how often will a human error occur?"); and (3) reduce human error ("how to prevent, suppress or reduce impact of human error?").