All systems are composed of three basic elements: people; processes; and technology. While managing safety, all three elements have to work cohesively in order to produce the desired results.
The cost of workplace injuries is staggering: employers pay as much as $1 billion per week in direct and indirect workers’ compensations costs. Indirect costs include training replacement employees, accident investigation, lost productivity, etc., and can be 20 times higher than direct medical and wage replacement costs.
The industries with the largest percentage of workers’ compensation claims include: air travel (7.3 percent get injured on the job), beverage and tobacco manufacturing (6.9 percent) and couriers and messengers (6.6 percent).
Integrated risk management focuses on the overall risk reduction of the organization. This is achieved through the quantitative and qualitative analysis of both the inherent risks, and the effectiveness and impact of sector-specific risk management processes.
New safety hazards and risks continuously emerge and must be mitigated. As long as safety risks are kept under an appropriate level of control, a system as open and dynamic as aviation can still be kept safe. It is important to note that acceptable safety performance is often defined and influenced by domestic and international norms and culture.
The B-737 Max accidents were inevitable and so were accidents like the Fukushima Daichi nuclear plant and Deepwater horizon. These accidents are prime examples of failed regulation, ignored warnings and workplace errors.
The first response from the regulators and operators after a major disaster is denial of the accident itself. Union Carbide accident, Bhopal, plant officials initially denied any chemical release and then said it was not dangerous, even as they themselves were fleeing upwind of the toxic fumes. The Soviet Union refused to admit there had been an accident at Chernobyl, even after the Swedish nuclear agency had concluded that the radioactive materials they were detecting had to come from Chernobyl. Worse yet, the USSR waited two days before evacuating the town next to the plant. BP officials and American officials consistently minimized the damage of the oil spill in the Gulf of Mexico and kept reporters and scientists away from the scene.
Boeing too has been denying that they have been aware of design defects which have led to accidents on their B-737 fleet.
At Fukushima, the regulatory authorities required a seawall that was a bit taller than the largest tsunami that locale had experienced in the last 1,000 years. So the danger was, indeed, recognized. But the seawall design was based on probabilistic thinking, not thinking about what is possible, and the seawall was horribly inadequate to the 2011 tsunami.
Before Fukushima, 14 lawsuits charging that risks had been ignored or hidden were filed in Japan, revealing a disturbing pattern in which operators underestimated or hid seismic dangers to avoid costly upgrades and keep operating. But all the lawsuits were unsuccessful. A representative in the Japanese parliament in 2003 warned that the nuclear plants were not sufficiently protected; a seismology professor at Kobe University resigned in protest from a nuclear safety board in 2006 due to a lack of attention to earthquake and tsunami risks.
At Boeing there were a series of accidents on the B-737 attributed to the defective rudder design. Boeing refused to admit the design defect till the Silk Air flight 185 accident in 1997. Following the accidents, Boeing redesigned the rudder systems of the 737 and paid for the redesign to be retrofitted globally. While it was a costly remedy for Boeing, it no doubt prevented further accidents. Early FAA intervention could have saved many lives.
Organisational learning is a key component of both good safety cultures and high-reliability organizations. But learning can be thwarted by well-known difficulties in handling information—too much information, inappropriate communication channels, incomplete or inappropriate information sources, or failure to connect available data—and these difficulties can pose acute challenges for safety. For example, an incomplete or inaccurate problem representation might develop at the level of the organisation as a whole and thus influence the interpretations and decisions of the organization’s individual members. Such a representation may arise through organizational rigidity of beliefs about what is and is not to be considered a “hazard.”
The Fukushima disaster is an instructive example of such organizational thinking. The plant owners developed a group mindset about the risks of tsunami, minimizing the significance of the knowledge that flooding across the site could lead to a total loss of power (and hence the cooling function). They also failed to take account of the risk of a tsunami larger than the projections made by the Japanese Society of Civil Engineers, even though it was clear that such an event could disable the plant and seriously damage the reactors, with catastrophic consequences.
As per Charles Perrow, like organizations and their leaders, people seek wealth and prestige and a reputation for integrity. In the process, they occasionally find it necessary to be deceitful, engaging in denials and coverups, cheating and fabrication. Everyone has violated regulations, failed to plan ahead, and bungled in crises. But people are not, as individuals, repositories of radioactive materials, toxic substances, and explosives, nor do they sit astride critical infrastructures. Organizations do. The consequences of an individual’s failures can only be catastrophic if they are magnified by organizations. The larger the organizations, the greater the concentration of destructive power. The larger the organizations, the greater the potential for political power that can influence regulations and ignore warnings.
Charles Perrow and Normal accident theory
Everything is subject to failure-designs, procedures, supplies and equipment, operators, and the environment. The government and businesses know this and design safety devices with multiple redundancies and all kinds of bells and whistles. But nonlinear, unexpected interactions of even small failures can defeat these safety systems. If the system is also tightly coupled, no intervention can prevent a cascade of failures that brings it down. It is much more common for systems with catastrophic potential to fail because of poor regulation, ignored warnings, production pressures, cost cutting, poor training, and so on.
There has been a big debate on the policy of self regulation by high risk industries. In most countries industries Nuclear industry have been regulated by the government since it has been found that profit driven companies have the tendency to cut corners.
An effective safety management system implemented by the operator with oversight by the regulator will keep track of risk mitigation and emerging risk. Safety is perceptional and if left to a few individuals, it can have individual biases creeping in to ultimately distort the real picture. Therefore there has to be adequate oversight by the regulator to ensure that vested interests do not hijack the safety agenda.
In the case of Boeing:
- The B-737 known rudder design defect
- The B-747 known center tank wiring fault
- The B-787 known APU battery design flaw
- The B-737-8Max known MCAS software issue.
The regulator has been aware of the design flaws but for reasons under investigation by the state have not taken timely action to prevent accidents. The industry has ended up paying billions of dollars as compensation to individuals but have done little to incur losses while enforcing safety. Its all about the safety culture and the ability of the leadership to resist political and lobbyist pressures.
The above is an extract from the FAA risk management handbook. Risk perception is an important factor when assessing risks.
Safety is not a revenue centre but can save losses.