Featured Product
This Week in Quality Digest Live
Quality Insider Features
Dirk Dusharme @ Quality Digest
Retailers that don’t use technology to streamline operations will lose business
Knowledge at Wharton
Create, communicate, and implement the change. Then let employees self-lead.
Karla Jo Helms
How innovators can mitigate the status quo pushback
Sachin Waiker
A study of motorcycle makers shows how focusing on one market for too long can reduce companies’ ability to survive
Adam Zewe
A designer could use this method to 3D print interactive input devices, like a joystick, switch, or handheld controller

More Features

Quality Insider News
Eiger Fleet to enable more control and automation of distributed manufacturing
Designed to reduce complexity and increase flexibility in design workflow
HP-L-10.10 noncontact laser line scanner takes on broadest scope of CMM inspection tasks with speed, accuracy, flexibility
Quickly add magnetic chuck, precision vise, and other accessories knowing that each will be perfectly fit and aligned
Award is for development of high-resolution SWIR electro-optical seeker
Partnership embeds quality assurance at every stage of the product life cycle, enables agile product introduction
A total of 152 Go and No-Go ring gages
Industrial fiber laser-based marking system delivers top performance at a breakthrough price
Even cheaper, faster, more convenient: IDS presents solutions for the most diverse requirements

More News

Paul Naysmith

Quality Insider

What the Fukushima Is a Risk Assessment?

Or, how the Fukushima disaster could have been prevented

Published: Monday, August 6, 2012 - 10:24

On Friday afternoon of March 11, 2011, an earthquake of 9.0 magnitude was detected about 45 miles off the coast of Japan. One of the most powerful ever recorded, it moved the 8,000 square-mile island of Honshu 8 feet to the east. It also set off a 130-ft tidal wave (the same height, ironically, as the world’s tallest water slide in Brazil).

Travelling at 70 miles an hour, the wave surged four miles inland, destroying or washing away everything in its path. To this day, substantial debris, like a Harley Davidson motorcycle, continues to wash up on the western shores of Canada and the United States.

The World Bank called it one of the most expensive natural disasters of all time. Certainly it was costly to the estimated 16,000 people who lost their lives.

Already some of these facts are slipping from our collective memory, but most people will continue to associate this earthquake with the subsequent disaster at Japan’s Fukushima Daiichi nuclear power station.

After a year of hard work containing the various issues at the plant, a report was released to the world. “How could such an accident occur in Japan, a nation that takes such great pride in its global reputation for excellence in engineering and technology?” asked the chairman of the investigating committee. This is a very powerful question and prompts me to ask myself why I’m now reading the report. How can my company benefit from me reading it?

I don’t know if you’re involved in reviewing your company’s internal reports of failure investigations. I’d like to think you’re familiar with an investigation process that requires your company to get to the root cause of a problem or issue. I’d even go so far as to presume that you use techniques, such as the 5 Whys, to help you get there.

But let me jump to the conclusion of the six-month investigation of the Fukushima disaster, where the root cause lies: “Therefore, we conclude that the accident was clearly manmade,” states the report. I have to say I struggled with this. An earthquake that, in an instant, erased buildings and access to the plant? A mega-wave that overcame the 13-ft sea defense at the site? These were somehow “manmade?”

“What the Fukushima?” I’m thinking. I’m compelled to read on to discover the reasoning behind this statement.

“The operator, the regulatory bodies, and the government body promoting the nuclear power industry all failed to correctly develop the most basic safety requirements—such as assessing the probability of damage, preparing for containing collateral damage from such a disaster, and developing evacuation plans for the public in the case of a serious radiation release,” the report explains. Now I see it: The manmade element is coming from the assessment of the disaster.

“In addition, although the Nuclear Safety Agency and the operator were aware of the risk of core damage from tsunami, no regulations were created, nor did the operator take any protective steps against such an occurrence,” the report continues. “Since 2006, the regulators and operator were aware of the risk that a total outage of electricity at the Fukushima Daiichi plant might occur if a tsunami were to reach the level of the site. They were also aware of the risk of reactor core damage from the loss of seawater pumps in the case of a tsunami larger than assumed in the Japan Society of Civil Engineers estimation. The regulatory bodies knew that the operator had not prepared any measures to lessen or eliminate the risk, but failed to provide specific instructions to remedy the situation.”

In summary, this “manmade” disaster was a failure in the manmade risk assessment process. It’s all down to this thing called a “risk assessment.” If you’re unfamiliar with this quality improvement technique, I recommend you learn about it; it’s a powerful prevention tool. In the meantime I’d like to advise you on how to tell a good risk assessment from a poor one.

If you’re familiar with safety risk assessment, you’ll probably know it’s also called a “quantitative risk assessment.” I’ll use this to help explain risk assessments. In a quantitative risk assessment, the risk (R) is calculated from two elements: the impact of the loss (I) and the probability that it will happen (P). Most government health and safety agencies will have their own defined process, or matrix; the Health and Safety Executive in the United Kingdom will even give you templates for free.

The risk assessment process takes participants through a set of assumptions and uncertainties, which are all considered through a brainstorming-type exercise. The risk will be calculated from the impact value, multiplied by the probability. Or in mathematical terms: R = I × P. If we apply this equation to, for example, the risk of being run over when crossing the road in New York during rush hour, the impact (i.e., being run over by a car) could be very high. However, the probability could be very low (cars don’t move very fast during rush hour). But knowing there is a terrible outcome to this scenario, we come to the most important part of the risk assessment: mitigating the risks. To reduce the risk of being run over in New York, the city provides safe crossing zones, or installs walkways and overpasses to physically separate pedestrians from the traffic.

In the case of Fukushima, Chairman Kurokawa is very critical of the mindset that failed to address the mitigating actions to the known risks. Interestingly, his report looks beyond the regulators and the operator to Japan as a society: “The consequences of negligence at Fukushima stand out as catastrophic, but the mindset that supported it can be found across Japan,” he writes. “In recognizing that fact, each of us should reflect on our responsibility as individuals in a democratic society.” Kurokawa implies that Japan as a nation failed to address the actions required to mitigate the risk. I find this aspect of risk assessment—addressing the mitigating actions—is often forgotten. But it is always the most crucial part of the assessment process.

Please don’t be lulled into thinking that a risk assessment is complete when all sections of the form are filled in; it’s certainly not yet finished. That happens only when you have put in the preventive measures and tested them for efficacy. This is where the Fukushima disaster really began, not at the time of the earthquake or tidal wave, but at the exact moment of inaction about preventive measures.

And this is the difference between a good risk assessment and a poor one: following through with the identified actions.

When considering the terrible event at Fukushima, or any other major disaster, it’s not the known failings or awful headlines that matter, but what we as a society must do to prevent the disaster from happening again. It’s clear to me the “manmade” error that led to the failure of the nuclear plant’s safety systems was a failure to address the known risks and implement appropriate prevention measures.

So the next time you run up against little management support in implementing your mitigating actions from a risk assessment, remember this was the same mindset that created the disaster at Fukushima. And keep in mind that preventing a disaster from happening is a much more comfortable feeling than having to explain why you did nothing to stop it in the first place.


About The Author

Paul Naysmith’s picture

Paul Naysmith

Paul Naysmith is the author of Business Management Tips From an Improvement Ninja and Business Management Tips From a Quality Punk. He’s also a Fellow and Chartered Quality Professional with the UK’s Chartered Quality Institute (CQI), and an honorary member of the South African Quality Institute (SAQI). Connect with him at www.paulnaysmith.com, or follow him on twitter @PNaysmith.

Those who have read Paul’s columns might be wondering why they haven’t heard from him in a while. After his stint working in the United States, he moved back to his homeland of Scotland, where he quickly found a new career in the medical-device industry; became a dad to his first child, Florence; and decided to restore a classic car back to its roadworthy glory. With the help of his current employer, he’s also started the first-of-its-kind quality apprenticeship scheme, which he hopes will become a pipeline for future improvement ninjas and quality punks.


risk calculus

The use of risk matrics, or thinking that risk can be subject to the calculation of 'probability of occurance' X 'severity of outcome' is dangerous.

Risk matrices, particularly when created by uncalibrated users, can mislead, and one simply cannot multiply a probabilty distribution function by an ordinal and get anything meaningful.

I refer to:




As concise comment on the problems. For a more detailed analysis, refer to Hubbards The Failure of Risk Management.


Getting management support for a risk assessment

Great article Paul. Is your mother going to wash your mouth out with soap? Of all your points, it was the last paragraph that carries the big question: If you don't take any risk assessment actions, and something does go wrong, how will you explain your lack of action? I was working with a German airline (guess?) some years ago, and the project head looked at me and said, "I want this analysis included in the project because I do not want to find myself in the uncomfortable position of having to explain to my executives why I did not take four more hours to investigate this issue." Jeff

It is really a matter of safety culture.

Fukushima is really another example of the degradation of an organization's safety culture rather than the lack of risk assessment. The U.S. Nuclear Industry is considering additional defense strategies as a result of Fukushima; going beyond design basis accidents. It is one thing to continually assess risk. It is the culture of the organization that provides the motivation to do something constructive with the results of that assessment.

That being said, I agree with Mr Naysmith's assertion that many organizations do not benefit from the insights that a robust risk assessment will provide. Here in the states we react rather than proact. Not across the board, but far too often.

Choice of words

Sophmoric play on words is disappointing for a "professional" publication.