The Futurist Interviews Crisis Communications Expert Peter Sandman on the Fukushima Daiichi Nuclear Meltdown in Japan

Subject(s):

Two weeks since the nuclear power plant in Fukushima Japan entered a state called partial meltdown, Japan saw an exodus of foreign executives, a near 16% drop in its stock market (which has since partially rebounded) and shortages of food linked to hoarding. All this is occurring despite the fact that nuclear experts say that even the effects of a full-scale meltdown would be limited to the area around the plant.

FUTURIST magazine deputy editor Patrick Tucker saw the ongoing news coverage of Fukushima Daiichi both in Japan and the United States. The Kan administration has been roundly criticized for its PR response to the situation. Here, he interviews Peter Sandman about the Kan Administration's communications response to the event.

The Futurist: What were the specific crisis management missteps of the Kan administration?

Sandman: I will confine myself to crisis communication, the only aspect of crisis management where I have some expertise. And I will focus on one specific Fukushima crisis communication failure, the one I consider most serious: the government’s failure to speculate publicly about what-if scenarios it was certainly considering privately.

Every crisis raises three key questions:

What happened – and what are you doing to respond to it, and what should we (the public) do?
What’s likely to happen next – and what are you doing to prepare for it, and what should we do?
What’s not so likely but possible and scary, your credible worst case scenario – and what are you doing to prevent it (and prepare for it in case prevention fails), and what should we do?
There are other questions as well, of course – notably questions about how the crisis will affect “me” (my health, my family, my home, my income, my community). And there will be still other questions once the crisis ebbs or stabilizes – especially questions about blame.

But regarding external events – regarding the crisis itself – the three key questions the public asks are always: what happened, what do you expect to happen, and what are you worried might happen.

With regard to Japan’s March 11 earthquake and tsunami, the first question was far more important than the second and third. March 11 was a tragedy but not a cliffhanger.

The resulting nuclear crisis at the Fukushima power plants, on the other hand, was a cliffhanger but not a tragedy (at least not early on) – so the second and third questions were the crucial ones. As I write this on April 1, they still are. “What happened” remains a largely unanswered question; the answers keep changing as the situation evolves. But as long as the Fukushima crisis threatens to deteriorate, the most important questions will be about the future, not the past.

By far the biggest crisis communication error of the Japanese government has been its failure to answer the second and third questions satisfactorily: its failure to forewarn people about tomorrow’s and next week’s probable headlines, and its failure to guide people’s fears about worst case scenarios.

The second and third questions – what are you expecting (“likeliest case”) and what are you most worried about (“worst case”) – are the inevitable forward-looking questions in any crisis, personal as well as societal. They are what we ask our plumber or our doctor. They are what we were asking about Fukushima from Day Two. (Day One was mostly about the first question, what happened.) And the government of Japan rarely gave us adequate answers.

Talking about what’s likely and what’s possible is necessarily speculative. Some commentators and even some crisis communication professionals have argued that authorities shouldn’t speculate in a crisis. This is incredibly bad advice.

Imagine a weather forecaster who refused to say where the hurricane was likely to be headed: “Here’s where it is now. We can’t be sure where it’s going, and we certainly wouldn’t want to speculate. Tune in again tomorrow and we’ll tell you where it went.” Or imagine a weather forecaster who refused to say how bad the hurricane might get: “It’s only Category 3 now. Any mention of the possibility of it strengthening to Category 4 or 5 and what that could mean to the cities in its path would be strictly speculative.”

In fact the Japanese government didn’t shy away from reassuring speculation, only from alarming speculation. Officials were happy to predict that they would probably get power to the pumps soon and would then be able to cool the plants properly, for example.

But they failed to predict that there would probably be increasing radiation levels in local milk, vegetables, and seawater; that Tokyo’s drinking water would probably see a radiation spike as well; that plutonium would probably be found in the soil near the damaged plants; that the evidence of core melt would probably keep getting stronger; that all that water they were using to cool the plants would probably become radioactive, probably make repair work more difficult and more dangerous, and probably begin to leak; etc. After each of these events occurred, the government told us they were predictable and not all that alarming. But it failed to predict them.

My guess is that officials did in fact predict most of these events – privately. But they failed to predict them publicly.

It would have been a mistake – the opposite mistake from the one they made – for officials to predict all these events to the public confidently. They may have been confident about some of them but highly uncertain about others. Whatever their actual level of confidence/uncertainty was, whatever level of confidence/uncertainty they were expressing in their private planning meetings, that’s the level of confidence/uncertainty they should have expressed to the public. This is what responsible speculation requires: telling the public that X is probably going to happen, Y is a 50-50 shot, and Z wouldn’t be a huge surprise but it’s not really likely either.

My 2004 column on “Acknowledging Uncertainty“ offers specific advice on how to communicate your level of confidence/uncertainty. It’s not terribly hard to do, once you decide to do it. But especially when you’re talking about more bad news in what is already a devastating crisis, it’s very hard to persuade yourself and your organization to do. The nearly universal temptation is to keep quiet about your gloomy but still tentative expectations, hoping against hope that they won’t happen … until they do happen and you have no choice but to say so.

Officials not only failed to speculate responsibly about their gloomy but still tentative expectations. They also failed to address still more alarming (and still less likely) worst case scenarios: what if there’s an explosive breach of a containment, propelling nuclear fuel fragments high into the atmosphere; what if fuel elements melt together and achieve recriticality; what if we have to think seriously about evacuating Tokyo; what if we can never reoccupy a significant chuck of Japan; etc.

Because the government avoided alarming speculation, the people of Japan (and the world) kept learning that the situation at Fukushima was “worse than we thought.”

This violates a core principle of crisis communication. In order to get ahead of the evolving crisis instead of chasing it, crisis communicators should make sure their early statements are sufficiently alarming that they won’t have to come back later and say things are worse than we thought. Far better to be able to say instead: “It’s not as bad as we feared.”

Among the reasons why officials have been reluctant to speculate alarmingly is, undoubtedly, a fear of panicking the public. But despite some condescending newspaper columns about “radiophobia,” there is little evidence of nuclear panic in Japan or elsewhere. Nuclear skepticism, nuclear distrust, nuclear dread … but not nuclear panic.

Crisis communication experts know that panic in emergency situations is rare. People may very well feel panicky, but panic isn’t a feeling. Panic is a behavior. Panic is doing something dangerous and stupid – something you know is dangerous and stupid – because you are in the grip of intolerable emotion and can’t stop yourself.

It isn’t panic to try to get your hands on some potassium iodide, even if you’re thousands of miles from Fukushima and vanishingly unlikely to need it. Panic is when you knock down your grandmother in your haste to get to the drugstore before it runs out of potassium iodide.

And it certainly isn’t panic to stockpile food and water in case these necessities become contaminated or their supply lines are disrupted. That’s simply prudence. It is what disaster experts often recommend – right up until the disaster happens. Then suddenly their tone turns disapproving, and they call the stockpilers “hoarders” and accuse them of panicking. The government has been appropriately empathic about the suffering of victims – the families of the dead and missing, the evacuees sheltering for weeks in school gymnasiums. But there has been precious little empathy for the millions who were rationally worried that they might be among the next victims.

Moreover, the Japanese government’s failure to speculate alarmingly didn’t “protect” the public from alarming speculation. It simply left people speculating on their own, and listening to the speculations of outside experts (and outside non-experts). Knowing that public and media speculation are inevitable, a wise crisis manager guides the speculation, rather than boycotting it or condemning it.

The bulk of the criticism of the government’s crisis communication has assailed it for failing to provide information promptly and honestly. (The same charge has been leveled against TEPCO, by the government among others.) There is doubtless some truth to this charge … and a few months or years from now we may know that withholding information was the most serious sin of Fukushima crisis communication.

But I doubt it. For the most part, I suspect, the government has told us what it knew for certain. Its biggest sin has been failing to tell us enough about what it guessed, what it predicted, and what it feared.

These failures felt dishonest. And in a sense they were dishonest. We kept hearing alarming speculations from outside academics and anti-nuclear activists and even the U.S. government and the International Atomic Energy Agency that we weren’t hearing from the government of Japan. We kept waking up to bad news that the Japanese government hadn’t told us might be coming. We rightly judged that the government was failing to keep us on top of the situation – but not, I think, because it wasn’t telling us what it knew; rather, because it wasn’t telling us what it guessed, predicted, and feared.

When a bad thing happens without warning midway through an evolving crisis, there are only three possible explanations:

The authorities had reason to think it was going to happen, and decided not to forewarn people – not to give the public time to prepare (emotionally as well as logistically).
The authorities knew they didn’t know what was going to happen, and decided not to tell the public that – not to tell us that the situation is unpredictable and warn us to expect scary surprises.
The authorities thought they had a better handle on the crisis than they actually had – and the new development is as shocking to them as it is to the public.
The truth is usually some mix of the three.

To the extent that the Japanese government had reason to expect particular bits of bad news, it should have said so. It is absurd, for example, that the 12 million people in Tokyo were not warned to stockpile at least a modest supply of a readily available resource – tap water – in advance of potential contamination.

And to the extent that the Japanese government knew it didn’t know what to expect, it should have said that. Acknowledging uncertainty, ignorance, and the resulting inevitability of scary surprises is itself a kind of forewarning. Even if you can’t prepare logistically for what you don’t know is coming, you can at least prepare emotionally to be surprised.

The Futurist: What harm has resulted from the Japanese government’s unwillingness to speculate?

SandmanThere has been damage, obviously, to the credibility of the Japanese government, and therefore to its ability to lead its people through the hard times ahead. There has been damage to the future of nuclear power, exacerbating the damage done by the crisis itself.

The worst damage may be the public’s growing sense that the Fukushima crisis is out of control and uncontrollable, that it cannot be predicted and is therefore greatly to be feared. Perhaps that very frightening assessment will turn out to be an accurate one. But if the crisis does stabilize and begin to ebb, if we stop waking up every morning to further bad news from Fukushima, if worst case scenarios start coming off the table in the minds of experts, will the public notice and believe? If it doesn’t, that will be largely a legacy of the Japanese government’s unwillingness to speculate.

The Futurist: What should they do now?

Sandman: Obviously, be willing to speculate – and learn how to speculate responsibly. Jody Lanard and I entitled our 2003 essay on crisis speculation “It Is Never Too Soon to Speculate.” It is never too late to speculate either.

In recent days, I think, the Japanese government has been more willing to address my second and third questions – to tell people what bad news it considers likely in the days ahead and what worse scenarios it is taking seriously, preparing to cope with, and trying to avert.

The short-term effect of this increased candor about likely and possible futures may well be increased concern. Journalists and the public are picking up on the change in tone, and some are interpreting it as evidence that the situation at Fukushima is worsening. When even the government says things look bad, some people figure, things must look very bad indeed. This is inevitable when officials switch from stonewalling and over-reassuring to responsible speculation.

I hope the government stays the course. In fact, I hope it focuses even more on becoming Fukushima’s Cassandra, not its Pollyanna. If predictable bad things happen (as they surely will), the government’s having predicted them will help keep people from overreacting to them. If the crisis worsens (as it may), the government’s pessimism will at least have alerted us to this real possibility. And if the crisis eases (as we all hope it will), I look forward to the day when the Japanese government will have earned the right to say to the public, “it’s not as bad as we feared.”

Then it will be time to address the much smaller problem of being accused of having “fear-mongered.” That accusation is almost inevitable when crisis communication has been well handled.

That’s the crisis communicator’s choice. Either you over-reassure people, fail to forewarn them about likely bad news to come and possible worst case scenarios, and leave them alone with their fears. Or you treat them like grownups, tell them what you expect and what you’re most worried about, and help them bear their fears. In the former case, they are forced to endure scary surprises, lose their trust in you, and have trouble noticing when the crisis is over. In the latter case, they prepare for the worst, manage their fears (and the situation itself) better … and end up a little irritated at you for having been so alarmist.

What is the best corollary to the present situation? Does another incident come to mind that could be instructive?

The most compelling precedent for Fukushima is of course Three Mile Island. Like Fukushima (and unlike Chernobyl), it was a cliffhanger too.

Seven years ago, on the 25th anniversary of the 1979 Three Mile Island nuclear accident, I wrote an article entitled “Three Mile Island – 25 Years Later.” In it I listed what I saw as the most enduring crisis communication lessons of the Three Mile Island Accident.

Several of these lessons strike me as relevant to Fukushima, and the rest of this section is adapted from that article.

Pay attention to communication.

Three Mile Island was technically much less serious than Fukushima; it was a near miss, but very little radiation was actually released. No local crops were contaminated. Pregnant women and young children were evacuated, but that turned out to have been unnecessary. What went wrong at TMI – really, really wrong – was the communication.

Communication professionals were minor players at TMI. I was at Three Mile Island, first for the Columbia Journalism Review (covering the coverage) and later for the U.S. government commission that investigated the accident. In the latter capacity, I asked Jack Herbein, the Metropolitan Edison engineering vice president who managed the accident, why he so consistently ignored the advice of his PR specialist, Blaine Fabian. (Risk communication hadn’t been invented yet.) He told me, “PR isn’t a real field. It’s not like engineering. Anyone can do it.”

That attitude, I think, cost MetEd and the nuclear power industry dearly. And that attitude continues to dominate the nuclear industry, contributing to one communication gaffe after another. Nuclear power proponents keep shooting themselves in the foot for lack of risk communication expertise.

I don’t know if TEPCO or the Japanese government has any in-house risk communication or crisis communication professionals, and I don’t know if either brought in outside risk communication or crisis communication advisors. I’m guessing the answers were no and no, at least in the first couple of weeks. There have been some signs of improved “uncertainty communication” and “worst case communication” in the last few days.

Don’t lie – and don’t tell half-truths.

Companies and government agencies try hard not to lie outright, but they usually feel entitled to say things that are technically accurate but misleading – especially in a crisis when they are trying to keep people calm. Ethics aside, the strategy usually backfires. People learn the other half of the truth, or just sense that they aren't being leveled with, and that in itself exacerbates their anxiety as it undermines their trust in officialdom.

Here is one spectacular example of a not-quite-lie from Three Mile Island. (We don’t know yet if there are comparable examples from Fukushima.)

The nuclear power plant in central Pennsylvania was in deep trouble. The emergency core cooling system had been mistakenly turned off; a hydrogen bubble in the containment structure was considered capable of exploding, which might breach the core vessel and cause a meltdown.

In the midst of the crisis, when any number of things were going wrong, MetEd put out a news release claiming that the plant was “cooling according to design.” Months later I asked the PR director how he could justify such a statement. Nuclear plants are designed to survive a serious accident, he explained. They are designed to protect the public even though many things are going wrong. So even though many things were going wrong at TMI, the plant was, nonetheless, “cooling according to design.”

Needless to say, his technically correct argument that he hadn’t actually lied did not keep his misleading statement from irreparably damaging the company’s credibility.

Get the word out.

Most government agencies and corporations respond to crisis situations by constricting the flow of information. Terrified that the wrong people may say the wrong things, they identify one or two spokespeople and decree that nobody else is to do any communicating. In an effort to implement this centralized communication strategy, they do little or nothing to keep the rest of the organization informed.

There is certainly a downside to authorizing lots of spokespeople; the mantra of most crisis communication experts is to “speak with one voice.” But I think the disadvantages of the one-voice approach outweigh the advantages. This approach almost always fails.

It failed at Three Mile Island. Reporters took down the license plate numbers of MetEd employees, got their addresses, and called them at home after shift. Inevitably, many talked – though what they knew was patchy and often mistaken. The designated information people for the U.S. Nuclear Regulatory Commission and the utility, meanwhile, had trouble getting their own information updates; those in the know were too busy coping with the accident to brief them. (The lesson here: There need to be technical experts at the scene whose designated job is to shuttle between the people who are managing the crisis and the people who are explaining it. As far as I can tell, nobody was assigned that role at Fukushima.) The state government felt its own information was so incomplete that Press Secretary Paul Critchlow asked one of his staff to play de facto reporter – trying to find out what was going on so Critchlow could tell the media … and the Governor.

In today’s world of 24/7 news coverage and the Internet, the information genie is out of the bottle. If official sources withhold information, we get it from unofficial sources; if official sources speak with one voice, we smell a rat and seek out other voices all the harder … and find them.

But crisis information wasn’t controllable three decades ago in central Pennsylvania either. As my wife and colleague Jody Lanard likes to point out, even in the pre-Gutenberg era, everyone in medieval villages knew when troubles were brewing. The information genie never was in the bottle. Keeping people informed and letting them talk is a wiser strategy than keeping them ignorant and hoping they won’t.

Err on the alarming side.

This is the Three Mile Island crisis communication lesson of greatest relevance to Fukushima.

In the early hours and days of the Three Mile Island accident, nobody knew for sure what was happening. That encouraged Metropolitan Edison to put the best face on things, to make the most reassuring statements it could make given what was known at the time. So as the news got worse, MetEd had to keep going back to the public and the authorities to say, in effect, “it’s worse than we thought.”

This violated the cardinal rule of crisis communication I discussed in my first answer: Always err on the alarming side, until you are absolutely 100% certain the situation cannot get any worse.

In the three decades since TMI, I have seen countless corporations and government agencies make the same mistake. Its cost: The source loses all credibility. And since the source is obviously underreacting, everybody else tends to get on the other side of the risk communication seesaw and overreact.

That’s why Pennsylvania Governor Dick Thornburgh ordered an evacuation of pregnant women and preschool children. MetEd was saying the amount of radiation escaping the site didn't justify any evacuation – and MetEd, it turns out, was right. But MetEd had been understating the seriousness of the accident from the outset. When the head of the Pennsylvania Emergency Management Agency misinterpreted a radiation reading from a helicopter flying through the plume, thinking it was probably an offsite reading of exposures reaching populated areas, Thornburgh didn't even check with the no-longer-credible utility (which could have told him PEMA had misunderstood the situation). He decided better safe than sorry and ordered the evacuation.

In contrast to Metropolitan Edison, the Pennsylvania Department of Health adopted an appropriately cautious approach. The Health Department was worried that radioactive Iodine 131 might escape from the nuclear plant, be deposited on the grass, get eaten by dairy cattle, and end up in local milk. Over a two-week period health officials issued several warnings urging people not to drink the milk. Meanwhile, they kept doing assays of the milk without finding any I-131. Their announcements moved slowly from “there will probably be I-131 in the milk” to “there may be I-131 in the milk” to “there doesn’t seem to be I-131 in the milk, but let us do one more round of testing just to be sure.”

By the time the Health Department declared the milk safe to drink, virtually everyone believed it. While the Health Department’s caution hurt the dairy industry briefly, the rebound was quick because health officials were credibly seen as looking out for people’s health more than for the dairy industry’s short-term profits.

By contrast, the Japanese government said nothing in advance about even the possibility of radioactive milk, and then it suddenly announced that it had tested the milk from around Fukushima (apparently secretly), found more radioactivity than it considered acceptable, and decided to ban its sale. If and when the milk is deemed safe again, I wonder how soon anyone will believe it.

The Futurist: How would you advise a public official dealing with a potential nuclear meltdown to communicate the risks to the public without alarming them?

Sandman: I wouldn’t! Why on earth wouldn’t you want to alarm people about a potential nuclear meltdown?

There is a purpose to alarming people, after all. You want to motivate them to put aside more ordinary concerns and focus on the crisis. You want them to start thinking about what they should do to protect themselves, their loved ones, and their community – what they should do now, and what they may need to do soon if the situation gets worse. You want them to get through their adjustment reaction (a brief over-reaction to a new risk), gird up their loins, and prepare themselves not just logistically but also emotionally.

My crisis communication clients often want the public to take precautions … but don’t want the public to get alarmed. But the main reason people take precautions is because they are alarmed.

One crucial goal in risk communication, therefore, should always be to achieve a level of public concern commensurate with the actual risk – or at least commensurate with the experts’ level of concern, since the “actual risk” may be unknown. When the actual risk (or the experts’ concern) is low, you want people to stay calm (or calm down); you don’t want them focusing undue attention on a tiny risk. But when the actual risk (or the experts’ concern) is high, the level of public concern should be high too – perhaps too high for the word “concern” to capture. (You don’t install “fire concerns” in buildings; you install “fire ALARMS.”) Even “alarm” may not capture it. Sometimes, in really bad times, you should be aiming for fear.

That’s true even if the current situation isn’t very serious. Don’t forget the “pre” in “precaution.” Ideally, precautions are things you do (or at least prepare to do) before the risk is imminent. Since a key goal of alarming people is to motivate precaution-taking, you need to alarm them about what might happen, not just what’s already happening. Japan’s earthquake and tsunami were so deadly mostly because there was no time for precautions, no time to alarm people before their risk was imminent.

The Fukushima crisis has allowed plenty of time to ramp up people’s alarm … and preparedness. One of the most frequent non-sequiturs in Fukushima crisis communication has been to assure the public that there’s no reason to be alarmed because the current level of radiation (except right near the plants) isn’t dangerously high. But what’s most frightening about Fukushima (except right near the plants) isn’t the level of radiation so far; it’s what might happen that could send the radiation literally through the roof.

In crisis communication, the goal isn’t to keep people from being fearful. The goal is to help them bear their fear (and the situation that provokes it), and to help them make wise rather than unwise decisions about precautions.

Arguably the cardinal sin in crisis communication is to tell people not to be afraid. If your false reassurance succeeds, they are denied the time they need to prepare. If your false reassurance fails, all you’ve accomplished is to leave people alone with their fear – prompting them, justifiably, to take guidance from sources other than you, and frittering away your own credibility and thus your capacity to lead them through the worsening crisis that may be coming.

My clients hate this advice. Their fear of fear – their reluctance to frighten the public even when the situation is legitimately frightening – results partly from what I call “panic panic”: the mistaken tendency of officials to imagine that the public is apt to panic or already panicking.

The public rarely panics in emergencies. They are especially unlikely to panic when they feel they can trust their leaders to tell them scary truths … that is, when they feel their leaders are trusting them to bear scary truths.

There is a downside to frightening the public, but it isn’t panic. The downside is that the crisis may ease instead of worsening, and with 20-20 hindsight people will blame you for frightening them unnecessarily. In the winter of 2009–2010, the U.K. went through an unexpectedly severe winter and an unexpectedly (and blessedly) mild swine flu pandemic – and the U.K. media reproached officials (sometimes on the very same day) for having bought too little road grit and salt and too much vaccine. But it’s not damned if you do and damned if you don’t. The repercussions (and thus the recriminations) of under-preparedness are a lot more harmful than those of over-preparedness. When it comes to warning – and frightening – the public about a crisis that could get worse, it’s darned if you do and damned if you don’t.

About the Interviewee
Peter M. Sandman is one of the preeminent risk communication speakers and consultants in the United States. Learn more at Psandman.comn