The Good and the Bad of False Alerts
Three years ago, my family and I found ourselves in the midst of what may be the worst false alert in modern times. It was just days into the new year, and we'd awoken to a 'typical' Hawaiian winter morning - blue skies, a cool breeze, and the promise of big waves. But life changed in an instant when at 8:07, all of our phones buzzed in unison. I picked up mine to see before even turning it on a full warning message displayed on the preview screen: "BALLISTIC MISSLE THREAT INBOUNT TO HAWAII. SEEK IMMEDIATE SHELTER. THIS IS NOT A DRILL."
The immediate gravity of this warning was incomparable to any I'd ever received before, and I experienced what many others also describe as being a 10-15 second period of both shock and denial. That passed quickly, though, and my wife and I found ourselves working through our options to survive a potential nuclear attack (which I describe in this blog post I drafted just days after the event).
We all know now that there was no actual missiles headed to Hawaii, but rather this warning was the result of a system test gone awry. But for 38 minutes, hundreds of thousands of residents, and many visitors and tourists, faced the prospect of an impending fiery demise. Many called their families on the mainland and elsewhere to say their last goodbyes, thereby spreading the impact further. Studies later showed that many of these people suffered increased and lingering anxiety levels long after the event, and for some it manifested as PTSD. Consider that in my own neighborhood, dozens of people climbed up into the hills and broke the locks on old WWII bunkers to seek shelter inside. Elsewhere on the island, people lifted manhole covers and climbed into the sewers to seek shelter, knowing that the single-wall construction of Hawaiian homes offers little to no protection from cold air - let alone ballistic missiles.
At least one person suffered a heart attack from the stress, and died.
The employee who made this tragic mistake was fired, and the state emergency management director ultimately resigned. Reports surfaced that this employee had made at least two similar mistakes in the past, and though protocols were in place to reduce the likelihood of a false alert it nonetheless still happened.
We do all we can to prevent false alerts from happening, and make changes when they do to prevent them from happening in a similar way in the future. But they can and do happen over and over. In fact, just days after the Hawaii event, a similar missile alert was mistakenly issued in Japan. Then, in 2019, Hawaii EMA issued another false alert - this time using the island-wide system of tsunami and air raid sirens. And in 2020, a nuclear-based false alert was issued in Ontario, Canada, through almost the same mechanisms as the 2018 Hawaii incident.
And just last week, the State of Texas Department of Public Safety inadvertently issued a false 'Amber Alert,' asking residents to be on the lookout for a child abducted by none other than horror movie villain 'Chucky'. Apparently the sociopathic doll had abducted his own son, also a doll. The alert was retracted and an apology issued, but not before some people had received the alert three times.
In the emergency management community, as in other professions, there is a deep reliance on drills and exercises to build the tacit knowledge that accompanies what we learn in books and in classes. We need to physically and psychologically feel what it is like to be in crisis situations in order to know more accurately what we should expect, and how we will react. But we know going into these situations that they are drills. We don't fear for others or for ourselves, because we know the event is contrived. It is planned, and precautions have been taken. The risk of failure is limited to ourselves, and there is always another chance to 'get it right'. Technology systems including virtual and augmented reality have improved our ability to simulate crises, but we cannot simulate genuine fear or concern like that which accompanies the real-life threat of harm to one's self, their family and friends, and of the people they are tasked to serve.
And of course we can't intentionally issue false alerts, even if those alerts were limited to EM officials (and not the public). The risk of a responder acting in a way that created harm to themselves or to others is just too great to justify. Such warnings would also threaten to undermine responder trust in their own systems, causing them to question with every warning whether this time it was 'just another drill'.
There is a reason every message in an exercise includes the rider that "THIS IS ONLY A DRILL". It is there so that when those words do not appear, there is no question of legitimacy.
A false alarm, however, gives us the best of both worlds. There is a realism to the situation that allows all responders to act with legitimacy. Every action they take is believe to be 'do or die'. There is a rush of adrenaline. For some, there is a sense of panic or paralyzing fear that may be unexpected. Others find themselves going into an autopilot built through years of training and preparation. Leaders emerge. And ultimately, the hazard was never real and the feared consequences are avoided. Following the 2002 Anthrax Attacks in Washington, DC, hundreds or even thousands of 'hoax' letters and packages with white powder began showing up all over the country. Local fire and police departments with little previous chem-bio experience were called to respond. And while more than 99% of these threats were ultimately harmless 'false alerts', those who responded had feared for their own safety and of those who had been exposed. And this helped to develop much more effective response protocols, PPE acquisitions, and future training programs that no planned exercise ever could. There was, in effect, a net benefit.
State and local response agencies need to be prepared to fully capture the lessons and best practices that result from false alerts. It is critical that the obvious negative aspects of these unfortunate events be adequately balanced with a net positive in the form of capacity development. I published a piece in the FBI publication "The Beacon" in 2002 that provided a methodology specific to the discovery of Anthrax hoaxes. A similar approach can be taken with False Alerts. As much as we seek to avoid them, we need to recognize them for the unwanted gifts that they are.
コメント