At 8:07 AM local time, panic swept through Hawai as citizens were warned of an “imminent missile threat” en route to the island nation. Emergency sirens rang out across cities as people desperately tried to flee to fallout shelters. Then, about half an hour after the alert went out, another nationwide alert calmed the nerves of citizens, notifying everyone that the previous notice was a ‘false alarm.’Though it later turned out to be a false alarm, the alert could have very easily triggered a counterattack - something which would have inevitably led to war.When investigators set out to uncover what caused the incident, what they found was more than a little concerning. According to initial reports,the alert was simply a matter of a “single employee” pressing the wrong button. But there’s a bit more to the story than that.
As it turns out, there’s a bit more to the story than that. First, the interface for Hawaii’s emergency notifications system was downright horrible. It consisted of a dropdown menu with a single-word difference between sending a test alert and a real alert.
Although there was a confirmation page, it’s easy to see how an employee could mistakenly send a mass notification. And that’s only the tip of the iceberg as points of failure are concerned. According to reports,the governor knew within minutes that the missile scare was false - but he couldn’t tell anyone because he forgot his Twitter password.
The governor assured news agencies that steps are being taken to ensure something like this never happens again.
Now, the golden question. What does any of this have to do with cybersecurity? A great deal, as a matter of fact.
See, much of what cybersecurity professionals do involves responding to and mitigating security incidents. What happened in Hawaii paints a pretty decent picture of how not to do so. First, the interface that employees were expected to use was extremely difficult and cumbersome to get around.
This very much echoes organizations in which applications and tools aren’t designed with usability in mind - such that employees either make mistakes or circumvent them, creating a security risk. Second, upper management - in this case, the governor - did not have the necessary tools to address an incident or a threat. Situations in which upper managers are disengaged or unaware of cybersecurity tend to create real problems within enterprise, and C-suite execs are often the source of many data breaches.
What happened in Hawaii was easily avoidable - and hopefully, they find a way to prevent an incident like this from happening again. Your business should pay attention to this missile crisis too, and not just because it’s an international incident. There are lessons to be learned from it from both a cybersecurity and crisis management standpoint.
Simply put, good UI and proper communication trump everything else, no matter your industry.
What the hawaii missile scare can teach us about cyber security
Jul 02, 2018
hosting
Though it later turned out to be a false alarm, the alert could have very easily triggered a counterattack - something which would have inevitably led to war.