Hurricane Harvey AAR – Lessons for Us All

Harris County, Texas has recently released their After Action Report (AAR) for Hurricane Harvey that devastated the area last year.  I applaud any AAR released, especially one for an incident of this magnitude.  It requires opening your doors to the world, showing some incredible transparency, and a willingness to discuss your mistakes.  Not only can stakeholders in Harris County learn from this AAR, but I think there are lessons to be learned by everyone in reviewing this document.

First, about making the sausage… The AAR includes an early section on the means and methods used to build the AAR, including some tools provided in the appendix.  Why is this important?  First, it helps build a better context for the AAR and lets you know what was studied, who was included, and how it was pulled together.  Second, it offers a great example for you to use for future incidents.  Developing an AAR for an incident has some significant differences from developing an AAR for an exercise.  Fundamentally, development of an AAR for an exercise begins with design of the exercise and is based upon the objectives identified for that exercise.  For an incident, the areas of evaluation are generally identified after the fact.  These areas of evaluation will focus the evaluation effort and help you cull through the volumes of documentation and stories people will want to tell.  The three focus areas covered in the AAR are Command and Control, Operations, and Mass Care and Sheltering.

Getting into the Harvey AAR itself… My own criticism in the formatting is that while areas for improvement in the AAR follow an Issue/Analysis/Recommendation format, identified strengths only have a sentence or two.  Many AAR writers (for incidents, events, or exercises) think this is adequate, but I do not.  Some measure of written analysis should be provided for each strength, giving it context and describing what worked and why.  I’m also in favor of providing recommendations for identified strengths.  I’m of the opinion that most things, even if done well and within acceptable standards, can be improved upon.  If you adopt this philosophy, however, don’t fall into the trap of simply recommending that practices should continue (i.e. keep doing this).  That’s not a meaningful recommendation.  Instead, consider how the practice can be improved upon or sustained.  Remember, always reflect upon practices of planning, organizing, equipping, training, and exercises (POETE).

As for the identified areas for improvement in AAR, the following needs were outlined:

  • Developing a countywide Continuity of Operations Plan
  • Training non-traditional support personnel who may be involved in disaster response operations
  • Transitioning from response to recovery operations in the Emergency Operations Center
  • Working with the City of Houston to address the current Donations Management strategy

If anything, for these reasons alone, the AAR and the improvement planning matrix attached should be reviewed by every jurisdiction.  Many jurisdictions that I encounter simply don’t have the POETE in place to be successful in addressing these areas.

What is your biggest take away from this AAR?

© 2018 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC™

The Hawaii Saga

I simply don’t think I can refrain from some extended commentary on the Hawaii missile notification incident any longer.  I’ve tossed a few Tweets on this topic in the past couple of weeks, but as the layers of this onion are peeled back, more and more is being revealed.  I’m not a conspiracy theorist, but the number of half-truths that have been reported on this incident lead me to believe we still don’t know everything that transpired that morning.  Now that the FCC has leaned into this investigation, more and more information is being revealed, despite reports that the employee at the center of it gave limited cooperation in the investigation (likely at the advice of an attorney).  Most of my commentary is based upon information reported by the Business Insider and Washington Post which includes information from the ongoing FCC investigation.

eas_logo_rev1_3

First, why was public notification of a false missile strike such a big deal?  The effective practice of notification and warning in emergency management relies on the transmission of accurate, timely, and relevant information.  Since emergency management is already challenged by a percentage of citizens that willfully don’t pay attention to warnings, don’t care about them enough to take action, or otherwise refuse to take action, the erosion of any of these pillars will degrade public trust in an already less than ideal environment.  We sometimes struggle to get accurate weather-related warnings issued, but when a warning is sent for a ballistic missile strike that isn’t occurring, that’s a significant error.  We certainly saw across social media the stories of people on the Hawaiian Islands as well as those in the continental US with friends and family in Hawaii.  The notification of an impending ballistic missile strike is terrifying to a population.  Imagine saying good bye to your family and loved ones for what you think is the last time.  What truly made this erroneous notification unforgivable was the 38-minute time span it took for it to be rectified.

While there is a lot of obvious focus on the employee who actually activated the alert, I see this person as only one piece of the chain of failures that occurred that morning.  It was first reported that the employee accidentally selected the wrong option in a drop-down menu; selecting an actual alert instead of a test.  While mistakes can and do happen in any industry, the processes we use should undergo reviews to minimize mistakes.  Those processes include the tools and technology we use to execute.  Certainly, any system that issues a mass notification should have a pop-up that says ‘ARE YOU REALLY SURE YOU WANT TO DO THIS???’ or a requirement for verification by another individual.  I’ll note that the Business Insider article says there is a verification pop-up in the system they used, so clearly that wasn’t enough.

Findings released from the initial FCC investigation found that the employee apparently thought this was a real incident instead of an exercise, therefore, their action was intentional.  So, we have another mistake.  As mentioned before, the processes and systems we have in place should strive to minimize mistakes.  A standard in exercise management is to use a phrase similar to ‘THIS IS AN EXERCISE’ in all exercise communications.  By doing so, everyone who receives these communications, intentionally or otherwise, is aware that what is being discussed is not real.  I would hope that if the warning point employee heard that phrase with the order to issue an emergency alert, the outcome would have been different.  According to the FCC report, the phrase ‘Exercise, exercise, exercise’ was used, but so was the phrase ‘this is not a drill’.  While reports indicate some issues with past performance of this employee, I would caution that messages such as this are confusing and should never be issued in this manner.  They need to take a serious look at their exercise program and how it is managed and implemented.

Next, 38 minutes of time passed before a retraction was issued.  Forgive me here, but what the hell happens in 38 minutes that you can’t issue a retraction?  There are timelines posted in the Business Insider and Washington Post articles on this matter.  I believe that what I’m reading is factual, but I shake my head at the ineptitude of leadership that existed, ranging from the employee’s supervisor, to the agency director, and all the way up to the Governor.  There is no reason a retraction could not have been issued within minutes of this false alarm.  We see things in this timeline such as ‘drafting a retraction’ and ‘lost Twitter password’.  Simply bullshit.  There isn’t much to draft for an initial retraction other than ‘False Alarm – No missile threat’.  We know from later in the timeline that this could have been sent through the same system that sent the initial message.

It’s noted that Hawaii EMA didn’t have a plan in place for issuing retractions on messages.  An easy enough oversight, I suppose, but when they report that this same employee had issued false messages on two previous occasions, a plan would have been developed for something that was an obvious concern.

A possible path to correction is a bill that may be introduced by Sen. Brian Schatz which would give the US Departments of Defense and Homeland Security the responsibility to notify the public of an incoming missile attack.  Is this a perfect fix? No.  Consider that weather alerts can be issued by the National Weather Service, or by state or local emergency management agencies based upon NWS information or what they are actually observing on the ground.  I’m a big believer in state’s rights as well as their ultimate responsibility to care for their populations, so I believe the states should have the ability to issue such alerts, however they should generally be defaulting to DoD, as DoD has the technology to detect an incoming attack.

There are numerous layers of failure in this situation which need to be examined and addressed through rigorous preparedness measures.  It obviously was an embarrassing occurrence for Hawaii EMA and I’m sure they are working to address it.  The intent of my article isn’t to harp on them, but to identify the potential points of failure found in many of our systems.  Unfortunately, this situation makes for a case study that we all can learn from.  Current technology provides every state, county, city, town, and village the ability to access an emergency alert system of some type.  Some are municipal systems, some are regional, some are state, and some are national (IPAWS).  We access these systems through custom developed programs or commercially available interfaces.  These systems will instantly issue alerts to cell phones, email accounts, social media, radio, and TV; and some will still activate sirens in certain localities.  The technology we have enables us to reach a high percentage of our populations and issue critical communications to them.  While the technology is great and the message we send is important, it’s only one element of a good public information and warning program.  Clearly, we see from the occurrence in Hawaii, that we need to have solid plans, policies, procedures, systems, training, and exercises to ensure that we can effectively and efficiently issue (and retract) those messages.  So crack open your own plans and start making a list of what needs to be improved.

© 2018 – Timothy M. Riecker, CEDP

Emergency Preparedness Solutions, LLC SM

Learning From a 10-Year-Old Report

I’ll admit that I’m often dismissive of information, especially in the field of emergency management and homeland security, if it’s over 10 years old.  There is a lot that’s changed in the past 10 years, after all.  But, realistically, for as much as we’ve changed, things have stayed the same.  Arguably, the first decade of this millennium saw much more change in EM/HS than the second decade has, at least so far.  The first decade saw events like 9/11 and Hurricane Katrina.  Yes, there certainly have been major events in this second decade, but none, it seems, were as influential to our field of practice than those in the first decade.

It’s important to reflect upon lessons observed and to examine what lessons we’ve actually learned.  How far have we come in implementing improvements from the 9/11 Report?  What still needs to be accomplished to meet the intent of the Post-Katrina Emergency Management Reform Act (PKEMRA)?  Perhaps when I have some time to devote I’ll review those documents again and look on them reflectively and provide my thoughts here.

Yesterday I received the latest email from DomesticPreparedness.com.  I refer often to their work in my articles.  This weekly brief included an article from one of my favorite authors in this field, John Morton.  I’ve referenced his work in a few of my past articles.  This article, titled The What If Possibility: A Chilling Report, talks about planning for a rogue nuclear attack, the likely lead role the federal government would have to take in response to such an attack (versus a locally-led response), and what the situation would be the day after.  With the threat of North Korean nuclear weapons capability looming, this article was an interesting read and spot-on.  I noticed a problem, though… It referenced Ash Carter as an assistant secretary of defense in the Clinton administration.  While this was true, Carter’s highest office was SecDef under President Obama.  Surely John Morton, with his incredible attention to detail that I’ve come to recognize couldn’t have made this error.

Nope.  No error on his part.  I looked at the date of the article.  June 27, 2007 – over a decade old.  Incredibly, this article is still highly relevant today.  The article does reference the drafting of certain federal plans for nuclear attack.  Plans which I am not privy to, but that must assuredly exist today.  I’m curious as to the model these plans follow, what has been learned from exercising them, and how we might be able to apply elements of these plans to other catastrophic occurrences.

Despite change, so much seems to stay the same. Of course a decade isn’t that long.  Given that emergency management and homeland security are primarily government roles, we have to acknowledge that the (usually necessary) bureaucracy simply doesn’t move that quickly.  Unfortunately, there are things we are far too slow to adopt, not just from a government perspective, but socially.  As a lover of history and sociology, I see lessons observed from the 1900 Galveston hurricane as well as the eruption of Mt Vesuvius in 79 CE.  There is much that history can teach us, if we are willing to listen. Lessons observed, but not learned.

© 2017 – Timothy Riecker

Emergency Preparedness Solutions, LLC

Seek First to Understand

‘Seek first to understand.’  It’s one of Stephen Covey’s 7 Habits of Highly Effective People.

This past weekend I came across a blog in a prominent industry magazine’s online edition which was highly critical of a recent response and the state of preparedness of a major metropolitan area.  I was quite set back by how outwardly critical this post was, particularly since the author is rather experienced in emergency management.

No matter what field we are in, we have a tendency to examine, critique, analyze, and criticize.  This is generally healthy and important, especially when there is something that can be learned and applied from the experience.  Things can easily go ugly, though.

The nitty gritty of this is that if you weren’t involved and aren’t providing a critique through something more or less official and reasonably objective, such as an after action report, you generally shouldn’t be commenting (at least publicly).  Why?  Primarily, you very likely don’t have all the information.  Second, what is the criticism gaining you aside from looking like an ass?

Seek first to understand.  That’s the main reason why we, particularly in emergency management, should be looking at other people’s incidents.  Yes, we can examine media reports and other sources of information, but be holistic and comprehensive.  If the people involved in managing the incident made mistakes, then learn from their mistakes.  Don’t criticize them for it – they very likely are already receiving that criticism internally.  They certainly don’t need you to Monday morning quarterback.  It does no one any good.

Pointing fingers at other people only makes them point fingers back and creates a culture of negativity.  In emergency management, we are fortunate enough to have a culture of collaboration, where we are generally willing to share our success and failures with others so that they may learn from them as well.  When we become critical, people become bitter, defensive, and isolationist.

It’s not to say that it’s inappropriate to use an incident as an example.  In December I wrote a post about how People Should Not Die in Exercises, in response to an article about an active shooter exercise in Kenya gone wrong. Was I harsh?  You bet your ass I was – and rightfully so.  The occurrence I wrote about was a great example of what not to do in exercises and an important lesson learned that a lot of people should know about to prevent further loss of life.

While I have as much a history of putting my foot in my mouth as the next person, all I’m saying is be careful how you spend your criticism credits.  When you start to criticize you are no longer seeking to understand.  If you aren’t seeking to understand, then no one learns.

-TR

An Academic Study in Ferguson Civil Disorder

From an academic and emergency management lessons learned perspective, there will be a great deal to learn from the events in Ferguson, Missouri.  In this brief article on Western Illinois University’s Emergency Management program, faculty comment on how a few of the courses within their degree program expect to analyze this social disaster. 

I anticipate a lot of post-incident analysis once we have the facts of this event.  Respecting the loss of life as we do in any disaster, the practice of emergency management within the greater professions of public safety and even government administration stand to learn a great deal from an after action analysis of this incident to help us improve by preparing for and preventing the impacts of future incidents. 

Hazard Analysis – Looking Beyond Your Borders

In the radiological emergency preparedness niche field of emergency management we conduct a lot of preparedness activities for a hazard which may not even be within our jurisdiction.  The emergency planning zone (EPZ) for a nuclear power plant often times transcends multiple towns, cities, villages, counties, and even state lines.  While I have some issues with the effectiveness and implementation of radiological emergency planning, they at least address the reality of the hazard crossing the artificial borders we humans have established.  For other hazards, this premise usually does not hold true.

In January of this year a chemical leaked from a storage tank at a coal processing facility in Charleston, West Virginia.  This chemical leaked into the Elk River and both directly and indirectly impacted hundreds of thousands of citizens, businesses, and governments requiring evacuations and preventing water use for several weeks. The DHS Lessons Learned Information Sharing (LLIS) website has posted a brief by The Joint Commission on this incident with specific citations on the impacts to area hospitals, mostly through contracted laundry services.

In the private sector, we often encourages businesses to examine the vulnerabilities of suppliers and distributors as part of their hazard vulnerability analysis (HVA) and business impact assessment (BIA).  This is not something often considered by governments.  For example, in my town, there is only one very small gas station, so due their limited hours (fuel is not available 24/7) government services and the town’s contracted fire company must leave the town for fuel.  That is a significant dependency on a supplier outside the jurisdiction.  I’ve sure there are many other suppliers used by the town which lie outside their borders.  Additionally, what are the potential impacts of an incident that occurs in a neighboring jurisdiction?  Such an incident could either directly impact you, such as a chemical plume entering your jurisdiction; or would require your jurisdiction to address sheltering, traffic, or mutual aid needs.

I would suggest, as part of the hazard analysis phase of your planning process, that you obtain copies of the hazard analysis of neighboring jurisdictions.  The hazards they indicate may be quite eye-opening to you and may require you to better prepare for a hazard beyond your borders.

©2014 Timothy Riecker