Culture of Preparedness – a Lofty Goal

September is National Preparedness Month here in the US. As we soon head into October, it’s a good opportunity to reflect on what we’ve accomplished during the month, or even elsewhere in the year. While National Preparedness Month is an important thing to mark and to remind us of how important it is to be prepared, over the past several years I’ve come to question our approaches to community preparedness. What are we doing that’s actually moving the needle of community preparedness in a positive direction? Flyers and presentations and preparedness kits aren’t doing it. While I can’t throw any particular numbers into the mix, I think most will agree that our return on investment is extremely low. Am I ready to throw all our efforts away and say it’s not making any difference at all? Of course not. Even one person walking away from a presentation who makes changes within their household to become better prepared is important. But what impact are we having overall?

Culture of preparedness is a buzz phrase used quite a bit over the last number of years. What is a culture of preparedness? An AI assisted Google search tells me that a culture of preparedness is ‘a system that emphasizes the importance of preparing for and responding to disasters, and that everyone has a role to play in doing so.’ Most agree that we don’t have a great culture of preparedness across much of the US (and many other nations) and that we need to improve our culture of preparedness. But how?

People love to throw that phrase into the mix of a discussion, claiming that improving the culture of preparedness will solve a lot of issues. They may very well be correct, but it’s about as effective as a doctor telling you that you will be fine from the tumor they found once a cure for cancer is discovered. Sure, the intent is good, but the statement isn’t helpful right now. We need to actually figure out HOW to improve our culture of preparedness. We also need to recognize that in all likelihood it will take more than one generation to actually realize the impacts of deliberate work toward improvement.

The time has come for us to stop talking about how our culture of preparedness needs improvement and to actually do something about it. There isn’t one particular answer or approach that will do this. Culture of preparedness is a whole community concept. We rightfully put a lot of time, effort, and money into ensuring that our responders (broad definition applied) are prepared, because they are the ones we rely on most. I’d say their culture of preparedness is decent (maybe a B-), but we can do a lot better. (If you think my assessment is off, please check out my annual reviews of the National Preparedness Report and let me know if you come to a different conclusion). There is much more to our community, however, than responders. Government administration, businesses, non-government organizations, and people themselves compose the majority of it, and unfortunately among these groups is where our culture of preparedness has the largest gaps.

As with most of my posts, I don’t actually have a solution. But I know what we are doing isn’t getting us to where we want to be. I think the solution, though, lies in studying people, communities, and organizations and determining why they behave and feel the way they do, and identifying methodologies, sticks, and carrots that can help attain an improved culture of preparedness over time. We must also ensure that we consider all facets of our communities, inclusive of gender identity, race, culture, income, citizenship status, and more. We need people who know and study such things to help guide us. The followers of Thomas Drabek. The Kathleen Tierneys* of the world. Sociologists. Anthropologists. Psychologists. Organizational psychologists.  

A real, viable culture of preparedness, in the present time, is little more than a concept. We need to change our approach from using this as a buzz phrase in which everyone in the room nods their heads, to a goal which we make a deliberate effort toward attaining. A problem such as this is one where we can have a true union of academia and practice, with academics and researchers figuring out how to solve the problem and practitioners applying the solutions, with a feedback loop of continued study to identify and track the impacts made, showing not only the successes we (hopefully) attain, but also how we can continue to improve.

*Note: I don’t know Dr. Tierney personally and it is not my intent to throw her under the proverbial bus for such a project. I cite her because her writing on related topics is extremely insightful. I highly recommend Disasters: A Sociological Approach.

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

ICS Training Sucks – Progress Inhibited by Bias

It’s been a while since I’ve written directly toward my years-long rally against our current approach to Incident Command System (ICS) training. Some of these themes I’ve touched on in the past, but recent discussions on this and other topics have gotten the concept of our biases interfering with progress stuck in my head.

It is difficult for us, as humans, to move forward, to be truly progressive and innovative, when we are in a way contaminated by what we know about the current system which we wish to improve. This knowledge brings with it an inherent bias – good, bad, or otherwise – which influences our vision, reasoning, and decisions. Though on the other hand, knowledge of the existing system gives us a foundation from which we can work, often having awareness of what does and does not work.

I’m sure there have been some type of psychological studies done on such things. I’ve certainly thought about, in my continued rally against our current approach to ICS training, what that training could look like if we set individuals to develop something new if they’ve never seen the current training. Sure, the current training has a lot of valuable components, but overall, it’s poorly designed, with changes and updates through decades still based upon curriculum that was poorly developed, though with good intentions, so long ago.

In recent months, having had discussions with people about various things across emergency management that require improvement, from how we assess preparedness, to how we develop plans, to how we respond, and even looking at the entire US emergency management enterprise itself. Every one of these discussions, trying to imagine what a new system or methodology could look like, with every one of these people (myself included), were infected by an inherent bias that stemmed from what is. Again, I’m left wondering, what would someone build if they had no prior knowledge of what currently exists.

Of course, what would be built wouldn’t be flawless. To some solutions, those of us in the know may even shake our heads, saying that certain things have already been tried but were proven to fail (though perhaps under very different circumstances which may no longer be relevant). Some solutions, however, could be truly innovative.

The notion, perhaps, is a bit silly, as I’m not sure we could expect anyone to build, for example, a new ICS curriculum, without having subject matter expertise in ICS (either their own or through SMEs who would guide and advise on the curriculum). These SMEs, inevitably, would have taken ICS training somewhere along their journey.

All that said, I’m not sure it’s possible for us to eliminate our bias in many of these situations. Even the most visionary of people can’t shed that baggage. But we can certainly improve how we approach it. I think a significant strategy would be having a facilitator who is a champion of the goal and who understands the challenges, who can lead a group through the process. I’d also suggest having a real-time ‘red team’ (Contrarian?) element as part of the group, who can signal when the group is exercising too much bias brought forth from what they know of the current implementation.

In the example of reimagining ICS training, I’d suggest that the group not be permitted to even access the current curriculum during this effort. They should also start from the beginning of the instructional design process, identifying needs and developing training objectives from scratch, rather than recycling or even referencing the current curriculum. The objectives really need to answer the question – ‘What do we want participants to know or do at the completion of the course?’. Levels of training are certainly a given, but perhaps we need to reframe to what is used elsewhere in public safety, such as the OSHA 1910.120 standard which uses the levels of Awareness, Operations, Technician, and Command. Or the DHS model which uses Awareness, Performance, and Management & Planning. We need to further eliminate other bias we bring with us, such as the concept of each level of training only consisting of one course. Perhaps multiple courses are required to accomplish what is needed at each level? I don’t have the answers to any of these questions, but all of these things, and more, should be considered in any real discussion about a new and improved curriculum.

Of course, any discussions on new and improved ICS curriculum need to begin at the policy level, approving the funding and the effort and reinforcing the goal of having a curriculum that better serves our response efforts.

How would you limit the influence of bias in innovation?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Mixing Exercise Types

As with many things, we are taught exercises in a rather siloed fashion. First by category: discussion-based and operations-based. Then by type. That kind of compartmentalization is generally a necessity in adult education methodology. Individually, each exercise type has its own pros and cons. Rarely, however, do we ever seen or heard of combining exercise types within one initiative.

The first time I did this was several years ago. My company was designing a series of functional exercises to be used for locations around the country. While the exercises were focused on response, one goal of our client was to include some aspects of recovery in the exercise. At about six hours, the exercises weren’t long. Time jumps can be awkward, and for the small amount of time dedicated to recovery in the exercise, the impact of the disruption from the time jump within the exercise may not net a positive result. Add to that the time it would take to provide a quantity of new information that would be needed to make a recovery-oriented functional exercise component viable.

Instead of trying to shoe-horn this in, we opted to stop the functional component of the exercise at an established time and introduce a discussion on disaster recovery. With the proper introduction and just a bit of information to provide context in addition to what they had already been working on, the discussion went smoothly and accomplished everything with which we were charged. The participants were also able to draw on information and actions from the response-focused functional component of the exercise.

We’re recently developed another exercise that begins with a tabletop exercise to establish context and premise then splits the participants into two groups which are each challenged with some operations-based activity: one deploying to a COOP location to test functionality (a drill), the other charged with developing plans to address the evolving implications of the initial incident (a functional exercise). Following the operations-based exercises, the two groups will reconvene to debrief on their activities and lessons learned before going into a hotwash.

Making this happen is easy enough. Obviously we need to ensure that objectives align with the expected activities. You also want to make sure that the dual exercise modalities are appropriate for the same participants. While I try not to be hung up on the nuances of documentation, though documentation is important, especially when it comes to grant compliance and ensuring that everyone understands the structure and expectations of the exercise. If we are mixing a discussion-based exercise and an operations-based exercise, one of the biggest questions is likely what foundational document to use – a SitMan or ExPlan. Generally, since the operations-based exercises can have greater consequences regarding safety and miscommunication, I’d suggest defaulting to an ExPlan, though be sure to include information that addresses the needs of the discussion-based exercise component in your ExPlan as well as the player briefing.

In running the exercise, be sure to have a clear transition from one exercise type to the other, especially if there are multiple locations and/or players are spread out. Players should be given information that prepares them for the transition in the player briefing. Having exercise staff (controllers/facilitators and evaluators) properly prepared for this through clearly communicating expectations at the C/E briefing and in C/E documentation is obviously important, as well as ensuring they are ready for the transition.

I’d love to hear other success stories from those who may have done something similar.

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Preparing for Community Lifelines Implementation

In all great ideas, the devil, as they say, is in the details. Implementing new concepts often requires preparations to ensure that the implementation goes smoothly. We often rush to implementation, perhaps excited for the results, perhaps not thinking through the details. Without proper preparation, that implementation can fail miserably. Integrating and implementing the Community Lifelines is no exception.

Just like everything else we do in preparedness, we should turn to the capability elements of planning, organizing, equipping, training, and exercises (POETE) to guide our preparedness for Community Lifeline implementation.

Planning and Organizing

I’m coupling these two capability elements together as they so strongly go hand-in-hand. Determining how you want to use Community Lifelines is an important early step. I’d suggest developing a Community Lifeline Implementation Plan for your jurisdiction that not only identify how you will use them in response and recovery operations, but details of how their use fits within your response and recovery management structure, how information will flow, who is responsible for what, how information is reported, and to who it is reported. The Implementation Plan should also outline the preparedness steps needed and how and where information will be catalogued.

I’ve seen several Community Lifeline integrations across local, county, and state jurisdictions, these mostly being visual status displays, but there can be some complexity in how we even get to that display.

We all know from CPG-101 that forming a planning team is the first step of emergency planning. While not itself really the capability element of Organizing, the stakeholders that will be assembled for this will extend across all capability elements and into response and recovery operations.

Before identifying stakeholders, we need to examine each Community Lifeline down to the sub-component levels, which first necessitates determining which components and sub-components are applicable to your jurisdiction. For example, within the Transportation Community Lifeline, if your jurisdiction has no Aviation resources or infrastructure, you may choose to not include that component.

Once you have made the determination as to what components and sub-components of each Community Lifeline will be included, it’s not time to form your planning teams for each. Depending on the size of your jurisdiction, you could form teams at the Community Lifeline level, the component level, or the sub-component level. You could even use different approaches for each (i.e. The Community Lifeline of Water Systems may only involve a few stakeholders to address all components and sub-components, whereas Health and Medical may require distinct teams for each component). Since much of the Community Lifelines is centered on or strongly relates to critical infrastructure, many of our stakeholders will be from the private sector. Hopefully these are partners you have engaged with before, but if not, this is a great opportunity to do so.

In meeting with each of these stakeholders/stakeholder groups, providing them with an orientation to the Community Lifelines concept will be important. Be sure to talk about this within the contexts of whole-community preparedness, public-private partnerships, critical infrastructure, and the five mission areas. This should include the expectation for these to be long-term working groups that will provide information updates before, during, and after a disaster. It will be important to obtain from each the following information (at minimum) for each function and/or facility:

  • Legal owners and operators
  • Primary and alternate points of contact (and contact info for each) (Note that these should be emergency/24 hour contacts)
  • Existing emergency plans
  • Protection activities
  • Prevention activities
  • Mitigation activities
  • Preparedness activities
  • Response and recovery priorities
  • Critical continuity and supply chain issues
  • Sensitive information concerns

Processes will need to be mapped to identify how information will be obtained in an incident from the owners/operators of each facility or function, what information will be expected, in what format, and how often. Internal (EOC) procedures should identify how this information will be received, organized, and reported and how it will influence operational priorities for response and recovery. Since the visual representation of the Community Lifelines is the face of the system, you should also determine what the benchmarks are within each Community Lifeline, component, and sub-component for differentiating between status (i.e. what failures will bring status from green to yellow, and from yellow to red) and how the status of one may influence the status of others.

Equipment (and Systems)

It is important to catalogue the information you obtain from preparedness activities as well as in implementation. Consider GIS integrations, as there is an abundance of information that involves geolocation. I’ll make a special shout out here to the Community Lifeline Status System (CLSS) project, which is funded by the DHS Science and Technology (S&T) Directorate and is being developed by contract to G&H International. When rolled out, the CLSS will be available at no cost to every jurisdiction in the US to support Community Lifeline integration. Having been fortunate enough to get a private in-depth tour of the system, I’m thoroughly impressed. The CLSS is based on Arc GIS and provides a lot of customizable space to store all this preparedness information.

Using a system such as CLSS to display and share Community Lifelines information is also a benefit. While most displays I’ve seen simply show the icon and status color for each Community Lifeline, an interactive dashboard type of system can help provide additional context and important information. This is something CLSS also provides.

Training and Exercises

As with any new plans or processes, training is an important part of supporting implementation. Training audiences will include:

  • EOC personnel
  • Owners/Operators of Community Lifelines infrastructure
  • Officials who will receive Community Lifelines information

Proper training requires that different audiences should receive training to address their specific needs.

Similarly, exercises should purposely test these processes, and use of Community Lifelines should be incorporated into exercises often. Community Lifelines status and information should be engaged in exercises to inform and support decision making.

///

If you already know the benefits of the Community Lifelines, hopefully you see the advantages of adequate preparedness to get the most out of them. The stakeholder groups you assemble to support planning should be everlasting, as information on their vulnerabilities, capabilities, and activities are likely to change over time. Beyond direct Community Lifeline applications, these are all great partners for a variety of emergency management activities to support the whole community. The preparedness efforts, and maintenance thereof (sorry, but it’s not just a one-time thing) is a significant investment (and could likely be a full-time job for even a moderately sized jurisdiction) but it should pay incredible dividends over and over again.

Are you using Community Lifelines? What have you learned about the need to prepare for their use?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

CDC Forgot About Planning

In late February, CDC released the highly anticipated notice of funding opportunity (NOFO) for the 2024-2028 Public Health Emergency Preparedness (PHEP) grant. The general concept of the grant wasn’t a big surprise, as they had been promoting a move to their Response Readiness Framework (RRF). The timing of the new five-year grant cycle seems ideal to implement lessons learned from COVID-19, yet they are falling short.

I’ve reflected in the past on the preparedness capability elements of Planning, Organizing, Equipment/Systems, Training, and Exercises (POETE). I also often add Assessing to the front of that (APOETE). These preparedness elements are essentially the buckets of activity through which we categorize our preparedness activities.

In reviewing the ten program priorities of the RRF, I’m initially encouraged by the first priority: Prioritize a risk-based approach to all-hazards planning. Activity-wise, what this translates to in the NOFO is conducting a risk assessment. Solid start. Yet nowhere else is planning overtly mentioned. Within the NOFO some of the other priorities reflect on ensuring certain things are addressed in plans, such as health equity, but there is otherwise no direct push for planning. Buried within the NOFO (page 62) is a list of plans that must be shared with project officers upon request (under the larger heading of Administrative and Federal Requirements) but the development of any of these plans does not correlate to any priorities, strategies, or activities within the document.

As for the rest of APOETE, there is good direction on Organizing, Equipment and Systems, Training, and Exercises. While that’s all great, planning is the true foundation of preparedness and it is so obviously left out of this NOFO. Along with my general opinion that most emergency plans (across all sectors) are garbage, that vast majority of findings from numerous COVID-19 after-action reports I’ve written (which included two states and several county and local governments) noted the significant need for improved emergency plans. Further, the other preparedness elements (OETE) should all relate back to our plans. If we aren’t developing, improving, and updating plans, then the other activities will generally lack focus, direction, and relevance.

Understanding that this is the first year of a five-year grant cycle, some changes and clarification will occur as the cycle progresses, but as planning is a foundational activity, it should be immediately and directly tied to the results of the assessment this year’s grant calls for. Otherwise, the findings of the assessments are generally meaningless if we aren’t taking action and developing plans to address them. This is leaving us with a significant gap in preparedness. Someone at CDC didn’t think this through and it leaves me with a great deal of concern, especially in the aftermath of the COVID-19 response.

What are your thoughts on this?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Properly Applying ICS in Function-Specific Plans

As with many of my posts, I begin with an observation of something that frustrates me. Through much of my career, as I review function-specific plans (e.g., shelter plans, point of distribution plans, debris management plans, mass fatality incident management plans) I see a lot of organization charts that are inserted into those plans. Almost always, the org chart is an application of a ‘full’ incident command system (ICS) org chart (Command, Command Staff, General Staff, and many subordinate positions). This is obviously suitable for a foundational emergency operations plan (EOP), an emergency operations center (EOC) plan, or something else that is very comprehensive in nature where this size and scope of an organization would be used, but function-specific plans are not that. This, to me, is yet another example of a misinterpretation, misunderstanding, and/or misuse of the principles of National Incident Management System (NIMS) and ICS.

Yes, we fundamentally have a mandate to use ICS, which is also an effective practice, but not every function and facility we activate within our response and recovery operations requires a full organization or an incident management team to run. The majority of applications of a function-specific plan are within a greater response (such as activating a commodity POD during a storm response). As such, the EOP should have already been activated and there should already be an ‘umbrella’ incident management organization (e.g., ICS) in place – which means you are (hopefully) using ICS. Duplicating the organization within every function isn’t necessary. If we truly built out organizations according to every well intentioned (but misguided) plan, we would need several incident management teams just to run a Type 3 incident. This isn’t realistic, practical, or appropriate.

Most function-specific plans, when activated, would be organized within the Operations Section of an ICS organization. There is a person in charge of that function – depending on the level of the organization in which they are placed and what the function is, there is plenty of room for discussion on what their title would be, but I do know that it absolutely is NOT Incident Commander. There is already one of those and the person running a POD doesn’t get to be it. As for ‘command staff’ positions, if there is really a need for safety or public information activity (I’m not even going to talk about liaison) at these levels, these would be assistants, as there is (should be) already a Safety Officer or PIO as a member of the actual Command Staff. Those working within these capacities at the functional level should be coordinating with the principal Command Staff personnel. As for the ‘general staff’ positions within these functions, there is no need for an Operations Section as what’s being done (again, most of the time that’s where these functions are organized) IS operations. Planning and Logistics are centralized within the ICS structure for several reasons, the most significant being an avoidance of duplication of effort. Yes, for all you ICS nerds (like me) there is an application of branch level planning (done that) and/or branch level logistics that can certainly be necessary for VERY complex functional operations, but this is really an exception and not the rule – and these MUST interface with the principal General Staff personnel. As for Finance, there are similarly many reasons for this to be centralized within the primary ICS organization, which is where it should be.

We need to have flexibility balanced with practicality in our organizations. We also need to understand that personnel (especially those trained to serve in certain positions) are finite, so it is not feasible to duplicate an ICS structure for every operational function, nor is it appropriate. The focus should be on what the actual function does and how it should organize to best facilitate that. My suggestion is that if you are writing a plan, unless you REALLY understand ICS (and I don’t mean that you’ve just taken some courses), find someone who (hopefully) does and have a conversation with them. Talk through what you are trying to accomplish with your plan and your organization; everything must have a purpose so ask ‘why?’ and question duplication of effort. This is another reason why planning is a team sport and it’s important to bring the right people onto the team.

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

First Responders as Emergency Managers

I continue to see concern with first responders entering the field of emergency management, and with good reason. Of course, this does not apply to everyone who has made this transition. I’ve seen some incredible emergency managers who have first responder roots, and obviously we are all part of the public safety family with some tangible connections, but they really are two very different fields with very different skillsets. Throughout my career and as I continue to work with emergency managers across the country, I obviously continue to come across emergency managers who were or in some cases still are first responders. Those roots are very apparent in many of them and while some traits can be beneficial, others can very much be detrimental.

Several weeks ago, someone posted a question on LinkedIn questioning why there was still such a gap between public health and emergency management. While any relationship requires work by both parties, I think the strain in this relationship in most places strongly lies with emergency managers, with much of it due to emergency managers poorly suited for the position.

The knowledge, skills, abilities, and attitudes between first responders and emergency managers don’t organically have as much overlap as many people seem to assume. One of the most common words in job descriptions for emergency management positions is ‘coordination’. Coordination is a soft skill. A people skill. It takes knowledge and awareness of who the other parties are, what they do, and what their priorities are. It requires abilities associated with communication, negotiation, and the ability to connect. Perhaps most importantly, though, it requires a proper attitude; one that is open and not standoffish, indifferent, or otherwise off-putting to others.

There is an assumption within government administrations that continues to be perpetuated within the field of emergency management that there is a direct portability between being a first responder and being an emergency manager, thus why so many first responders continue to be hired into emergency management jobs. I recently ran a tabletop exercise with two emergency managers in the room, one who was still active in the fire service, the other who was still active in law enforcement. When they made statements, the body language of others in the room immediately changed. They were unnecessarily aggressive in their demeaner and shut down conversation rather than encouraging the exchange of ideas which was the purpose of the exercise. Unfortunately, this is a common personality train among first responders. Along with the people skills needed, emergency management is a very administrative field. I also recently met with a first responder turned emergency manager in a debrief talking about incident management. He directly expressed his disdain for the bureaucracy of incident management practices and the necessity for any measure of documentation. He’s a doer. I don’t knock his perspective, but his attitude isn’t aligned with the needs of emergency management.

Certainly, some first responders do have the acumen of emergency managers. They can see things big picture. They are able to step away from tactics yet benefit by their knowledge of tactics. They have the personality and people skills necessary to facilitate coordination, not just with first responder agencies, but with others. They seek knowledge and training beyond their first responder backgrounds, recognizing that they need to know more beyond response and beyond the discipline from which they come. These are the ones who will be more prone to success, for themselves, their agencies, and their jurisdictions.

I was a first responder for ten years, including time as a chief officer; much of this time prior to entering emergency management. While I’d like to think I was reasonably positioned to become an emergency manager when I did, I’m aware now that I certainly had some of these flaws early in my emergency management career that kept me from being the best emergency manager I could at the time. Fortunately, I had great colleagues and mentors who helped guide me. I also recognized some bad examples early on and saw how their interactions with others, especially those who weren’t first responders, were not as engaging or positive as they should have been.

Can first responders become successful emergency managers? Absolutely! But being a first responder, in most cases, should not be a prerequisite for emergency management positions. Also, hiring someone is not just about what’s on their resume. If a job requires people skills and interagency coordination, that should be a big part of the interview process. First responder or not, arms crossed with short responses or an aggressive attitude is not a good indicator of someone being a people person. As emergency managers, we are responsible for our own profession. We need to make the change from within and work with those on the outside (administrations, human resources departments, etc.) to ensure that the field continues to grow in a positive direction, ensuring success for the field of emergency management as well as the people who are brought into it.

What are your thoughts on this topic?

©2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Developing Incident After-Action Reports

Incident and event after action reports (AARs) are extremely important for identifying the successes and challenges we faced in our efforts. Just like our evaluation efforts in exercises, many valuable lessons can be learned and effective practices identified from incidents and events. Yet for as much as incident and event AARs are encouraged, there are often problems with how these are developed.

While the quality of exercise after action reports is often not up to par, a defined process of exercise evaluation along with a suggested AAR format has been available to us and engrained in emergency management practice for a long time via the Homeland Security Exercise and Evaluation Program (HSEEP). While some concepts of exercise evaluation can be utilized for incident and event evaluation, we need to have a very different approach to be most effective.

FEMA has been promoting a Continuous Improvement concept for emergency management for several years. Incident and event evaluation is part of continuous improvement, though continuous improvement is intended to permeate much more of our daily and incident operations. While FEMA’s program has some good information that applies to incident and event evaluation, there are some important things I feel are missing.

Perhaps the most significant difference in our approach to incident and event evaluation vs exercise evaluation is the evaluation focus. Exercises, right from our very first steps of design, are designed explicitly for evaluation. The identification of capabilities and exercise objectives gives direction to our design and directly informs our evaluation of the exercise. Essentially, the intent and focus of evaluation is baked in from the start. For incidents and events, however, it is not.

Because evaluation is not a primary intent of incidents and events, we generally need to determine our evaluation strategy afterwards. The development of our evaluation strategy absolutely must begin with the identification of what we want to evaluate. This is a critical element not included in FEMA’s Continuous Improvement guidance. Without determining the focus of the evaluation, the discovery process lacks direction and may likely explore areas of incident/event operations that are lower priority to stakeholders. Determining what the evaluation effort will focus on can be considered similar to developing objectives, and as such should be specific enough to give proper direction to the evaluation effort. For example, having done numerous COVID-19 AARs, it’s not enough to say that we will evaluate ‘vaccination’. Vaccination is a very broad activity so we should determine specific aspects of vaccination to focus on, such as equity of distribution or vaccine point of dispensing (POD) operations. Obviously multiple focus areas can be identified based upon what is most important to stakeholders. And no, incident objectives should not serve as your focal points. These are operational objectives that have nothing to do with evaluation, though your evaluation itself may likely take the incident objectives (and associated actions) into consideration.

FEMA’s Continuous Improvement guidance provides a lot of great insight for the discovery process. The most common tools I use are focus groups, interviews, document reviews, and surveys. Focus groups and interviews allow people to tell their experiences from their perspectives. These offer a lot of insight and include facts as well as opinions, both of which are valid in the AAR process, as long as they are handled properly in the process, as discerning between the two is important.

Document reviews are also important. Typically I look at documents developed before the incident (mostly plans) and those developed during the incident (such as press releases, incident action plans, situation reports, and operational plans). While documents developed during the incident typically tell me what was done or what was intended to be done, the documents developed prior to the incident typically provide me with a standard from which to work.

There are a couple of important caveats with this:

1) Many plans are operationally inadequate, so they may not have been able to be followed.

2) Many organizations don’t reference their plans, regardless of quality.

As such, a big part of my document review is also determining the quality of the documents and if they were referenced during the incident or event. It may very well be that the actions taken were better than what was in the plans.

Surveys… so much to say about surveys that probably deserves its own blog post. Surveys can be great tools, but most tend to design poor surveys. They should be succinct and to the point. You will want to ask a lot of questions, but resist the urge to do so. The more questions you ask, the lower the rate of return on surveys. So focus on a few questions that will give you great data.

We then go to writing, which involves the organization of our information, formation of key observations (by focus area), a narrative analysis for each observation, and development of one or more recommendations for each observation. The analysis is an aspect that many AARs, including those for exercises, miss the mark. The analysis needs to contextualize the observation and justify the recommendations. It should provide sufficient detail for someone not knowledgeable in that observation (or of the incident) to have a reasonable understanding of the associated issues. Remember that an AAR may be referenced for years to come and can also be used to support budgets, grant applications, and obviously the corrective actions that are identified. A good analysis is necessary and should certainly be more than a couple of sentences. Be sure to identify strengths and effective practices, not just lessons learned and challenges.

I do not advocate using the HSEEP AAR template for incident and event evaluations. Beyond an awkward fit for some of the ‘fill-in-the-box’ information, the overall structure is not supportive of what an incident or event AAR needs to include. I suggest writing the AAR like a professional report. I’d include an executive summary, table of contents, research methodology, observations/analysis/recommendations, an incident or event timeline, and summary of recommendations (I do still like to use the traditional HSEEP improvement plan matrix for this). I tend to have a lot of citations throughout the document (typically I footnote these). Citations can include standards, such as NIMS, references (plans), media articles, and more.

A couple of notes: 1 – When planning our management of an event, we can be more proactive in evaluation by including it as a deliberate component of our efforts. 2 – Incident evaluation can begin during the incident by tasking an incident evaluator.

Incident and event evaluation can be daunting to approach. It requires endorsement from the highest levels to ensure cooperation and access to information. Honesty is what is needed, not sugar coating. Far too many AARs I’ve seen for exercises, incidents, and events are very soft and empty. Remember that we aren’t evaluating people, rather we are evaluating plans, processes, systems, and decisions. The final AAR should be shared with stakeholders so they can learn and apply corrective actions that may be relevant to them. Given most state public information laws, the AAR may need to be made available to the public, which is more reason to ensure that it is professionally written and that observations have quality analysis as members of the public may require context. I’ve also seen many elected and appointed officials (and legal counsels) be reluctant to have written reports or written reports with much detail because of freedom of information laws. While I understand that accountability and transparency can create challenges, we must remember that governments works on behalf of the people, and the acknowledgement of mistakes and shortcomings (as well as successes) is important to continuous improvement of the services we provide.

What is your approach with incident and event AARs? Where do you see that we need to improve this important process?

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Stop Exercising Bad Plans

We know that the purpose of most exercises in emergency management (ref HSEEP) and related fields is to validate plans. That concept, though, is built on a fragile premise: that the plans are good.

Over the years, the more plans I see from various jurisdictions, the more disappointed I am practically to the extent of losing near-total faith in our profession’s ability to develop quality plans. Most emergency plans out there are crap. Garbage. Not worth the effort that has been put into them. Typically, they don’t have enough detail. Not that they need to have procedure-level detail (but those procedures should be found somewhere), but they are often written so high level that they are merely conceptual or policy-esque.

The premise that exercises are intended to validate plans would indicate a belief that the plans themselves serve as quality standards of practice for the organization(s) they are built for. The sad truth is that they are not. So, what are our exercises proving?

Gaps in exercise evaluation are a significant hurdle which are often based upon poor evaluation practices, poor AAR writing, and/or the assumption of quality plans. I find many AARs to be very superficial. They provide observations and recommendations, but no analysis. Without analysis we have no context for the observation and no examination of root cause or other contributing factors. Absent this analysis, the AARs aren’t able to truly identify what needs to be addressed. So, with the superficial, come the obvious statements and recommendations that communication needs to be improved, more ICS training is needed, etc.

What I don’t see enough of are observations, ANALYSIS, and recommendations that indicate:

  1. Plans need to be drastically improved (updated and/or developed)
  2. Responders need to actually be trained in their roles to support implementation of the plans (ICS does NOT teach us how to implement plans… in fact ICS training largely ignores the importance of existing plans)

What of the AARs that are better and actually do recommend improved plans? This leads us to the next potential point of failure: implementation of corrective actions. I see so many organizations are simply bad at this. They seem content to exercise over and over again (typically at the expense of taxpayer dollars) and come up with the same results. They largely aren’t fixing anything, or perhaps just the proverbial low-hanging fruit (i.e. more ICS training), but they aren’t tackling the harder-to-do, yet more impactful development of quality plans.

We need to stop assuming our plans are good. Exercising bad plans has little value to us and is typically more wasteful than beneficial.

Just like the potential causes identified above, there are numerous issues to be addressed. First of all, we need to recognize that not every emergency manager has the acumen for writing plans. The development of emergency plans is a hybrid of art and science. It includes hard and soft skillsets such as technical writing, systems thinking, organization, research, collaboration, and creativity. We have standards for developing plans, such as CPG101, which overall is a good standard (though it could be improved to help people use it). We have some training available in how to develop emergency plans, but there are some issues.

  • The G-235 Emergency Planning course (now IS-235) was a great course, but the big push 15-20 years ago to put so many classroom courses online to make them more accessible and to save costs largely resulted in decreased learning outcomes.
  • The classroom training in emergency planning has largely been replaced by the E103 Planning: Emergency Operations course, which is part of the Emergency Management Basic Academy. This is a pretty good course but being part of the Basic Academy (which is a great concept) also limits access to some people as the general practice is (understandably) to give registration preference to those who are taking the entire academy. Sure, the entire academy makes for more well-rounded EMs, but if someone wants to focus on emergency planning, some of the other courses, while complimentary, constitute a larger investment of time and possibly money.
  • Finally, FEMA has the Planning Practitioner Program, which is a more intensive experience and certainly provides some improved learning outcomes, but with the expectation of a huge percentage of emergency managers (and those in related professions) to be proficient in emergency planning, this program simply isn’t available enough. (Note re training: yes, there are an abundance of other planning-related courses out there… I just highlighted these as examples).

I’ll also say that simply taking some classes does not make you a proficient emergency planner. Because there is art and science to it, it can’t simply be taught. It needs to be learned and experienced. Practice and mentorship are key – which is something else most EMs don’t have access to or even seek out. Training is not the only solution.

So, while this article started out with identifying the fallacy often seen in our exercise practices, I end up, once again, pointing out what I think is the biggest gap in the entirety of emergency management – bad plans. Plans are the foundation of our practice, yet we can’t seem to get it right. We are too dismissive of the necessity and process of plan development and upkeep. We are too accepting of inadequate plans that are not implementation ready. We don’t do enough to build personnel capability in plan development. So many of those who are writing plans, be they civil servants, consultants, or others, are simply bad at it. And while some have potential that is underdeveloped, others simply don’t have the acumen for it.

And the worst part about it all… we, as a practice and professional culture, are accepting it!

Many of my posts through the years have ended with a similar statement… we are treating emergency management like a game of beer league hockey. We aren’t taking it seriously enough. We need to do better and demand better. So what are you doing to support improved emergency planning practices?

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

AHIMTA Incident Management Certification

I was very pleased to see last week’s announcement by the All-Hazards Incident Management Team Association (AHIMTA) about their certification services for incident management personnel. From their website, AHIMTA is utilizing the National Incident Management System (NIMS) and the National Qualification System (NQS) as the baseline for their AHIMTA Incident Management Certification System (AIMCS). Information, including trainee application information can be found at https://www.ahimta.org/certification. In many ways, the AIMCS is a continuation of the Interstate Incident Management Qualifications System (IIMQS) Guide that AHMITA developed in 2012.

AHIMTA is providing a much-needed service, filling a vacuum that has always existed in the all-hazards incident management team (IMT) program in the US. While FEMA is responsible for maintaining the NQS, they have not actually provided certification or qualification of IMTs or IMT personnel. Last year it was decided that the US Fire Administration would discontinue their management of the AHIMT program. While the USFA didn’t provide any certification services, the program guidance they provided was valuable. They were also the primary federal agency doing anything with external AHIMTs. While some states have implemented the FEMA NQS standard for IMTs and associated positions, others have not. Even among the states that have, some have only done so, officially, for state-sponsored teams/personnel and not for those affiliated with local governments or other entities. Clearly gaps exist that must be filled. AHIMTA has continued to advocate for quality AHIMTs and personnel across the nation.

AHIMTA’s role as a third-party certification provider presents an interesting use case. While not unique, a third party providing a qualification certification (not a training certificate) based on a federal standard is not necessarily common. AHIMTA doesn’t have any explicit authority to provide this certification from FEMA or others, but as a respected organization in the AHIMT area of practice, I don’t think their qualifications to do so can be denied. Certification demands a certain rigor and even assumes liability. The documentation of the processes associated with their certification are well documented in their AIMCS Guide. While AHIMTA can’t require their certification, states and other jurisdictions may very well adopt it as the standard by which they will operate, and can make it a requirement for their jurisdiction. Aside from some very specific certifications that have existed, such as those for wildfire incident management personnel, much of AHIMT practices has been self-certification, which can vary in quality and rigor. The AIMCS program can provide consistency as well as relieve the pressure from states and other jurisdictions in forming and managing their own qualification systems. There will also be an expected level of consistency and excellence that comes from AHIMTA.

All that said, I continue to have reservations about membership organizations offering professional certifications. While membership organizations arguably have some of the greatest interest in the advancement of their profession and adherence to standards, as well as the pool of knowledge within their practice, the potential for membership influencing the process or injecting bias against non-members can never fully be eliminated. I feel that certifications should be provided by government agencies or fully independent organizations that are not beholden to a membership. Not wishing to stall AHIMTA’s progress or success in this program, I’m hopeful they may be willing to create a separate organization solely for the purposes of certification credentialing. I’d also love to see, be it offered in conjunction with this program or otherwise, an EOC qualification certification program, ideally centered upon FEMA’s EOC Skillsets, but with qualification endorsements for various EOC organizational models, such as the Incident Support Model.

I’m very interested to see the progress to be made by the AIMCS and how states and other jurisdictions adopt it as their standard. This certification should have significant impact on the continued development of quality all-hazard incident management teams.

What are your thoughts on this certification program?

© 2023 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®