ICS: Problems and Perceptions

Oddly enough, I’ve recently seen a spate of LinkedIn posts espousing the benefits of the Incident Command System (ICS). Those who have been reading my material for a while know that I’m a big proponent of ICS, though I am highly critical of the sub-par curriculum that we have been using for decades to teach ICS. The outcome is an often poorly understood and implemented system resulting in limited effectiveness.

Yes, ICS is a great tool, if implemented properly. Yet most implementations I see aren’t properly conducted. To further muddy these waters, I see emergency plans everywhere that commit our responders and officials to using ICS – this is, after all, part of the National Incident Management System (NIMS) requirement that many have – yet they don’t use it.

So why isn’t ICS being used properly or even at all? Let’s start with plans. Plans get written and put up on a proverbial shelf – physical or digital. They are often not shared with the stakeholders who should have access to them. Even less frequently are personnel trained in their actual roles as identified and defined in plans. Some of those roles are within the scope of ICS while some are not. The bottom line is that many personnel, at best, are only vaguely familiar with what they should be doing in accordance with plans. So, when an incident occurs, most people don’t think to reference the plan, and they flop around like a fish out of water trying to figure out what to do. They make things up. Sure, they often try their best, assessing what’s going on and finding gaps to fill, but without a structured system in place and in the absence (or lack of referencing) of the guidance that a quality plan should offer, efficiency and effectiveness are severely decreased, and some gaps aren’t even recognized or anticipated.

Next, let’s talk about ICS training. Again, those who have been reading my work for a while have at least some familiarity with my criticism of ICS training. To be blunt, it sucks. Not only does the content of courses not even align with course objectives, the curriculum overall doesn’t teach us enough of HOW to actually use ICS. My opinion: We need to burn the current curriculum to the ground and start over. Course updates aren’t enough. Full rewrites, a complete reimagining of the curriculum and what we want to accomplish with it, needs to take place.

Bad curriculum aside… For some reason people think that ICS training will solve all their problems. Why? One reason I believe is that we’ve oversold it. Part of that is most certainly due to NIMS requirements. Not that I think the requirements, conceptually, are a bad thing, but I think they cause people to think that if it’s the standard that we are all required to learn, it MUST be THE thing that we need to successfully manage the incident. I see people proudly boasting that they’ve completed ICS300 or ICS400. OK, that’s great… but what can you actually do with that? You’ve learned about the system, but not so much of how to actually use it. Further, beyond the truth that ICS training sucks, it’s also not enough to manage an incident. ICS is a tool of incident management. It’s just one component of incident management, NOT the entirety of incident management. Yes, we need to teach people how to use ICS, but we also need to teach the other aspects of incident management.

We also don’t use ICS enough. ICS is a contingency system. It’s not something we generally use every day, at least to a reasonably full extent. Even our first responders only use elements of ICS on a regular basis. While I don’t expect everyone to be well practiced in the nuances and specific applications of ICS, we still need more practice at using more of the system. It’s not the smaller incidents where our failure to properly implement ICS is the concern – it’s the larger incidents. It’s easy to be given a scenario and to draw out on paper what the ICS org chart should look like to manage the scenario. It’s a completely different thing to have the confidence and ego in check to make the call for additional resources – not the tactical ones – but for people to serve across a number of ICS positions. Responders tend to have a lot of reluctance to do so. Add to that the fact that most jurisdictions simply don’t have personnel even remotely qualified to serve in most of those positions. So not only are we lacking the experience in using ICS on larger incidents, we also don’t have experience ‘ramping up’ the organization for a large response. An increase in exercises, of course, is the easy answer, but exercises require time, money, and effort to implement.

One last thing I’ll mention on this topic is about perspective. One of the posts I read recently on LinkedIn espoused all the things that ICS did. While I understand the intent of their statements, the truth is that ICS does nothing. ICS is nothing more than a system on paper. It takes people to implement it. ICS doesn’t do things; PEOPLE do these things. The use of ICS to provide structure and processes to the chaos, if properly done, can reap benefits. I think that statements claiming all the things that ICS can do for us, without inserting the critical human factor into the statement, lends to the myth of ICS being our savior. It’s not. It must be implemented – properly – by people to even stand a chance.

Bottom line: we’re not there yet when it comes to incident management, including ICS. I dare say too many people are treating it as a hobby, not a profession. We have a standard, now let’s train people on it PROPERLY and practice it regularly.

©2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

NIMS Intel and Investigations Function – A Dose of Reality

Background

Soon after the initiation of the National Incident Management System (NIMS) as a result of Homeland Security Presidential Directive 5 in 2003, the Intelligence and Investigation (I/I) function was developed and introduced to NIMS, specifically to the Incident Command System (ICS). While we traditionally view I/I as a law enforcement function, there are other activities which guidance indicates may fall within I/I, such as epidemiology (personally, I’d designate epidemiology as a specific function, as we saw done by many during the COVID-19 response), various cause and origin investigations, and others. Integration of these activities into the response structure has clear advantages.

The initial guidance for the I/I function was largely developed by command personnel with the New York City Police Department (NYPD). This guidance offered several possible locations for the I/I function within the ICS structure, based on anticipated level of activity, needed support, and restrictions of I/I related information. These four possible ways of organizing the I/I function per this guidance are depicted here, and include:

  1. Placement as a Command Staff position
  2. Organized within the Operations Section (i.e. at a Branch level)
  3. Developed as its own section
  4. Included as a distinct unit within the Planning Section

These concepts have been included in the NIMS doctrine and have been supported within the NIMS Intelligence/Investigations Function Guidance and Field Operations Guide, though oddly enough, this second document ONLY addresses the organization of an I/I Section and not the other three options.

The Reality

Organization of I/I can and does certainly occur through any one of these four organizational models, though my own experiences and experiences of others as described to me have shown that very often this kind of integration of I/I within the ICS structure simply does not occur. Having worked with numerous municipal, county, state, federal, and specially designated law enforcement agencies, I’ve found that the I/I function is often a detached activity which is absolutely not operating under the command and control of the incident commander.

Many of the sources of I/I come from fusion centers, which are off-scene operations, or from agencies with specific authorities for I/I activities that generally have no desire or need to become part of the ICS structure, such as the FBI conducting a preliminary investigation into an incident to determine if it was a criminal act, or the NTSB investigating cause and origin of a transportation incident. These entities certainly should be communicating and coordinating with the ICS structure for scene access and operational deconfliction, but are operating under their own authority and conducting specific operations which are largely separate from the typical life safety and recovery operations on which the ICS structure is focused.

My opinion on this is that operationally it’s completely OK to have the I/I function detached from the ICS structure. There are often coordination meetings and briefings that occur between the I/I function and the ICS structure which address safety issues and acknowledge priorities and authorities, but the I/I function is in no way reporting to the IC. Coordination, however, is essential to safety and mutual operational success.

I find that the relationship of I/I to the ICS structure most often depends on where law enforcement is primarily organized within the ICS structure and who is managing that interest. For example, if the incident commander (IC) is from a law enforcement agency, interactions with I/I activities are more likely to be directly with the IC. Otherwise, interactions with I/I are typically handled within the Operations Section through a law enforcement representative within that structure. Similarly, I’ve also experienced I/I activity to have interactions with an emergency operations center (EOC) through the EOC director (often not law enforcement, though having designated jurisdictional authority and/or political clout) or through a law enforcement agency representative. As such, compared to the options depicted on an org chart through the earlier link, we would see this coordination or interaction depicted with a dotted line, indicating that authority is not necessarily inherent.

I think that the I/I function organized within the ICS structure is more likely to happen when a law enforcement agency has significant responsibility and authority on an incident, and even more likely if a law enforcement representative is the IC or represented in a Unified Command. I also think that the size and capabilities of the law enforcement agency is a factor, as it may be their own organic I/I function that is performing within the incident. As such, it would make sense that a law enforcement agency such as NYPD, another large metropolitan law enforcement agency, or a state police agency leading or heavily influencing an ICS structure would be more likely to bring an integrated I/I function to that structure. Given this, it makes sense that representatives from NYPD would have initially developed these four possible organizational models and seemingly exclude the possibility of a detached I/I function, but we clearly have numerous use cases where these models are not being followed. I’ll also acknowledge that there may very well be occurrences where I/I isn’t but should be integrated into the ICS structure. This is a matter for policy and training to address when those gaps are identified.

I believe that NIMS doctrine needs to acknowledge that a detached I/I function is not just possible, but very likely to occur. Following this, I’d like to see the NIMS Intelligence/Investigation Function Guidance and Field Operations Guide updated to include this reality, along with operational guidance on how best to interact with a detached I/I function. Of course, to support implementation of doctrine, this would then require policies, plans, and procedures to be updated, and training provided to reflect these changes, with exercises to test and reinforce the concepts.

What interactions have you seen between an ICS or EOC structure and the I/I function? What successes and challenges have you seen from it?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Culture of Preparedness – a Lofty Goal

September is National Preparedness Month here in the US. As we soon head into October, it’s a good opportunity to reflect on what we’ve accomplished during the month, or even elsewhere in the year. While National Preparedness Month is an important thing to mark and to remind us of how important it is to be prepared, over the past several years I’ve come to question our approaches to community preparedness. What are we doing that’s actually moving the needle of community preparedness in a positive direction? Flyers and presentations and preparedness kits aren’t doing it. While I can’t throw any particular numbers into the mix, I think most will agree that our return on investment is extremely low. Am I ready to throw all our efforts away and say it’s not making any difference at all? Of course not. Even one person walking away from a presentation who makes changes within their household to become better prepared is important. But what impact are we having overall?

Culture of preparedness is a buzz phrase used quite a bit over the last number of years. What is a culture of preparedness? An AI assisted Google search tells me that a culture of preparedness is ‘a system that emphasizes the importance of preparing for and responding to disasters, and that everyone has a role to play in doing so.’ Most agree that we don’t have a great culture of preparedness across much of the US (and many other nations) and that we need to improve our culture of preparedness. But how?

People love to throw that phrase into the mix of a discussion, claiming that improving the culture of preparedness will solve a lot of issues. They may very well be correct, but it’s about as effective as a doctor telling you that you will be fine from the tumor they found once a cure for cancer is discovered. Sure, the intent is good, but the statement isn’t helpful right now. We need to actually figure out HOW to improve our culture of preparedness. We also need to recognize that in all likelihood it will take more than one generation to actually realize the impacts of deliberate work toward improvement.

The time has come for us to stop talking about how our culture of preparedness needs improvement and to actually do something about it. There isn’t one particular answer or approach that will do this. Culture of preparedness is a whole community concept. We rightfully put a lot of time, effort, and money into ensuring that our responders (broad definition applied) are prepared, because they are the ones we rely on most. I’d say their culture of preparedness is decent (maybe a B-), but we can do a lot better. (If you think my assessment is off, please check out my annual reviews of the National Preparedness Report and let me know if you come to a different conclusion). There is much more to our community, however, than responders. Government administration, businesses, non-government organizations, and people themselves compose the majority of it, and unfortunately among these groups is where our culture of preparedness has the largest gaps.

As with most of my posts, I don’t actually have a solution. But I know what we are doing isn’t getting us to where we want to be. I think the solution, though, lies in studying people, communities, and organizations and determining why they behave and feel the way they do, and identifying methodologies, sticks, and carrots that can help attain an improved culture of preparedness over time. We must also ensure that we consider all facets of our communities, inclusive of gender identity, race, culture, income, citizenship status, and more. We need people who know and study such things to help guide us. The followers of Thomas Drabek. The Kathleen Tierneys* of the world. Sociologists. Anthropologists. Psychologists. Organizational psychologists.  

A real, viable culture of preparedness, in the present time, is little more than a concept. We need to change our approach from using this as a buzz phrase in which everyone in the room nods their heads, to a goal which we make a deliberate effort toward attaining. A problem such as this is one where we can have a true union of academia and practice, with academics and researchers figuring out how to solve the problem and practitioners applying the solutions, with a feedback loop of continued study to identify and track the impacts made, showing not only the successes we (hopefully) attain, but also how we can continue to improve.

*Note: I don’t know Dr. Tierney personally and it is not my intent to throw her under the proverbial bus for such a project. I cite her because her writing on related topics is extremely insightful. I highly recommend Disasters: A Sociological Approach.

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

ICS Training Sucks – Progress Inhibited by Bias

It’s been a while since I’ve written directly toward my years-long rally against our current approach to Incident Command System (ICS) training. Some of these themes I’ve touched on in the past, but recent discussions on this and other topics have gotten the concept of our biases interfering with progress stuck in my head.

It is difficult for us, as humans, to move forward, to be truly progressive and innovative, when we are in a way contaminated by what we know about the current system which we wish to improve. This knowledge brings with it an inherent bias – good, bad, or otherwise – which influences our vision, reasoning, and decisions. Though on the other hand, knowledge of the existing system gives us a foundation from which we can work, often having awareness of what does and does not work.

I’m sure there have been some type of psychological studies done on such things. I’ve certainly thought about, in my continued rally against our current approach to ICS training, what that training could look like if we set individuals to develop something new if they’ve never seen the current training. Sure, the current training has a lot of valuable components, but overall, it’s poorly designed, with changes and updates through decades still based upon curriculum that was poorly developed, though with good intentions, so long ago.

In recent months, having had discussions with people about various things across emergency management that require improvement, from how we assess preparedness, to how we develop plans, to how we respond, and even looking at the entire US emergency management enterprise itself. Every one of these discussions, trying to imagine what a new system or methodology could look like, with every one of these people (myself included), were infected by an inherent bias that stemmed from what is. Again, I’m left wondering, what would someone build if they had no prior knowledge of what currently exists.

Of course, what would be built wouldn’t be flawless. To some solutions, those of us in the know may even shake our heads, saying that certain things have already been tried but were proven to fail (though perhaps under very different circumstances which may no longer be relevant). Some solutions, however, could be truly innovative.

The notion, perhaps, is a bit silly, as I’m not sure we could expect anyone to build, for example, a new ICS curriculum, without having subject matter expertise in ICS (either their own or through SMEs who would guide and advise on the curriculum). These SMEs, inevitably, would have taken ICS training somewhere along their journey.

All that said, I’m not sure it’s possible for us to eliminate our bias in many of these situations. Even the most visionary of people can’t shed that baggage. But we can certainly improve how we approach it. I think a significant strategy would be having a facilitator who is a champion of the goal and who understands the challenges, who can lead a group through the process. I’d also suggest having a real-time ‘red team’ (Contrarian?) element as part of the group, who can signal when the group is exercising too much bias brought forth from what they know of the current implementation.

In the example of reimagining ICS training, I’d suggest that the group not be permitted to even access the current curriculum during this effort. They should also start from the beginning of the instructional design process, identifying needs and developing training objectives from scratch, rather than recycling or even referencing the current curriculum. The objectives really need to answer the question – ‘What do we want participants to know or do at the completion of the course?’. Levels of training are certainly a given, but perhaps we need to reframe to what is used elsewhere in public safety, such as the OSHA 1910.120 standard which uses the levels of Awareness, Operations, Technician, and Command. Or the DHS model which uses Awareness, Performance, and Management & Planning. We need to further eliminate other bias we bring with us, such as the concept of each level of training only consisting of one course. Perhaps multiple courses are required to accomplish what is needed at each level? I don’t have the answers to any of these questions, but all of these things, and more, should be considered in any real discussion about a new and improved curriculum.

Of course, any discussions on new and improved ICS curriculum need to begin at the policy level, approving the funding and the effort and reinforcing the goal of having a curriculum that better serves our response efforts.

How would you limit the influence of bias in innovation?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Mixing Exercise Types

As with many things, we are taught exercises in a rather siloed fashion. First by category: discussion-based and operations-based. Then by type. That kind of compartmentalization is generally a necessity in adult education methodology. Individually, each exercise type has its own pros and cons. Rarely, however, do we ever seen or heard of combining exercise types within one initiative.

The first time I did this was several years ago. My company was designing a series of functional exercises to be used for locations around the country. While the exercises were focused on response, one goal of our client was to include some aspects of recovery in the exercise. At about six hours, the exercises weren’t long. Time jumps can be awkward, and for the small amount of time dedicated to recovery in the exercise, the impact of the disruption from the time jump within the exercise may not net a positive result. Add to that the time it would take to provide a quantity of new information that would be needed to make a recovery-oriented functional exercise component viable.

Instead of trying to shoe-horn this in, we opted to stop the functional component of the exercise at an established time and introduce a discussion on disaster recovery. With the proper introduction and just a bit of information to provide context in addition to what they had already been working on, the discussion went smoothly and accomplished everything with which we were charged. The participants were also able to draw on information and actions from the response-focused functional component of the exercise.

We’re recently developed another exercise that begins with a tabletop exercise to establish context and premise then splits the participants into two groups which are each challenged with some operations-based activity: one deploying to a COOP location to test functionality (a drill), the other charged with developing plans to address the evolving implications of the initial incident (a functional exercise). Following the operations-based exercises, the two groups will reconvene to debrief on their activities and lessons learned before going into a hotwash.

Making this happen is easy enough. Obviously we need to ensure that objectives align with the expected activities. You also want to make sure that the dual exercise modalities are appropriate for the same participants. While I try not to be hung up on the nuances of documentation, though documentation is important, especially when it comes to grant compliance and ensuring that everyone understands the structure and expectations of the exercise. If we are mixing a discussion-based exercise and an operations-based exercise, one of the biggest questions is likely what foundational document to use – a SitMan or ExPlan. Generally, since the operations-based exercises can have greater consequences regarding safety and miscommunication, I’d suggest defaulting to an ExPlan, though be sure to include information that addresses the needs of the discussion-based exercise component in your ExPlan as well as the player briefing.

In running the exercise, be sure to have a clear transition from one exercise type to the other, especially if there are multiple locations and/or players are spread out. Players should be given information that prepares them for the transition in the player briefing. Having exercise staff (controllers/facilitators and evaluators) properly prepared for this through clearly communicating expectations at the C/E briefing and in C/E documentation is obviously important, as well as ensuring they are ready for the transition.

I’d love to hear other success stories from those who may have done something similar.

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Preparing for Community Lifelines Implementation

In all great ideas, the devil, as they say, is in the details. Implementing new concepts often requires preparations to ensure that the implementation goes smoothly. We often rush to implementation, perhaps excited for the results, perhaps not thinking through the details. Without proper preparation, that implementation can fail miserably. Integrating and implementing the Community Lifelines is no exception.

Just like everything else we do in preparedness, we should turn to the capability elements of planning, organizing, equipping, training, and exercises (POETE) to guide our preparedness for Community Lifeline implementation.

Planning and Organizing

I’m coupling these two capability elements together as they so strongly go hand-in-hand. Determining how you want to use Community Lifelines is an important early step. I’d suggest developing a Community Lifeline Implementation Plan for your jurisdiction that not only identify how you will use them in response and recovery operations, but details of how their use fits within your response and recovery management structure, how information will flow, who is responsible for what, how information is reported, and to who it is reported. The Implementation Plan should also outline the preparedness steps needed and how and where information will be catalogued.

I’ve seen several Community Lifeline integrations across local, county, and state jurisdictions, these mostly being visual status displays, but there can be some complexity in how we even get to that display.

We all know from CPG-101 that forming a planning team is the first step of emergency planning. While not itself really the capability element of Organizing, the stakeholders that will be assembled for this will extend across all capability elements and into response and recovery operations.

Before identifying stakeholders, we need to examine each Community Lifeline down to the sub-component levels, which first necessitates determining which components and sub-components are applicable to your jurisdiction. For example, within the Transportation Community Lifeline, if your jurisdiction has no Aviation resources or infrastructure, you may choose to not include that component.

Once you have made the determination as to what components and sub-components of each Community Lifeline will be included, it’s not time to form your planning teams for each. Depending on the size of your jurisdiction, you could form teams at the Community Lifeline level, the component level, or the sub-component level. You could even use different approaches for each (i.e. The Community Lifeline of Water Systems may only involve a few stakeholders to address all components and sub-components, whereas Health and Medical may require distinct teams for each component). Since much of the Community Lifelines is centered on or strongly relates to critical infrastructure, many of our stakeholders will be from the private sector. Hopefully these are partners you have engaged with before, but if not, this is a great opportunity to do so.

In meeting with each of these stakeholders/stakeholder groups, providing them with an orientation to the Community Lifelines concept will be important. Be sure to talk about this within the contexts of whole-community preparedness, public-private partnerships, critical infrastructure, and the five mission areas. This should include the expectation for these to be long-term working groups that will provide information updates before, during, and after a disaster. It will be important to obtain from each the following information (at minimum) for each function and/or facility:

  • Legal owners and operators
  • Primary and alternate points of contact (and contact info for each) (Note that these should be emergency/24 hour contacts)
  • Existing emergency plans
  • Protection activities
  • Prevention activities
  • Mitigation activities
  • Preparedness activities
  • Response and recovery priorities
  • Critical continuity and supply chain issues
  • Sensitive information concerns

Processes will need to be mapped to identify how information will be obtained in an incident from the owners/operators of each facility or function, what information will be expected, in what format, and how often. Internal (EOC) procedures should identify how this information will be received, organized, and reported and how it will influence operational priorities for response and recovery. Since the visual representation of the Community Lifelines is the face of the system, you should also determine what the benchmarks are within each Community Lifeline, component, and sub-component for differentiating between status (i.e. what failures will bring status from green to yellow, and from yellow to red) and how the status of one may influence the status of others.

Equipment (and Systems)

It is important to catalogue the information you obtain from preparedness activities as well as in implementation. Consider GIS integrations, as there is an abundance of information that involves geolocation. I’ll make a special shout out here to the Community Lifeline Status System (CLSS) project, which is funded by the DHS Science and Technology (S&T) Directorate and is being developed by contract to G&H International. When rolled out, the CLSS will be available at no cost to every jurisdiction in the US to support Community Lifeline integration. Having been fortunate enough to get a private in-depth tour of the system, I’m thoroughly impressed. The CLSS is based on Arc GIS and provides a lot of customizable space to store all this preparedness information.

Using a system such as CLSS to display and share Community Lifelines information is also a benefit. While most displays I’ve seen simply show the icon and status color for each Community Lifeline, an interactive dashboard type of system can help provide additional context and important information. This is something CLSS also provides.

Training and Exercises

As with any new plans or processes, training is an important part of supporting implementation. Training audiences will include:

  • EOC personnel
  • Owners/Operators of Community Lifelines infrastructure
  • Officials who will receive Community Lifelines information

Proper training requires that different audiences should receive training to address their specific needs.

Similarly, exercises should purposely test these processes, and use of Community Lifelines should be incorporated into exercises often. Community Lifelines status and information should be engaged in exercises to inform and support decision making.

///

If you already know the benefits of the Community Lifelines, hopefully you see the advantages of adequate preparedness to get the most out of them. The stakeholder groups you assemble to support planning should be everlasting, as information on their vulnerabilities, capabilities, and activities are likely to change over time. Beyond direct Community Lifeline applications, these are all great partners for a variety of emergency management activities to support the whole community. The preparedness efforts, and maintenance thereof (sorry, but it’s not just a one-time thing) is a significant investment (and could likely be a full-time job for even a moderately sized jurisdiction) but it should pay incredible dividends over and over again.

Are you using Community Lifelines? What have you learned about the need to prepare for their use?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

CDC Forgot About Planning

In late February, CDC released the highly anticipated notice of funding opportunity (NOFO) for the 2024-2028 Public Health Emergency Preparedness (PHEP) grant. The general concept of the grant wasn’t a big surprise, as they had been promoting a move to their Response Readiness Framework (RRF). The timing of the new five-year grant cycle seems ideal to implement lessons learned from COVID-19, yet they are falling short.

I’ve reflected in the past on the preparedness capability elements of Planning, Organizing, Equipment/Systems, Training, and Exercises (POETE). I also often add Assessing to the front of that (APOETE). These preparedness elements are essentially the buckets of activity through which we categorize our preparedness activities.

In reviewing the ten program priorities of the RRF, I’m initially encouraged by the first priority: Prioritize a risk-based approach to all-hazards planning. Activity-wise, what this translates to in the NOFO is conducting a risk assessment. Solid start. Yet nowhere else is planning overtly mentioned. Within the NOFO some of the other priorities reflect on ensuring certain things are addressed in plans, such as health equity, but there is otherwise no direct push for planning. Buried within the NOFO (page 62) is a list of plans that must be shared with project officers upon request (under the larger heading of Administrative and Federal Requirements) but the development of any of these plans does not correlate to any priorities, strategies, or activities within the document.

As for the rest of APOETE, there is good direction on Organizing, Equipment and Systems, Training, and Exercises. While that’s all great, planning is the true foundation of preparedness and it is so obviously left out of this NOFO. Along with my general opinion that most emergency plans (across all sectors) are garbage, that vast majority of findings from numerous COVID-19 after-action reports I’ve written (which included two states and several county and local governments) noted the significant need for improved emergency plans. Further, the other preparedness elements (OETE) should all relate back to our plans. If we aren’t developing, improving, and updating plans, then the other activities will generally lack focus, direction, and relevance.

Understanding that this is the first year of a five-year grant cycle, some changes and clarification will occur as the cycle progresses, but as planning is a foundational activity, it should be immediately and directly tied to the results of the assessment this year’s grant calls for. Otherwise, the findings of the assessments are generally meaningless if we aren’t taking action and developing plans to address them. This is leaving us with a significant gap in preparedness. Someone at CDC didn’t think this through and it leaves me with a great deal of concern, especially in the aftermath of the COVID-19 response.

What are your thoughts on this?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Properly Applying ICS in Function-Specific Plans

As with many of my posts, I begin with an observation of something that frustrates me. Through much of my career, as I review function-specific plans (e.g., shelter plans, point of distribution plans, debris management plans, mass fatality incident management plans) I see a lot of organization charts that are inserted into those plans. Almost always, the org chart is an application of a ‘full’ incident command system (ICS) org chart (Command, Command Staff, General Staff, and many subordinate positions). This is obviously suitable for a foundational emergency operations plan (EOP), an emergency operations center (EOC) plan, or something else that is very comprehensive in nature where this size and scope of an organization would be used, but function-specific plans are not that. This, to me, is yet another example of a misinterpretation, misunderstanding, and/or misuse of the principles of National Incident Management System (NIMS) and ICS.

Yes, we fundamentally have a mandate to use ICS, which is also an effective practice, but not every function and facility we activate within our response and recovery operations requires a full organization or an incident management team to run. The majority of applications of a function-specific plan are within a greater response (such as activating a commodity POD during a storm response). As such, the EOP should have already been activated and there should already be an ‘umbrella’ incident management organization (e.g., ICS) in place – which means you are (hopefully) using ICS. Duplicating the organization within every function isn’t necessary. If we truly built out organizations according to every well intentioned (but misguided) plan, we would need several incident management teams just to run a Type 3 incident. This isn’t realistic, practical, or appropriate.

Most function-specific plans, when activated, would be organized within the Operations Section of an ICS organization. There is a person in charge of that function – depending on the level of the organization in which they are placed and what the function is, there is plenty of room for discussion on what their title would be, but I do know that it absolutely is NOT Incident Commander. There is already one of those and the person running a POD doesn’t get to be it. As for ‘command staff’ positions, if there is really a need for safety or public information activity (I’m not even going to talk about liaison) at these levels, these would be assistants, as there is (should be) already a Safety Officer or PIO as a member of the actual Command Staff. Those working within these capacities at the functional level should be coordinating with the principal Command Staff personnel. As for the ‘general staff’ positions within these functions, there is no need for an Operations Section as what’s being done (again, most of the time that’s where these functions are organized) IS operations. Planning and Logistics are centralized within the ICS structure for several reasons, the most significant being an avoidance of duplication of effort. Yes, for all you ICS nerds (like me) there is an application of branch level planning (done that) and/or branch level logistics that can certainly be necessary for VERY complex functional operations, but this is really an exception and not the rule – and these MUST interface with the principal General Staff personnel. As for Finance, there are similarly many reasons for this to be centralized within the primary ICS organization, which is where it should be.

We need to have flexibility balanced with practicality in our organizations. We also need to understand that personnel (especially those trained to serve in certain positions) are finite, so it is not feasible to duplicate an ICS structure for every operational function, nor is it appropriate. The focus should be on what the actual function does and how it should organize to best facilitate that. My suggestion is that if you are writing a plan, unless you REALLY understand ICS (and I don’t mean that you’ve just taken some courses), find someone who (hopefully) does and have a conversation with them. Talk through what you are trying to accomplish with your plan and your organization; everything must have a purpose so ask ‘why?’ and question duplication of effort. This is another reason why planning is a team sport and it’s important to bring the right people onto the team.

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

First Responders as Emergency Managers

I continue to see concern with first responders entering the field of emergency management, and with good reason. Of course, this does not apply to everyone who has made this transition. I’ve seen some incredible emergency managers who have first responder roots, and obviously we are all part of the public safety family with some tangible connections, but they really are two very different fields with very different skillsets. Throughout my career and as I continue to work with emergency managers across the country, I obviously continue to come across emergency managers who were or in some cases still are first responders. Those roots are very apparent in many of them and while some traits can be beneficial, others can very much be detrimental.

Several weeks ago, someone posted a question on LinkedIn questioning why there was still such a gap between public health and emergency management. While any relationship requires work by both parties, I think the strain in this relationship in most places strongly lies with emergency managers, with much of it due to emergency managers poorly suited for the position.

The knowledge, skills, abilities, and attitudes between first responders and emergency managers don’t organically have as much overlap as many people seem to assume. One of the most common words in job descriptions for emergency management positions is ‘coordination’. Coordination is a soft skill. A people skill. It takes knowledge and awareness of who the other parties are, what they do, and what their priorities are. It requires abilities associated with communication, negotiation, and the ability to connect. Perhaps most importantly, though, it requires a proper attitude; one that is open and not standoffish, indifferent, or otherwise off-putting to others.

There is an assumption within government administrations that continues to be perpetuated within the field of emergency management that there is a direct portability between being a first responder and being an emergency manager, thus why so many first responders continue to be hired into emergency management jobs. I recently ran a tabletop exercise with two emergency managers in the room, one who was still active in the fire service, the other who was still active in law enforcement. When they made statements, the body language of others in the room immediately changed. They were unnecessarily aggressive in their demeaner and shut down conversation rather than encouraging the exchange of ideas which was the purpose of the exercise. Unfortunately, this is a common personality train among first responders. Along with the people skills needed, emergency management is a very administrative field. I also recently met with a first responder turned emergency manager in a debrief talking about incident management. He directly expressed his disdain for the bureaucracy of incident management practices and the necessity for any measure of documentation. He’s a doer. I don’t knock his perspective, but his attitude isn’t aligned with the needs of emergency management.

Certainly, some first responders do have the acumen of emergency managers. They can see things big picture. They are able to step away from tactics yet benefit by their knowledge of tactics. They have the personality and people skills necessary to facilitate coordination, not just with first responder agencies, but with others. They seek knowledge and training beyond their first responder backgrounds, recognizing that they need to know more beyond response and beyond the discipline from which they come. These are the ones who will be more prone to success, for themselves, their agencies, and their jurisdictions.

I was a first responder for ten years, including time as a chief officer; much of this time prior to entering emergency management. While I’d like to think I was reasonably positioned to become an emergency manager when I did, I’m aware now that I certainly had some of these flaws early in my emergency management career that kept me from being the best emergency manager I could at the time. Fortunately, I had great colleagues and mentors who helped guide me. I also recognized some bad examples early on and saw how their interactions with others, especially those who weren’t first responders, were not as engaging or positive as they should have been.

Can first responders become successful emergency managers? Absolutely! But being a first responder, in most cases, should not be a prerequisite for emergency management positions. Also, hiring someone is not just about what’s on their resume. If a job requires people skills and interagency coordination, that should be a big part of the interview process. First responder or not, arms crossed with short responses or an aggressive attitude is not a good indicator of someone being a people person. As emergency managers, we are responsible for our own profession. We need to make the change from within and work with those on the outside (administrations, human resources departments, etc.) to ensure that the field continues to grow in a positive direction, ensuring success for the field of emergency management as well as the people who are brought into it.

What are your thoughts on this topic?

©2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Developing Incident After-Action Reports

Incident and event after action reports (AARs) are extremely important for identifying the successes and challenges we faced in our efforts. Just like our evaluation efforts in exercises, many valuable lessons can be learned and effective practices identified from incidents and events. Yet for as much as incident and event AARs are encouraged, there are often problems with how these are developed.

While the quality of exercise after action reports is often not up to par, a defined process of exercise evaluation along with a suggested AAR format has been available to us and engrained in emergency management practice for a long time via the Homeland Security Exercise and Evaluation Program (HSEEP). While some concepts of exercise evaluation can be utilized for incident and event evaluation, we need to have a very different approach to be most effective.

FEMA has been promoting a Continuous Improvement concept for emergency management for several years. Incident and event evaluation is part of continuous improvement, though continuous improvement is intended to permeate much more of our daily and incident operations. While FEMA’s program has some good information that applies to incident and event evaluation, there are some important things I feel are missing.

Perhaps the most significant difference in our approach to incident and event evaluation vs exercise evaluation is the evaluation focus. Exercises, right from our very first steps of design, are designed explicitly for evaluation. The identification of capabilities and exercise objectives gives direction to our design and directly informs our evaluation of the exercise. Essentially, the intent and focus of evaluation is baked in from the start. For incidents and events, however, it is not.

Because evaluation is not a primary intent of incidents and events, we generally need to determine our evaluation strategy afterwards. The development of our evaluation strategy absolutely must begin with the identification of what we want to evaluate. This is a critical element not included in FEMA’s Continuous Improvement guidance. Without determining the focus of the evaluation, the discovery process lacks direction and may likely explore areas of incident/event operations that are lower priority to stakeholders. Determining what the evaluation effort will focus on can be considered similar to developing objectives, and as such should be specific enough to give proper direction to the evaluation effort. For example, having done numerous COVID-19 AARs, it’s not enough to say that we will evaluate ‘vaccination’. Vaccination is a very broad activity so we should determine specific aspects of vaccination to focus on, such as equity of distribution or vaccine point of dispensing (POD) operations. Obviously multiple focus areas can be identified based upon what is most important to stakeholders. And no, incident objectives should not serve as your focal points. These are operational objectives that have nothing to do with evaluation, though your evaluation itself may likely take the incident objectives (and associated actions) into consideration.

FEMA’s Continuous Improvement guidance provides a lot of great insight for the discovery process. The most common tools I use are focus groups, interviews, document reviews, and surveys. Focus groups and interviews allow people to tell their experiences from their perspectives. These offer a lot of insight and include facts as well as opinions, both of which are valid in the AAR process, as long as they are handled properly in the process, as discerning between the two is important.

Document reviews are also important. Typically I look at documents developed before the incident (mostly plans) and those developed during the incident (such as press releases, incident action plans, situation reports, and operational plans). While documents developed during the incident typically tell me what was done or what was intended to be done, the documents developed prior to the incident typically provide me with a standard from which to work.

There are a couple of important caveats with this:

1) Many plans are operationally inadequate, so they may not have been able to be followed.

2) Many organizations don’t reference their plans, regardless of quality.

As such, a big part of my document review is also determining the quality of the documents and if they were referenced during the incident or event. It may very well be that the actions taken were better than what was in the plans.

Surveys… so much to say about surveys that probably deserves its own blog post. Surveys can be great tools, but most tend to design poor surveys. They should be succinct and to the point. You will want to ask a lot of questions, but resist the urge to do so. The more questions you ask, the lower the rate of return on surveys. So focus on a few questions that will give you great data.

We then go to writing, which involves the organization of our information, formation of key observations (by focus area), a narrative analysis for each observation, and development of one or more recommendations for each observation. The analysis is an aspect that many AARs, including those for exercises, miss the mark. The analysis needs to contextualize the observation and justify the recommendations. It should provide sufficient detail for someone not knowledgeable in that observation (or of the incident) to have a reasonable understanding of the associated issues. Remember that an AAR may be referenced for years to come and can also be used to support budgets, grant applications, and obviously the corrective actions that are identified. A good analysis is necessary and should certainly be more than a couple of sentences. Be sure to identify strengths and effective practices, not just lessons learned and challenges.

I do not advocate using the HSEEP AAR template for incident and event evaluations. Beyond an awkward fit for some of the ‘fill-in-the-box’ information, the overall structure is not supportive of what an incident or event AAR needs to include. I suggest writing the AAR like a professional report. I’d include an executive summary, table of contents, research methodology, observations/analysis/recommendations, an incident or event timeline, and summary of recommendations (I do still like to use the traditional HSEEP improvement plan matrix for this). I tend to have a lot of citations throughout the document (typically I footnote these). Citations can include standards, such as NIMS, references (plans), media articles, and more.

A couple of notes: 1 – When planning our management of an event, we can be more proactive in evaluation by including it as a deliberate component of our efforts. 2 – Incident evaluation can begin during the incident by tasking an incident evaluator.

Incident and event evaluation can be daunting to approach. It requires endorsement from the highest levels to ensure cooperation and access to information. Honesty is what is needed, not sugar coating. Far too many AARs I’ve seen for exercises, incidents, and events are very soft and empty. Remember that we aren’t evaluating people, rather we are evaluating plans, processes, systems, and decisions. The final AAR should be shared with stakeholders so they can learn and apply corrective actions that may be relevant to them. Given most state public information laws, the AAR may need to be made available to the public, which is more reason to ensure that it is professionally written and that observations have quality analysis as members of the public may require context. I’ve also seen many elected and appointed officials (and legal counsels) be reluctant to have written reports or written reports with much detail because of freedom of information laws. While I understand that accountability and transparency can create challenges, we must remember that governments works on behalf of the people, and the acknowledgement of mistakes and shortcomings (as well as successes) is important to continuous improvement of the services we provide.

What is your approach with incident and event AARs? Where do you see that we need to improve this important process?

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®