Revisiting POETE

One of my most popular posts has been my original post on POETE from July 2014. In the 11+ years has passed since that post, I continue leveraging the concept in every way I can. In case you’ve not heard of the concept, I certainly urge you to click the link above and read my original post. Briefly, POETE stands for Planning, Organizing, Equipping, Training, and Exercises. These are collectively known as the elements of preparedness. POETE is more than a checklist to me. It’s a strategic lens for realistically building and sustaining capabilities. Whether you are building a new emergency operations plan, launching a new public health preparedness initiative, or refining multiagency coordination activities, POETE offers a structured way to think through what takes to help ensure these endeavors are implementation-ready.

While I’ve written on these in the past, my continued and diverse application of POETE has broadened my perspective on application, so here are some fresh thoughts.

Assessing – On occasion, I throw an A in front of the acronym for Assessing. While assessments are an early activity of Planning, there are also plenty of stand-alone assessment activities which should be regarded of their own accord. Assessments can and should inform everything else we do in preparedness. Good assessments can provide us with justification for certain activities and investments and can often give us a data-driven approach. Along with many of the risk assessments common across emergency management, like the Threat, Hazard Identification, and Risk Assessment (THIRA), I’d also suggest that (well written) after-action reports (AARs) can also do the job. A well-developed AAR for an incident, event, or exercise can provide objective analysis of observed activities or discussions. When writing an AAR, we should always keep in mind that part of achieving the goal of improvement may involve requests and justifications for funding.

Planning – I’ve written a lot on the topic of emergency planning through the years. Overall, my take on most emergency plans is that they suck. Horribly. They aren’t worth the time, money, or effort invested in writing them. So many people go about it wrong. A true plan needs to be a blueprint for action. Good plans are operationally-focused, guiding decisions and actions. They should not just be theory and policy, as so many are. At best, I’d call something like that a framework, but it’s certainly not a plan.

Organizing – Organizing is largely about structure, roles, and responsibilities, but you can’t even get there without first building relationships and partnerships. Everything we do in emergency management is about relationships. It’s about knowing who has the thing you need – be it a physical resource, specialized knowledge, or specific authority. Last week I wrote a new piece on Community Lifelines. The central activity of doing anything with Community Lifelines is building relationships. Once those relationships are in place, then other activities will follow.

Equipping – I’ve always been very big on tools matching the mission. Equipment in this context means any and all resources available to us. The key aspect of this is alignment. Are the tools we use matching up to our threats, our people, and our procedures? While it’s understandable to have to update procedures to match a new resource, we should be very cautious about the resource dictating procedure. Our resources need to work for us, not the other way around.

Training – I feel like we have been gradually moving away from compliance being the center of the training universe. Yes, there is still plenty of training that is required for various purposes – there should be and there will always be. But I’ve been getting more requests from clients to develop custom training because they realize that little to no training exists to meet their needs. More people are realizing, for example, that ICS training is absolutely not the fit for EOC staff. Similarly, they are realizing that existing EOC training might begin to approach their needs, but the implementation of their specific EOC model really requires customized training. Overall, training needs to be role-based. We need to be training people what we want them to do. We need to give them the knowledge to succeed, not just generalized training for a broad group hoping that people will be able to ascertain what pertains to them and what does not. We also need to realize that, since most training in emergency management is response-oriented, the things they are being trained to do are things they don’t do often and/or don’t do them under pressure. So frequency of training and job aids are essential to their success.

Exercises – The thing I do the highest volume of. Luckily, I love to do them! Exercises are about testing our plans and capabilities before they are tested for real. Pay attention to good exercise design and never forget that the end product is a worthwhile AAR. I still see so many softball AARs out there. AARs that pat people on the back for a job well done while only acknowledging the superficial opportunities to improve – often times because they don’t want to hurt anyone’s feelings. I don’t ever write an AAR for the purpose of offending anyone, but if we don’t expose what doesn’t work, the chances of it ever being addressed are so much lower than if we had documented it.

While we have the acronym of (A)POETE, it’s important to keep in mind that it’s not intended to be a linear process. It’s iterative and constantly in need of attention. Each component is informed by the others. While I generally believe that Planning is still the foundation of preparedness and it should heavily influence all other elements, those other elements can still influence Planning. POETE activities should be used to build our capabilities. These activities help us prepare with purpose, focus, and intent.

© 2025 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Replacing ESFs with Community Lifelines

I’ve written previously about my concerns with using Emergency Support Functions in many state, local, territorial, and tribal (SLTT) emergency operations centers. The ESF structure was never intended for SLTT use, and while it may have some successes with the largest of the states and metropolitan areas, it’s generally not a good fit.

I’ve also written previously about Community Lifelines and the benefits thereof. Consider, however, that Community Lifelines, while designed originally for the organization of information by FEMA regional offices when they monitor a disaster, can have much broader applicability. We can and should be using Community Lifelines across every phase / mission area of Emergency Management.

Lately I’ve been having more and more conversations about Community Lifelines with clients, at conferences, and with others who are interested in learning more about them and how to use them. Across emergency management we often find or are provided with approaches to problems that are single-use. We should regularly explore opportunities to expand those single-use applications, increasing the utility of the concept at hand. Given the shortcomings of ESFs for most jurisdictions and the much broader applicability of Community Lifelines for every jurisdiction under the sun, I suggest that Community Lifelines cannot only be operationalized to be a viable replacement for ESFs, they can do so much more. Here are my arguments in support of replacing ESFs with a Community Lifelines – driven organization in SLTT EOCs as well as emergency management programs as a whole:

  1. Community Lifelines are community-focused and more comprehensive of the needs of a community, whereas ESFs are driven by functions which may have limited capabilities or capacities in any given jurisdiction.
  2. Community Lifelines can be operationalized just like ESFs, with primary and support agencies and organizations.
  3. Community Lifelines are focused on stabilizing critical services with built-in mechanisms for assessing impacts and establishing priorities.
  4. Community Lifelines more directly support the inclusion of the private sector, along with government, NGOs, and quasi-government owners/operators.
  5. Community Lifelines provide better preparedness and resilience initiatives.
  6. Community Lifelines provide us with a basis for measuring progress across all phases or mission areas. The only thing we can measure in an ESF is what we might have available to leverage in a response.
  7. Community Lifelines connect resilience, response, and recovery since they are the focal point. While the National Response Framework and National Recovery Framework still have national relevance, the transition from ESFs to Recovery Support Functions (RSFs) is challenging at best.
  8. The inclusion of Community Lifelines in our EOC structure is easy and agnostic to the organizational model used in the EOC. ESFs include functions that are part of the typical overhead management of an EOC, such as ESF 5 (Information and Planning), ESF 7 (Logistics), and ESF 15 (External Affairs), which is an awkward integration.
  9. Community Lifelines lend to better partnerships and preparedness. The ESF plans of most jurisdictions are truly little more than a general scope of the ESF with a list of participating agencies and organizations.

We need to change our mindset of emergency management being centered on response. Yes, response is the big shiny thing. It’s the thing we practice for and anticipate. A more wholistic and comprehensive approach is available to us, however, by using Community Lifelines as the foundation of our work. I suggest that jurisdictions develop Community Lifeline Implementation Plans, which are fundamentally strategic plans identifying how Community Lifelines can be used in Prevention/Protection/Mitigation, Preparedness, Response, and Recovery. Consider how the relationships forged with the owners/operators of Community Lifeline partners can support each of those phases and activities, increasing the resiliency of our community as a whole by making each partner more resilient; and by understanding and preparing for the response and recovery needs of our community through the collective effort of Community Lifeline partners.

Emergency management is more than response. It is a comprehensive effort to support our communities before, during, and after disaster.

© 2025 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Generative AI in EM

We’ve recently seen a significant increase in discussion about applications of artificial intelligence (AI) (generative AI, to be more specific) in the emergency management (EM) space. In just a few minutes of scrolling through my LinkedIn feed this morning I’ve come across four user posts and one user-posted article expressing caution and concern about the use of AI in a full range of EM-related work, and one post extolling the advantages of using AI in certain EM applications. These posts expressing caution included topics such as being disingenuous in our work, inaccuracies of AI products, and accountability for use. The post in favor indicated ease of use and efficiency as principal advantages.

AI can certainly be a tool with many applications to support aspects of our job. It can short cut a lot of activities for us, saving huge amounts of time. It can generate ideas or create an outline to work from. But it cannot reliably and completely replace what we do. I see it as a complementary tool – one that still requires human input, intervention, and review to be successful. In examining the pros and cons, we can’t just look at it superficially, though. There are concerns of information security, intellectual property, inaccuracies, environmental impact, and ethical accountability to consider.

There are concerns about where generative AI platforms source their data. In essence, it can be seen as a type of crowd sourcing, pulling data from across the internet, similar to how we might in doing research. However, generative AI does not often cite its sources and has been heavily criticized by writers and artists of plagiarism. I’ve actually run a few tests of my own, asking a generative AI tool to write about certain topics that are very niche, with myself being one of the few people writing on those topics. While it did cite me as a source on a couple of occasions, it typically did not, though there were clearly word-for-word phrases sourced from my own writing. Additionally, generative AI is not skilled in discerning truth from misinformation or disinformation, potentially leading to significant inaccuracies. On the flip side, anything you input into a public generative AI platform, such as an emergency plan, is now part of that AI’s dataset, bringing potential security concerns to the discussion.

What has me even further concerned is the cognitive impact on those who habitually use AI to do much of their work. MIT did a study which concluded that overuse of AI harms critical thinking. Microsoft partnered with Carnegie Mellon for a study that came to similar conclusions. We should also be aware of the environmental impacts of AI data farms, (and here is another article for your reference) which is a significantly growing concern around the world.

In regard to the impacts on critical thinking, I have severe concerns about the need to raise the bar of emergency manager knowledge, skills, and abilities (KSAs), not just as a matter of progressing the profession, but because of some serious gaps we’ve recently seen identified in after action reports (AARs), media statements, social media posts, and other releases that demonstrate a huge lack of understanding of key concepts among emergency managers. While the use of generative AI may help support the work involved in various projects, I would argue that it is not promoting or advancing individual KSAs in the field of emergency management (aside from those needed to interface with AI). If unplugging an emergency manager from AI tools results in us no longer having a knowledgeable, skilled, and able emergency manager, we have a major problem. 

I say all this not ignorant of the fact that I have friends and colleagues who use generative AI to help them develop content, such as their LinkedIn posts. Largely, these individuals have been transparent about their use of generative AI, indicating that they use it either up front to help provide structure to an idea which they then use as a framework to flesh out on their own, or at the end of their own creative process to tighten up their work. Overall, I don’t see much detriment in these approaches and uses, and have even acknowledged that my college students may be using it in these ways and providing them some guidance that supports successful use while helping them to ensure accuracy and avoid any allusion of plagiarism. It’s when people habitually use generative AI to pass off work as their own with little to no human input that I have concern. I also have friends and colleagues working with much broader applications of AI, for which I have concerns. While my concerns aren’t necessarily opposition, as I clearly see the benefits of these tools and uses for what we do in EM, I still see a lot of potential for eroding KSAs and critical thinking in our field, which is something we cannot afford. Yet I remain cautiously optimistic of a net gain. 

For those that choose to use it to generate content and outputs, be ethical and transparent about it. There is no shame in using AI, just consider citing it as you would a source (because it’s not your work) just as you should be with any other sources, and obviously be aware of the pros and cons of using generative AI. Generative AI is still a developing technology, a toddler perhaps in terms of relative growth, and I think even proponents should be skeptical, as skepticism can help address many of these concerns. Consider that toddlers can be fun, but they can cause absolute chaos and can’t be left unattended for even a moment.

I’m reminded of a saying with its roots in project management that goes something like this: You can have it cheap, fast, or good, but you can only pick two. Here’s what the options look like:

  • Fast and good won’t be cheap.
  • Good and cheap won’t be fast.
  • Fast and cheap won’t be good.

It seems to me that most people using generative AI are trying to pick the ‘fast and cheap’ options, which bring about the majority of concerns associated with quality and integrity in this article, but when we look beyond the superficial, into things like the environmental impacts of AI data farms and the cognitive impacts to high-volume users, the end result certainly isn’t cheap, no matter what options we pick.

©2025 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Why Aren’t We Asking Questions and Demanding Answers?

Having been on the road practically every week this year has made blogging a bit of a challenge. I’ve had some engagement on LinkedIn, often in discourse on the issues facing disaster and emergency management brought on largely by the current administration.

You’ve probably heard of the concept of mushroom management, right? Keep people in the dark and feed them shit. That’s what we’re getting regarding the future of FEMA, which makes it extremely difficult for practitioners to chart a course ahead for this field. The only information we receive is poorly communicated intent and obfuscation. Those who are supposed to be representing the profession either aren’t asking the right questions or are being ignored.

Let’s take a look at this horrible bit of journalism from earlier in the month (I’ll note that most news outlets have been using the same quotes and generally haven’t brought us anything new either).

I’ll break this down a bit. First, the headline: Trump and Noem detail planned FEMA changes: ‘We’re going to give out less money’. Commentary: There is NO detail in the article or in anything the administration has communicated about this. Simply stating that they are going to give out less money leaves a whole lot of questions. Much of the narrative has been around funds associated with declared emergencies and major disasters, but there is a whole lot of other funding that FEMA manages. More on this later.

A quote early in the article: “We want to wean off FEMA and we want to bring it down to the state level… We’re moving it back to the states.” Commentary: If I’ve not made this abundantly clear before, DISASTER MANAGEMENT HAS ALWAYS BEEN A STATE (and local, tribal, and territorial) ISSUE. What exactly is being moved back to the states? Further, as I’ve stated in a previous article, I’m not opposed to a ‘weaning off’ of federal funds, but there should be a plan in place for this which is implemented over time and communicated with enough lead time to allow state, local, tribal, and territorial governments to begin adjusting budgets. Also, this again begs the question of exactly what funding is being changed. We need information so this can be addressed.  

The next quote in the article: “We’re going to give out less money… It’s going to come from the President’s Office” Commentary: How, exactly, is it coming from the President’s office? The Stafford Act assigns FEMA with the responsibility for coordinating federal disaster assistance. So are we changing the law?

The next item I want to poke at is a quote from Secretary Noem in the article, which reads “the administration is building communication and mutual aid agreements among states to respond to each other so that they can stand on their own two feet…”. Commentary: Does she mean the Emergency Management Assistance Compact (EMAC)? The same EMAC that has its roots going back to 1992 and was ratified by Congress in 1996 as a nation-wide system of state-to-state mutual aid and used very effectively since then? That mutual aid system? Or is the administration building something else? This seems to be a greatly misinformed quote.

With no apology for my language, what in the actual fuck is going on here?

I pointed out the article as being a horrible bit of journalism because it’s simply reporting quotes and NO ONE IS ASKING QUESTIONS. (Or if they are, they aren’t being answered). There are a lot of questions that need to be asked and answered. This is a big issue. A complicated issue. It requires a whole lot of clarity. The problem is, it seems that those making decisions really don’t know anything about the topic or what they are doing.

Over the past couple of months we’ve seen leaked budget proposals that impact emergency and disaster management grants, public health emergency preparedness grants, and the like. Those grants that aren’t being completely cut in these proposals are being reduced by tens of percentage points, equating to tens, and sometimes hundreds of millions of dollars. Regardless of the long-term impacts of these decreases, this is massively disruptive in the short term. Disruptive to the point that entire programs will be gutted and our collective ability to respond to disasters will be dangerously impacted. Again, I ask what is the plan or is the administration simply hacking and slashing away at things? As no existence of a plan, much less a strategy, has been mentioned, I can only believe this to be arbitrary and uninformed.

So many people are acknowledging there are issues with FEMA and that addressing them can be good, and I won’t dispute that, but that’s as far as they go. It’s a statement, a justification, an excuse, but it’s not a conversation. This needs to be a conversation. Uninformed change is bad. Misguided change is bad. Change for the sake of change is bad. As I stated in a recent LinkedIn post, emergency managers need to get their shit together. While the usual pace of government and bureaucracy can be frustratingly slow at times, we have systems in place for a reason. Change at this level must be well considered with input garnered from across the practice. The rhetoric and bull in a China shop approach might excite those who are easily impressed with superficial, performative bullshit; but it shouldn’t for those of us with some intelligence and background in this. We have a lot of smart people in emergency and disaster management, but I’m not hearing a lot of voices.

Who should ask questions? What percentage of elected officials at the federal level even know what the Emergency Management Performance Grant (EMPG) is? I’m betting it’s pretty low. Do they even realize there are prevention, preparedness, and hazard mitigation funds and programs or do they only know about assistance after a disaster? It’s easy for us to point a finger at the membership organizations for not doing enough advocacy and outreach (are they?), but that job isn’t theirs alone. It’s on ALL of us to get elected officials to understand what these programs are and why they are necessary, or at least advocate a better way to enact cuts. If you don’t know how to contact your members of Congress, start here and tell them the current approach is unacceptable. Likewise, if you haven’t contacted your state elected officials, you need to do so as well. The writing seems to be on the wall that these significant cuts in funding are coming and they need to discuss (with practitioners!!!) how to deal with it.

Regardless of the outcome of FEMA’s status and that of federal program funding, disaster management will continue, but the impacts may very likely be severe, and not just in the short term. There will be lasting impacts which will need to be addressed through years to come and at a much greater cost than delaying this politically-driven action in exchange for a more thoughtful approach.

TR

Cutting Grant Funds Cuts National Practices

As some rumors become reality for the current fiscal year and budget memos are leaked for the coming fiscal year, one thing is clear – states, local governments, tribal governments, and territories (SLTTs) will be receiving significantly less federal grant funding for preparedness. While some programs are expected to be outright eliminated, others are being reorganized and refocused with significant budget cuts. While not all change is bad, there is a significant shift in preparedness priorities that is largely politically motivated and lacking foundations in reality. I wrote last month on the Future of the US Emergency Management System, which focuses mostly on FEMA-centric topics, but we are also seeing and expecting major cuts to public health emergency preparedness (PHEP) grant funds, the elimination of certain PHEP programs, and indirect impacts to PHEP from cuts to other public health programs.  Similar cuts are also expected with the Hospital Preparedness Program (HPP). While I don’t think preparedness funds will be completely cut, the impacts will be significant until SLTTs are able to adjust their own budgets to address what priorities they can.  

Grant funding, however, is not only to the direct advantage of the recipients. Compliance with grant rules has long supported national standards (note that I use this term loosely. See this article for more information). FEMA preparedness grants, PHEP grants, and HPP grants, among others, have required the adoption of the National Incident Management System (NIMS), the use of the Homeland Security Exercise and Evaluation Program (HSEEP), national focus on certain threats or hazards, and reasonable consistency in building and sustaining defined capabilities. Grants have been the proverbial carrot that encouraged compliance and participation. While some of the results have been poorly measured (see my annual commentary on the National Preparedness Reports), the benefits of others have been much more tangible. Keeping things real, compliance with many of these requirements by some recipients may have been lackluster at best. Enforcement of these requirements has been practically non-existent (despite rumors of the “NIMS Police” circulating for years), which I think is a shame. That said, I think most recipients worked to meet most requirements in good faith; perhaps partly because someone’s signature attested to it, but I think mostly because many of these requirements were viewed as best practices. As such, while the requirements may be going away if there is no carrot for compliance, I think many jurisdictions will continue implementation.

All this, however, looks at past requirements. But what of new practices that would benefit from nation-wide implementation? I fear that without practices being required as part of a grant, adoption will be minimal. We would have to count on several factors for adoption to take place.

1) Emergency managers would need to be informed of the practice and the benefits thereof. Let’s be honest, most emergency managers are not well informed of new practices and concepts. Often, they simply don’t have the time to do any more than what they are doing, but unfortunately some may not care. Agencies like FEMA have also been notoriously bad at circulating information on new programs, practices, and concepts.

2) Emergency managers would have to agree that the practice can be beneficial to them.

3) Emergency managers would need the resources (time, staff, funds, etc.) to actually implement the practice.

4) In a multi-agency environment, partner agencies are more willing to support activities if they are told it’s a grant requirement – even if it’s not their own grant requirement. They may be reluctant to commit resources to something that is simply perceived as a good idea.

There are certainly a number of challenges ahead for emergency management in the broadest of applications. What I discuss here only scratches the surface. Let’s not lose sight of the benefits of best practices and standards, even if no one is telling us we need to adhere to them. That’s a hallmark of professionalism. We need to collectively advocate for our profession and the resources necessary to perform the critical functions we have. We need to take the time to advocate and to be deliberate in our actions. We need to secure multiple funding streams from every level of government possible. We need to identify efficiencies and leverage commonalities among partner agencies. Yes, lend your voice to the national organizations, but know that it’s up to you to advocate in your municipality, county, and state – and those efforts are now more important than ever.

©2025 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

2024 National Preparedness Report – Another Missed Opportunity

The annual National Preparedness Report (NPR) is a requirement of Presidential Policy Directive 8, which states that the NPR is based on the National Preparedness Goal. The National Preparedness Goal, per the FEMA website, is “A secure and resilient nation with the capabilities required across the whole community to prevent, protect against, mitigate, respond to, and recover from the threats and hazards that pose the greatest risk.” The capabilities indicated in the National Preparedness Goal are specifically the 32 Core Capabilities.

The 2024 NPR is developed to reflect data and information from 2023. As with previous NPRs, I have a lot of concern about the ultimate value of the document. While I’m sure a lot of time, effort, and money was spent gathering an abundance of data from across the nation to support this report, this year’s report, following the unfortunate trend of its predecessors, doesn’t seem to be worth the investment. As with the others, this report falls short on adequate scope, information, and recommendations. Certainly, there is a challenge to be acknowledged of not only gathering a massive quantity of information from across the country but also examining and reporting this information in aggregate, as most federal reports are burdened to do. That said, I see little excuse to not provide a meaningful report.

In this year’s report, following the introductory materials, is a section on risks, which is largely a reflection of the high impact disasters of 2023 seen across the US; the most challenging threats and hazards; and the intersections of risk and vulnerability. All in all, this is an adequate snapshot of these topics in summary, with some solid points and a level of analysis that I would expect through the rest of the document, which includes trends over time, and identification of factors which influence the findings. There are several maps and charts which provide good data visualization and several mentions of bridging data between agencies such as FEMA, NOAA, and CDC. A good start.

The next section is Capabilities. This section has two areas of narrative – community preparedness and individual and household preparedness. Given the significant efforts to bolster capabilities throughout the federal government and in state, tribal, and territorial governments, it seems these levels are obviously missing if we are to suggest that all local governments are simply communities, so I’m not sure why this is specifically titled community preparedness. Does it not include the efforts of states or others? Page 18 of the report provides a chart similar to what we’ve seen in previous reports which shows how much money was spent on each capability (in communities… again, what does this include or exclude?) for 2023. The chart also indicates the percentage of communities achieving their capability targets.

As with the reports from the previous year, I ask: So what? This is a snapshot in time and lacking context. A trend analysis accounting for at least the past several years would be quite insightful, as would some description of what the funds within each capability were primarily spent on – broadly planning, organizing, equipping, training, and exercises, but I’d like to see even more specifics. There are a few random examples in the narrative, but a lot is still lacking. I’d also like to see some analysis of relative success or value of these investments. In regard specifically to the percentage of communities who feel they have achieved their capability target, I have to eye roll a bit at this, as this is often the most subjective (and sometimes smoke and mirrors) aspect of the Threat and Hazard Identification and Risk Assessment (THIRA). This chart currently has little value other than a ‘gee whiz’ factor of seeing how much money is spent on each capability.

I’ll also include a specific observation of mine here: the Core Capability of Mass Care Services, which in the previous year’s report was indicated as a high-priority capability, continues a trend (I’m only aware of the trend from looking back at previous reports since trend data is not included in the report) of having one of the lowest achievement percentages and investments. I’m hopeful that’s why it’s included in the next section as a focus area.

The other area of narrative in the Capabilities section is individual and household preparedness. All in all, the information presented here is fine and even includes a slight bit of trend analysis, though in 2023 a much more comprehensive reporting of this information was provided under separate cover. I think an improved version of something like the 2023 report should be incorporated into the NPR.

The next section of the NPR is Focus Areas, which includes the Core Capabilities of Mass Care Services, Public Information and Warning, Infrastructure Systems, and Cybersecurity. Each focus area includes narrative on risk, capabilities and gaps, and management opportunities – which all provide great information. There is a brief mention of how these focus areas were selected. While I’m fine with having a deeper analysis of certain focus areas, I think the NPR should still provide a comprehensive review of all Core Capabilities.

While the management opportunities listed for each of the four focus areas are essentially recommendations, the report itself only provides two recommendations which are labeled as such. These recommendations are identified in the document’s introduction with a bit of narrative (and the conclusion with none), that thankfully provides some suggestions for actionable implementation, but I was left feeling both surprised and disappointed that the National Preparedness Report, which really should be providing an analysis of all 32 Core Capabilities which serve as the foundation for nation’s preparedness goal, has only two recommendations for improving our preparedness. Two. That’s it. There should be an abundance of recommendations. This is the information that emergency managers and decision-makers within the field of practice need within federal agencies and state, local, tribal, and territorial (SLTT) governments. Another missed opportunity to provide value.

The 2024 NPR is extremely similar to the past several years in format and general content, and as such I’m not surprised by the lack of value. I continue to stand by my statement across these past several years in regard to this report: the emergency management community should not be accepting this type of reporting. While I recognize that through PPD8, it is defined that the audience for this report is the President and the Secretary of Homeland Security, the utility of such a report can and should have a much broader reach across all of emergency management, and idealistically to tax payers as well, who should be able to access better information on how their tax dollars are spent within preparedness – which impacts everyone. States, UASIs, and other entities who submit information annually for this report should also be disappointed that this is what is published about their hard work, and the emergency management membership organizations should also be demanding better. This report has the potential to be meaningful, insightful, and influential, yet FEMA misses the opportunity every single year to do so. The data exists, and the stories of the activities, accomplishments, and gaps can all be told. With the application of some reasonable analysis and recommendations, the document could be much more impactful.

It’s been said by many that emergency managers are notorious for not marketing well, and this document is proof positive of that. Those of us working in this profession know there is so much more to be examined and described that can tell of not only what we have accomplished but also of the work to be done. We find ourselves in a time where the purpose and value of FEMA is being questioned by a number of people; a time where some inefficiencies, missteps, and even failures are being put under a very critical microscope and seemingly being used to fuel a suggestion of eliminating FEMA. Greater efficiencies can certainly be identified and gaps addressed, but our reluctance to tell the stories of what we do clearly lend to misunderstandings and a severe lack of awareness that exist about our field of practice – one in which there is no organization of greater prominence and importance than FEMA. While the NPR is not at fault for these shortcomings, it is a contributor. When reports like this miss opportunities to do more and be more year over year, that snowballs and becomes a much greater issue. We need to do better.

© 2015 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

NIMS Intel and Investigations Function – A Dose of Reality

Background

Soon after the initiation of the National Incident Management System (NIMS) as a result of Homeland Security Presidential Directive 5 in 2003, the Intelligence and Investigation (I/I) function was developed and introduced to NIMS, specifically to the Incident Command System (ICS). While we traditionally view I/I as a law enforcement function, there are other activities which guidance indicates may fall within I/I, such as epidemiology (personally, I’d designate epidemiology as a specific function, as we saw done by many during the COVID-19 response), various cause and origin investigations, and others. Integration of these activities into the response structure has clear advantages.

The initial guidance for the I/I function was largely developed by command personnel with the New York City Police Department (NYPD). This guidance offered several possible locations for the I/I function within the ICS structure, based on anticipated level of activity, needed support, and restrictions of I/I related information. These four possible ways of organizing the I/I function per this guidance are depicted here, and include:

  1. Placement as a Command Staff position
  2. Organized within the Operations Section (i.e. at a Branch level)
  3. Developed as its own section
  4. Included as a distinct unit within the Planning Section

These concepts have been included in the NIMS doctrine and have been supported within the NIMS Intelligence/Investigations Function Guidance and Field Operations Guide, though oddly enough, this second document ONLY addresses the organization of an I/I Section and not the other three options.

The Reality

Organization of I/I can and does certainly occur through any one of these four organizational models, though my own experiences and experiences of others as described to me have shown that very often this kind of integration of I/I within the ICS structure simply does not occur. Having worked with numerous municipal, county, state, federal, and specially designated law enforcement agencies, I’ve found that the I/I function is often a detached activity which is absolutely not operating under the command and control of the incident commander.

Many of the sources of I/I come from fusion centers, which are off-scene operations, or from agencies with specific authorities for I/I activities that generally have no desire or need to become part of the ICS structure, such as the FBI conducting a preliminary investigation into an incident to determine if it was a criminal act, or the NTSB investigating cause and origin of a transportation incident. These entities certainly should be communicating and coordinating with the ICS structure for scene access and operational deconfliction, but are operating under their own authority and conducting specific operations which are largely separate from the typical life safety and recovery operations on which the ICS structure is focused.

My opinion on this is that operationally it’s completely OK to have the I/I function detached from the ICS structure. There are often coordination meetings and briefings that occur between the I/I function and the ICS structure which address safety issues and acknowledge priorities and authorities, but the I/I function is in no way reporting to the IC. Coordination, however, is essential to safety and mutual operational success.

I find that the relationship of I/I to the ICS structure most often depends on where law enforcement is primarily organized within the ICS structure and who is managing that interest. For example, if the incident commander (IC) is from a law enforcement agency, interactions with I/I activities are more likely to be directly with the IC. Otherwise, interactions with I/I are typically handled within the Operations Section through a law enforcement representative within that structure. Similarly, I’ve also experienced I/I activity to have interactions with an emergency operations center (EOC) through the EOC director (often not law enforcement, though having designated jurisdictional authority and/or political clout) or through a law enforcement agency representative. As such, compared to the options depicted on an org chart through the earlier link, we would see this coordination or interaction depicted with a dotted line, indicating that authority is not necessarily inherent.

I think that the I/I function organized within the ICS structure is more likely to happen when a law enforcement agency has significant responsibility and authority on an incident, and even more likely if a law enforcement representative is the IC or represented in a Unified Command. I also think that the size and capabilities of the law enforcement agency is a factor, as it may be their own organic I/I function that is performing within the incident. As such, it would make sense that a law enforcement agency such as NYPD, another large metropolitan law enforcement agency, or a state police agency leading or heavily influencing an ICS structure would be more likely to bring an integrated I/I function to that structure. Given this, it makes sense that representatives from NYPD would have initially developed these four possible organizational models and seemingly exclude the possibility of a detached I/I function, but we clearly have numerous use cases where these models are not being followed. I’ll also acknowledge that there may very well be occurrences where I/I isn’t but should be integrated into the ICS structure. This is a matter for policy and training to address when those gaps are identified.

I believe that NIMS doctrine needs to acknowledge that a detached I/I function is not just possible, but very likely to occur. Following this, I’d like to see the NIMS Intelligence/Investigation Function Guidance and Field Operations Guide updated to include this reality, along with operational guidance on how best to interact with a detached I/I function. Of course, to support implementation of doctrine, this would then require policies, plans, and procedures to be updated, and training provided to reflect these changes, with exercises to test and reinforce the concepts.

What interactions have you seen between an ICS or EOC structure and the I/I function? What successes and challenges have you seen from it?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Culture of Preparedness – a Lofty Goal

September is National Preparedness Month here in the US. As we soon head into October, it’s a good opportunity to reflect on what we’ve accomplished during the month, or even elsewhere in the year. While National Preparedness Month is an important thing to mark and to remind us of how important it is to be prepared, over the past several years I’ve come to question our approaches to community preparedness. What are we doing that’s actually moving the needle of community preparedness in a positive direction? Flyers and presentations and preparedness kits aren’t doing it. While I can’t throw any particular numbers into the mix, I think most will agree that our return on investment is extremely low. Am I ready to throw all our efforts away and say it’s not making any difference at all? Of course not. Even one person walking away from a presentation who makes changes within their household to become better prepared is important. But what impact are we having overall?

Culture of preparedness is a buzz phrase used quite a bit over the last number of years. What is a culture of preparedness? An AI assisted Google search tells me that a culture of preparedness is ‘a system that emphasizes the importance of preparing for and responding to disasters, and that everyone has a role to play in doing so.’ Most agree that we don’t have a great culture of preparedness across much of the US (and many other nations) and that we need to improve our culture of preparedness. But how?

People love to throw that phrase into the mix of a discussion, claiming that improving the culture of preparedness will solve a lot of issues. They may very well be correct, but it’s about as effective as a doctor telling you that you will be fine from the tumor they found once a cure for cancer is discovered. Sure, the intent is good, but the statement isn’t helpful right now. We need to actually figure out HOW to improve our culture of preparedness. We also need to recognize that in all likelihood it will take more than one generation to actually realize the impacts of deliberate work toward improvement.

The time has come for us to stop talking about how our culture of preparedness needs improvement and to actually do something about it. There isn’t one particular answer or approach that will do this. Culture of preparedness is a whole community concept. We rightfully put a lot of time, effort, and money into ensuring that our responders (broad definition applied) are prepared, because they are the ones we rely on most. I’d say their culture of preparedness is decent (maybe a B-), but we can do a lot better. (If you think my assessment is off, please check out my annual reviews of the National Preparedness Report and let me know if you come to a different conclusion). There is much more to our community, however, than responders. Government administration, businesses, non-government organizations, and people themselves compose the majority of it, and unfortunately among these groups is where our culture of preparedness has the largest gaps.

As with most of my posts, I don’t actually have a solution. But I know what we are doing isn’t getting us to where we want to be. I think the solution, though, lies in studying people, communities, and organizations and determining why they behave and feel the way they do, and identifying methodologies, sticks, and carrots that can help attain an improved culture of preparedness over time. We must also ensure that we consider all facets of our communities, inclusive of gender identity, race, culture, income, citizenship status, and more. We need people who know and study such things to help guide us. The followers of Thomas Drabek. The Kathleen Tierneys* of the world. Sociologists. Anthropologists. Psychologists. Organizational psychologists.  

A real, viable culture of preparedness, in the present time, is little more than a concept. We need to change our approach from using this as a buzz phrase in which everyone in the room nods their heads, to a goal which we make a deliberate effort toward attaining. A problem such as this is one where we can have a true union of academia and practice, with academics and researchers figuring out how to solve the problem and practitioners applying the solutions, with a feedback loop of continued study to identify and track the impacts made, showing not only the successes we (hopefully) attain, but also how we can continue to improve.

*Note: I don’t know Dr. Tierney personally and it is not my intent to throw her under the proverbial bus for such a project. I cite her because her writing on related topics is extremely insightful. I highly recommend Disasters: A Sociological Approach.

© 2024 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

ICS Training Sucks – Progress Inhibited by Bias

It’s been a while since I’ve written directly toward my years-long rally against our current approach to Incident Command System (ICS) training. Some of these themes I’ve touched on in the past, but recent discussions on this and other topics have gotten the concept of our biases interfering with progress stuck in my head.

It is difficult for us, as humans, to move forward, to be truly progressive and innovative, when we are in a way contaminated by what we know about the current system which we wish to improve. This knowledge brings with it an inherent bias – good, bad, or otherwise – which influences our vision, reasoning, and decisions. Though on the other hand, knowledge of the existing system gives us a foundation from which we can work, often having awareness of what does and does not work.

I’m sure there have been some type of psychological studies done on such things. I’ve certainly thought about, in my continued rally against our current approach to ICS training, what that training could look like if we set individuals to develop something new if they’ve never seen the current training. Sure, the current training has a lot of valuable components, but overall, it’s poorly designed, with changes and updates through decades still based upon curriculum that was poorly developed, though with good intentions, so long ago.

In recent months, having had discussions with people about various things across emergency management that require improvement, from how we assess preparedness, to how we develop plans, to how we respond, and even looking at the entire US emergency management enterprise itself. Every one of these discussions, trying to imagine what a new system or methodology could look like, with every one of these people (myself included), were infected by an inherent bias that stemmed from what is. Again, I’m left wondering, what would someone build if they had no prior knowledge of what currently exists.

Of course, what would be built wouldn’t be flawless. To some solutions, those of us in the know may even shake our heads, saying that certain things have already been tried but were proven to fail (though perhaps under very different circumstances which may no longer be relevant). Some solutions, however, could be truly innovative.

The notion, perhaps, is a bit silly, as I’m not sure we could expect anyone to build, for example, a new ICS curriculum, without having subject matter expertise in ICS (either their own or through SMEs who would guide and advise on the curriculum). These SMEs, inevitably, would have taken ICS training somewhere along their journey.

All that said, I’m not sure it’s possible for us to eliminate our bias in many of these situations. Even the most visionary of people can’t shed that baggage. But we can certainly improve how we approach it. I think a significant strategy would be having a facilitator who is a champion of the goal and who understands the challenges, who can lead a group through the process. I’d also suggest having a real-time ‘red team’ (Contrarian?) element as part of the group, who can signal when the group is exercising too much bias brought forth from what they know of the current implementation.

In the example of reimagining ICS training, I’d suggest that the group not be permitted to even access the current curriculum during this effort. They should also start from the beginning of the instructional design process, identifying needs and developing training objectives from scratch, rather than recycling or even referencing the current curriculum. The objectives really need to answer the question – ‘What do we want participants to know or do at the completion of the course?’. Levels of training are certainly a given, but perhaps we need to reframe to what is used elsewhere in public safety, such as the OSHA 1910.120 standard which uses the levels of Awareness, Operations, Technician, and Command. Or the DHS model which uses Awareness, Performance, and Management & Planning. We need to further eliminate other bias we bring with us, such as the concept of each level of training only consisting of one course. Perhaps multiple courses are required to accomplish what is needed at each level? I don’t have the answers to any of these questions, but all of these things, and more, should be considered in any real discussion about a new and improved curriculum.

Of course, any discussions on new and improved ICS curriculum need to begin at the policy level, approving the funding and the effort and reinforcing the goal of having a curriculum that better serves our response efforts.

How would you limit the influence of bias in innovation?

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Mixing Exercise Types

As with many things, we are taught exercises in a rather siloed fashion. First by category: discussion-based and operations-based. Then by type. That kind of compartmentalization is generally a necessity in adult education methodology. Individually, each exercise type has its own pros and cons. Rarely, however, do we ever seen or heard of combining exercise types within one initiative.

The first time I did this was several years ago. My company was designing a series of functional exercises to be used for locations around the country. While the exercises were focused on response, one goal of our client was to include some aspects of recovery in the exercise. At about six hours, the exercises weren’t long. Time jumps can be awkward, and for the small amount of time dedicated to recovery in the exercise, the impact of the disruption from the time jump within the exercise may not net a positive result. Add to that the time it would take to provide a quantity of new information that would be needed to make a recovery-oriented functional exercise component viable.

Instead of trying to shoe-horn this in, we opted to stop the functional component of the exercise at an established time and introduce a discussion on disaster recovery. With the proper introduction and just a bit of information to provide context in addition to what they had already been working on, the discussion went smoothly and accomplished everything with which we were charged. The participants were also able to draw on information and actions from the response-focused functional component of the exercise.

We’re recently developed another exercise that begins with a tabletop exercise to establish context and premise then splits the participants into two groups which are each challenged with some operations-based activity: one deploying to a COOP location to test functionality (a drill), the other charged with developing plans to address the evolving implications of the initial incident (a functional exercise). Following the operations-based exercises, the two groups will reconvene to debrief on their activities and lessons learned before going into a hotwash.

Making this happen is easy enough. Obviously we need to ensure that objectives align with the expected activities. You also want to make sure that the dual exercise modalities are appropriate for the same participants. While I try not to be hung up on the nuances of documentation, though documentation is important, especially when it comes to grant compliance and ensuring that everyone understands the structure and expectations of the exercise. If we are mixing a discussion-based exercise and an operations-based exercise, one of the biggest questions is likely what foundational document to use – a SitMan or ExPlan. Generally, since the operations-based exercises can have greater consequences regarding safety and miscommunication, I’d suggest defaulting to an ExPlan, though be sure to include information that addresses the needs of the discussion-based exercise component in your ExPlan as well as the player briefing.

In running the exercise, be sure to have a clear transition from one exercise type to the other, especially if there are multiple locations and/or players are spread out. Players should be given information that prepares them for the transition in the player briefing. Having exercise staff (controllers/facilitators and evaluators) properly prepared for this through clearly communicating expectations at the C/E briefing and in C/E documentation is obviously important, as well as ensuring they are ready for the transition.

I’d love to hear other success stories from those who may have done something similar.

© 2024 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®