Measuring Return on Investment Through Key Performance Indicators

Return on investment (ROI) is generally defined as a measurement of performance to evaluate the value of investments of time, money, and effort. Many aspects of preparedness in emergency management offer challenges when trying to gauge return on investment. Sure, it’s easy to identify that m number of classes were conducted and n number of people were trained, that x number of exercises were conducted with y number of participants, that z number of plans were written, or even that certain equipment was purchased. While those tell us about activity, they don’t tell us about performance, results, or outcomes.

More classes were conducted. So what?

We purchased a generator. So what?

The metrics of these activities are easy to obtain, but these are rather superficial and generally less than meaningful. So how can we obtain a meaningful measure of ROI in emergency preparedness?

ROI is determined differently based on the industry being studied, but fundamentally it comes down to identifying key performance indicators, their value, and how much progress was made toward those key performance indicators. So what are our key performance indicators in preparedness?

FEMA has recently began linking key performance indicators to the THIRA. The Threat and Hazard Identification and Risk Assessment, when done well, gives us quantifiable and qualifiable information on the threats and hazards we face and, based upon certain scenarios, the performance measures need to attain certain goals. This is contextualized and standardized through defined Core Capabilities. When we compare our current capabilities to those needed to meet the identified goals (called capability targets in the THIRA and SPR), we are able to better define the factors that contribute to the gap. The gap is described in terms of capability elements – planning, organizing, equipping, training, and exercises (POETE). In accordance with this, FEMA is now making a more focused effort to collect data on how we are meeting capability targets, which helps us to better identify return on investment.

2021 Emergency Management Performance Grant (EMPG) funding is requiring the collection of data as part of the grant application and progress reports to support their ability to measure program effectiveness and investment impacts. They are collecting this information through the EMPG Work Plan. This spreadsheet goes a long way toward helping us better measure preparedness. This Work Plan leads programs to identify for every funded activity:

  • The need addressed
  • What is expected to be accomplished
  • What the expected impact will be
  • Identification of associated mission areas and Core Capabilities
  • Performance goals and milestones
  • Some of the basic quantitative data I mentioned above

This is a good start, but I’d like to see it go further. They should still be prompting EMPG recipients to directly identify what was actually improved and how. What has the development of a new plan accomplished? What capabilities did a certain training program improve? What areas for improvement were identified from an exercise, what is the corresponding improvement plan, and how will capabilities be improved as a result? The way to get to something more meaningful is to continue asking ‘so what?’ until you come to an answer that really identifies meaningful accomplishments.

EMPG aside, I encourage all emergency management programs to identify their key performance indicators. This is a much more results-oriented approach to managing your program, keeping the program focused on accomplishing meaningful outcomes, not just generating activity. It’s more impactful to report on what was accomplished than what was done. It also gives us more meaningful information to analyze across multiple periods. This type of information isn’t just better for grant reports, but also for your local budgets and even routine reports to upper management and elected officials.

What do you think about FEMA’s new approach with EMPG? What key performance indicators do you use for your programs?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Emergency Management Budgets

Last week there were some posts circulating around Twitter expressing some considerable dismay about emergency management budgets. While I obviously agree that emergency management programs should be better funded, there is some important context to consider when looking at (most) emergency management agency budgets in the US.

While jurisdictions having emergency management programs provide some measure of funding, typically the largest quantity of funding comes from federal grant programs, with the most significant grant for operational expenses being the Emergency Management Performance Grant (EMPG). EMPG is part of the Homeland Security Grant Program (HSGP) and is budgeted each year in the federal budget with administrative responsibilities in the hands of FEMA. States are the grantees of EMPG. While a considerable amount of the funds are retained by states, there is a requirement for a certain percentage to be applied to local emergency management programs. States have different models for how the funds are allocated – some states award funds directly to county/local governments (subgrantees), others spend the funds on behalf of the subgrantees through the provision of direct services to county/local governments. Many states also use a hybrid of the two models. Those receiving an allocation of EMPG are ideally accounting for it in their published budgets, but we should be aware that some releases of budget information may not include EMPG numbers.

There are also additional grant funds available to county and local governments to support an array of emergency management and emergency management-related programs. These include hazard mitigation grants, the Urban Area Security Initiatives (UASI) grant, Secure the Cities, and others. Yes, a lot of these funds are targeted to more ‘homeland security’ types of activities, but we should also recognize the considerable overlap in a lot of EM and HS. I took a small sample of a few mid to large sized cities (mostly since they have established and funded emergency management offices), seeing ratios of 1:3 to 1:4 for local share funding compared to grant funding (this did not include COVID-related supplemental funding). Of course, you may see numbers significantly different in your jurisdiction.

I’ll also suggest that activities across many other local government agencies and departments support some measure of emergency management. While a lot of these expenditures may not have the input of an emergency management office, there are a variety of local infrastructure projects (hopefully contributing to hazard mitigation), health and human services investments (mitigation and preparedness), code enforcement (mitigation), and others that do contribute to the greater emergency management picture for the jurisdiction. In fact, some of the funding allocations received by these agencies may be through discipline-specific emergency management grant programs, such as those which may come from US DOT or CDC/HHS.

Overall, emergency management funding tends to be a lot larger than the casual observer may think, though even a budget analyst would require some time to identify how it all comes together, especially for a larger jurisdiction that tends to have larger departments, more complex expenditures, and more grant funding. As mentioned, I’d still love to see more direct funding allocations for emergency management programs, especially as emergency management can hopefully direct efforts where and how they are needed most within their communities. I’m also hopeful that officials leading different programs at the local level are coming together to jointly determine how best to allocate federal funds (obviously within the grant terms and conditions), even if they are coming from different federal and state agencies and being awarded to different local departments, with a goal of addressing local threats, hazards, and capabilities in the best ways possible for communities.

While what I wrote is a broad-brush example of how emergency funding is allocated across much of the US, different states do administer grants different. It can be as simple as I’ve outlined, or a lot more complex. We also have a lot of examples of the haves and have-nots, with many smaller jurisdictions being left woefully behind in funding. I’d love to hear what the funding situation looks like for your jurisdiction. Also, for those not in the US, how are your local programs funded?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

A Podcast Invitation

Last week I had the honor of being invited to guest on the EM Weekly podcast. We had a great discussion there talking about incident management structures and some of the continued challenges of emergency management.

Check it out here:

An Update of Ontario’s Incident Management System

Just yesterday, the Canadian province of Ontario released an update of its Incident Management System (IMS) document. I gave it a read and have some observations, which I’ve provided below. I will say that it is frustrating that there is no Canadian national model for incident management, rather the provinces determine their own. Having a number of friends and colleagues from across Canada, they have long espoused this frustration as well. That said, this document warrants an examination.

The document cites the Elliot Lake Inquiry from 2014 as a prompt for several of the changes in their system from the previous iteration of their IMS document. One statement from the Inquiry recommended changes to ‘put in place strategies that will increase the acceptance and actual use of the Incident Management System – including simplifying language’. Oddly enough, this document doesn’t seem to overtly identify any strategies to increase acceptance or use; in fact there is scant mention of preparedness activities to support the IMS or incident management as a whole. I think they missed the mark with this, but I will say the recommendation from the Inquiry absolutely falls in line with what we see in the US regarding acceptance and use.

The authors reinforce that ICS is part of their IMS (similar to ICS being a component of NIMS) and that their ICS model is compatible with ICS Canada and the US NIMS. I’ll note that there are some differences (many of which are identified below) that impact that compatibility, though don’t outright break it. They also indicate that this document isn’t complete and that they already identified future additions to the document including site-specific roles and responsibilities, EOC roles and responsibilities, and guidance on resource management. In regard to the roles and responsibilities, there is virtually no content in this document on organizations below the Section Chief level, other than general descriptions of priority activity. I’m not sure why they held off of including this information, especially since the ICS-specific info is reasonably universal.

I greatly appreciate some statements they make on the application of Unified Command, saying that it should only be used when single command cannot be established. They give some clarifying points within the document with some specific considerations, but make the statement that “Single command is generally the preferred form of incident management except in rare circumstances where unified command is more effective” and reinforce that regular assessment of Unified Command should be performed if implemented. It’s quite a refreshing perspective opposed to what we so often see in the US which practically espouses that Unified Command should be the go-to option. Unified Command is hard, folks. It adds a lot of complexity to incident management. While it can solve some problems, it can also create some.

There are several observations I have on ICS-related organizational matters:

  • They use the term EOC Director. Those who have been reading my stuff for a while know that I’m really averse to this term as facilities have managers. They also suggest that the term EOC Command could be used (this might even be worse than EOC Director!).
  • While they generally stick with the term Incident Commander, they do address a nuance where Incident Manager might be appropriate (they use ‘manager’ here but not for EOCs??). While I’m not sure that I’m sold on the title, they suggest that incidents such as a public health emergency that is wide-reaching and with no fixed site is actually managed and not commanded. So in this example, the person in charge from the Health Department would be the Incident Manager. It’s an interesting nuance that I think warrants more discussion.
  • The document refers several times to the IC developing strategies and tactics. While they certain may have input to this, strategies and tactics are typically reserved for the Operations Section.
  • There is an interesting mention in the document that no organization has tactical command authority over any other organization’s personnel or assets unless such authority is transferred. This is a really nuanced statement. When an organization responds to an incident and acknowledges that the IC is from another organization, the new organization’s resources are taking tactical direction from the IC. Perhaps this is the implied transfer of authority? This statement needs a lot of clarification.
  • Their system formally creates the position of Scribe to support the Incident Commander, while the EOC Director may have a Scribe as well as an Executive Assistant. All in all, I’m OK with this. Especially in an EOC, it’s a reflection of reality – especially the Executive Assistant – which is not granted the authority of a Deputy, but is more than a Scribe. I often see this position filled by a Chief of Staff.
  • The EOC Command Staff (? – they don’t make a distinction for what this group is called in an EOC) includes a Legal Advisor. This is another realistic inclusion.
  • They provide an option for an EOC to be managed under Unified Command. While the concept is maybe OK, ‘command’ is the wrong term to use here.
  • The title of Emergency Information Officer is used, which I don’t have any particular issue with. What’s notable here is that while the EIO is a member of the Command Staff (usually), the document suggests that if the EIO is to have any staff, particularly for a Joint Information Center, that they are moved to the General Staff and placed in charge of a new section named the Public Information Management Section. (a frustration here that they are calling the position the EIO, but the section is named Public Information). Regardless of what it’s called or if there is or is not a JIC, I don’t see a reason to move this function to the General Staff.
  • Aside from the notes above, they offer three organizational models for EOCs, similar to those identified in NIMS
  • More than once, the document tasks the Operations Section only with managing current operations with no mention of their key role in the planning process to develop tactics for the next operational period.
  • They suggest other functions being included in the organization, such as Social Services, COOP, Intelligence, Investigations, and Scientific/Technical. It’s an interesting call out whereas they don’t specify how these functions would be included. I note this because they refer to Operations, Planning, Logistics, and Finance/Admin as functions (which is fine) but then also calling these activities ‘functions’ leads me to think they intend for new sections to be created for these. Yes, NIMS has evolved to make allowances for some flexibility in the organization of Intel and Investigations, something like Social Services (for victims) is clearly a function of Operations. While I appreciate their mention of COOP, COOP is generally a very department-centric function. While a continuity plan could certainly be activated while the broader impacts of the incident are being managed, COOP is really a separate line of effort, which should certainly be coordinated with the incident management structure, but I’m not sure it should be part of it – though I’m open to discussion on this one.
  • I GREATLY appreciate their suggestion of EOC personnel being involved in planning meetings of incident responders (ICP). This is a practice that can pay significant dividends. What’s interesting is that this is a measure of detail the document goes into, yet is very vague or lacking detail in other areas.

The document has some considerable content using some different terminology in regard to incidents and incident complexity. First off, they introduce a classification of incidents, using the following terminology:

  • Small
  • Large
  • Major
  • Local, Provincial, and National Emergencies

Among these, Major incidents and Local/Provincial/National Emergencies can be classified as ‘Complex Incidents’. What’s a complex incident? They define that as an incident that involves many factors which cannot be easily analyzed or understood; they may be prolonged, large scale, and/or involve multiple jurisdictions. While I understand that perhaps they wanted to simplify the language associated with Incident Types, but even with the very brief descriptions the document provided on each classification, these are very vague. Then laying the term of ‘complex incident’ over the top of this, it’s considerably confusing.

**Edit – I realized that the differentiator between small incident and large incident is the number of responding organizations. They define a small incident as a single organization response, and a large incident as a multi agency response. So the ‘typical’ two car motor vehicle accident that occurs in communities everywhere, requiring fire, EMS, law enforcement, and tow is a LARGE INCIDENT????? Stop!

Another note on complex incidents… the document states that complex incidents involving multiple response organizations, common objectives will usually be high level, such as ‘save lives’ or ‘preserve property’, with each response organization developing their own objectives, strategies, and tactics.  I can’t buy into this. Life safety and property preservation are priorities, not objectives. And allowing individual organizations to develop their own objectives, strategies, and tactics pretty much breaks the incident management organization and any unity of effort that could possibly exist. You are either part of the response organization or you are not.

Speaking of objectives, the document provides a list of ‘common response objectives’ such as ‘save lives’ and ‘treat the sick and injured’. These are not good objectives by any measure (in fact they can’t be measured) and should not be included in the document as they only serve as very poor examples.

So in the end there was a lot in this document that is consistent with incident management practices, along with some good additions, some things that warrant further consideration, and some things which I strongly recommend against. There are certainly some things in here that I’d like to see recognized as best practices and adopted into NIMS. I recognize the bias I have coming from the NIMS world, and I tried to be fair in my assessment of Ontario’s model, examining it for what it is and on its own merit. Of course anyone who has been reading my posts for a while knows that I’m just as critical of NIMS and related documents out of the US, so please understand that my (hopefully) constructive comments are not intended to create an international incident. I’m a big fan of hockey and poutine – please don’t take those away from me!

I’m always interested in the perspectives of others. And certainly if you were part of the group that developed this document, I’d love to hear about some of your discussions and how you reached certain conclusions, as well as what you envision for the continued evolution for the Provincial IMS.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

A Few Thoughts on Emergency Planning

A conversation I find myself having fairly often is about people not using plans. It’s amazing that we invest so much time, money, and effort into building plans to never see them used, even if the opportunity presents itself. Why is this? I see four primary reasons:

1. People don’t know the plans exist. There is really no excuse for this one. I find it shameful and wasteful, especially if these people are identified as action agents within that plan. There was practically no point in even developing the plan no one knows about it and their respective roles identified within. Socialization of plans once they are developed is extremely important. Minimalist effort can be made by simply sending the plan or a link to the plan, but I consider this to be inadequate as many people will dismiss it, never get to reviewing it, or not understand what they are reading. Structured briefings are the best way to initially familiarize people with the plans and their roles. It helps to have refresher training as well as ensuring that new hires are similarly trained. This can even be done as a recorded presentation or webinar, though providing a contact for questions is important. Along with socializing, remember the importance of exercises, not only to validate plans but also to help people become more familiar with plans their respective roles by taking a scenario-drive dive into the content. Does everyone in your organization or jurisdiction who has a role in a plan know about it?

2. People don’t remember the plans exist. This one is a bit more forgivable, especially for newer plans, rarely implemented plans, or for personnel who are used to “doing things the way they’ve always been done”. Still, I find these excuses to be weak at best. People’s inability to remember the plans, even granting them the distraction of the incident itself, means that the plans haven’t been socialized and reinforced enough (see item 1 above).

3. People don’t care if the plans exist. This one has been underscored considerably over the past year related to pandemic plans, point of distribution (POD) plans, and other related plans. We’ve seen many senior leaders and elected officials be completely dismissive of established plans, choosing instead to “do it their way” in an effort to exert greater control or to ensure that their name is front and center. Since this one involves a lot of ego, particularly of senior leaders and elected officials, it can be difficult to work around. That said, this underscores the importance of ensuring that elected officials and newly appointed senior leaders are adequately briefed on the existing plans when they take office, and given confidence in the plans and the people identified to implement them, as well as the important roles of elected and appointed officials.

4. People think the plans are faulty. This option is the likely more well-intentioned version of #3, where people are intentionally not using the plan because they feel (maybe true, maybe not) the plan is inadequate and feel that “winging it” is the better option. Part of this lack of confidence may be unfamiliarity with and/or validation of the plans (see item 1 above re socialization and exercises). This could be a difference of opinion or even something intentionally obstructionist. Along with socialization and exercises, I’ll also add the value of including key people in the planning process. This gives them a voice at the table and allows their input to be heard and considered for development of the plan. While you can’t include everyone in the planning process, consider that the people you do choose to involve can serve as representatives or proxies for others, especially if they are well respected, giving less reason for others to push back.

A separate, but somewhat related topic (mostly to #4 above) is about people being often dismissive of or lacking confidence in plans by expressing the saying of “No plan survives first contact with the enemy”. This saying is credited to nineteenth century Prussian military commander Helmuth van Moltke. We see this saying tossed around quite a bit in various circles, including emergency management. While I understand and respect the intent of the phrase, I don’t think this necessarily holds true. I’ve seen great plans fail and mediocre plans be reasonably successful. Why? Circumstances dictate a lot of it. Implementation as well (this is the human factor). What we need to understand is that plans provide a starting point and hopefully some relevant guidance along the way. If a plan is so detailed and rigid, it is more likely to fail. So should our plans not be detailed? No, we should put as much detail as possible into our plans as these will help guide us in the midst of the incident, especially if certain activities are highly technical or process-oriented; but we also need to allow for flexibility. Consider a plan to be a highway. Highways have exits which take us off to different places, but they also have on-ramps to help us return. A deviation from a plan does not mean we throw the plan away, as we can always get back onto the plan, if it’s appropriate. It’s also smart to build in options, as possible, within our plans to help minimize deviations. 

How we develop plans is strongly related to step 2 of CPG-101, and that is “Understand the Situation”. Without an understanding of the situation, we can’t account for the various factors involved and may not account for the circumstances for which we must develop contingencies or options. And while this assessment is part of the planning process, as well as training, exercises, and other facets of preparedness, I feel that a wholistic assessment also has value. I’ve written a lot about the POETE preparedness elements and have begun advocating for APOETE, with the A standing for Assessment. This assessment is broad based to help guide our overall preparedness activity but is not a replacement for the element-specific assessments.

My last thought is about pandemic and POD plans. I’m curious about who has used their plans during this pandemic, and if not, why not? Of course many of the assumptions we used for pandemic planning weren’t realized in this pandemic. Does this mean our pandemic plans were faulty? Not entirely. Clearly there should have been many content areas that were still useful, and even though some of the assumptions we had didn’t apply to this pandemic, they may still hold true for future public health emergencies. We’ve also learned a lot about our response that needs to be considered for plan updates, and we need to weigh how much of the reality of political blundering we should account for in our plans. In the end, what I caution against is developing a pandemic plan that centers on the COVID-19 pandemic. Preparing for the last disaster doesn’t necessarily prepare us for the next one.

Those are some of my thoughts for the morning. As always, I welcome your thoughts and feedback.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

What Actually is Emergency Management?

Many people have a concept of what emergency management is, typically shaped by their own experiences or aspirations, but it is so much more. I think a part of the seeming identity crisis emergency management suffers as well as the issues expressed by some that emergency management isn’t a recognized profession stem from that the fact that so much of emergency management is actually a unified effort of an amalgamation of other professions. So let’s consider what emergency management actually is. The list below is not exhaustive and is largely formed around common professions, major activities, and areas of academic study.

  • Grants management
  • Accounting
  • Procurement
  • Logistics
  • Equipment Maintenance
  • GIS
  • Information Technology
  • Planning
  • Document Development and Publishing
  • Marketing
  • Communications
  • Public and Media Relations
  • Community Outreach
  • Volunteer Management
  • Instructional Design and Delivery
  • Data Analysis
  • Engineering
  • Project Management
  • Policy and Political Science
  • Business/Public Administration
  • Organizational Management and Leadership
  • Consulting and SME
  • Academics and Research
  • Physical Sciences (Geology, Meteorology, Earth Science, etc.)
  • Social Sciences (Sociology, Anthropology, etc.)

These are all distinct functions and major activities/professions I’ve seen in emergency managers and emergency management agencies. Many emergency managers do a lot of these, while some focus on a few or even just one. Some of these activities may be outsourced to other agencies or to the private sector. Yet any of the items on the list taken out of the context of emergency management are then no longer (at least directly) emergency management. This may be a permanent state for someone holding one of these positions, or perhaps they are brought into the realm of emergency management on more of an ad-hoc or temporary basis. On the other hand, the application of these activities within emergency management often requires them to have knowledge of the areas of emergency management in which they are being applied.

Defining what emergency management is and does without the context of these other professions/activities is difficult. There is a big part of emergency management that is less defined and tangible, filling in the gaps and connective tissue between and among all of these; harnessing and focusing the collective capabilities toward distinct efforts across preparedness and the five mission areas, by way of a highly complex effort which we encapsulate with one simple word – coordination. So oddly enough, emergency management is all of these, yet it is also something else.

I think the recognition of this will go a long way for us, helping to progress the profession while also being less rigid in our approach to pigeon-hole what an emergency manager is.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Building Local Incident Management Capability

Just over a year ago I wrote An Alternate Concept of Incident Management Support, identifying the gap that exists in most local communities and areas for improved incident management capability. While I still think that formal Incident Management Teams (IMTs) are the gold standard, not every community or even multi-county region can support a formal IMT, which requires dozens of personnel and rigorous qualification and maintenance. Over the past year, we’ve seen a lot of use of IMTs across the nation, supporting the COVID-19 response and myriad other incidents. Sitting in over the last few days on the All Hazard Incident Management Team Association (AHIMTA) virtual symposium, there is a lot of exciting evolution happening with IMTs as they continue to enhance their capabilities. And while this is great, I feel we are leaving a lot of areas behind. This isn’t on the AHIMTA, but rather on the emergency management community as a whole. That said, there are certainly some intersections, as a lot of the training available to IMT personnel may need to be made more accessible to those who would be part of the Incident Support Quick Response Teams (ISQRTs) as I came to call them in the article I mentioned from last year, and addressing a fundamental need I’ve been espousing for a long time.

As I’ve soaked in a lot of great information from the AHIMTA symposium about IMTs, the need to build local capability in the absence of IMTs is even more apparent. Some may argue that IMTs are available to deploy to any area if requested. Possibly. Obviously there are a lot of conditions… what are other teams dealing with? What’s the relative priority of the requesting area? EMAC is certainly an option, but States need to approve the local request if they are to put the request into the EMAC system. The state may not agree with the need, may not want to spend the funds for an incoming team for an incident that may not receive a federal declaration, or it may not be practical to wait a couple of days to get an IMT on the ground when the response phase of the incident may be resolved or near resolved by then.   

Fundamentally, every area should have its own organic incident management capability. As mentioned, most areas simply can’t support or sustain the rigors of a formal IMT, but they can support a short roster of people who are interested, able, and capable. This is a situation where a little help can go a long way in making a difference in a local response for a more complex Type 4 incident or the onset of a Type 3 incident – or simply to do what they can for a larger incident where additional help simply isn’t available. I mentioned in last year’s article that the focus should really be on incident planning support, with an Incident Management Advisor to support the IC and local elected officials, an Incident Planning Specialist to help the local response organization harness the Planning Process, a Planning Assistant to support the detailed activities involved in a Planning Section such as situational awareness and resource tracking, and an Operations and Logistics Planner to support local responders who may have great tactical knowledge, but not much experience on operational planning much less forecasting logistical needs associated with this. Largely these are all advisors, who are likely to integrate into the incident management organization, so we aren’t creating new ICS positions, though I still encourage some deeper and deliberate application of incident management advisors.

My big thought today is how do we make something like this happen? First, I think we need to sell FEMA and State IMT programs and or State Training Officers on the concept. That comes first from recognizing and agreeing on the gap that exists and that we must support the organic incident management capability of local jurisdictions with fewer resources, through something that is more than the ICS courses, but less than what is required for an IMT. Part of this is also the recognition that these ISQRTs are not IMTs and not intended to be IMTs but fill an important role in addressing this gap. This will go a long way toward getting this past ICS and IMT purists who might feel threatened by this or for some reason disagree with the premise.

Next is establishing standards, which first is defined by general expectations of activity for each of these roles, pre-requisites for credentialing, then training support. The existing position-specific training is likely not fully appropriate for these positions, but a lot can be drawn upon from the existing courses, especially those for Incident Commander and the Planning Section positions, but there are also some valuable pieces of information that would come from Operations Section and Logistics Section Courses. I’d suggest that we work toward a curriculum to address these specific ISQRT roles. There are then some administrative details to be developed in terms of local formation, protocols for notification and activation, etc. State recognition is important, but perhaps approval isn’t necessarily needed, though coordination and support from States may be critical to the success of ISQRTs, again considering that these are most likely to be serving areas with fewer resources. ISQRTs will also need to work with local emergency managers and local responders to gain support, to be included in local preparedness activities, and to be called upon when they should be. A lot of success can be gained from things such as local/county/regional/state meetings of fire chiefs and police chiefs.

Do you agree with the gap that exists for most communities? What do you think we need to get the ball rolling on such a concept?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

You’ve Been Trained, Now What?

I just sat through a good webinar on incident response and management. The panel consisted of fire and law enforcement personnel. A law enforcement official was rather honest in saying that one of their identified deficiencies from an AAR was poor implementation of ICS. He said that while all police personnel had received ICS training back during the NIMS push of the mid-2000s, most officers had done little with it since. We see so many endless lists of training that people have taken on their CVs, resumes, LinkedIn, etc., but how much of that do they still know? Take an honest look at your own resume of training and I bet you will see some of the same.

In public safety we love to get training. A lot of the training is good. Some less so. Much of the training we take is awareness-level training, providing us with knowledge. It’s fairly easy to flex those muscles after the training by reading about, writing about, teaching, or doing other things with that information. Still, some of that acquired knowledge stagnates. Some of the training we take is more operations-based – it’s hands on or procedural. Most certainly, without using the knowledge and skills acquired in operations-based training, those skills atropy.

So what should we do to protect our loss of these valuable knowledge and skills acquired? Obviously application is the best means of preserving what we have learned. Even if you are using it, though, it’s good to stay on top of best practices, new practices, and updated training; not only as a means of staying current on the latest and greatest, but also to hedge against bad habits as well as certain nuggets of that original training we might not regularly perform. Apply and practice skills, either on the job or in exercises. For things that are more knowledge-based, talk about it, read about it, write about, or present on it. This repetition will keep the subject matter familiar and quicken your recall of facts and increase your ability to analyze it. Writing can be in any form, up to and including developing or updating plans and procedures. A special shout out goes to presentations and training (if you are qualified), though. Training and presentations often require the instructor/presenter to have a depth of knowledge beyond the learning domain of what they are teaching or presenting on. This is often required to answer questions, support implementation, and address the many what-ifs related to the subject matter.

I’d argue that your organization also has a role (and responsibility) in preserving these gained knowledge and skills as well. First, sharing of the experience is important. Since not everyone in your organization can attend every training opportunity, it’s a best practice for those who receive training to tell others about their experience, what they learned, and the relevance they see to their work. Simpler subject matter can be provided in an email or printed handout, while more complex subject matter might be better conveyed through a presentation. Unless your training was received to help you support an existing plan or procedure, your organization should also support implementation of what you have learned, if appropriate. Keeping knowledge and skills fresh should also be endorsed through opportunities for refresher training and other related training which may expand the knowledge and skills or hone specific application. Organizations should also identify what knowledge and skills they need and must maintain, and ensure that they identify staff that need the opportunities for training and development, as well as how to maintain what is learned.

With the personal and organizational costs of training, we reap the greatest benefit by maintaining or advancing the knowledge and proficiency gained. While the quest for knowledge is endless and admirable, and I’d generally never block an opportunity for someone to gain more, we should be assessing what the benefit is to learner and to the organization. Part of that is determining what commitments the organization and the learner must make to preserve what is gained. I believe that employee development plans can be a big part of this, as they should be informed by what the employee needs to improve upon, what we want them to excel at, and what future roles we may have planned for them. These factors drive the goals and objectives of the employee development plan which should also lead to what training opportunities are ideal to support these goals and objectives. Even if your organization doesn’t do any formal employee development plans, you can develop one for yourself.

What’s your take on keeping current with what you’ve learned?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Experience Bias

I recently read an interesting piece in Psychology Today by Dr. Christopher Dwyer titled ‘How Experience Can Hinder Critical Thinking’. Do check it out. There is application to pretty much everything, but of course I tend to think of things in the context of emergency management.

The article starts with the age long argument of education vs experience, but with a particular slant toward critical thinking. My personal take is that the education vs experience argument, in its totality, can’t have a blanket resolution. I think a lot of it is dependent on the topic at hand, and obviously it’s rarely a dichotomy, rather a blending of education and experience is often best. In regard to education, certainly the actual education received holds value, but there are tasks intrinsic to academia which also hold value, perhaps even more than what was learned in the classroom; the rigors of research in an academic environment often being most valuable among them. With that, in many regards, we often see employment announcements with a range of degree majors, or just simply a stated minimum of education, regardless of major. This is in recognition of the intrinsic value of education. And while some professions absolutely require a specific degree, those which don’t, can and should hold less rigidly to those requirements.

While I certainly advocate a minimum extent of education for most positions, I’ve also worked with a considerable number of people with a high school diploma or associate’s degree that can intellectually run circles around those with advanced degrees, at least in certain applications of work and life. Experience is often indicative of exposure to certain situations, often with repetition. The comparing and contrasting of those experiences with what is being experienced in the moment is what supports the argument for the value of experience. It’s also why many advanced degree programs actually require some term of actual work experience before they will accept applicants into their programs. Consider academic programs such as criminal justice. Sure, there are a lot of philosophical topics that are taught, but any courses that speak to practical application should probably be taught by those with actual experience doing those things. Though Dr. Dwyer gives wise advice, stating that we shouldn’t confuse experience with expertise.

All that said, Dr. Dwyer’s article focuses on the application of critical thinking in this argument. He cites some insightful data and studies, but most interesting to me is his mention of experience being personalized. While several people may have ‘been there, done that, got the t-shirt’, they each may have experienced the event differently or left with different impressions, even if exposed to some of the same situations. We all bring a bias with us, and this bias in the lens through which we view the events of our lives. That bias is then influenced by our perception of each event, fundamentally snowballing and compounding with the more experiences we have. This shows how our experiences can bias our own critical thinking skills. Dr. Dwyer states that critical thinking stemming from someone with more education than experience is likely to be more objective and based on knowledge, which certainly makes sense. That said, individuals basing their critical thinking solely on education may miss insight provided experiences, which can provide considerable context to the thought exercise.

I think the conclusion to be drawn in all this is that critical thinking, in most regards, is optimized by those with a blend of education and experience. It’s also extremely important for us to recognize our own limitations and biases when we approach a decision or other relevant situation. Specific to emergency management, we can leverage a lot from our experiences, but we also know that no two incidents are the same. Therefore, while our experiences can support us in a new event, they can also derail us if not applied thoughtfully and in recognition of our own biases.

This all comes around to my advocacy for emergency management broadly, and incident management in particular, being team sports. Even the first step of the CPG 101 planning process is to form a planning team. We each bring different approaches and perspectives. We also need to advocate for diversity in our teams, regardless of what tasks those teams are charged with. This should be diversity in the broadest sense – diversity of experience, diversity of discipline, diversity in education, diversity in gender, diversity in race, creed, culture, etc. The broader the better. We must do better opening ourselves to the perspectives of others. We all have bias – every one of us. Some bias, obviously depending on the focus, is OK, but it is best for us to balance our individual bias with those of a diverse group. A diverse team approach will bring us better results time and again.

How does experience bias impact you?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

FEMA’s First Lessons Learned From COVID-19

FEMA recently released the Pandemic Response to Coronavirus Disease 2019 (COVID-19): Initial Assessment Report (January – September 2020). The report has many elements of a traditional after-action report. The authors reinforce that the report only evaluates FEMA’s response, not those of other agencies or entities. That said, emergency management, by nature is collaborative and FEMA’s interactions with other agencies and entities are cited as necessary. The report covers five primary areas of evaluation:

  1. Coordinating Structures and Policy
  2. Resources
  3. Supporting State, Local, Tribal, and Territorial (SLTT) Partners
  4. Preparedness and Information Analysis
  5. Organizational Resilience

Also, with similarity to a traditional after-action report, this report provides a table of key findings and recommendations as Appendix A.

Here are some of my primary observations:

Following the executive summary is a the COVID-19 Pandemic Overview, which is a well-constructed piece providing a combined narrative timeline and topical highlights, providing information and context to the pandemic and the response, as well as some of the complexities encountered. While the report does well to acknowledge the myriad disasters that SLTT partners and federal agencies responded to over 2020, I find it shameful that they very obviously ignore the societal impacts of the US political climate (related to the pandemic and otherwise) as well as events surrounding the BLM movement. I firmly believe this report should fully acknowledge these factors and could have done so without itself making a political statement. These were important, impactful, and far-reaching, certainly influencing the operating environment, public information, and other very real facets of the response. I feel that the exclusion of these factors leaves this report incomplete.

Relative to the Coordinating Structures and Policy section, FEMA reinforces many, many times that they were put into a leadership position for this disaster that was unexpected and perhaps led to some coordination problems. I feel FEMA should always be a lead or co-lead agency for the federal response for large disasters regardless of the hazard. While a pandemic is certainly a public health hazard, FEMA has practiced experience in federal coordination to major disasters, mobilization of resources and logistical support, SLTT coordination, and overall incident management. The Unified Coordination Group is a sound application in situations where other federal agencies share significant authority. The kinks should be worked out of this, with the National Response Framework updated to reflect such.

Also mentioned within this section is the creation of a White House Task Force which was intended to make executive decisions of the highest level. This is not unprecedented and should certainly be expected for other large-scale disasters in the future. I feel, however, that removing the FEMA Administrator from having a direct line of communication with the White House during ‘peace time’ has significant impact on FEMA leadership’s ability to integrate. Positioning FEMA subordinate to the Secretary of Homeland Security is akin to putting a police officer in charge of a pool and keeping the lifeguard in the breakroom. Sure, the police officer can do a lot, but there are specific skills needed which necessitate that the lifeguard has a constant presence at the pool rather than only being called in when something gets bad enough. 

FEMA makes a point about inheriting eight task forces created by HHS which then needed to be integrated into the NRCC organization. These task forces had some overlap with the existing NRCC and ESF structure, resulting in duplications of effort and coordination problems. While FEMA says they were able to overcome this over time, it is obviously something that, given the National Response Framework, should have not happened in the first place. FEMA’s recommendations associated with this matter do not once cite the National Response Framework and instead point the finger at NIMS/ICS use, fully ignoring that the foundation of preparedness is planning. Either HHS made these task forces up on the fly or had a plan in place that accounted for their creation. Either way, it’s the National Response Framework that was ignored. NIMS/ICS helps support plan implementation.

The next section on resource management demonstrates that FEMA learned a lot about some intricacies of resource management they may have not previously encountered. With the full mobilization of resources across the nation for the pandemic, along with targeted mobilizations for other disasters, the system was considerably stressed. FEMA adapted their systems and processes, and in some cases developed new methodologies to address resource management needs. One key finding identified was a need to better integrate private sector partners, which isn’t surprising. I think we often take for granted the resources and systems needed to properly coordinate with the private sector on a large scale during a disaster. One of the largest disasters within this disaster was that of failed supply chains. Granted, the need was unprecedented, but we certainly need to bolster our preparedness in this area.

To help address supply chain issues, novel solutions such as Project Airbridge and specific applications of the Defense Production Act were used. The best practices from these strategies must be memorialized in the form of a national plan for massive resource mobilizations.

SLTT support for the time period of the report was largely successful, which isn’t a surprise since it’s fundamentally what FEMA does as the main coordination point between SLTT partners and federal agencies. Significant mobilizations of direct federal support to SLTT partners took place. The pandemic has provided the best proof of concept of the FEMA Integration Teams (FIT) since their development in 2017. With established relationships with SLTT partners and knowledge of needs of the federal system, they provided support, liaised, and were key to shared situational awareness. I appreciate that one of the recommendations in this section was development of a better concept of operations to address the roles and responsibilities of FIT and IMATs.

One item not directly addressed in this section was that in emergency management we have a great culture of sharing resources and people. Sharing was pretty limited in the pandemic since everyone was impacted and everyone needed resources. This caused an even greater demand on FEMA’s resources since SLTT partners largely weren’t able to support each other as they often do during disasters.

The section on preparedness and information analysis was interesting, especially on the information analysis side. The preparedness findings weren’t really much of a surprise, including not anticipating supply chain issues or SLTT needs. What this boils down to is a lack of effective plans for nation-wide disasters. On the information side, the key findings really boil down to not only improved defining of data sets and essential elements of information relative to specific needs, audiences, functions, capabilities, and lines of effort. It appears a lot was learned about not only the information needed, but also how to best utilize that information. Analytics makes data meaningful and supports better situational awareness and common operating picture.

The last section on FEMA’s organizational resilience is a good look at some of the inner workings and needs of FEMA as an agency and how they endured the pandemic and the varied demands on the agency. FEMA has always had a great culture of most employees having a disaster job which they are prepared to move into upon notice. They learned about some of the implications associated with this disaster, such as issues with engaging such a large portion of their employees in long-term deployments, public health protection, and mental health matters.

Ultimately, despite my disagreement with a couple of recommendations and leaving out some very important factors, the report is honest and, if the corrective actions are implemented, will support a stronger FEMA in the future. I’m hopeful we see a lot of these AAR types of documents across federal agencies, state agencies, local governments, the private sector, etc. EVERYONE learned from this pandemic, and continues to learn. That said, while the efforts of individual entities hold a lot of value, there also needs to be a broader, more collective examination of ‘our’ response to this disaster. This would be a monumental first task for a National Disaster Safety Board, would it not? 

© 2021 Timothy Riecker, CEDP

The Contrarian Emergency Manager™

Emergency Preparedness Solutions, LLC®