Preparing for Disaster Deployments

I wrote last year about my trepidation over Community Emergency Response Teams (CERT) being considered as a deployable resource. The problem is that even most professionally trained emergency personnel aren’t prepared for deployment. We need to do better.

One of the key aspects of a disaster is that it overwhelms local resources. This often requires help from outside the impacted jurisdiction(s). Working outward from the center, like the bullseye of a dartboard, we are usually able to get near-immediate assistance from our neighbors (aka mutual aid), with additional assistance from those at greater distances. When I use the word ‘deployment’, I’m referring to the movement of resources from well outside the area and usually for a period of time of several days or longer.

The US and other places around the world have great mutual aid systems, many supported by laws and administrative procedures, identifying how requests are made, discerning the liability for the requesting organization and the fulfilling organization(s), and more. Most of these are intended for response vs deployment, but may have the flexibility to be applicable to deployment. Some, such as the Emergency Management Assistance Compact (EMAC) are specifically written for deployments. While all this is certainly important, most organizations haven’t spent the time to prepare their people for deployment, which is a need that many organizations seem to take for granted. Those organizations which are, practically be definition, resources which are designed to deploy, such as Type 1 and 2 incident management teams (IMTs), often have at least some preparations in place and can be a good resource from which others can learn.

What goes into preparing for deployment? First, the sponsoring organization needs to recognize that their resources might be requested for deployment and agree to take part in this. That said, some organizations, such as volunteer fire departments, might have little control over their personnel deploying across the country when a call for help goes out publicly. These types of requests, in my opinion, can be harmful as large numbers of well-intentioned people may abandon their home organization to a lack of even basic response resources – but this is really a topic to be explored separately.

Once an organization has made a commitment to consider future requests, leadership needs to develop a policy and procedure on how they will review and approve requests. Will requests only be accepted from certain organizations? What are the acceptable parameters of a request for consideration? What are the thresholds for resources which must be kept at home? 

Supporting much of this decision making is the typing of resources. In the US, this is often done in accordance with defined typing from FEMA. Resource typing, fundamentally, helps us to identify the capabilities, qualifications, and eligibility of our resources. This is good not only for your own internal tracking, but is vitally important to most deployment requests. Organizations should do the work now to type their resources and personnel.

If an organization’s leadership decides they are willing to support a request, there then needs to be a canvass and determination of interest to deploy personnel. This is yet another procedure and the one that has most of my focus in this article. Personnel must be advised of exactly what they are getting into and what is expected of them (Each resource request received should give information specific to the deployment, such as deployment duration, lodging conditions, and duties.). The organization may also determine a need to deny someone the ability to deploy based on critical need with the home organization or other reasons, and having a policy already established for this makes the decision easier to communicate and defend.

These organization-level policies and procedures, along with staff-level training and policies should be developed to support the personnel in their decision and their readiness for an effective deployment.

Many things that should be determined and addressed would include:

  • Matters of pay, expenses, and insurance
  • Liability of personal actions
  • Code of conduct
  • What personnel are expected to provide vs what the organization will provide (equipment, supplies, uniform, etc.)
  • Physical fitness requirements and inoculations
  • Accountability to the home organization

Personnel also need to be prepared to work in austere conditions. They may not have a hotel room; instead they could be sleeping on a cot, a floor, or in a tent. This alone can break certain people, physically and psychologically. Access to showers and even restrooms might be limited. Days will be long, the times of day they work may not be what they are used to, and they will be away from home. They must be ready, willing, and able to be away from their lives – their families, pets, homes, jobs, routines, and comforts – for the duration of the deployment. Their deployment activity can subject them to physical and psychological stresses they must be prepared for. These are all things that personnel must take into consideration if they choose to be on a deployment roster.

This is stuff not taught in police academies, fire academies, or nursing schools. FEMA, the Red Cross, and other organizations have policy, procedures, training, and other resources available for their personnel because this is part of their mission and they make these deployments regularly. The big problem comes from personnel with organizations which don’t do this as part of their core mission. People who are well intentioned, even highly trained and skilled in what they do, but simply aren’t prepared for the terms and conditions of deployment can become a liability to the response and to themselves.

Of course, organizational policy and procedure continues from here in regard to their methods for actually approving, briefing, and deploying personnel; accounting for them during the deployment; and processing their return home. The conditions of their deployment may necessitate follow up physical and mental health evaluations (and care, as needed) upon their return. They should also be prepared to formally present lessons learned to the organization’s leadership and their peers.

I’ll say that any organization interested in the potential of deploying personnel during a disaster is responsible for making these preparations, but a broader standard can go a long way in this effort. I’d suggest that guidance should be established at the state level, by state emergency management agencies and their peers, such as state fire administrators; state departments of health, transportation, criminal justice, and others. These state agencies often contribute to and are even signatories of state-wide mutual aid plans which apply to the constituents of their areas of practice. Guidance developed at the state level should also dovetail into EMAC, as it’s states that are the signatories to these agreements and often rely on the resources of local organizations when requests are received.

There is clearly a lot to consider for organizations and individuals in regard to disaster deployments. It’s something often taken for granted, with the assumption that any responder can be sent to a location hundreds of miles away and be fully prepared to live and function in that environment. We can do better and we owe our people better.

Has your organization developed policies, procedures, and training for deploying personnel?

© 2021 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

EM Engagement with the Public

Emergency management is notoriously bad at marketing. People have a much better idea of what most other government agencies do, or simply (at least) that they exist. Establishing awareness and understanding of emergency management not only for the people you serve, but those you work with can go a long way toward meeting your goals.

As with any message everything is about the audience. Emergency management has a variety of audiences. While we have some programs and campaigns oriented toward individuals, much of our work is with organizations, including non-profits, other government agencies, and the private sector. All in all, most emergency managers are pretty good at interfacing and coordinating with organizations. It’s the public that we still struggle with. Emergency management inherited the burden of individual and family preparedness from the days of civil defense. Things were different then. Civil defense focused on one threat, it was persistent, and the calls to action were tangible and even practiced with the public in many communities.

And yes, I said that our present engagement with the public is a burden. Can it make a difference? Sure. Does it make a difference? Sometimes. While some can argue that any measurable difference we make is good, we all need to acknowledge that campaigns and programs for the public are often a huge source of frustration for emergency managers across the nation and elsewhere. We feel compelled to do it, but so often we can’t make that connection. While I think it is a worthwhile mission and there are successes, the usual rhetoric is stale (i.e., make a plan, build a kit, be informed, get involved) and our return on investment is extremely low.

We need to do more than handing out flyers at the county fair. Some communities have been able to find success through partner agencies or organizations that actually do work with the public on a regular basis, which I think is a better formula for success. These agencies and organizations already have an in with a certain portion of the population. They have an established presence, rapport, and reputation. Given that agencies and organizations have different audiences, it is best to engage more than one to ensure the best coverage throughout the community.

As mentioned, our usual rhetoric also needs to change. With continued flooding here in the northeast US, I saw a message from a local meteorologist on Twitter recently giving some information on the flooding and saying to ‘make a plan’. Fundamentally that’s good. Unfortunately, this message is pretty consistent with what we put out most of the time in emergency management. Yes, it’s a call to action, but incredibly non-specific. Should I plan to stay home? Should I plan to evacuate? Should I plan to get a three-week supply of bread and milk? I’ll grant that Twitter isn’t really the best platform for giving a lot of detail, but I think we can at least tell the public what to make a plan for and provide a reference to additional information.

Should EM disengage with the public at large? No, absolutely not. But we do need to find better ways to engage, and I think that really requires a keen eye toward marketing, analyzing our audiences to determine what kinds of messaging will work best, how to reach them, and what is important to them. Two messages a year about preparedness doesn’t cut it. Neither does a bunch of messages giving the FEMA hotline after a disaster. It needs to be consistent. It needs to be fun. It needs to be engaging. It should be multimodal – social media, speaking at local meetings, articles in the town newsletter, etc. Don’t be boring, don’t be technical, don’t be doom and gloom. Make it clear, make it interesting (to them… not you), and make it brief. Essentially, don’t be so ‘government’ about it. (The same applies for any corporate emergency management program as well).

I’ll also add that having a presence with the public in your community is, in a practical sense, a presence with voters. While emergency managers often talk about the need for emergency management to be politically neutral, there are a lot of interests that align with emergency management that are clearly partisan, giving cause for us to be political. For context (because ‘politics’ has become such a bad word) I’m not talking about campaigning for someone, attending a rally, or spewing political rhetoric; but rather being engaged in political processes, of which a huge part is having a regular and strong presence. Even with partisan issues aside, emergency management requires funding and other resources to be effective, and that often requires an extent of political engagement and support. We need to actively and regularly promote what we do and what we accomplish. No, it’s not usually as sexy as putting out a big fire or building a bridge, but most fire and highway departments don’t miss an opportunity to get that stuff in the news. That’s why people know them.

Given the fairly universal benefits to emergency managers everywhere, I’d love to see FEMA engage with a marketing firm to produce a broad range of reusable content. TV and radio spots; website and social media graphics; customizable newsletter articles and handouts; speaking points for meetings (no PowerPoint necessary, please), interviews, and podcasts; etc. This also can’t be done every 10 or 15 years. It’s something that should be refreshed every two years to stay relevant, fresh, and meaningful, and with the input of actual emergency managers and public information officers. Speaking of PIOs, if you think your only work with emergency management is during a disaster, think again. PIOs, even if not within EM, should absolutely be engaged in these efforts.

FEMA has produced some material in the past, as have some states for use by local governments, but we need more and we can’t hold our breath for this to be done. Emergency management is, however, a great community of practice. If you have a successful practice or message, please share it! Bring it to your networks or even provide information in a comment to this post.

© 2021 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

When to AAR

A discussion with colleagues last week both on and off social media on the development of after-action reports (AARs) for the COVID 19 pandemic identified some thoughtful perspectives. To contextualize, the pandemic is arguably the longest and largest response the world has ever faced. Certainly, no one argues the necessity for organizations to develop AARs, as there have been an abundance of lessons learned that transcend all sectors. It’s thankfully not often we are faced with such a long incident, but in these circumstances, we need to reconsider our traditional ways of doing things, which has generally been to develop an AAR at the conclusion of the incident.

One central aspect of the discussions was about the timing of the AARs. When should we develop an AAR for an incident? I certainly think that with most incidents, we can safely AAR when the incident is complete, particularly given that most incidents don’t last as long as the pandemic has. The difficulty with the pandemic, relative to AARs, is time. The more time goes on, the more we focus on recent concerns and the less we remember of the earlier parts of the response. This likely remains within tolerable limits for an incident that will last several weeks or even up to a few months, but eventually we need to recognize that that longer we go without conducting the after-action process, the more value we lose. Yes, we can recreate a lot with through documentation, but human inputs are critical to the AAR process, and time severely erodes those. Given this, I suggest the ideal practice in prolonged incidents is to develop interim AARs to ensure that chunks of time are being captured.

Another aspect related to this is to determine what measure we are using for the incident. The vast majority of AARs focus mostly on response, not recovery. This is an unfortunate symptom of the response-centric mentality that persists in emergency management. We obviously should be conducting AARs after the response phase, but we also need to remember to conduct them once the recovery phase is substantially complete. Given that recovery often lasts much longer than the response, we certainly shouldn’t wait until recovery is complete to develop a single AAR for the incident, rather we should be developing an AAR, at a minimum, at the substantial completion of response and another at the substantial completion of recovery.

Yet another complication in this discussion is that timing is going to be different for different organizations. I presently have some clients for which the pandemic is much less of a concern operationally as it was a year ago, especially with a vaccinated workforce. So much less of a concern, in fact, that they have largely resumed normal operations, though obviously with the continuation of some precautionary measures. Other organizations, however, are still in a full-blown response; while there are still yet others somewhere in the middle. This means that as we go through time, the pandemic will largely be over for certain organizations and jurisdictions around the world, while others are still consumed by the incident. While the WHO will give the official declaration of the conclusion of the pandemic, it will be over much sooner for a lot of organizations. Organizations should certainly be developing AARs when they feel the incident has substantially ended for them, even though the WHO may not have declared the pandemic to have concluded.

Consider that the main difference between evaluating an exercise and evaluating an incident is that we begin the exercise with the goal of evaluation. As such, evaluation activities are planned and integrated into the exercise, with performance standards identified and staff dedicated to evaluation. While we evaluate our operations for effectiveness during a response and into recovery, we are generally adjusting in real time to this feedback rather than capturing the strengths and opportunities for improvement. Be it during the incident or after, we need to deliberately foster the AAR process to not only capture what was done, but to help chart a path to a more successful future. I’ve been preaching about the value of incident evaluation for several years, and have been thankful to see that FEMA had developed a task book for such.

Given the complexity and duration of the pandemic, I started encouraging organizations to develop interim AARs longer than a year ago, and in fact supported a client in developing their initial response AAR just about a year ago. FEMA smartly assembled an ‘Initial Assessment Report’ of their early response activity through September of 2020, though unfortunately I’ve not seen anything since. There was a question about naming that came up in the discussions I had, suggesting that the term ‘AAR’ should be reserved for after the incident, and a different term used for any other reports. I partially agree. While I think we should still call it what it is – even if it’s done in the midst of an incident, it is still an after-action report – that being an analysis of actions we’ve taken within a defined period of time. Afterall, it’s not called an ‘after incident report’. That said, I do think that any AARs developed during the incident do warrant some clarification, which can incorporate the inclusion of a descriptor such as ‘interim’ or ‘phase 1, 2, 3, etc’, or whatever is most suitable. I don’t think we need anything standardized so long as it’s fairly self-explanatory.

Have you already conducted an AAR for the pandemic? Do you expect to do another?

© 2021 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Metrics and Data Analytics in Emergency Management

I’ve lately seen some bad takes on data analytics in emergency management. For those not completely familiar, data analytics is a broad-based term applied to all manner of data organization, manipulation, and modeling to bring out the most valuable perspectives, insights, and conclusions which can better inform decision-making. Obviously, this can be something quite useful within emergency management.

Before we can even jump into the analysis of data, however, we need to identify the metrics we need. This is driven by decision-making, as stated above, but also by operational need, measurement of progress, and reporting to various audiences, which our own common operating picture, to elected officials, to the public. In identifying what we are measuring, we should regularly assess who the audience is for that information and why the information is needed.

Once we’ve identified the metrics, we need to further explore the intended use and the audience, as that influences what types of analysis must be performed with the metrics and how the resultant information will be displayed and communicated.

I read an article recently from someone who made themselves out to be the savior of a state emergency operations center (EOC) by simply collecting some raw data and putting it into a spreadsheet. While this is the precursor of pretty much all data analysis, I’d argue that the simple identification and listing of raw data is not analytics. It’s what I’ve come to call ‘superficial’ data, or what someone on Twitter recently remarked to me as ‘vanity metrics’. Examples: number of people sheltered, number of customers with utility outages, number of people trained, number of plans developed.

We see a lot of these kinds of data in FEMA’s annual National Preparedness Report and the Emergency Management Performance Grant (EMPG) ‘Return on Investment’ report generated by IAEM and NEMA. These reports provide figures on dollars spent on certain activities, assign numerical values to priorities, and state how much of a certain activity was accomplished within a time period (i.e. x number of exercises were conducted over the past year). While there is a place for this data, I’m always left asking ‘so what?’ after seeing these reports. What does that data actually mean? They simply provide a snapshot in time of mostly raw data, which isn’t very analytical or insightful. It’s certainly not something I’d use for decision-making. Both of these reports are released annually, giving no excuse to not provide some trends and comparative analysis over time, much less geography. Though even in the snapshot-of-time type of report, there can be a lot more analysis conducted that simply isn’t done.

The information we report should provide us with some kind of insight beyond the raw data. Remember the definition I provided in the first paragraph… it should support decision-making. This can be for the public, the operational level, or the executive level. Yes, there are some who simply want ‘information’ and that has its place, especially where political influence is concerned.

There are several types of data analytics, each suitable for examining certain types of data. What we use can also depend on our data being categorical (i.e. we can organize our data into topical ‘buckets’) or quantitative. Some data sets can be both categorical and quantitative. Some analysis examines a single set of data, while other types support comparative analysis between multiple sets of data. Data analytics can be as simple as common statistical analysis, such as range, mean, median, mode, and standard deviation; while more complex data analysis may use multiple steps and various formulas to identify things like patterns and correlation. Data visualization is then how we display and communicate that information, through charts, graphs, geographic information systems (GIS), or even infographics. Data visualization can be as important as the analysis itself, as this is how you are conveying what you have found.

Metrics and analytics can and should be used in all phases of emergency management. It’s also something that is best planned, which establishes consistency and your ability to efficiently engage in the activity. Your considerations for metrics to track and analyze, depending on the situation, may include:

  • Changes over time
    • Use of trend lines and moving averages may also be useful here
  • Cost, resources committed, resources expended, status of infrastructure, and measurable progress or effectiveness can all be important considerations
  • Demographics of data, which can be of populations or other distinctive features
  • Inclusion of capacities, such as with shelter data
  • Comparisons of multiple variables in examining influencing factors (i.e. loss of power influences the number of people in shelters)
    • Regression modeling, a more advanced application of analytics, can help identify what factors actually do have a correlation and what the impact of that relationship is.
  • Predictive analytics help us draw conclusions based on trends and/or historical data
    • This is a rabbit you can chase for a while, though you need to ensure your assumptions are correct. An example here: a hazard of certain intensity occurring in a certain location can expect certain impacts (which is much of what we do in hazard mitigation planning). But carry that further. Based on those impacts, we can estimate the capabilities and capacities that are needed to respond and protect the population, and the logistics needed to support those capabilities.
  • Consider that practically any data that is location-bound can and should be supported with GIS. It’s an incredible tool for not only visualization but analysis as well.
  • Data analytics in AARs can also be very insightful.

As I mentioned, preparing for data analysis is important, especially in response. Every plan should identify the critical metrics to be tracked. While many are intuitive, there is a trove of Essential Elements of Information (EEI) provided in FEMA’s Community Lifelines toolkit. How you will analyze the metrics will be driven by what information you ultimately are seeking to report. What should always go along with data analytics is some kind of narrative not only explaining and contextualizing what is being shown, but also making some inference from it (i.e. what does it mean, especially to the intended audience).

I’m not expecting that everyone can do these types of analysis. I completed a college certificate program in data analytics last year and it’s still challenging to determine the best types of analysis to use for what I want to accomplish, as well as the various formulas associated with things like regression models. Excel has a lot of built-in functionality for data analytics and there are plenty of templates and tutorials available online. It may be useful for select EOC staff as well as certain steady-state staff to get some training in analytics. Overall, think of the variables which can be measured: people, cost, status of infrastructure, resources… And think about what you want to see from that data now, historically, and predicted into the future. What relationships might different variables have that can make data even more meaningful. What do we need to know to better support decisions?

Analytics can be complex. It will take deliberate effort to identify needs, establish standards, and be prepared to conduct the analytics when needed.

How have you used data analytics in emergency management? What do you report? What decisions do your analytics support? What audiences receive that information and what can they do with it?

© 2021 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

An Update of Ontario’s Incident Management System

Just yesterday, the Canadian province of Ontario released an update of its Incident Management System (IMS) document. I gave it a read and have some observations, which I’ve provided below. I will say that it is frustrating that there is no Canadian national model for incident management, rather the provinces determine their own. Having a number of friends and colleagues from across Canada, they have long espoused this frustration as well. That said, this document warrants an examination.

The document cites the Elliot Lake Inquiry from 2014 as a prompt for several of the changes in their system from the previous iteration of their IMS document. One statement from the Inquiry recommended changes to ‘put in place strategies that will increase the acceptance and actual use of the Incident Management System – including simplifying language’. Oddly enough, this document doesn’t seem to overtly identify any strategies to increase acceptance or use; in fact there is scant mention of preparedness activities to support the IMS or incident management as a whole. I think they missed the mark with this, but I will say the recommendation from the Inquiry absolutely falls in line with what we see in the US regarding acceptance and use.

The authors reinforce that ICS is part of their IMS (similar to ICS being a component of NIMS) and that their ICS model is compatible with ICS Canada and the US NIMS. I’ll note that there are some differences (many of which are identified below) that impact that compatibility, though don’t outright break it. They also indicate that this document isn’t complete and that they already identified future additions to the document including site-specific roles and responsibilities, EOC roles and responsibilities, and guidance on resource management. In regard to the roles and responsibilities, there is virtually no content in this document on organizations below the Section Chief level, other than general descriptions of priority activity. I’m not sure why they held off of including this information, especially since the ICS-specific info is reasonably universal.

I greatly appreciate some statements they make on the application of Unified Command, saying that it should only be used when single command cannot be established. They give some clarifying points within the document with some specific considerations, but make the statement that “Single command is generally the preferred form of incident management except in rare circumstances where unified command is more effective” and reinforce that regular assessment of Unified Command should be performed if implemented. It’s quite a refreshing perspective opposed to what we so often see in the US which practically espouses that Unified Command should be the go-to option. Unified Command is hard, folks. It adds a lot of complexity to incident management. While it can solve some problems, it can also create some.

There are several observations I have on ICS-related organizational matters:

  • They use the term EOC Director. Those who have been reading my stuff for a while know that I’m really averse to this term as facilities have managers. They also suggest that the term EOC Command could be used (this might even be worse than EOC Director!).
  • While they generally stick with the term Incident Commander, they do address a nuance where Incident Manager might be appropriate (they use ‘manager’ here but not for EOCs??). While I’m not sure that I’m sold on the title, they suggest that incidents such as a public health emergency that is wide-reaching and with no fixed site is actually managed and not commanded. So in this example, the person in charge from the Health Department would be the Incident Manager. It’s an interesting nuance that I think warrants more discussion.
  • The document refers several times to the IC developing strategies and tactics. While they certain may have input to this, strategies and tactics are typically reserved for the Operations Section.
  • There is an interesting mention in the document that no organization has tactical command authority over any other organization’s personnel or assets unless such authority is transferred. This is a really nuanced statement. When an organization responds to an incident and acknowledges that the IC is from another organization, the new organization’s resources are taking tactical direction from the IC. Perhaps this is the implied transfer of authority? This statement needs a lot of clarification.
  • Their system formally creates the position of Scribe to support the Incident Commander, while the EOC Director may have a Scribe as well as an Executive Assistant. All in all, I’m OK with this. Especially in an EOC, it’s a reflection of reality – especially the Executive Assistant – which is not granted the authority of a Deputy, but is more than a Scribe. I often see this position filled by a Chief of Staff.
  • The EOC Command Staff (? – they don’t make a distinction for what this group is called in an EOC) includes a Legal Advisor. This is another realistic inclusion.
  • They provide an option for an EOC to be managed under Unified Command. While the concept is maybe OK, ‘command’ is the wrong term to use here.
  • The title of Emergency Information Officer is used, which I don’t have any particular issue with. What’s notable here is that while the EIO is a member of the Command Staff (usually), the document suggests that if the EIO is to have any staff, particularly for a Joint Information Center, that they are moved to the General Staff and placed in charge of a new section named the Public Information Management Section. (a frustration here that they are calling the position the EIO, but the section is named Public Information). Regardless of what it’s called or if there is or is not a JIC, I don’t see a reason to move this function to the General Staff.
  • Aside from the notes above, they offer three organizational models for EOCs, similar to those identified in NIMS
  • More than once, the document tasks the Operations Section only with managing current operations with no mention of their key role in the planning process to develop tactics for the next operational period.
  • They suggest other functions being included in the organization, such as Social Services, COOP, Intelligence, Investigations, and Scientific/Technical. It’s an interesting call out whereas they don’t specify how these functions would be included. I note this because they refer to Operations, Planning, Logistics, and Finance/Admin as functions (which is fine) but then also calling these activities ‘functions’ leads me to think they intend for new sections to be created for these. Yes, NIMS has evolved to make allowances for some flexibility in the organization of Intel and Investigations, something like Social Services (for victims) is clearly a function of Operations. While I appreciate their mention of COOP, COOP is generally a very department-centric function. While a continuity plan could certainly be activated while the broader impacts of the incident are being managed, COOP is really a separate line of effort, which should certainly be coordinated with the incident management structure, but I’m not sure it should be part of it – though I’m open to discussion on this one.
  • I GREATLY appreciate their suggestion of EOC personnel being involved in planning meetings of incident responders (ICP). This is a practice that can pay significant dividends. What’s interesting is that this is a measure of detail the document goes into, yet is very vague or lacking detail in other areas.

The document has some considerable content using some different terminology in regard to incidents and incident complexity. First off, they introduce a classification of incidents, using the following terminology:

  • Small
  • Large
  • Major
  • Local, Provincial, and National Emergencies

Among these, Major incidents and Local/Provincial/National Emergencies can be classified as ‘Complex Incidents’. What’s a complex incident? They define that as an incident that involves many factors which cannot be easily analyzed or understood; they may be prolonged, large scale, and/or involve multiple jurisdictions. While I understand that perhaps they wanted to simplify the language associated with Incident Types, but even with the very brief descriptions the document provided on each classification, these are very vague. Then laying the term of ‘complex incident’ over the top of this, it’s considerably confusing.

**Edit – I realized that the differentiator between small incident and large incident is the number of responding organizations. They define a small incident as a single organization response, and a large incident as a multi agency response. So the ‘typical’ two car motor vehicle accident that occurs in communities everywhere, requiring fire, EMS, law enforcement, and tow is a LARGE INCIDENT????? Stop!

Another note on complex incidents… the document states that complex incidents involving multiple response organizations, common objectives will usually be high level, such as ‘save lives’ or ‘preserve property’, with each response organization developing their own objectives, strategies, and tactics.  I can’t buy into this. Life safety and property preservation are priorities, not objectives. And allowing individual organizations to develop their own objectives, strategies, and tactics pretty much breaks the incident management organization and any unity of effort that could possibly exist. You are either part of the response organization or you are not.

Speaking of objectives, the document provides a list of ‘common response objectives’ such as ‘save lives’ and ‘treat the sick and injured’. These are not good objectives by any measure (in fact they can’t be measured) and should not be included in the document as they only serve as very poor examples.

So in the end there was a lot in this document that is consistent with incident management practices, along with some good additions, some things that warrant further consideration, and some things which I strongly recommend against. There are certainly some things in here that I’d like to see recognized as best practices and adopted into NIMS. I recognize the bias I have coming from the NIMS world, and I tried to be fair in my assessment of Ontario’s model, examining it for what it is and on its own merit. Of course anyone who has been reading my posts for a while knows that I’m just as critical of NIMS and related documents out of the US, so please understand that my (hopefully) constructive comments are not intended to create an international incident. I’m a big fan of hockey and poutine – please don’t take those away from me!

I’m always interested in the perspectives of others. And certainly if you were part of the group that developed this document, I’d love to hear about some of your discussions and how you reached certain conclusions, as well as what you envision for the continued evolution for the Provincial IMS.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

A Few Thoughts on Emergency Planning

A conversation I find myself having fairly often is about people not using plans. It’s amazing that we invest so much time, money, and effort into building plans to never see them used, even if the opportunity presents itself. Why is this? I see four primary reasons:

1. People don’t know the plans exist. There is really no excuse for this one. I find it shameful and wasteful, especially if these people are identified as action agents within that plan. There was practically no point in even developing the plan no one knows about it and their respective roles identified within. Socialization of plans once they are developed is extremely important. Minimalist effort can be made by simply sending the plan or a link to the plan, but I consider this to be inadequate as many people will dismiss it, never get to reviewing it, or not understand what they are reading. Structured briefings are the best way to initially familiarize people with the plans and their roles. It helps to have refresher training as well as ensuring that new hires are similarly trained. This can even be done as a recorded presentation or webinar, though providing a contact for questions is important. Along with socializing, remember the importance of exercises, not only to validate plans but also to help people become more familiar with plans their respective roles by taking a scenario-drive dive into the content. Does everyone in your organization or jurisdiction who has a role in a plan know about it?

2. People don’t remember the plans exist. This one is a bit more forgivable, especially for newer plans, rarely implemented plans, or for personnel who are used to “doing things the way they’ve always been done”. Still, I find these excuses to be weak at best. People’s inability to remember the plans, even granting them the distraction of the incident itself, means that the plans haven’t been socialized and reinforced enough (see item 1 above).

3. People don’t care if the plans exist. This one has been underscored considerably over the past year related to pandemic plans, point of distribution (POD) plans, and other related plans. We’ve seen many senior leaders and elected officials be completely dismissive of established plans, choosing instead to “do it their way” in an effort to exert greater control or to ensure that their name is front and center. Since this one involves a lot of ego, particularly of senior leaders and elected officials, it can be difficult to work around. That said, this underscores the importance of ensuring that elected officials and newly appointed senior leaders are adequately briefed on the existing plans when they take office, and given confidence in the plans and the people identified to implement them, as well as the important roles of elected and appointed officials.

4. People think the plans are faulty. This option is the likely more well-intentioned version of #3, where people are intentionally not using the plan because they feel (maybe true, maybe not) the plan is inadequate and feel that “winging it” is the better option. Part of this lack of confidence may be unfamiliarity with and/or validation of the plans (see item 1 above re socialization and exercises). This could be a difference of opinion or even something intentionally obstructionist. Along with socialization and exercises, I’ll also add the value of including key people in the planning process. This gives them a voice at the table and allows their input to be heard and considered for development of the plan. While you can’t include everyone in the planning process, consider that the people you do choose to involve can serve as representatives or proxies for others, especially if they are well respected, giving less reason for others to push back.

A separate, but somewhat related topic (mostly to #4 above) is about people being often dismissive of or lacking confidence in plans by expressing the saying of “No plan survives first contact with the enemy”. This saying is credited to nineteenth century Prussian military commander Helmuth van Moltke. We see this saying tossed around quite a bit in various circles, including emergency management. While I understand and respect the intent of the phrase, I don’t think this necessarily holds true. I’ve seen great plans fail and mediocre plans be reasonably successful. Why? Circumstances dictate a lot of it. Implementation as well (this is the human factor). What we need to understand is that plans provide a starting point and hopefully some relevant guidance along the way. If a plan is so detailed and rigid, it is more likely to fail. So should our plans not be detailed? No, we should put as much detail as possible into our plans as these will help guide us in the midst of the incident, especially if certain activities are highly technical or process-oriented; but we also need to allow for flexibility. Consider a plan to be a highway. Highways have exits which take us off to different places, but they also have on-ramps to help us return. A deviation from a plan does not mean we throw the plan away, as we can always get back onto the plan, if it’s appropriate. It’s also smart to build in options, as possible, within our plans to help minimize deviations. 

How we develop plans is strongly related to step 2 of CPG-101, and that is “Understand the Situation”. Without an understanding of the situation, we can’t account for the various factors involved and may not account for the circumstances for which we must develop contingencies or options. And while this assessment is part of the planning process, as well as training, exercises, and other facets of preparedness, I feel that a wholistic assessment also has value. I’ve written a lot about the POETE preparedness elements and have begun advocating for APOETE, with the A standing for Assessment. This assessment is broad based to help guide our overall preparedness activity but is not a replacement for the element-specific assessments.

My last thought is about pandemic and POD plans. I’m curious about who has used their plans during this pandemic, and if not, why not? Of course many of the assumptions we used for pandemic planning weren’t realized in this pandemic. Does this mean our pandemic plans were faulty? Not entirely. Clearly there should have been many content areas that were still useful, and even though some of the assumptions we had didn’t apply to this pandemic, they may still hold true for future public health emergencies. We’ve also learned a lot about our response that needs to be considered for plan updates, and we need to weigh how much of the reality of political blundering we should account for in our plans. In the end, what I caution against is developing a pandemic plan that centers on the COVID-19 pandemic. Preparing for the last disaster doesn’t necessarily prepare us for the next one.

Those are some of my thoughts for the morning. As always, I welcome your thoughts and feedback.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

What Actually is Emergency Management?

Many people have a concept of what emergency management is, typically shaped by their own experiences or aspirations, but it is so much more. I think a part of the seeming identity crisis emergency management suffers as well as the issues expressed by some that emergency management isn’t a recognized profession stem from that the fact that so much of emergency management is actually a unified effort of an amalgamation of other professions. So let’s consider what emergency management actually is. The list below is not exhaustive and is largely formed around common professions, major activities, and areas of academic study.

  • Grants management
  • Accounting
  • Procurement
  • Logistics
  • Equipment Maintenance
  • GIS
  • Information Technology
  • Planning
  • Document Development and Publishing
  • Marketing
  • Communications
  • Public and Media Relations
  • Community Outreach
  • Volunteer Management
  • Instructional Design and Delivery
  • Data Analysis
  • Engineering
  • Project Management
  • Policy and Political Science
  • Business/Public Administration
  • Organizational Management and Leadership
  • Consulting and SME
  • Academics and Research
  • Physical Sciences (Geology, Meteorology, Earth Science, etc.)
  • Social Sciences (Sociology, Anthropology, etc.)

These are all distinct functions and major activities/professions I’ve seen in emergency managers and emergency management agencies. Many emergency managers do a lot of these, while some focus on a few or even just one. Some of these activities may be outsourced to other agencies or to the private sector. Yet any of the items on the list taken out of the context of emergency management are then no longer (at least directly) emergency management. This may be a permanent state for someone holding one of these positions, or perhaps they are brought into the realm of emergency management on more of an ad-hoc or temporary basis. On the other hand, the application of these activities within emergency management often requires them to have knowledge of the areas of emergency management in which they are being applied.

Defining what emergency management is and does without the context of these other professions/activities is difficult. There is a big part of emergency management that is less defined and tangible, filling in the gaps and connective tissue between and among all of these; harnessing and focusing the collective capabilities toward distinct efforts across preparedness and the five mission areas, by way of a highly complex effort which we encapsulate with one simple word – coordination. So oddly enough, emergency management is all of these, yet it is also something else.

I think the recognition of this will go a long way for us, helping to progress the profession while also being less rigid in our approach to pigeon-hole what an emergency manager is.

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Building Local Incident Management Capability

Just over a year ago I wrote An Alternate Concept of Incident Management Support, identifying the gap that exists in most local communities and areas for improved incident management capability. While I still think that formal Incident Management Teams (IMTs) are the gold standard, not every community or even multi-county region can support a formal IMT, which requires dozens of personnel and rigorous qualification and maintenance. Over the past year, we’ve seen a lot of use of IMTs across the nation, supporting the COVID-19 response and myriad other incidents. Sitting in over the last few days on the All Hazard Incident Management Team Association (AHIMTA) virtual symposium, there is a lot of exciting evolution happening with IMTs as they continue to enhance their capabilities. And while this is great, I feel we are leaving a lot of areas behind. This isn’t on the AHIMTA, but rather on the emergency management community as a whole. That said, there are certainly some intersections, as a lot of the training available to IMT personnel may need to be made more accessible to those who would be part of the Incident Support Quick Response Teams (ISQRTs) as I came to call them in the article I mentioned from last year, and addressing a fundamental need I’ve been espousing for a long time.

As I’ve soaked in a lot of great information from the AHIMTA symposium about IMTs, the need to build local capability in the absence of IMTs is even more apparent. Some may argue that IMTs are available to deploy to any area if requested. Possibly. Obviously there are a lot of conditions… what are other teams dealing with? What’s the relative priority of the requesting area? EMAC is certainly an option, but States need to approve the local request if they are to put the request into the EMAC system. The state may not agree with the need, may not want to spend the funds for an incoming team for an incident that may not receive a federal declaration, or it may not be practical to wait a couple of days to get an IMT on the ground when the response phase of the incident may be resolved or near resolved by then.   

Fundamentally, every area should have its own organic incident management capability. As mentioned, most areas simply can’t support or sustain the rigors of a formal IMT, but they can support a short roster of people who are interested, able, and capable. This is a situation where a little help can go a long way in making a difference in a local response for a more complex Type 4 incident or the onset of a Type 3 incident – or simply to do what they can for a larger incident where additional help simply isn’t available. I mentioned in last year’s article that the focus should really be on incident planning support, with an Incident Management Advisor to support the IC and local elected officials, an Incident Planning Specialist to help the local response organization harness the Planning Process, a Planning Assistant to support the detailed activities involved in a Planning Section such as situational awareness and resource tracking, and an Operations and Logistics Planner to support local responders who may have great tactical knowledge, but not much experience on operational planning much less forecasting logistical needs associated with this. Largely these are all advisors, who are likely to integrate into the incident management organization, so we aren’t creating new ICS positions, though I still encourage some deeper and deliberate application of incident management advisors.

My big thought today is how do we make something like this happen? First, I think we need to sell FEMA and State IMT programs and or State Training Officers on the concept. That comes first from recognizing and agreeing on the gap that exists and that we must support the organic incident management capability of local jurisdictions with fewer resources, through something that is more than the ICS courses, but less than what is required for an IMT. Part of this is also the recognition that these ISQRTs are not IMTs and not intended to be IMTs but fill an important role in addressing this gap. This will go a long way toward getting this past ICS and IMT purists who might feel threatened by this or for some reason disagree with the premise.

Next is establishing standards, which first is defined by general expectations of activity for each of these roles, pre-requisites for credentialing, then training support. The existing position-specific training is likely not fully appropriate for these positions, but a lot can be drawn upon from the existing courses, especially those for Incident Commander and the Planning Section positions, but there are also some valuable pieces of information that would come from Operations Section and Logistics Section Courses. I’d suggest that we work toward a curriculum to address these specific ISQRT roles. There are then some administrative details to be developed in terms of local formation, protocols for notification and activation, etc. State recognition is important, but perhaps approval isn’t necessarily needed, though coordination and support from States may be critical to the success of ISQRTs, again considering that these are most likely to be serving areas with fewer resources. ISQRTs will also need to work with local emergency managers and local responders to gain support, to be included in local preparedness activities, and to be called upon when they should be. A lot of success can be gained from things such as local/county/regional/state meetings of fire chiefs and police chiefs.

Do you agree with the gap that exists for most communities? What do you think we need to get the ball rolling on such a concept?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

You’ve Been Trained, Now What?

I just sat through a good webinar on incident response and management. The panel consisted of fire and law enforcement personnel. A law enforcement official was rather honest in saying that one of their identified deficiencies from an AAR was poor implementation of ICS. He said that while all police personnel had received ICS training back during the NIMS push of the mid-2000s, most officers had done little with it since. We see so many endless lists of training that people have taken on their CVs, resumes, LinkedIn, etc., but how much of that do they still know? Take an honest look at your own resume of training and I bet you will see some of the same.

In public safety we love to get training. A lot of the training is good. Some less so. Much of the training we take is awareness-level training, providing us with knowledge. It’s fairly easy to flex those muscles after the training by reading about, writing about, teaching, or doing other things with that information. Still, some of that acquired knowledge stagnates. Some of the training we take is more operations-based – it’s hands on or procedural. Most certainly, without using the knowledge and skills acquired in operations-based training, those skills atropy.

So what should we do to protect our loss of these valuable knowledge and skills acquired? Obviously application is the best means of preserving what we have learned. Even if you are using it, though, it’s good to stay on top of best practices, new practices, and updated training; not only as a means of staying current on the latest and greatest, but also to hedge against bad habits as well as certain nuggets of that original training we might not regularly perform. Apply and practice skills, either on the job or in exercises. For things that are more knowledge-based, talk about it, read about it, write about, or present on it. This repetition will keep the subject matter familiar and quicken your recall of facts and increase your ability to analyze it. Writing can be in any form, up to and including developing or updating plans and procedures. A special shout out goes to presentations and training (if you are qualified), though. Training and presentations often require the instructor/presenter to have a depth of knowledge beyond the learning domain of what they are teaching or presenting on. This is often required to answer questions, support implementation, and address the many what-ifs related to the subject matter.

I’d argue that your organization also has a role (and responsibility) in preserving these gained knowledge and skills as well. First, sharing of the experience is important. Since not everyone in your organization can attend every training opportunity, it’s a best practice for those who receive training to tell others about their experience, what they learned, and the relevance they see to their work. Simpler subject matter can be provided in an email or printed handout, while more complex subject matter might be better conveyed through a presentation. Unless your training was received to help you support an existing plan or procedure, your organization should also support implementation of what you have learned, if appropriate. Keeping knowledge and skills fresh should also be endorsed through opportunities for refresher training and other related training which may expand the knowledge and skills or hone specific application. Organizations should also identify what knowledge and skills they need and must maintain, and ensure that they identify staff that need the opportunities for training and development, as well as how to maintain what is learned.

With the personal and organizational costs of training, we reap the greatest benefit by maintaining or advancing the knowledge and proficiency gained. While the quest for knowledge is endless and admirable, and I’d generally never block an opportunity for someone to gain more, we should be assessing what the benefit is to learner and to the organization. Part of that is determining what commitments the organization and the learner must make to preserve what is gained. I believe that employee development plans can be a big part of this, as they should be informed by what the employee needs to improve upon, what we want them to excel at, and what future roles we may have planned for them. These factors drive the goals and objectives of the employee development plan which should also lead to what training opportunities are ideal to support these goals and objectives. Even if your organization doesn’t do any formal employee development plans, you can develop one for yourself.

What’s your take on keeping current with what you’ve learned?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Experience Bias

I recently read an interesting piece in Psychology Today by Dr. Christopher Dwyer titled ‘How Experience Can Hinder Critical Thinking’. Do check it out. There is application to pretty much everything, but of course I tend to think of things in the context of emergency management.

The article starts with the age long argument of education vs experience, but with a particular slant toward critical thinking. My personal take is that the education vs experience argument, in its totality, can’t have a blanket resolution. I think a lot of it is dependent on the topic at hand, and obviously it’s rarely a dichotomy, rather a blending of education and experience is often best. In regard to education, certainly the actual education received holds value, but there are tasks intrinsic to academia which also hold value, perhaps even more than what was learned in the classroom; the rigors of research in an academic environment often being most valuable among them. With that, in many regards, we often see employment announcements with a range of degree majors, or just simply a stated minimum of education, regardless of major. This is in recognition of the intrinsic value of education. And while some professions absolutely require a specific degree, those which don’t, can and should hold less rigidly to those requirements.

While I certainly advocate a minimum extent of education for most positions, I’ve also worked with a considerable number of people with a high school diploma or associate’s degree that can intellectually run circles around those with advanced degrees, at least in certain applications of work and life. Experience is often indicative of exposure to certain situations, often with repetition. The comparing and contrasting of those experiences with what is being experienced in the moment is what supports the argument for the value of experience. It’s also why many advanced degree programs actually require some term of actual work experience before they will accept applicants into their programs. Consider academic programs such as criminal justice. Sure, there are a lot of philosophical topics that are taught, but any courses that speak to practical application should probably be taught by those with actual experience doing those things. Though Dr. Dwyer gives wise advice, stating that we shouldn’t confuse experience with expertise.

All that said, Dr. Dwyer’s article focuses on the application of critical thinking in this argument. He cites some insightful data and studies, but most interesting to me is his mention of experience being personalized. While several people may have ‘been there, done that, got the t-shirt’, they each may have experienced the event differently or left with different impressions, even if exposed to some of the same situations. We all bring a bias with us, and this bias in the lens through which we view the events of our lives. That bias is then influenced by our perception of each event, fundamentally snowballing and compounding with the more experiences we have. This shows how our experiences can bias our own critical thinking skills. Dr. Dwyer states that critical thinking stemming from someone with more education than experience is likely to be more objective and based on knowledge, which certainly makes sense. That said, individuals basing their critical thinking solely on education may miss insight provided experiences, which can provide considerable context to the thought exercise.

I think the conclusion to be drawn in all this is that critical thinking, in most regards, is optimized by those with a blend of education and experience. It’s also extremely important for us to recognize our own limitations and biases when we approach a decision or other relevant situation. Specific to emergency management, we can leverage a lot from our experiences, but we also know that no two incidents are the same. Therefore, while our experiences can support us in a new event, they can also derail us if not applied thoughtfully and in recognition of our own biases.

This all comes around to my advocacy for emergency management broadly, and incident management in particular, being team sports. Even the first step of the CPG 101 planning process is to form a planning team. We each bring different approaches and perspectives. We also need to advocate for diversity in our teams, regardless of what tasks those teams are charged with. This should be diversity in the broadest sense – diversity of experience, diversity of discipline, diversity in education, diversity in gender, diversity in race, creed, culture, etc. The broader the better. We must do better opening ourselves to the perspectives of others. We all have bias – every one of us. Some bias, obviously depending on the focus, is OK, but it is best for us to balance our individual bias with those of a diverse group. A diverse team approach will bring us better results time and again.

How does experience bias impact you?

© 2021 Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC®