Gaps in ICS Doctrine and Documents

Last month I got to spend several days with some great international colleagues discussing problems and identifying solutions that will hopefully have a meaningful and lasting impact across incident management and emergency response. No, this wasn’t at an emergency management conference; this was with an incredible group of ICS subject matter experts convened by ICS Canada, with a goal of addressing some noted gaps in ICS doctrine, training, and other related documents. While the focus was specific to the documents under the purview of ICS Canada, most of these matters directly apply to ICS in the United States as well.

Overall, our doctrine, curriculum, etc. (collectively, documents) across ICS is a mess. Broadly, the issues include:

  • Poor definitions of key concepts and features of ICS.
  • Lack of proper emphasis or perspective.
  • Lack of inclusion of contemporary practices. (management concepts, social expectations, moral obligations, even legal requirements, etc.)
  • Lack of continuity from doctrine into supporting documents and curriculum. – Everything needs to point back to doctrine. Not that every tool needs to be explicitly included in the doctrine, but they should be based upon consistent standards.
  • A need to support updated training to improve understanding and thus implementation.

As we discussed among the group and I continued thought on this, I’ve realized that ICS, as it relates to the US (NIMS) has so little doctrine spread across a few NIMS documents (the core NIMS doctrine, National Qualification System documents, and a few guidance/reference documents – which aren’t necessarily doctrine). In the US, via the National Wildfire Coordinating Group (NWCG), we used to have a whole array of documents which could be considered ICS doctrine (in the days of NIIMS <yes, that’s two ‘eyes’>). When the responsibility for the administration of ICS (for lack of better phrasing) shifted to DHS, these documents were ‘archived’ by the NWCG and not carried over or adopted by the NIMS Integration Center (NIC) in DHS who now has responsibility for NIMS oversight and coordination. The NIC has developed some good documents, but in the 20 years since the signing of HSPD-5 (which created and required the use of NIMS) it seems the greatest progress has been on resource typing and little else.

Looking at current NIMS resources, I note that some are available from the core NIMS site https://www.fema.gov/emergency-managers/nims while others are available from EMI at https://training.fema.gov/emiweb/is/icsresource/. All these documents really need to be consolidated into one well organized site with doctrine identified separate from other resources and documents (i.e. job aids, guidance, etc.).

I thought it might be fun to find some examples so I decided to open up the ICS 300 instructor guide, flip through some pages, and look at a few concepts identified therein that might not have much doctrinal foundation. Here’s a few I came up with:

  • Formal and Informal Communication
    • These concepts aren’t cited anywhere in NIMS documents. While superficially they seem to be pretty straight forward, we know that communication is something we constantly need improvement in (see practically any after-action report). As such, I’d suggest that we need inclusion and reinforcement of foundational communications concepts, such as these, in doctrine to ensure that we have a foundation from which to instruct and act.
  • Establishing Command
    • This is mentioned once in the core NIMS doctrine with the simple statement that it should be done at the beginning of an incident. While often discussed in ICS courses, there are no foundational standards or guidance for what it actually means to establish command or how to do it. Seems a significant oversight for such an important concept.
  • Agency Administrator
    • While this term comes up several times in the core NIMS doctrine, they are simple references with the general context being that the Agency Administrator will seek out and give direction to the Incident Commander. It seems taken for granted that most often the Incident Commander needs to seek out the Agency Administrator and lead up, ask specific questions, and seek specific permissions and authorities.
  • Control Objectives
    • Referenced in the course but not defined anywhere in any ICS document.
  • Complexity Analysis
    • The course cites factors but doesn’t reference the NIMS Incident Complexity Guide. Granted, the NIMS Complexity Guide was published in June 2021 (after the most recent ICS 300 course material), but the information in the Complexity Guide has existed for some time and is not included in the course materials.
  • Demobilization
    • Another big example of the tail wagging the dog in NIMS. Demobilization is included across many ICS trainings, but there is so little doctrinal foundation for the concept. The core NIMS doctrine has several mentions of demobilization, even with a general statement of importance, but there is no standard or guidance on the process of demobilization beyond what is in curriculum – and training should never be the standard.

For ICS being our standard, we haven’t established it well as a standard. A lot of work needs to be done to pull this together, fill the gaps, and ensure that all documents are adequately and accurately cross-referenced. This will require a significant budget investment in the National Integration Center and the formation of stakeholder committees to provide guidance to the process. We need to do better.

What doctrine and document gaps do you see as priorities in NIMS?

© 2023 Tim Riecker, CEDP

Emergency Preparedness Solutions, LLC®

Failed Attempts to Measure NIMS Compliance – How can we get it right?

Yesterday the US Government Accountability Office (GAO) released a report titled Federal Emergency Management Agency: Strengthening Regional Coordination Could Enhance Preparedness Efforts.  I’ve been waiting for a while for the release of this report as I am proud to have been interviewed for it as a subject matter expert.  It’s the second GAO report on emergency management I’ve been involved in through my career.

The end game of this report shows an emphasis for a stronger role of the FEMA regional offices.  The GAO came to this conclusion through two primary discussions, one on grants management, the other on assessing NIMS implementation efforts.  The discussion on how NIMS implementation has thus far been historically measured shows the failures of that system.

When the National Incident Management System (NIMS) was first created as a nation-wide standard in the US via President Bush’s Homeland Security Presidential Directive (HSPD) 5 in 2003, the NIMS Integration Center (NIC) was established to make this happen.  This was a daunting, but not impossible task, involving development of a standard (lucky much of this already existed through similar systems), the creation of a training plan and curricula (again, much of this already existed), and encouraging something called ‘NIMS implementation’ by every level of government and other stakeholders across the nation.  This last part was the really difficult one.

As identified in the GAO report: “HSPD-5 calls for FEMA to (1) establish a mechanism for ensuring ongoing management and maintenance of the NIMS, including regular consultation with other federal departments and agencies and with state and local governments, and (2) develop standards and guidelines for determining whether a state or local entity has adopted NIMS.”

While there was generally no funding directly allocated to NIMS compliance activities for state and local governments, FEMA/DHS associated NIMS compliance as a required activity to be eligible for many of its grant programs.  (So let’s get this straight… If my jurisdiction is struggling to be compliant with NIMS, you will take away the funds which would help me to do so????)  (the actual act of denying funds is something I heard few rumors about, but none actually confirmed).

NIMS compliance was (and continues to be) a self-certification, with little to no effort at the federal level to actually assess compliance.  Annually, each jurisdiction would complete an online assessment tool called NIMSCAST (the NIMS Compliance Assistant Support Tool).  NIMSCAST ran until 2013.

NIMSCAST was a mix of survey type questions… some yes/no, some with qualified answers, and most simply looking for numbers – usually numbers of people trained in each of the ICS courses.  From FEMA’s NIMS website: “The purpose of the NIMS is to provide a common approach for managing incidents.”  How effective do you think the NIMSCAST survey was at gauging progress toward this?  The answer: not very well.  People are good at being busy but not actually accomplishing anything.  It’s not to say that many jurisdictions didn’t make good faith efforts in complying with the NIMS requirements (and thus were dedicated to accomplishing better incident management), but many were pressured and intimidated, ‘pencil whipping’ certain answers, fearing a loss of federal funding.   Even for those will good faith efforts, churning a bunch of people through training courses does not necessarily mean they will implement the system they are trained in.  Implementation of such a system required INTEGRATION through all realms of preparedness and response.  While NIMSCAST certainly provided some measurable results, particularly in terms of the number of people completing ICS courses, that really doesn’t tell us anything about IMPLEMENTATION.  Are jurisdictions actually using NIMS and, if so, how well?  NIMSCAST was a much a show of being busy while not accomplishing anything as some of the activities it measured.  It’s unfortunate that numbers game lasted almost ten years.

In 2014, the NIC (which now stands for the National Integration Center) incorporated NIMS compliance questions into the Unified Reporting Tool (URT), including about a dozen questions into every state’s THIRA and State Preparedness Report submission.  Jurisdictions below states (unless they are Urban Area Security Initiative grant recipients) no longer need to provide any type of certification about their NIMS compliance (unless required by the state).  The questions asked in the URT, which simply check for a NIMS pulse, are even less effective at measuring any type of compliance than NIMSCAST was.

While I am certainly being critical of these efforts, I have and continue to acknowledge how difficult this particular task is.  But there must be a more effective way.  Falling back to my roots in curriculum development, we must identify how we will evaluate learning early in the design process.  The same principal applies here.  If the goal of NIMS is to “provide a common approach to managing incidents”, then how do we measure that?  The only acceptable methodology toward measuring NIMS compliance is one that actually identifies if NIMS has been integrated and implemented.  How do we do that?

The GAO report recommends the evaluation of after action reports (AARs) from incidents, events, and exercises as the ideal methodology for assessing NIMS compliance.  It’s a good idea.  Really, it is.  Did I mention that they interviewed me?

AARs (at least those well written) provide the kinds of information we are looking for.  Does it easily correlate into numbers and metrics?  No.  That’s one of the biggest challenges with using AARs, which are full of narrative.  Another barrier to consider is how AARs are written.  The HSEEP standard for AARs is to focus on core capabilities.  The issue: there is no NIMS core capability.  Reason being that NIMS/ICS encompasses a number of key activities that we accomplish during an incident.  The GAO identified the core capabilities of operational coordination, operational communication, and public information and warning to be the three that have the most association to NIMS activities.

The GAO recommends the assessment of NIMS compliance is best situated with FEMA’s regional offices.  This same recommendation comes from John Fass Morton who authored Next-Generation Homeland Security (follow the link for my review of this book).  Given the depth of analysis these assessments would take to review AAR narratives, the people who are doing these assessments absolutely must have some public safety and/or emergency management experience.  To better enable this measurement (which will help states and local jurisdictions, by the way), there may need to be some modification to the core capabilities and how we write AARs to help us better draw out some of the specific NIMS-related activities.  This, of course, would require several areas within FEMA/DHS to work together… which is something they are becoming better at, so I have faith.

There is plenty of additional discussion to be had regarding the details of all this, but its best we not get ahead of ourselves.  Let’s actually see what will be done to improve how NIMS implementation is assessed.  And don’t forget the crusade to improve ICS training!

What are your thoughts on how best to measure NIMS implementation?  Do you think the evaluation of AARs can assist in this?  At what level do you think this should be done – State, FEMA Regional, or FEMA HQ?

As always, thanks for reading!

© 2016 – Timothy Riecker