NIMS Alert: NQS Qualifications and Task Books for Recovery, Mitigation, and Incident Evaluation

The National Integration Center (NIC) has been busy with developing more National Qualification System (NQS) tools for incident management.  Here are the titles for the latest release open to public comment:

  • Damage Assessment Coordinator
  • HM Community Education and Outreach Specialist
  • HM Community Planner Specialist
  • HM Engineering and Architect Specialist
  • HM Floodplain Management Specialist
  • EHP Environmental Specialist
  • EHP Historic Preservation Specialist
  • Incident/Exercise Evaluator
  • Public Assistance
  • State Disaster Recovery Coordinator

There may be some incident management and response purists out there wondering why they should care about these particular titles.  I’ll agree that most of them aren’t used in a life-saving response capacity, but these are the people you want to have backing you up – otherwise you may never get away from the incident and you will find yourself in a very foreign land where complex requirements from FEMA and other federal agencies are the rules of play.

Having worked disaster recovery for some massive incidents, such as Hurricane Sandy, I can personally attest to the value so many of these people bring to the table.  It’s great to see qualification standards being established for them, just as they are for core incident management team personnel and resources.  While my experience with most of these is ancillary, however, I’ll leave specific commentary on them to those functional experts.

There is one role in here that I’m particularly pleased to see and will comment on, and that’s the Incident/Exercise Evaluator.  I wrote last year on this topic specifically and have reflected on its importance in other posts.  I see the inclusion of an Incident Evaluator in the NQS as being a huge success and the beginning of a conscious and deliberate shift toward evaluation and improvement in what we do.  Looking at the resource typing definition, I’m pretty pleased with what the NIC has put together.

What I like… I appreciate that they include a note indicating that personnel may need additional training based upon the nature or specialization of the incident or exercise.  They include a decent foundation of NIMS/ICS, exercise, and fundamental emergency management training across the various position types (although most of these are FEMA Independent Study courses -which I think are great for introductory and supplemental matter, but shouldn’t be the only exposure personnel have), including a requirement of completion of the Homeland Security Exercise and Evaluation Program (HSEEP) for a Type 1.

What I feel needs to be improved…  Considering that the Type 1 Incident/Exercise Evaluator is expected to lead the evaluation effort, I’d like to see more than just HSEEP training being the primary discerning factor.  Just because someone has completed HSEEP doesn’t mean they can plan a project, lead a team, or extrapolate HSEEP exercise evaluation practices to be effective for incident evaluation.  I suggest HSEEP should be the requirement for the Type 2 position (which would correlate well to the position description), with additional training on project management and leadership supporting the Type 1 position.  While the note is included re: the potential need for additional training, there is nothing in this about operational experience, which I think is rather important.  Lastly, this seems to identify a need for course and/or guidance specific to incident evaluation, which can and should use the principals of HSEEP as its foundation, but identify the differences, best practices, and approaches to applying them to an incident or event.

I’d love to hear your thoughts on incident evaluation as well as the other positions being identified in the NQS. Do you participate in the national engagements and provide feedback?

© 2018 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC™

 

 

Failed Attempts to Measure NIMS Compliance – How can we get it right?

Yesterday the US Government Accountability Office (GAO) released a report titled Federal Emergency Management Agency: Strengthening Regional Coordination Could Enhance Preparedness Efforts.  I’ve been waiting for a while for the release of this report as I am proud to have been interviewed for it as a subject matter expert.  It’s the second GAO report on emergency management I’ve been involved in through my career.

The end game of this report shows an emphasis for a stronger role of the FEMA regional offices.  The GAO came to this conclusion through two primary discussions, one on grants management, the other on assessing NIMS implementation efforts.  The discussion on how NIMS implementation has thus far been historically measured shows the failures of that system.

When the National Incident Management System (NIMS) was first created as a nation-wide standard in the US via President Bush’s Homeland Security Presidential Directive (HSPD) 5 in 2003, the NIMS Integration Center (NIC) was established to make this happen.  This was a daunting, but not impossible task, involving development of a standard (lucky much of this already existed through similar systems), the creation of a training plan and curricula (again, much of this already existed), and encouraging something called ‘NIMS implementation’ by every level of government and other stakeholders across the nation.  This last part was the really difficult one.

As identified in the GAO report: “HSPD-5 calls for FEMA to (1) establish a mechanism for ensuring ongoing management and maintenance of the NIMS, including regular consultation with other federal departments and agencies and with state and local governments, and (2) develop standards and guidelines for determining whether a state or local entity has adopted NIMS.”

While there was generally no funding directly allocated to NIMS compliance activities for state and local governments, FEMA/DHS associated NIMS compliance as a required activity to be eligible for many of its grant programs.  (So let’s get this straight… If my jurisdiction is struggling to be compliant with NIMS, you will take away the funds which would help me to do so????)  (the actual act of denying funds is something I heard few rumors about, but none actually confirmed).

NIMS compliance was (and continues to be) a self-certification, with little to no effort at the federal level to actually assess compliance.  Annually, each jurisdiction would complete an online assessment tool called NIMSCAST (the NIMS Compliance Assistant Support Tool).  NIMSCAST ran until 2013.

NIMSCAST was a mix of survey type questions… some yes/no, some with qualified answers, and most simply looking for numbers – usually numbers of people trained in each of the ICS courses.  From FEMA’s NIMS website: “The purpose of the NIMS is to provide a common approach for managing incidents.”  How effective do you think the NIMSCAST survey was at gauging progress toward this?  The answer: not very well.  People are good at being busy but not actually accomplishing anything.  It’s not to say that many jurisdictions didn’t make good faith efforts in complying with the NIMS requirements (and thus were dedicated to accomplishing better incident management), but many were pressured and intimidated, ‘pencil whipping’ certain answers, fearing a loss of federal funding.   Even for those will good faith efforts, churning a bunch of people through training courses does not necessarily mean they will implement the system they are trained in.  Implementation of such a system required INTEGRATION through all realms of preparedness and response.  While NIMSCAST certainly provided some measurable results, particularly in terms of the number of people completing ICS courses, that really doesn’t tell us anything about IMPLEMENTATION.  Are jurisdictions actually using NIMS and, if so, how well?  NIMSCAST was a much a show of being busy while not accomplishing anything as some of the activities it measured.  It’s unfortunate that numbers game lasted almost ten years.

In 2014, the NIC (which now stands for the National Integration Center) incorporated NIMS compliance questions into the Unified Reporting Tool (URT), including about a dozen questions into every state’s THIRA and State Preparedness Report submission.  Jurisdictions below states (unless they are Urban Area Security Initiative grant recipients) no longer need to provide any type of certification about their NIMS compliance (unless required by the state).  The questions asked in the URT, which simply check for a NIMS pulse, are even less effective at measuring any type of compliance than NIMSCAST was.

While I am certainly being critical of these efforts, I have and continue to acknowledge how difficult this particular task is.  But there must be a more effective way.  Falling back to my roots in curriculum development, we must identify how we will evaluate learning early in the design process.  The same principal applies here.  If the goal of NIMS is to “provide a common approach to managing incidents”, then how do we measure that?  The only acceptable methodology toward measuring NIMS compliance is one that actually identifies if NIMS has been integrated and implemented.  How do we do that?

The GAO report recommends the evaluation of after action reports (AARs) from incidents, events, and exercises as the ideal methodology for assessing NIMS compliance.  It’s a good idea.  Really, it is.  Did I mention that they interviewed me?

AARs (at least those well written) provide the kinds of information we are looking for.  Does it easily correlate into numbers and metrics?  No.  That’s one of the biggest challenges with using AARs, which are full of narrative.  Another barrier to consider is how AARs are written.  The HSEEP standard for AARs is to focus on core capabilities.  The issue: there is no NIMS core capability.  Reason being that NIMS/ICS encompasses a number of key activities that we accomplish during an incident.  The GAO identified the core capabilities of operational coordination, operational communication, and public information and warning to be the three that have the most association to NIMS activities.

The GAO recommends the assessment of NIMS compliance is best situated with FEMA’s regional offices.  This same recommendation comes from John Fass Morton who authored Next-Generation Homeland Security (follow the link for my review of this book).  Given the depth of analysis these assessments would take to review AAR narratives, the people who are doing these assessments absolutely must have some public safety and/or emergency management experience.  To better enable this measurement (which will help states and local jurisdictions, by the way), there may need to be some modification to the core capabilities and how we write AARs to help us better draw out some of the specific NIMS-related activities.  This, of course, would require several areas within FEMA/DHS to work together… which is something they are becoming better at, so I have faith.

There is plenty of additional discussion to be had regarding the details of all this, but its best we not get ahead of ourselves.  Let’s actually see what will be done to improve how NIMS implementation is assessed.  And don’t forget the crusade to improve ICS training!

What are your thoughts on how best to measure NIMS implementation?  Do you think the evaluation of AARs can assist in this?  At what level do you think this should be done – State, FEMA Regional, or FEMA HQ?

As always, thanks for reading!

© 2016 – Timothy Riecker

Updating ICS Training: Identification of Core Competencies

The crusade continues.  ICS training still sucks.  Let’s get enough attention on the subject to get it changed and make it more effective.

If you are a new reader of my blog, or you happened to miss it, check out this post from last June which should give you some context: Incident Command System Training Sucks.

As mentioned in earlier posts on the topic, the ICS-100 and ICS-200 courses are largely OK as they current exist.  Although they could benefit from a bit of refinement, they accomplish their intent.  The ICS-300 course is where we rapidly fall apart, though.  Much of the ICS-300 is focused on the PLANNING PROCESS, which is extremely important (I’ve worked a lot as an ICS Planning Section Chief), however, there is knowledge that course participants (chief and supervisor level responders) need to know well before diving into the planning process.

First responders and other associated emergency management partners do a great job EVERY DAY of successfully responding to and resolving incidents.  The vast majority of these incidents are fairly routine and of short duration.  In NIMS lingo we refer to these as Type IV and Type V incidents.  The lack of complexity doesn’t require a large organization, and most of that organization is dedicated to getting the job done (operations).  More complex incidents – those that take longer to resolve (perhaps days) and require a lot more resources, often ones we usually don’t deal with regularly – are referred to as Type III incidents.  Type III incidents, such as regional flooding or most tornados, are localized disasters.  I like to think of Type III incidents as GATEWAY INCIDENTS.  Certainly far more complex than the average motor vehicle accident, yet not hurricane-level.  The knowledge, skills, and abilities applied in a Type III, however, can be directly applied to Type II and Type I incidents (the big ones).

It’s not to say that what is done in a car accident, conceptually, isn’t done for a hurricane, but there is so much more to address.  While the planning process certainly facilitates a proactive and ongoing management of the incident, there are other things to first be applied.  With all that said, in any re-writing and restructuring of the ICS curriculum, we need to consider what the CORE COMPETENCIES of incident management are.

What are core competencies?  One of the most comprehensive descriptions I found of core competencies comes from the University of Nebraska – Lincoln, which I summarized below.  While their description is largely for a standing organization (theirs), these concepts easily apply to an ad-hoc organization such as those we establish for incident management.

Competency: The combination of observable and measurable knowledge, skills, abilities and personal attributes that contribute to enhanced employee performance and ultimately result in organizational success. To understand competencies, it is important to define the various components of competencies.

  • Knowledge is the cognizance of facts, truths and principles gained from formal training and/or experience. Application and sharing of one’s knowledge base is critical to individual and organizational success.
  • A skill is a developed proficiency or dexterity in mental operations or physical processes that is often acquired through specialized training; the execution of these skills results in successful performance.
  • Ability is the power or aptitude to perform physical or mental activities that are often affiliated with a particular profession or trade such as computer programming, plumbing, calculus, and so forth. Although organizations may be adept at measuring results, skills and knowledge regarding one’s performance, they are often remiss in recognizing employees’ abilities or aptitudes, especially those outside of the traditional job design.

When utilizing competencies, it is important to keep the following in mind:

  • Competencies do not establish baseline performance levels
  • Competencies support and facilitate an organization’s mission 
  • Competencies reflect the organization’s strategy; that is, they are aligned to short- and long-term missions and goals.
  • Competencies focus on how results are achieved rather than merely the end result. 
  • Competencies close skill gaps within the organization.
  • Competency data can be used for employee development, compensation, promotion, training and new hire selection decisions.

So what are the CORE COMPETENCIES OF INCIDENT MANAGEMENT?  What are the knowledge, skills, and abilities (KSAs) that drive organizational success in managing and resolving an incident?  Particularly for this application, we need to focus on WHAT CAN BE TRAINED.  I would offer that knowledge can be imparted through training, and skills can be learned and honed through training and exercises; but abilities are innate, therefore we can’t weigh them too heavily when considering core competencies for training purposes.

All in all, the current ICS curriculum, although in need of severe restructuring, seems to cover the knowledge component pretty well – at least in terms of ICS ‘doctrine’.  More knowledge needs to be imparted, however, in areas that are tangential to the ICS doctrine, such as emergency management systems, management of people in the midst of chaos, and other topics.  The application of knowledge is where skill comes in. That is where we see a significant shortfall in the current ICS curriculum.  We need to introduce more SCENARIO-BASED LEARNING to really impart skill-based competencies and get participants functioning at the appropriate level of Bloom’s Taxonomy.

Aside from the key concepts of ICS (span of control, transfer of command, etc.), what core competencies do you feel need to be trained to for the average management/supervisor level responder (not an IMT member)?  What knowledge and skills do you feel they need to gain from training?  What do we need a new ICS curriculum to address?

(hint: this is the interactive part!  Feedback and comments welcome!)

As always, thanks to my fellow crusaders for reading.

© 2016 – Timothy Riecker

Emergency Preparedness Solutions, LLC

FEMA National Preparedness System Updates

This afternoon EMForum.org hosted Donald ‘Doc’ Lumpkins, the Director of the National Integration Center from the National Preparedness Directorate. Doc had some great information on their current and near future activities regarding updates to the National Incident Management System (NIMS) and new Comprehensive Preparedness Guides (CPGs) expected to be released this year.  This is great news as we are always seeking additional national guidance and revisions which help us to maintain standards of practice.

Regarding NIMS, the guiding document has not been revised since 2008.  Doc specifically mentioned updates to NIMS to include:

  • the National Preparedness Goal and the National Preparedness System
  • Expanding NIMS across all five mission areas (Prevention, Protection, Mitigation, Response, and Recovery)
  • Encouraging whole community engagement and understanding
  • Continued emphasis that NIMS is more than just the Incident Command System (ICS)
  • Integrating incident support structures (such as EOCs – more on EOCs later)
  • Integrating situational awareness content
  • Incorporating lessons learned from exercises and real world events (Doc mentioned his office’s activity of culling through LLIS.gov to gain much of this information)
  • Including stakeholder feedback in the revision efforts
  • NIMS update activities will be conducted through the summer with an expected release of a new document this fall

As a significant component of the NIMS update, there will also be continued efforts to update the resource typing list.  Priority will be given to resources which are often requested.

The next topic of discussion was the Comprehensive Preparedness Guides (CPGs).  I was very excited to see a list of likely and potential CPGs either currently under development or expected to be developed soon.  These included:

  • Updating CPG 101
  • A CPG for Strategic Planning (This should shape out to be excellent guidance and essentially serves as a ‘catch all’ for many of the strategic planning tasks we do in emergency management)
  • Incident Action Planning (Doc said this will not be anything new or a replacement of best practices such as the Planning P.  Rather this document will serve to capture these best practices and ensure currency and critical linkages)
  • Planning for mass casualty incidents
  • Social media (a critical aspect of emergency management that is still changing regularly, and I don’t yet feel that we have a firm grasp on it and how to best use it.)
  • Access/Re-Entry to disaster sites
  • Improvised Explosive Devices (crafting hazard-specific annexes)
  • EOC guidelines (I’m hoping this document, while outlining best practices, provides flexibility for different management models of EOCs)
  • Search and rescue management

I’ve come to greatly appreciate that the National Preparedness System is a blanket thrown over the five mission areas, recognizing that each mission area (again – Prevention, Protection, Mitigation, Response, and Recovery) must be prepared for at every level of government to achieve the greatest measure of effectiveness.  There are many critical linkages within preparedness that are found within each or at least most mission areas and the continued efforts of the National Preparedness Directorate seem to be going in a good direction and incorporating the right people and information in their efforts.  Within this frame of thought, Doc mentioned that all of these efforts will utilize subject matter experts from across the country, with many drafts having public comment periods.  Be on the look out for these (I’ll post them as I see them) and be sure to review and comment on them.

As a final note, this was the last broadcast for EMForum.  After 17 years they are shutting down their program.  There has been no mention as to why they are shutting down.  While I’ve not attended every webinar, I do catch a few each year when the topic and/or speaker interest me.  The loss of EMForum is a loss to emergency management and the spirit of sharing information we have.  Through EMForum, there have been many great webinars, such as this one, where new programs and best practices are shared.  I’m hopeful the function that EMForum has served in facilitating this soon replaced so we can continue to stay up to date on what is transpiring.

©2014 Timothy Riecker