The 2019 National Preparedness Report, or ‘How Are We Measuring Preparedness?’

FEMA recently released the 2019 National Preparedness Report.  Simply put, I’m confused.  Nothing in the report actually lines up with doctrine.  It leaves me wondering how we are actually measuring preparedness.  So what’s the issue?

While the National Preparedness Report is initially structured around the five mission areas (Prevention, Protection, Mitigation, Response, and Recovery), the only full inclusion of the Core Capabilities in the report is a table on page 9, outlining usage of grant funds per Core Capability.  After this, the Core Capabilities for each mission are listed in the title page for each mission area within the detailed findings for those mission areas.  No detail of progress within these Core Capabilities is provided, however.  With the absence of this analysis, we are not seeing data on the progression of preparedness, which, per the National Preparedness Report, is measured through the lens of each of the Core Capabilities.

This is further confused on pages 45 and 48, in particular, where tables list the Community Lifelines with some sort of correlated ‘capabilities’ (noted with a lowercase ‘c’… thus not the Core Capabilities).  These capabilities are not from any doctrine that I can find or recall, including the components and subcomponents for each Community Lifeline provided in the Community Lifelines Toolkit.  For each of these they provide some analytical data, but it’s unclear what this is based upon.  The methodology provided early in the document does nothing to identify why this change in format has occurred or where these specific data sets come from, much less why they are deviating from the previous format and the standards provided through the National Preparedness Goal.

Some perspective… It would seem logical that the National Preparedness Report would be assessing our national state of preparedness relative to the National Preparedness Goal, as it has since its inception.  The National Preparedness Goal is structured around the five mission areas and the 32 Core Capabilities.  With the emergence of the Community Lifelines and their inclusion in the recent update of the National Response Framework, it makes sense that we will see Community Lifelines further integrated into standards, doctrine, and reports, but they have yet to be integrated into the National Preparedness Goal (the current version is dated 2015).  We have not yet seen a comprehensive crosswalk between the Community Lifelines and the Core Capabilities, but it should be recognized that there are certain aspects, even if you just examine the Response Mission Area, that don’t match up.

In an unrelated observation on the National Preparedness Report, the trend continues with citing after action reports from the year, but not actually providing any analysis of lessons learned and how those are being applied across the nation.

Bottom line… while there are some valuable nuggets of information included in this report, I find most of it to be confusing, as it lacks a consistent format on its own, as well as inconsistency with the existing standard of measurement as defined by the National Preparedness Goal.  Why is this a big deal?  First, it’s a deviation from the established standard.  While the standard may certainly have room for improvement, the standard must first be changed before the metrics in the reporting can be changed.  Second, with the deviation from the standard, we aren’t able to measure progress over time.  All previous National Preparedness Reports have provided data within the scope of Core Capabilities, while this one largely does not.  This breaks the possibility of any trend analysis.  Third, there is no reasoning provided behind the capabilities (lowercase ‘c’) associated with each of the Community Lifelines in the report.  It’s simply confusing to the extent that it becomes irrelevant because the information provided is not within the existing lexicon which is used for measurement of practically everything in preparedness.

Simply put, this year’s report is even more disappointing than those provided in previous years.  In fact, since it doesn’t conform with the current standard, I’d suggest it’s not even valid.  This should be far better.

Thoughts?

© 2019 – Timothy Riecker, CEDP

Emergency Preparedness Solutions, LLC

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s