Ever think a problem was fixed just to find that the solution was really more of a problem or a totally different kind of problem. While this can certainly happen in our person lives, I see this happen a lot in my professional life, and I’m sure you do as well. Through my tenure in emergency management, I’ve seen a lot of ill-informed assessments, poorly written plans, misguided training programs, bad hires or contracts, unwise equipment purchases, and exercises that could really be called damaging. Not only is the time, money, and effort put into developing these a waste of time (aside from learning how not to do them), they can have ramifications that cause issues to be solved in the short term or down the road.
Poorly conducted assessments can result in a lot of problems. If the data, the analysis, or conclusions are wrong, this can have considerable consequences if that assessment was intended to inform other projects, such as plans, construction, hazard mitigation efforts, staffing, and more. I’ve seen people point to reports with the assumption that the data was complete, analysis was unbiased, and conclusions are correct, and with something akin to blind obedience. When an assessment is used to justify spending and future efforts, we need to ensure that the assessment is carefully planned and executed. Similarly, we’ve all seen a lot of decisions based on no assessment at all. This can be just as dangerous.
Bad planning is a problem that has always, and I fear will always, plague emergency management. Of course, there are some really stellar plans out there, but they seem to be the exception. There are an abundance of mediocre plans in existence, which I suppose are fine but in the end aren’t doing anyone any favors because while the plans themselves may be fine, they tend not to include much useful information, specifics on procedure, or job aids to support implementation of the plan.
Here’s an example of how disruptive bad plans can be: A few years ago, my firm was hired by a UASI to design, conduct, and evaluate a couple of exercises (one discussion-based, the other operations-based) to validate a new plan written for them by another firm. Being that the exercises were to be based on the plan, I took a deep dive into the plan. I honestly found myself confused as I read. I forwarded the plan to a member of our project team to review and, quite unsolicited, I received a litany of communications expressing how confounded he was by the plan. At the very best, it was unorganized and poorly thought out. The subject matter lent itself to a timeline-based progression, which they seemed to have started then abandoned, which resulted in a scattering of topic-based sections that were poorly connected. After conferring with that team member to develop some very specific points, I approached our client for a very candid conversation. I came to find out that the planning process recommended and established by CPG-101, NFPA 1600, and others, was not at all used, instead the firm who built the plan didn’t confer with stakeholders at all and delivered (late) a final product with no opportunity for the client to review and provide feedback. This is a firm that gives other consulting firms a bad name. Working with the client, we restructured our scope of work, turning the tabletop exercise into a planning workshop which we used to inform a full re-write of the plan, which we then validated through the operations-based exercise.
Having been involved in training and exercises for the entire duration of my career, I’ve seen a lot of ugly stuff. We’ve all been through training that is an epic waste of time – training that clearly was poorly written, wasn’t written with the intended audience in mind, and/or didn’t meet the need it was supposed to. For the uninitiated, I’ll shamelessly plug my legacy topic of ICS Training Sucks. Possibly even worse is training that teaches people the wrong way to do things. Similarly, poorly designed, conducted, and evaluated exercises are not only a waste of time, but can be very frustrating, or even dangerous. Don’t reinforce negative behavior, don’t make things more complex than they are, don’t put people in danger, and DO follow established guidance and best practices. Finally, if you are venturing into unknown territory, find someone who can help you.
Equipment that’s not needed, has different capability than what is needed, is overpurchased, underperforms, undertrained, poorly stored and maintained, readily obsolete, and not used. Familiar with any of this? It seems to happen with a lot of agencies. Much of this seems to stem from grant funding that has very specific guidelines and must be spent in a fairly short period of time. Those who have been around for a while will remember the weapons of mass destruction (WMD) preparedness program that started prior to 9/11 and was bolstered by post-9/11 program funding. The centerpiece of this program was equipment purchases. While there was some good that came from this program, I witnessed a lot of wasted money and mis-guided purchases for equipment that wasn’t needed, for jurisdictions that didn’t need it or couldn’t sustain it, and supporting training and exercises to teach people how to use the equipment and keep them proficient. A lot of this circles back to poor (or non-existent) assessments used to inform these purchases, but the real culprit here is the ‘spend it or lose it’ mentality of grant surges like this. Foundational aspects of this program, such as defined need, sustainability, and interoperability were often skewed or ignored in favor of simply spending the funds that were thrust upon jurisdictions. I really blame the poor structuring of this program at the federal level on the poor implementations I saw and heard of at the state and local levels.
There are so many other examples of poor implementations that cause problems. Poorly built infrastructure, misguided hazard mitigation projects, and even poor responses. In the realm of response, I’ll draw on another example that I was involved in. Large disasters really do need to draw on a whole-community approach, which often leads to agencies who aren’t used to large-scale and long-duration incident operations going in over their heads. In one large disaster, I had been hired to help lead a team assembled to fix just such an occurrence, charged with rescuing a functionally necessary program that had been managed into the ground by a well intentioned but overly bureaucratic agency with high degrees of micromanagement. The time, money, and effort exerted to support saving this program from itself was fairly extensive, and, in implementation, challenging given the layers and nuances created by the agency that built it. In the end, the biggest issues they had were not listening to subject matter experts, some of which were in their own agency, and, ultimately, a failure of executives to deal with very apparent problems.
Most emergency management agencies operate on very slim and limited budgets. Being efficient and effective is of great importance. Don’t waste limited money or limited time of limited staff. Sometimes the things with greatest impact are simple, but if executed poorly the consequences can be high. Think things through and consult the right people. It makes a difference.
©2020 Timothy Riecker, CEDP