Vantage Point: Bridging the Digital (Flood Data) Divide

A 184Kb PDF of this article as it appeared in the magazine—complete with images—is available by clicking HERE

The first Technical Mapping Advisory Council (TMAC) to FEMA noted in its very first report (1996) that embracing the digital environment would improve efficiency in generating, managing, integrating, and distributing flood data. The National Research Council’s 2009 report, "Mapping the Zone: Improving Flood Map Accuracy", also stressed the significant advances made possible by a fully digital environment, finding that digital mapping creates opportunity to significantly improve communication of flood hazards and flood risks through maps and web-based products. That report further recommended that FEMA ensure new flood information, revisions, and Letters of Map Change be incorporated into the digital Flood Insurance Rate Maps as soon as they become effective, something that can only be accomplished in a fully digital environment.

The present TMAC has also embraced this view of a digital future and its opportunities to update and disseminate data more rapidly and to present data in different formats for more effective communication with different user groups. At the same time, TMAC recognizes the need to establish accuracy and precision specifications for each layer, and to provide documentation as to source and methodology. Standards for compilation of individual data layers and the information within them may already exist for many of those layers.

Regardless of the benefits of such changes, we must acknowledge that not everyone can fully utilize digital information–or even access it. Some communities in the National Flood Insurance Program have only the most basic of computer hardware, software, and skills. The information-seeking public at large can be in the same or even more computer-challenged situation, and may be confused or feel disenfranchised as a result; public awareness and understanding of flood hazards and flood risks may require additional modes of information delivery. Even technical data users do not always have software compatible with the datasets available, or capability to handle the large data files associated with flood studies.

Such an array of conditions presents challenges regarding the human and technological aspects of a fully digital world as we move toward that endpoint. This is true no matter what kind of data we are talking about, but particularly in a nationwide program affecting so many individuals and businesses. The following list of considerations may specify floodplain data, but are pertinent when building any system for digital data storage and distribution:
• Access to information: We must acknowledge the varied abilities or lack of ability for non-technical stakeholders to access data, much less use it. For these stakeholders, there must be an alternate means of access, whether by non-digital media or perhaps through viewers that can be used off-line. For more technical constituents, other factors pertain. While transparency and accessibility are critical, how do we protect the integrity of core data? Should all data be open to everyone? How much access to data is to be provided? To whom should access be open, to which data, and to what depth within the database?
• Usefulness to stakeholders: The world changes, and so does its floodplains. Is the new data/product better (more detailed, more precise, more accurate, more understandable) than what exists? Does it answer needs of users? Does it address multiple needs (mitigation planning, evacuation routing, siting of critical facilities, etc.)? Credibility is critical for implementing, enforcing, and accepting floodplain management decisions. Do stakeholders understand–and believe–the reasons for changes to flood zone designation or extent (horizontal and/or vertical)?
• Public education and outreach: Some stakeholders, especially non-technical property owners, may resist the alldigital display of flood risks. Others may believe that anything appearing on a computer screen is error-free. For these (and other) reasons, FEMA’s efforts in reaching various user groups must be ongoing and evolve as products and methods of presentation change.
• Appropriate and acceptable uses of data: Not all data is of the same level of accuracy, and therefore it is not reliable or appropriate for all intended uses. Users, both technical and non-technical, must be informed and educated regarding data quality and limitations. Full documentation of data accuracy should be available both inside and outside of metadata for better recognition by all users.
• Legal equivalence of different formats: Do different formats carry the same legal weight, or do modifications/ extracts change that? While some products are clearly identified (such as the note in the title block of every FIRMette created on FEMA’s Map Service Center website), not all are.
• Timeliness of data/product availability: How is notification of changes accomplished? When are changes distributed? What are repercussions of timing? The timeliness of data and product availability and coordination of digital data availability and distribution directly affect community floodplain management and welfare.
• Consistency and stability: Should data be usable for a predictable period of time? Currently it changes all the time. Tracking versions is important to document both the effective date and the specific information altered, added, or deleted between versions.
• Modifications of procedures: Every change in procedure has a direct impact on data users, particularly as projects may be in progress when the modifications are announced or become effective. Thus a clear plan for the roll out of procedural modifications to avoid confusion is imperative, with no retroactive application.
• Proprietary versus public domain models and products: Various users encounter difficulties when faced with proprietary models and/or products. Some technical users may want to modify models to better accommodate and reflect local conditions but are unable to do so. Electronic data should be available in generic form to accommodate those communities and others without infrastructure, hardware, software, and/ or expertise to use digital information in the currently available forms.
• Consistency of models: Different models can yield different results. For consistency in use of models, FEMA should provide guidance on selection of appropriate models under various circumstances.
• Costs of transitioning to a fully digital environment: While the long-term view shows us the possibilities for cost savings in data acquisition, analysis, dissemination, and storage, immediate costs are far from negligible. Stakeholders can begin to prepare their documents and data to be integrated into a seamless national system if fully informed of incremental steps toward that goal.

This topic and many others are addressed in detail in the current TMAC’s Annual Report. Look for it (and TMAC’s report on Future Conditions) on FEMA’s website at http://www.fema.gov/technical-mappingadvisory-council

Wendy Lathrop is licensed as a Professional Land Surveyor in NJ, PA, DE, and MD, and has been involved since 1974 in surveying projects ranging from construction to boundary to environmental land use disputes. She is a Professional Planner in NJ, and a Certified Floodplain Manager through ASFPM.

A 184Kb PDF of this article as it appeared in the magazine—complete with images—is available by clicking HERE