Figure 1 -- NOAA caught Rita’s eye in this GOES-12 satellite image at 1-km resolution. Courtesy of NOAA.

Hurricane Season

Sensing Mother Nature’s Changing Moods

By Rod Franklin, Reporter
Denver, Colo.


Sorting order from the ferocity of summer’s large tropical and sub-tropical storms is a developing art that merges human wit with quick response experience, remote sensing technology and computer science. The events of 2004 and 2005 reminded the imaging community of how severely hurricanes and tsunamis can destroy food and water supplies while completely obfuscating the ingress and egress by which first responders would logically hope to retrieve the imperiled, the wounded and the dead.

During and after such tempests, image modeling geographers try their best to superimpose known reality onto the windy, dark canvasses of the moment. It isn’t an easy trick. Ask anyone who watched hurricanes Katrina and Rita storm ashore like a pair of enraged sister shrews who were in no mood to accommodate any of the personnel in charge of GIS networks or rescue efforts.

Some who worked the front lines can remember all too well the ill-equipped staging areas and the few stressed-out jurisdictional spitting matches — experiences that made some already difficult situations even more so. In the months that have elapsed, they have had a chance to review the massive body of logistical intelligence that was collected from satellites and aircraft, as well as to analyze the more human aspects of the rescue response.

From a technical perspective, remote sensing’s contribution to last year’s efforts appears to have carried with it fewer technical glitches than some other chores that were necessary in order to stage command posts, deploy rescue teams and distribute supplies. Pilots and air controllers exercised their common sense training when it came to flying within proximity of Mother Nature’s hissy fits. If anything, they may have done their job too well: Large organizations like the National Oceanic and Atmospheric Administration (NOAA) collected more images than could be digested by the makeshift networks set up to process them.

NOAA and 3001, Inc. (Fairfax, Va.) recorded so much imagery in the days after Katrina hit the coast that software experts “didn’t really know how to process it,” recalls Louis Demargne, director of corporate marketing for EarthData International Inc.

He isn’t exaggerating. On Aug. 31, NOAA posted more than 350 aerial images of the decimated area between Bay St. Louis and Pascagoula, Miss. The next day it produced an additional 1,450 aerial images, including those picturing the mess Katrina had made of New Orleans. On Sept. 5, NOAA added another 1,100 images to the repository.

EarthData was called in to help strain various datasets in ways that could help rescuers delineate where houses and bridges had once stood. “They tasked us to correct the mosaic and extract information such as the debris line,” Demargne said.

To deal with the exigencies of emergency logistics planning, the company generally favored color infrared imaging. “It’s a little bit easier to produce a land cover map,” Demargne explained. “We can calculate the total amount of impervious surfaces and plug that into a hydrologic runoff model. What we did is provide them with the basic information that they needed to run their models.”

For Dr. Nan Walker, director of the EarthScan Laboratory at Louisiana State University’s (LSU) Coastal Studies Institute in Baton Rouge, the better established, proven technologies again proved their mettle. What worked best for New Orleans were the multispectral modalities, although imagers had to wait for the clouds to clear before they could gather clean data. The radar data was very useful for mapping oil slicks.

According to most accounts, companies and organizations holding title to valuable stores of data banded together in a spirit of cooperation. Remembers Walker, “We got our radar imagery through NOAA, and everybody that usually wants to sell their imagery was willing to share it. I’m pretty sure that the big companies, once they see the light and the scale of a disaster, they open up their libraries.”

In pre- and post-storm analysis, some imaging technologies have proven more useful than others. LiDAR, for example, relies on the physical distance of an airborne sensor to a reflective surface, and thus may be less helpful in situations where the modeler wants to quantify standing storm water on a volumetric basis so rational pump deployment decisions can be made.

But new techniques are always coming to the fore. Jacqueline Mills, coordinator of LSU’s GIS Clearinghouse, said personnel working in the Lower Ninth Ward of New Orleans were grateful for the GPS-encoded digital video system loaned by the National Center for Geocomputation at the National University of Ireland. As window-mounted video cameras roll, the changing location of the vehicle can be traced onto a street network, greatly simplifying the chore of assessing damage, flood heights, and markings for search and rescue.

Figure 2 - Another satellite image showing Rita as it made landfall. To the right is Florida, and to the left is Mexico. (Courtesy of NOAA).
“This type of geospatial technology could readily replace the manually intensive paper-only recording approaches of Red Cross damage assessment teams,” Mills wrote in a recent correspondence. “The obvious benefit of this system is the speed and accuracy of data recording, and the dramatic reduction in human effort.”

But flooding was by no means the all of it. When Katrina struck, offshore oil platform valves and pipelines across a wide area in the Gulf region bent or snapped, fouling an estimated 534 square miles with crude oil, according to Skytruth (Shepherdstown, W.Va.), a nonprofit that uses remote imaging to expand awareness of environmental issues. Images Skytruth recorded on Sept. 2 showed slicks over an area calculated to host about 38,000 gallons of spilled oil. This estimate assumed that the slicks were just 1.1um (1/10,000th of a millimeter) thick, which is the lower limit for detection on radar satellite images, so this estimate is really a minimum.

To get better estimates when mapping oil spills, fate/transport models (typically analytical solutions to simplified transport equations) can be conceived to account for sea currents, temperature, turbidity and other factors. Skytruth founder John Amos points to these exercises as good areas for further research by specialists in the thermal infrared and ultraviolet spectrums.

Skytruth images picturing lingering oil several weeks after the storm suggested that up to 100,000 gallons of oil eventually were released from leaking platforms and pipelines. With lives and freshwater quality at stake, however, other emergency response chores took precedence. Katrina tipped more than nine million gallons of oil out of containers in and around New Orleans, creating a hazard more ominous than that which wafted over the ocean’s surface. “The Coast Guard was quite rightly throwing its energy into those spills,” Amos said.

Figure 3 - This satellite image shows the dry areas of New Orleans in pink and flooded regions in blue (Courtesy of SPOT Image).
These spills amounted to little more than grease spots when compared with the 11,000 square miles of Alaska waters polluted in 1989 as a result of the Exxon Valdez accident. “Back in 1989 there wasn’t a whole lot of geospatial technology being used on the Exxon Valdez oil spill,” recalls Leslie Pearson, emergency response program manager for the state Department of Environmental Conservation in Anchorage. “We didn’t even have an Internet back then. It was a real piecemeal effort.” Remote imaging and GIS have allowed quick responders to better prepare themselves for ocean oil spills of any size.

When it comes to errant water, there are roles reserved for most imaging resolutions at some point along the emergency response timeline. The IKONOS satellite owned by GeoEye of Dulles, Va., for example, was able to document the extent of damage along the coast of Sumatra from the Indian Ocean tsunami in late 2004 — information that was put to use for purposes of initial response planning.

“That imagery was quickly made available to the U.S. government, and some of it was released to the media,” said Mark Brender, vice president of communications and marketing. “It really provided a contextual assessment for policymakers even before ground-level reports were aggregated. You could see which airports still had runways. You could see how extensive the damage was, so you could gauge the level of the response.”

As the imaged resolutions of flooded regions become finer in any scenario, with satellite work giving way to aerial imaging, and finally to land-based GPS, response strategies and equipment planning can proceed at scales tailored more realistically to real-world needs.

Many of the lingering weaknesses in the technology have to do with human factors. Everyone remembers the anger and helplessness reflected in the faces of Katrina’s rooftop refugees. Some accounts have it that the mood in the staging areas was no less frenetic, with network and computer hardware experts in cramped offices tripping over the feet of GIS modelers, imaging specialists, data miners and others as they all rushed to produce maps and other logistical guidance.

Amid such chaos, emergency managers are coming to understand the promise of more formal operational and behavioral protocols. Hurricane Katrina: GIS Response for a Major Metropolitan Area ( was a post-mortem critique authored by several LSU faculty who worked the storm. It identified data organization and access, chaotic working environments and request tracking as the three areas most in need of improvement.

The report concluded as follows: “Two of the main findings that can enable greater efficiency in GIS response to a similar disaster are the need for 1) a specific response plan that includes the use of volunteers early in the response and designates these people well before a disaster; and 2) digital and paper copies of existing products for standard requests, such as road maps for the impacted areas, which save time, money, and personnel from duplicating efforts on a product that is already available.”

On a wider level, it remains to be seen if and when the remote imaging and GIS industries will shake hands in a serious way with the media. While the drama of the lone reporter standing amid sheets of rain while dodging assorted flying detritus continues to sell, newer uses of geospatial data are beginning to tease the idea of more detailed and practical reportage. How many viewers, for example, might opt to replace the reporter’s standup with a graphic illustrating the proximity of nursing home residents living next to the clogged storm culvert at the edge of town, or the homes situated under the quaking limestone-capped hilltop of saturated mud?

Figure 4 - Skytruth used existing data to show the proximity of the storm to undersea oil pipelines (Courtesy of Skytruth).
Theoretically, imagers and data modelers could pool their hardware and expertise with that of television networks and meteorologists, cable networks and independent stations, which traditionally have been shackled by the one-dimensional limitations of video to report stories with graphic demographic data or image overlays.

A few creative ideas have surfaced. During Rita, the Federal Emergency Management Agency (FEMA) mapped the projected storm path to concentrated populations of children under the age of five just north of the coastline at the Texas-Louisiana border. Demographics were also used to pinpoint additional at-risk groups in the storm’s right-of-way, including populations of the elderly and the incarcerated.

As the technologies mature, remote sensing and imaging can only add greater value to disaster relief data aggregation. The areas that appear to need attention at this writing include more formalized response protocols and more widespread implementation of imaging techniques that allow responders to gauge the severity of a storm’s damage at ground level in a timely manner.
Sensors & Systems | Monitoring, Analyzing and Adapting to Global Change | Stay in tune with the transformation. Subscribe to the free weekly newsletter.