In the panic and confusion of a disaster, much can be unclear.
In a flood, which building can be a refuge, and which needs to be evacuated? Which roads will be safe for rescue crews if the water keeps rising?
In a tornado or earthquake, can past events give clues about what kind of heavy equipment, cutting tools or rescue gear will be needed?
In a violent attack, how can local agencies collaborate with citizen witnesses who are immediately sharing critical information?
Open data can inform reliable response to such uncertainty.
The U.S. Federal Emergency Management Agency (FEMA) met the challenge of the hurricane trifecta of Harvey, Irma and Maria by collaborating with the Department of Homeland Security and the Department of the Interior to create one-stop open data-driven portals, based on the ArcGis platform from Esri (Environmental Systems Research Institute), for response and recovery efforts.
Thanks to dozens of datasets, the portals offered information on which roads were impassable due to elevated flood waters, which pharmacies were open or closed due to storm effects, the locations of vulnerable populations, sites of at-risk infrastructure hubs (such as power substations or fuel storage centres) and even listed available plug-in stations for electric vehicles. First responders may never have had such access to so much information before.
The U.S. federal portals also included links to crowdsourced open data solutions. Waze is a community mapping initiative which helps drivers find the best route to work on normal days, but became part of the Harvey response by providing road condition updates every two minutes in hurricane-ravaged areas.
Other tech companies demonstrated the versatility of their platforms through open data. The U.S.-based non-profit Data World aggregated data sets to produce an online list and interactive map of open shelters in Florida following Hurricane Irma.
The Humanitarian OpenStreetMap Team (aka, HOT) has been doing earthquake response since the Haiti quake of 2010, crowdsourcing citizen input to produce relevant post-quake mapping of affected areas. The team helmed the Latin American OpenStreetMap response after the deadly Mexican quake of Sept. 7, mapping the damage to buildings, roads and other infrastructure through dozens of online projects.
The diversity of response — historic datasets, real-time observation, crowd-sourced reaction — suggests an emerging challenge facing first responders and recovery teams: too much information.
Canada, too, has faced natural disasters — hurricanes, tornadoes, wildfires, floods and earthquakes. While crowd-sourced and online data tools such as Google Crisis maps have been used in response to such emergencies, with emergency response and management the responsibility of provincial and municipal agencies, there is no national equivalent to the FEMA-organized portals.
That’s a challenge for emergency managers hoping to realize the potential of datasets," says Sarah Thompson, Projects Chair on the Board of Directors of the Ontario Association of Emergency Managers (OAEM).
In an article published this month on the OAEM website, Thompson writes that incorporating data management expertise is key to unlocking the untapped potential of emergency management programs, such as the application of Facebook’s Safety Check, which was heavily engaged following the Las Vegas shootings this month.
But a public assumption that social media applications will solve all problems underestimates the complexity of the available material, and the limited resources for emergency management agencies in managing and analyzing such data.
She wrote, “Beginning this conversation with how to use social media, or open data (or other common data initiatives), ignores many initial planning steps. It’s like building a house without a foundation or building codes. In essence, such projects could end up without quality standards, a common operating picture, nor any assurance of stability or sustainability. Not to mention that it overlooks the potentially huge additional workload associated with management and integration of this sort of data.”
In her blog post, Thompson noted challenges that are familiar to entrepreneurs hoping to use open data: varying formats for open data, limited mandates or silo mentality for open data custodians, and the lack of expertise among potential clients in interpreting datasets.
But those challenges are also opportunities. In an email exchange with ODX, Thompson says that open data is becoming part of the toolkit of emergency response: “Emergency Operations centres, namely the planning section (situation awareness, in particular) and ops, have much to gain from the kind of intelligence that is possible from analyzing and processing open/big data.”
Some of these open data projects are starting small: such as the work by the Vancouver-area North Shore Emergency Management Office, which is adapting the FEMA-deployed HAZUS loss-estimation program to predict potential earthquake and tsunami damages; the City of Toronto Open Data Program; or the flood hazard modelling being undertaken by the Toronto and Region Conservation Authority.
Thompson says there is an opportunity for federal/provincial support to guide data governance, and support the emergency management sector to harness today’s growing data resources. The need for one or more accessible open data platforms creates possibilities for a wide range of enterprising collaboration, as this area is “open to opportunities for innovation, and development of capability, in utilizing open data for emergency response. . . in Canada, I think we are just scratching the surface.”