Big Data: A Natural Solution for Disaster Relief

5 Min Read

By Mike Smitheman, VP of Marketing at GoodData

By Mike Smitheman, VP of Marketing at GoodData

Last Friday, a 10,000-ton meteor sped into northern Russia at 40,000 miles per hour. The resulting shock wave hurt an estimated 1,000 people. While scientists were concerned about another, more massive meteor called DA14, the Russian one seemingly slipped under the radar, catching experts off-guard.

With big data as common in science as it is everywhere else, could we have used better tools to see this coming? What’s the role of big data in natural disasters today?

The answer is a work in progress. NASA, for one, admits to currently having a big data problem. “(D)ata is continually streaming from spacecraft on Earth and in space, faster than we can store, manage, and interpret it,” writes NASA Project Manager Nick Skytland. “In our current missions, data is transferred with radio frequency, which is relatively slow. In the future, NASA will employ technology such as optical (laser) communication to increase the download and mean a 1000x increase in the volume of data. This is much more then we can handle today and this is what we are starting to prepare for now. We are planning missions today that will easily stream more then 24TB’s a day. That’s roughly 2.4 times the entire Library of Congress – EVERY DAY. For one mission.”

NASA still needs to catch up with its data load. Other government agencies are looking for ways to collaborate more effectively. For example, the Department of Defense has secret satellites located around the world for reconnaissance. Those satellites also happen to have the capability to detect large and small meteors. The DoD, however, is nervous about sharing any information that it deems classified, so efforts are still underway to find a way to incorporate that data into the bigger scientific schema.

Real-Time Disaster Maps

Terrestrial challenges, on the other hand, are currently more amenable to big data. One of big data’s true strengths lies in crisis mapping, the process of using visualizations, footage, analysis and apps to get an overview of a disaster as it evolves. Google’s Superstorm Sandy Crisis Map tracked the course of last winter’s storm, with video footage, evacuation routes and emergency aid centers. The UN commissioned the Digital Humanitarian Network to track the real-time effects of Typhoon Pablo in the Philippines. Among other efforts, social data was analyzed to provide a detailed, real-time map of displaced people, fatalities, crop damage, broken bridges and more.

Relief Efforts

If done well, relief coordination means the difference between life and death. Big data has made it possible for relief organizations to reach their goals more quickly and effectively.

To name a few examples: Data analytics company Palantir worked with relief agencies to provide rescuers with data in advance of their arrival, such as a map of fallen trees, power outages and gas shortages. NASA and the Open Science Data Group are working on Project Matsu, which uses Hadoop to analyze and timestamp satellite pictures to give relief workers real-time disaster maps. Google Person Finder re-connects disaster victims with their friends and families.

A Nascent Science

Big data’s contributions to crisis mapping and relief efforts have already made a big difference. Yet they’re just a drop in the bucket compared to the technologies that will evolve in the coming decade. We may have push alerts sent to our mobile phones, warning of natural disasters either in our location or those of our loved ones. Rather than swarming the nearest location, disaster victims may be routed to the relief centers that can best accommodate them. With the right tools and communal efforts, we may even be able to predict the trajectories of oncoming meteorites.

Share This Article
Exit mobile version