How Big Data is Aiding in Disaster Relief

6 Min Read

Unfortunately, there’s just no way to prevent natural disasters from striking. Events like earthquakes, hurricanes, and floods can cause millions of dollars in damage, cause hundreds or thousands of fatalities and injuries, and displace entire regions. When infrastructure breaks down (or was not well-established in the first place) in the wake of a disaster and communication becomes difficult, it is logistically tough to coordinate relief efforts and react quickly during a situation that requires immediate attention to save lives and prevent further damage.

Unfortunately, there’s just no way to prevent natural disasters from striking. Events like earthquakes, hurricanes, and floods can cause millions of dollars in damage, cause hundreds or thousands of fatalities and injuries, and displace entire regions. When infrastructure breaks down (or was not well-established in the first place) in the wake of a disaster and communication becomes difficult, it is logistically tough to coordinate relief efforts and react quickly during a situation that requires immediate attention to save lives and prevent further damage. We’re still having trouble with disaster preparedness and relief worldwide, but new tools can be helpful for mitigating the impact of these disasters. Big data and artificial intelligence have been used in recent years to guide aid agencies in preparing for and responding to natural disasters

Predicting & Preparing for Disasters 

Predicting disasters is the first step in reducing damage and loss of life. Companies like Terra Seismic use satellite data and other factors to predict earthquakes with 90% accuracy, a huge advantage when dealing with this kind of natural disaster. Weather agencies can track patterns of storms and make predictions about which areas will be most vulnerable, helping to prepare residents for evacuation, if necessary.

Prior to the arrival of Hurricane Matthew in 2016, crews from the U.S. Geological Survey installed sensors in areas expected to experience heavy rain and wind from the storm. These sensors were used to collect data to help with forecasting and disaster relief in the wake of the storm. The sensors provided live updates on flooding and damage during the disaster, allowing crews to plan for relief efforts. The storm, which tragically killed over 800 people in Haiti, shows just how important ongoing use of hurricane monitoring is in countries all over the world. Some of these sensors are low-cost but do have certain infrastructure requirements.

Government agencies and disaster relief organizations have made use of open-source platforms to upload helpful data and provide evacuation and relief information for area residents. During Super Typhoon Haiyan in the Philippines in 2013, volunteers, aid workers, and government sources uploaded 1.5 million updates onto OpenStreetMap, an open source map. This tool helped relief workers to coordinate and assess damage.

Even something as simple as pre-storm emergency supply buying can be improved by the use of big data. Wal-Mart used predictive analysis to their advantage prior to Hurricane Frances and Hurricane Sandy, when they noticed that consumers were stocking up on items like strawberry Pop-Tarts and beer, along with emergency supplies. While these items are not crucial during a hurricane, stocking more shelf-stable supplies that people will purchase ahead of the hurricane helps ensure that more people will have some kind of supplies following a storm. 

Finding Victims of Disasters

For those who have lost track of family members in the aftermath of disaster, big data has been helpful in tracking the whereabouts and conditions of missing people. Google’s People Finder was made available after the massive 2010 earthquake in Haiti and is an open source platform for updating information about disaster victims. The platform has been used in other disasters since, receiving 5300 updates in just two days following Nepal’s earthquake in 2015.

Assessing Hurricane Damage with Social Media Analysis

Twitter was a surprising big data tool that was used during and after Hurricane Sandy struck the East Coast of the United States in 2012. FEMA organized groups of public and private agencies to analyze the tweets made during that time to help workers strategize. Geeks Without Bounds was one of the nonprofits that analyzed the hashtags associated with the tweets and used these to plot out locations where aid and resources were most needed. This was one of the first instances of using social media to distribute aid, and it helped FEMA assess damages and make more informed decision about aid. During the storm, geotagged photos were also used in determining how many people were affected by the storm.

Still in the Early Stages

Big data and predictive analytics have been a helpful tool in a myriad of applications over the years, but their use in climate change, disaster preparedness, and relief is just beginning. Agencies are learning how to harness the power of data for keeping us safe during the storm, and will hopefully pave the way for better outcomes after natural disasters in the future.

Share This Article
Exit mobile version