Demystifying Hadoop: Not All Problems Are Hadoop-able
When you hear about Big Data, Hadoop hype follows almost automatically. But people often ask me what Hadoop actually does. Built by Yahoo and Google to essentially index the Internet, Hadoop is not a data warehouse or storage solution: it’s a tool that’s useful when information can be broken up, analyzed in pieces and put back together.
For example, if a chain of convenience stores needs to find out how many customers used MasterCard, Visa, American Express, or cash at the pump in the past year, they can use Hadoop as a tool to retrieve that information because it can be divided up and managed in pieces per location, without affecting the big picture.
However, if you’re working with data that requires an examination of the relationships and dependencies within the data, you can’t just look at it in pieces and get the “big picture” of what the data is trying to tell you. So, back to the previous example, this approach would fail if the chain wants to know what food and beverages are being purchased together in rural vs. urban locations and how weather impacts those buying patterns.
The hype around Hadoop makes it seem like a one-size-fits-all solution for leveraging big data, but the reality is that not all problems are Hadoop-able, and more and more business users are learning that. Jaikumar Vijayan of ComputerWorld wrote, “Hadoop isn’t enough anymore for enterprises that need new and faster ways to extract business value from massive datasets.”
Time is a major factor, but what about requiring an IT army to run Hadoop? Steve Rosenbush wrote in Wall Street Journal about how GameStop CIO Jeff Donaldson “picked a more traditional approach for analyzing large amounts of customer data, because he didn’t want to manage the complexity of having his engineers learn Hadoop, or have to call in consultants for help.”
While Hadoop is an effective and low-cost tool for some companies, it is not an application and does not get business users any closer to the most critical part of Big Data: getting to the insight. Hadoop chops and dices and stores, but does not make a consumable dish! It leaves users wanting for the value of big data, some of which include:
- Law enforcement and intelligence agencies seeking insight from their data to mitigate threats to public safety.
- Healthcare institutions trying to predict disease outbreaks or customize treatments to diseases.
- Retailers wanting insight into demand trends and customer-buying patterns to serve the markets more profitably
- Supply chain professionals wanting insight into data to understand the cause-and effect across the nodes on the chain.
All these business problems require the ability to extract insight from data. Rather than break the data into pieces and store-n-query, organizations need the ability to detect patterns and gain insights from their data. Hadoop destroys the naturally occurring patterns and connections because its functionality is based on breaking up data. The problem is that most organizations don’t know that their data can be represented as a graph nor the possibilities that come with leveraging connections within the data.
Take healthcare, for example. You have nodes for people, medicine, symptoms and side effects. To determine the type of person most likely to have the least side effects related to a certain medication, one needs to leverage the patterns of connections within data – as opposed to breaking apart those connections into disparate clusters. This is the kind of analysis that is not going to lend itself to being partitioned in a Hadoop-able way.
Law enforcement agencies have data comprised of people, organizations, places and words. These connection points, if not Hadoop-ed, can reveal valuable connections to networks of interest and key influencers within those networks. But the value lies in the data that’s sparse, so it needs to be assessed all at the same time instead of as distributed fragments.
Though Hadoop is getting a lot of attention, it may not always be the best approach to crunching Big Data for strategic insights. “[Hadoop] is simply too slow for companies that need sub-millisecond query response times,” wrote Vijayan. Whether it’s healthcare, manufacturing, retail or banking, companies with data that can be represented as a “big picture” demand effective solutions beyond the methods they are likely using now such as spreadsheets made by guesstimates and intuition.
Emcien CEO Radhika Subramanian is a seasoned entrepreneur with decades of experience helping organizations utilize the insight buried within their data. Numerous associations recognize her as an innovator in analytics, with a proven track record with global giants such as Porsche, John Deere, NCR, Dell and more. Tweet to @RadhikaAtEmcien & join her Big Data Apps LinkedIn group.
Other Posts by Radhika Subramanian
The moderated business community for business intelligence, predictive analytics, and data professionals.