Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    warehouse accidents
    Data Analytics and the Future of Warehouse Safety
    10 Min Read
    stock investing and data analytics
    How Data Analytics Supports Smarter Stock Trading Strategies
    4 Min Read
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
    data analytics and gold trading
    Data Analytics and the New Era of Gold Trading
    9 Min Read
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Big Data Is Not Enough
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Analytics > Modeling > Big Data Is Not Enough
AnalyticsBig DataModelingPredictive AnalyticsSoftware

Big Data Is Not Enough

DeanAbbott
DeanAbbott
7 Min Read
big data and predictive analytics
SHARE

big data and predictive analyticsBig data is the big buzz word in the world of analytics today. According to google trends, shown in the figure, searches for “big data” have been growing exponentially since 2010 though perhaps is beginning to level off. Or take a look on amazon.com for books with Big Data in the title sometime: the publication dates, for the most part, are in 2012 or 2013. 

big data and predictive analyticsBig data is the big buzz word in the world of analytics today. According to google trends, shown in the figure, searches for “big data” have been growing exponentially since 2010 though perhaps is beginning to level off. Or take a look on amazon.com for books with Big Data in the title sometime: the publication dates, for the most part, are in 2012 or 2013. 

But what’s the key to unlock the big data door? In his interview with Eric Siegel on April 12, Ned Smith of Business News Daily starts with this apt insight: “Predictive Analytics is the ‘Open Sesame’ for the world of Big Data.” Big data is what we have; predictive analytics (PA) is what we do with it.

Why is the data so big? Where does it come from? We who do PA usually think of doing predictive modeling on structured data pulled from a database, probably flattened into a single modeling table by a query so that the data is loadable into a software tool. We then clean the data, create features, and away we go with predictive modeling.

More Read

tracing blind spots in big data
How To Find And Resolve Blind Spots In Your Data
Burning Microwave!!! (Mozy online backup) (via ijustine)
Why the Manufacturing Industry Needs Business Intelligence Badly: A Christmas Story
Unlocking Big Data Means Truly Understanding the Customer Journey [INFOGRAPHIC]
The Telecom Industry Needs Big Data To Thrive In The 21st Century

But according to a 2012 IBM study, “Analytics: The real-world use of big data”, 88% of big data comes from transactions, 73% from log data, and significant proportions of data come from audio and video (still and motion). These are not structured data. Log files are often unstructured data containing nothing more than notes, sometimes freehand, sometimes machine-created, and therefore cannot be used without first preprocessing the data using text mining techniques. For all of us who have built models augmented with log files or other text data, we know how much work is involved in transforming text into useful attributes that can then be used in predictive models

Even the most structured of the big data sources, transactional data, often are nothing more than dates, IDs and very simple information about the nature of the transaction (an amount, time period, and perhaps a label about the nature of the transaction).

Transactional data is rarely used directly; it is usually transformed into a form more useful for predictive modeling. For example, rather than building models where each row is a web page transaction, we transform the data so that each row is a person (the ID) and the fields are aggregations of that person’s history for as long as their cookie has persisted; the individual transactions have to be linked together and aggregated to be useful.

The big data wave we are experiencing is therefore not helpful directly for improving predictive models, we need to first determine the level of analysis needed to build useful models, i.e., what a record in the model represents. The unit of analysis is determined by the question the model is intended to answer, or put another way, the decision the model is intended to improve within the organization. This is determined by defining the business objectives of the models, normally by a program manager or other domain expert in the organization, and not by the modeler.

The second step in building data for predictive modeling is creating the features to include as predictors for the models. How do we determine the features? I see three ways:

  1. the analyst can define the features based on his / her experience in the field, or do research to find what others have done in the field through google searching and academic articles. This assumes the analyst is, to some degree, a domain expert.
  2. the key features can be determined by other domain experts either handed down to the analyst or through interviews of domain experts by the analyst. This is better than a google search because the answers are focused on the organization’s perspective on solving the problem.
  3. the analyst can rely on algorithm-based features creation. In this approach, the analyst merely provides the raw input fields and allows the algorithms to find the appropriate transformations of individual fields (easy) or multivariate combinations (more complex). Some algorithms and implementations of algorithms in software can do this quite effectively. This third approach I see advocated implicitly by data scientists in particular.

In reality, a combination of all three is usually used and I recommend all three. But features based on domain expertise almost always provides the largest gains in model performance compared with algorithm-based (automatic) feature creation.

This is the new thee-legged stool of predictive modeling: big data provides the information, augmenting what we have used in the past, domain experts provide the structure for how to set up the data for modeling, including what a record means and the key attributes that reflect information expected to be helpful to solve the problem, and predictive analytics provides the muscle to open the doors to what is hidden in the data. Those who take advantage of all three will be the winners in operationalizing analytics.

First posted at The Predictive Analytics Times

(Big Data growth / shutterstock)

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

macro intelligence and ai
How Permutable AI is Advancing Macro Intelligence for Complex Global Markets
Artificial Intelligence Exclusive
warehouse accidents
Data Analytics and the Future of Warehouse Safety
Analytics Commentary Exclusive
stock investing and data analytics
How Data Analytics Supports Smarter Stock Trading Strategies
Analytics Exclusive
qr codes for data-driven marketing
Role of QR Codes in Data-Driven Marketing
Big Data Exclusive

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

big data can help your company choose the right staffing model
Analytics

Using Data Analytics for Selecting Staff Augmentation & Managed Services

8 Min Read
online data
Big Data

Understanding the Different Types of Online Data for Your Data Strategy

11 Min Read

Examining PMML 4.0 – Part I: Pre-Processing

7 Min Read
How Can Big Data Bring Hope to Mesothelioma Patients?
AnalyticsBig DataBusiness IntelligenceCloud ComputingData ManagementHadoopITMapReduceOpen SourceSoftwareSQLUnstructured Data

5 Ways Hadoop Can Help Healthcare Organizations and You

5 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai is improving the safety of cars
From Bolts to Bots: How AI Is Fortifying the Automotive Industry
Artificial Intelligence
AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?