Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    big data analytics in transporation
    Turning Data Into Decisions: How Analytics Improves Transportation Strategy
    3 Min Read
    sales and data analytics
    How Data Analytics Improves Lead Management and Sales Results
    9 Min Read
    data analytics and truck accident claims
    How Data Analytics Reduces Truck Accidents and Speeds Up Claims
    7 Min Read
    predictive analytics for interior designers
    Interior Designers Boost Profits with Predictive Analytics
    8 Min Read
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: New Technology Is Not an Easy Button for Big Data
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Culture/Leadership > New Technology Is Not an Easy Button for Big Data
AnalyticsBig DataCulture/LeadershipData ManagementDecision Management

New Technology Is Not an Easy Button for Big Data

BillFranks
BillFranks
6 Min Read
SHARE

Easy ButtonIt is good to remember in today’s hype-filled big data world that there is no “easy” button for big data. In fact, in many ways, big data is quite difficult to deal with. Many organizations seem to be falling for the fallacy that simply implementing new tools or platforms will “automagically” solve their big data problems. Unfortunately this isn’t the case.

Easy ButtonIt is good to remember in today’s hype-filled big data world that there is no “easy” button for big data. In fact, in many ways, big data is quite difficult to deal with. Many organizations seem to be falling for the fallacy that simply implementing new tools or platforms will “automagically” solve their big data problems. Unfortunately this isn’t the case.

For example, there is a common belief that MapReduce platforms such as Teradata Aster or Hadoop can tame big data in and of themselves.  In reality they don’t inherently enable new functionality or analytic logic to be executed. Rather, they allow you to scale certain kinds of functionality and analytic logic in a way that makes the functionality and logic much more powerful and widely applicable.

This is an important distinction – and one I want to explore in detail.

More Read

DIALOG Group RCI and Legacy Migration
5 Business Expenses That Data Can Help Reduce
Uses for Analytics and Big Data in Marketing
Last night at the offices of blogging software company Six…
How Big Data Has Impacted The Real Estate Industry

Many organizations seem to be thinking of MapReduce as a magic bullet or “easy” button for handling big data. Just set up a system, and your big data problems are solved, right? Wrong. Once the system is in place, it is still necessary to develop the analytic processes that run against it.  There really is no shortcut here. If you want great analytics, you’re going to have to build your processes just like you always have. Organizations that don’t understand this fact will be disappointed when they realize they aren’t instantly getting the value they expected from their investment.

As I said earlier, MapReduce doesn’t inherently enable new functionality. When you hear about MapReduce environments, you will quickly come to a discussion of leveraging languages such as Java or Python. It just so happens that these languages have been around for quite a while. They had strong followings before the concept of MapReduce came into existence. Most users of these languages have never used, and may never use, a MapReduce architecture as part of their work.  However, they code away day to day developing processes just like their big data focused counterparts.

What many people don’t take the time to think about is that whatever logic you develop today in Java to run in a MapReduce environment is something you could have written in Java years ago. The exact same code, the exact same output for a given piece of data. This is why I said that MapReduce doesn’t directly cause any new analytic logic to come into existence. Rather, MapReduce provides a highly scalable platform so that logic can be executed at a scale far surpassing what was possible in the past.

This last point is the value that MapReduce brings. Having a terrific facial recognition or text parsing algorithm doesn’t do much good if there is no way to scale the process to a big data environment. MapReduce provides that ability.  It lets organizations apply algorithms to a much wider base of problems and a much larger amount of data. It allows logic that wasn’t practical to build into your analytic processes to become practical.

This no different than how parallel database platforms provide value. A Massively Parallel (MPP) database system runs on SQL just like a non-MPP system. An MPP system doesn’t enable new functionality in the absolute sense, but it does provide the ability to scale an SQL process.  As a result it enables far more value to be derived and a much wider set of problems to be practically addressed than when using a non-MPP architecture.

In summary, we can expect MapReduce to continue to be a force behind the taming of big data. But, the onus will still be on the organizations that use it to develop and implement the required analytic processes just as they always have had to do in the past. Many analytics that were theoretically possible, but impractical, will no longer be a problem. That will lead to a lot of value. The key is to understand what the architecture will do for you, and to not underestimate the effort required to use it correctly. It will take work to get the benefits. There is no “easy” button for big data.

To see a video version of this blog, visit my YouTube channel.

Originally published by the International Institute for Analytics

 

Share This Article
Facebook Pinterest LinkedIn
Share
ByBillFranks
Follow:
Bill Franks is Chief Analytics Officer for The International Institute For Analytics (IIA). Franks is also the author of Taming The Big Data Tidal Wave and The Analytics Revolution. His work has spanned clients in a variety of industries for companies ranging in size from Fortune 100 companies to small non-profit organizations. You can learn more at http://www.bill-franks.com.

Follow us on Facebook

Latest News

data intelligence in healthcare
How Data Is Powering Real-Time Intelligence in Health Systems
Big Data Exclusive
intersection of data
The Intersection of Data and Empathy in Modern Support Careers
Big Data Exclusive
blockchain for ICOs
The Role of Blockchain in ICO Fundraising
Blockchain Exclusive
ai in business
How AI Helps Businesses Discover Specialized Niches
Exclusive Marketing

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Low Latency a High Priority When Choosing a Data Center Location

9 Min Read

Top ten RRReasons R is bad for you ?

4 Min Read

I’m a Data Miner: T-Shirts, Mugs and Mousepads

2 Min Read

Don’t Forget

3 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive
AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?