Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: It’s Time for a New Definition of Big Data
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Data Management > Culture/Leadership > It’s Time for a New Definition of Big Data
AnalyticsCollaborative DataCommentaryCulture/Leadership

It’s Time for a New Definition of Big Data

MIKE20
MIKE20
6 Min Read
SHARE

Two words seemingly on every technologist’s lips are “big data”.  The Wikipedia definition for big data is: “In information technology, big data consists of datasets that grow so large that they become awkward to work with using on-hand database management tools”.  This approach to describing the term constrains the discussion of big data to scale and fails to realise the key difference between regular data and big data.  The blog posts and books which cover the topic seem to conver

Contents
Big data that is very smallLarge datasets that aren’t bigDefining big data

Two words seemingly on every technologist’s lips are “big data”.  The Wikipedia definition for big data is: “In information technology, big data consists of datasets that grow so large that they become awkward to work with using on-hand database management tools”.  This approach to describing the term constrains the discussion of big data to scale and fails to realise the key difference between regular data and big data.  The blog posts and books which cover the topic seem to converge on the same approach to defining big data and describe the challenges with extracting value from this resource in terms of its size.

Big data can really be very small and not all large datasets are big!  It’s time to find a new definition for big data.

Big data that is very small

Modern machines such as cars, trains, power stations and planes all have increasing numbers of sensors constantly collecting masses of data.  It is common to talk of having thousands or even hundreds of thousands of sensors all collecting information about the performance and activities of a machine.

More Read

robotic automation
How Advances In Big Data Technology Make RPA Automation Viable
Data Analytics is a Huge Boon for Alzheimer’s Disease Research
How Analytics is Making Basketball a More Beautiful Game
The Role of Predictive Analytics in Forecasting using Business Intelligence
Not Seeing the Results of Big Data? Maybe You Have a Lot of Data, Not Big Data

Imagine a plane on a regular one hour flight with a hundred thousand sensors covering everything from the speed of air over every part of the airframe through to the amount of carbon dioxide in each section of the cabin.  Each sensor is effectively an independent device with its own physical characteristics.  The real interest is usually in combinations of sensor readings (such as carbon dioxide combined with cabin temperature and the speed of air combined with air pressure).  With so many sensors the combinations are incredibly complex and vary with the error tolerance and characteristics of individual devices.

The data streaming from a hundred thousand sensors on an aircraft is big data.  However the size of the dataset is not as large as might be expected.  Even a hundred thousand sensors, each producing an eight byte reading every second would produce less than 3GB of data in an hour of flying (100,000 sensors x 60 minutes x 60 seconds x 8 bytes).  This amount of data would fit comfortably on a modest memory stick!

Large datasets that aren’t big

We are increasingly seeing systems that generate very large quantities of very simple data.  For instance, media streaming is generating very large volumes with increasing amounts of structured metadata.  Similarly, telecommunications companies have to track vast volumes of calls and internet connections.

Even if these two activities are combined, and petabytes of data is produced, the content is extremely structured.  As search engines, such as Google, and relational databases have shown, datasets can be parsed extremely quickly if the content is well structured.  Even though this data is large, it isn’t “big” in the same way as the data coming from the machine sensors in the earlier example.

Defining big data

If size isn’t what matters then what makes big data big?  The answer is in the number of independent data sources, each with the potential to interact.  Big data doesn’t lend itself well to being tamed by standard data management techniques simply because of its inconsistent and unpredictable combinations.

Another attribute of big data is its tendency to be hard to delete making privacy a common concern.  Imagine trying to purge all of the data associated with an individual car driver from toll road data.  The sensors counting the number of cars would no longer balance with the individual billing records which, in turn, wouldn’t match payments received by the company.

Perhaps a good definition of big data is to describe “big” in terms of the number of useful permutations of sources making useful querying difficult (like the sensors in an aircraft) and complex interrelationships making purging difficult (as in the toll road example).

Big then refers to big complexity rather than big volume.  Of course, valuable and complex datasets of this sort naturally tend to grow rapidly and so big data quickly becomes truly massive.

TAGGED:big data
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

hadoop analytics
Analytics

Hadoop to Be Pervasive By 2015

3 Min Read
AI-driven SEO
Big DataData Mining

How Data Mining Tools Break Through Misconceptions To Optimize SEO

6 Min Read
amazon AWS web services
Cloud ComputingData ManagementIT

IT Infrastructure Needs Rise as Big Data Proliferates

6 Min Read
big data transfer online
AnalyticsBig DataExclusive

Comparative Analysis of Two Top Big Data Transfer Services

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?