Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    composable analytics
    How Composable Analytics Unlocks Modular Agility for Data Teams
    9 Min Read
    data mining to find the right poly bag makers
    Using Data Analytics to Choose the Best Poly Mailer Bags
    12 Min Read
    data analytics for pharmacy trends
    How Data Analytics Is Tracking Trends in the Pharmacy Industry
    5 Min Read
    car expense data analytics
    Data Analytics for Smarter Vehicle Expense Management
    10 Min Read
    image fx (60)
    Data Analytics Driving the Modern E-commerce Warehouse
    13 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: The Future of Data and Analytics – Bigger and Faster
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Business Intelligence > The Future of Data and Analytics – Bigger and Faster
Business Intelligence

The Future of Data and Analytics – Bigger and Faster

Brett Stupakevich
Brett Stupakevich
3 Min Read
SHARE

informationweek full issue august 9 2010 47684 photo (business intelligence)

informationweek full issue august 9 2010 47684 photo (business intelligence)

When you’re talking about bytes, mega is “so yesterday” and peta is quickly going the way of “kilo.” More is better in lots of ways but in forecasting the impact on data analytics and business intelligence, you’ll find the units of measure just keep getting bigger. Instead of terabytes in your data warehouse, get ready for exabytes (1 billion gigabytes) or yottabytes (one quadrillion gigabytes). No wonder Information Week devoted a recent cover story to “Big and Fast” – the need for getting insights from huge data sets with ever-increasing speed and accuracy. 

More Read

Tips for Executives – How to Get the Data You Need
Is 2010 The Year Asia Cuts the Apron Strings?
Interview with Lenny Murphy: Market Research and Social Media
Business Intelligence Foundations
Solving the Challenge of Customer-Facing Analytics

To us, the key idea in “Big and Fast” is the concurrence of both adjectives – by working in conjunction are going to dramatically remake data analytics as tools improve and uses proliferate. Databases will only get larger, more complex and more interconnected. Doug Henschen reports that nearly half of all organizations surveyed by the Data Warehousing Institute – 46 percent of them — plan to change their data warehousing platform by 2012.

The online resource center offers advice from eight companies – Cabela’s to Catalina Marketing to Barnes & Noble – that have hundreds of stores and millions of online purchase records. In-database modeling and other features that Spotfire users already know about, permit even large data to be used more efficiently. We can’t wait to copy data out of source warehouses, transfer it to a new application, perform a task and then return an answer. If demands for Web bandwidth are any clue and Google can plow through searches and deliver a set of results in under a second, then the expectation of speed AND power is only going to increase the pressure.

Henschen charts the future including the replacement of disk drives with solid-state drives (SSD) a storage technology that can produce speeds of 150x the read/write/retrieve times of disks. Yet costs that can be 10x the expense of disk technology, mean it will take years for data centers to make the switch. Laptops with SSD instead of disks are an early indicator. Some manufacturers tout the lightweight, high-speed, reliability and durability that are coming our way.

Ready to face the future? You’ll want to practice pronouncing zettabyte and yottabyte — both measures of data will eventually replace exabyte. One study by Cisco contends the global monthly Internet traffic as of March 2010 is estimated to be 21 exabytes. However, zettabtye and yottabyte data amounts have yet to be approached. When we reach those measurements at least you’ll know the language … And hopefully you’ll have data analytics software that can keep pace.

David Wallace
Spotfire Blogging Team

TAGGED:data analytics
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

student learning AI
Advanced Degrees Still Matter in an AI-Driven Job Market
Artificial Intelligence Exclusive
mobile device farm
How Mobile Device Farms Strengthen Big Data Workflows
Big Data Exclusive
composable analytics
How Composable Analytics Unlocks Modular Agility for Data Teams
Analytics Big Data Exclusive
fintech startups
Why Fintech Start-Ups Struggle To Secure The Funding They Need
Infographic News

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Practical Data Analytics – When is “close enough” good enough?

3 Min Read
create seamless web dashboards
Analytics

Utilizing Data Analytics To Create Seamless Web Dashboards

6 Min Read
selecting an it provider
Analytics

Does Data Analytics Help With Selecting An IT Provider?

5 Min Read
production line analytics
AnalyticsExclusiveWorkforce Analytics

Invest In Analytics To Improve Your Production Line

8 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive
data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?