Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data analytics
    How Data Analytics Can Help You Construct A Financial Weather Map
    4 Min Read
    financial analytics
    Financial Analytics Shows The Hidden Cost Of Not Switching Systems
    4 Min Read
    warehouse accidents
    Data Analytics and the Future of Warehouse Safety
    10 Min Read
    stock investing and data analytics
    How Data Analytics Supports Smarter Stock Trading Strategies
    4 Min Read
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: The Future of Data and Analytics – Bigger and Faster
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Business Intelligence > The Future of Data and Analytics – Bigger and Faster
Business Intelligence

The Future of Data and Analytics – Bigger and Faster

Brett Stupakevich
Brett Stupakevich
3 Min Read
SHARE

informationweek full issue august 9 2010 47684 photo (business intelligence)

informationweek full issue august 9 2010 47684 photo (business intelligence)

When you’re talking about bytes, mega is “so yesterday” and peta is quickly going the way of “kilo.” More is better in lots of ways but in forecasting the impact on data analytics and business intelligence, you’ll find the units of measure just keep getting bigger. Instead of terabytes in your data warehouse, get ready for exabytes (1 billion gigabytes) or yottabytes (one quadrillion gigabytes). No wonder Information Week devoted a recent cover story to “Big and Fast” – the need for getting insights from huge data sets with ever-increasing speed and accuracy. 

More Read

Intelligence in the Cloud
Can AI Slash the Costs of Accounting Errors in 2020?
Building an Analytical Portal to Support Analytical Culture
Blame it on PR
Embracing Business Intelligence for a Better Tomorrow

To us, the key idea in “Big and Fast” is the concurrence of both adjectives – by working in conjunction are going to dramatically remake data analytics as tools improve and uses proliferate. Databases will only get larger, more complex and more interconnected. Doug Henschen reports that nearly half of all organizations surveyed by the Data Warehousing Institute – 46 percent of them — plan to change their data warehousing platform by 2012.

The online resource center offers advice from eight companies – Cabela’s to Catalina Marketing to Barnes & Noble – that have hundreds of stores and millions of online purchase records. In-database modeling and other features that Spotfire users already know about, permit even large data to be used more efficiently. We can’t wait to copy data out of source warehouses, transfer it to a new application, perform a task and then return an answer. If demands for Web bandwidth are any clue and Google can plow through searches and deliver a set of results in under a second, then the expectation of speed AND power is only going to increase the pressure.

Henschen charts the future including the replacement of disk drives with solid-state drives (SSD) a storage technology that can produce speeds of 150x the read/write/retrieve times of disks. Yet costs that can be 10x the expense of disk technology, mean it will take years for data centers to make the switch. Laptops with SSD instead of disks are an early indicator. Some manufacturers tout the lightweight, high-speed, reliability and durability that are coming our way.

Ready to face the future? You’ll want to practice pronouncing zettabyte and yottabyte — both measures of data will eventually replace exabyte. One study by Cisco contends the global monthly Internet traffic as of March 2010 is estimated to be 21 exabytes. However, zettabtye and yottabyte data amounts have yet to be approached. When we reach those measurements at least you’ll know the language … And hopefully you’ll have data analytics software that can keep pace.

David Wallace
Spotfire Blogging Team

TAGGED:data analytics
Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

ai in business
Recurring Revenue Strategies for the AI Business Era
Artificial Intelligence Exclusive
ai for playground safety
Using Data to Plan Safer, More Efficient Public Playgrounds
Big Data Exclusive
AI for cybersecurity
How AI Supports Modern Penetration Testing
Artificial Intelligence Exclusive
ai kids and their parents
How Cities Use AI to Improve Playground Design
Exclusive News

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

big data mac performance
News

Data-Driven Tips to Optimize the Speed of Macs

12 Min Read
big data
AnalyticsBig Data

Making Sense of All the Big Data Around Us

5 Min Read
AI in web design
Artificial IntelligenceExclusive

The Impact Of Artificial Intelligence In Web Design

5 Min Read
big data helping Russian car sharing business
Big Data

How Big Data Helped Russia Become A Leader In Car Sharing

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

data-driven web design
5 Great Tips for Using Data Analytics for Website UX
Big Data
AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?