Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    data analytics
    How Data Analytics Can Help You Construct A Financial Weather Map
    4 Min Read
    financial analytics
    Financial Analytics Shows The Hidden Cost Of Not Switching Systems
    4 Min Read
    warehouse accidents
    Data Analytics and the Future of Warehouse Safety
    10 Min Read
    stock investing and data analytics
    How Data Analytics Supports Smarter Stock Trading Strategies
    4 Min Read
    predictive analytics risk management
    How Predictive Analytics Is Redefining Risk Management Across Industries
    7 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Five Steps to Successfully Manage Multiple Data Platforms
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Warehousing > Five Steps to Successfully Manage Multiple Data Platforms
Big DataBusiness IntelligenceData ManagementData WarehousingHadoop

Five Steps to Successfully Manage Multiple Data Platforms

Kevin_Petrie
Kevin_Petrie
5 Min Read
SHARE

We data folks live in exciting times.

As we saw at the Strata + Hadoop show in San Jose, open source developers continue to deliver new ways to analyze high volumes of fast-moving data.

One of the hottest, Apache Kafka, can feed data from thousands of applications to emerging platforms like HBase and Cassandra. Enterprises can use Kafka message brokers to tap real-time data streams from myriad sources to address a range of use cases. They can:

More Read

Prioritizing Market Research
How Hospital Security Breaches Devastate Local Communities
Innovating the Practice of Performance Management
IBM Brings Business Analytics to Apple iPad
Salary Changes for Quant Workers: What Can You Expect?

We data folks live in exciting times.

As we saw at the Strata + Hadoop show in San Jose, open source developers continue to deliver new ways to analyze high volumes of fast-moving data.

One of the hottest, Apache Kafka, can feed data from thousands of applications to emerging platforms like HBase and Cassandra. Enterprises can use Kafka message brokers to tap real-time data streams from myriad sources to address a range of use cases. They can:

  • Engage prospects that visit their websites based on highly granular activity tracking
  • Correlate transaction histories with store sensors and smartphone apps to make location-based retail offers to customers
  • Manage supply chains and product shipments based on real-time location checks, operational metrics and traffic patterns

Cool stuff!

But innovations like Kafka also raise questions about the role of the enterprise data warehouse (EDW), the reliable system of record that often will be a source but not target for streaming use cases like the bulleted examples above. The short answer? EDWs must co-exist with more and more complementary platforms.

“Accept that the world will get more distributed,” Gartner VP Ted Friedman advised attendees at the Gartner Enterprise Information & Master Data Management Summit in Dallas in March.

Enterprises of all types are re-examining long-held assumptions about the EDW. Recently a global financial services organization in EMEA speculated to us that their Data Lake might become their central data “hub,” with EDWs serving as “spokes” for various lines of business.

While the merits of such a strategy will vary by customer, we find the most successful enterprises follow some consistent guiding principles.

  • Hadoop is a must-have architecture component. Hadoop’s ability to cost-effectively process fast-growing volumes of structured and unstructured data has proven a powerful complement to the data warehouse. In many cases EDWs remain the system of record and Hadoop serves as the analytics testbed for new data types and user cases. But perhaps the most compelling reason to invest in a Data Lake now is to tap future innovation opportunities. Hadoop is the focal point for Apache open source contributions such as Spark, Kafka and Storm. Join the community today to capitalize on future game changers.
  • The customer is king. Analyst George Gilbert of Wikibon envisions the rise of “systems of engagement” that enable enterprises to identify, predict and shape individual customer experiences. To do this, they are analyzing historic and real-time customer activities across multiple channels to act most effectively on their 360-degree view. As a case in point, the Canada-based digital bank Tangerine integrates social media, emails and customer records on SQL server to improve both real-time customer service and longer-term product offerings.
  • Data analytics can and should become a profit center. Ted Friedman and Debra Logan of Gartner predicted at the Gartner conference in Dallas that by 2020, half of enterprises will “successfully” link financial objectives to data and analytics, and 10 percent will “have a highly profitable business unit specifically for productizing and commercializing their information assets.” So while platforms will proliferate, leading enterprises will assign teams to put the pieces together to capitalize on digital insights.
  • Go to war for talent. Analytics initiatives are only as good as the people that drive them. The leaders are doing what it takes to win talent when for Apache Hadoop, Kafka and Spark. Even in today’s tight job market, it is easier to find and hire experts with the right skills than to develop that expertise exclusively in-house.
  • Automate. Automate. Automate. With so many new tasks requiring deep expertise, it is critical to take the manual labor out of repetitive tasks like ETL and creating/managing data warehouses, when possible.

While some of these principles seem revolutionary, enterprises can and should take an incremental approach, continuously re-shaping and experimenting. 

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

Edge Computing in IoT
Unique Capabilities of Edge Computing in IoT
Exclusive Internet of Things
Turning Geographic Data Into Competitive Advantage
The Rise of Location Intelligence: Turning Geographic Data Into Competitive Advantage
Big Data Exclusive
AI Recruitment Software Solution
The Best AI Recruitment Software Solution: Transforming Hiring with Smarter Tech
Artificial Intelligence Exclusive
real estate data
How Big Data Is Changes How We Buy and Sell Real Estate
Big Data Exclusive

Stay Connected

1.2KFollowersLike
33.7KFollowersFollow
222FollowersPin

You Might also Like

Enterprise 2.0: How to Make it Work

3 Min Read
Image
Big Data

Big Data, business analytics revenue to reach $150B in 2017

2 Min Read

Does Data Visualization Improve the World? – Tech@State Question Asks for Answers

3 Min Read

High-Performance Scoring of Healthcare Data

5 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai chatbot
The Art of Conversation: Enhancing Chatbots with Advanced AI Prompts
Chatbots
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?