By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    AI analytics
    AI-Based Analytics Are Changing the Future of Credit Cards
    6 Min Read
    data overload showing data analytics
    How Does Next-Gen SIEM Prevent Data Overload For Security Analysts?
    8 Min Read
    hire a marketing agency with a background in data analytics
    5 Reasons to Hire a Marketing Agency that Knows Data Analytics
    7 Min Read
    predictive analytics for amazon pricing
    Using Predictive Analytics to Get the Best Deals on Amazon
    8 Min Read
    data science anayst
    Growing Demand for Data Science & Data Analyst Roles
    6 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-23 SmartData Collective. All Rights Reserved.
Reading: Exterior Design, Part II: Frequency
Share
Notification Show More
Aa
SmartData CollectiveSmartData Collective
Aa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Exterior Design, Part II: Frequency
Data QualityData WarehousingExclusive

Exterior Design, Part II: Frequency

Rob Armstrong
Last updated: 2011/04/03 at 12:34 PM
Rob Armstrong
7 Min Read
SHARE

To continue the theme of designing for an ever changing set of requirements, I wanted to talk about the second item we all know will happen.  Exactly how and when it will happen is open for debate but it will happen, unless you prevent it.  The second known is that the users will want to have the data more frequently.  For now I am referring to the same data as opposed to increasing the granularity of the data (that comes a bit later).  The best way to illustrate this is with a true story from my travels.

To continue the theme of designing for an ever changing set of requirements, I wanted to talk about the second item we all know will happen.  Exactly how and when it will happen is open for debate but it will happen, unless you prevent it.  The second known is that the users will want to have the data more frequently.  For now I am referring to the same data as opposed to increasing the granularity of the data (that comes a bit later).  The best way to illustrate this is with a true story from my travels.

I was in South America visiting a retail account.  I asked about the frequency of the data they have to analyze.  The business lead in the meeting responded that they get data every month.  The IT lead responded they get data every day.  This is quite a disparity.  After a bit of back and forth, the IT lead sighed heavily and said “we load daily data every month”.  I had to shake my head as I pointed out that if that was the case, they only had monthly data.  Once again we have to take the user perspective and getting May 5th data in June does not help you make decisions on May 7th.  So what was the problem in getting data that supported the end user needs?

More Read

AI analytics

AI-Based Analytics Are Changing the Future of Credit Cards

Enterprises Are Leveraging the Benefits of AI-Driven ERPs
Tips to Protect Office 365 Systems from Data Breaches
AI In Marketing: Is It Worth the Hype?
What Is The Advantage Of Using SDK in AI Technology?

Editor’s note: Rob Armstrong is an employee of Teradata. Teradata is a sponsor of The Smart Data Collective.

In this case it was a matter of the historically processes put in place by the IT organization.  In order to load the data the tables need to be offline.  Since the total monthly load took over 48 hours (including the summary table and index maintenance as well) to be accomplished they could not run this job frequently.  Much of this was a legacy problem but no one ever bothered to either optimize the processes using newer techniques or calculate that running the job with a much smaller daily amount would not take the long downtime.  Again it was simply a case of this is how we do it and it is too hard to change.

So we can reasonably expect that the users will ask for their data to become fresher until it reaches the point that critical business data will be required with “real time”.  I actually do not like term as it means something completely different for many different parts of an organization.  I like to say that if the frequency of data loading has to be the half life of a user’s ability to respond to it.  Therefore, if a user can get a piece of data and take a direct action on it within 2 minutes then they should be getting the data updated every minute.  This is not a hard and fast rule but it at least establishes a starting point for the conversations.

Keeping our “exterior” perspective it also must be noted that the loading is not the end point.  The data has to be readily available to the end user so any ancillary maintenance to take it from loaded into detail to being accessible to queries must also be included.

This gets to another one of those interesting paradoxes.  Things like indexing, summary tables, cubes, and what not are suppose to improve performance for the end users.  Unfortunately as the data frequency trends to “real time” those very same “performance optimizations” become a barrier and will actually increase the latency from data available to data accessible.

So as in last installment, I’ll ask a question.  If you are in IT, how ready are you for the community to ask that the data become “more frequent”?  If you load weekly, are you ready for daily?  If you load daily are you ready for hourly?  

If you are in the business arena, what could you do to increase the value of your processes if you got the data “more frequently”?  Better inventory management, better response to customer problems, decrease the lag enough that you can implement analytics into the call center?  Finally, if you are in the executive or leadership roles, what would these improved processes mean in terms of increase revenue or decreased costs?

OK, one more readily known requirement to be designed for and that will be the next blog topic.

Finally, thank you to all those who provided feedback and comments on my original efforts.  I trust that over the next few postings I’ll find my voice and give you all a place to pick up some useful ideas and insights.

Rob Armstrong April 3, 2011
Share This Article
Facebook Twitter Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

Data Ethics: Safeguarding Privacy and Ensuring Responsible Data Practices
Data Ethics: Safeguarding Privacy and Ensuring Responsible Data Practices
Best Practices Big Data Data Collection Data Management Privacy
data protection for SMEs
8 Crucial Tips to Help SMEs Guard Against Data Breaches
Data Management
How AI is Boosting the Customer Support Game
How AI is Boosting the Customer Support Game
Artificial Intelligence
AI analytics
AI-Based Analytics Are Changing the Future of Credit Cards
Analytics Artificial Intelligence Exclusive

Stay Connected

1.2k Followers Like
33.7k Followers Follow
222 Followers Pin

You Might also Like

AI analytics
AnalyticsArtificial IntelligenceExclusive

AI-Based Analytics Are Changing the Future of Credit Cards

6 Min Read
ai-driven ERP software
Artificial Intelligence

Enterprises Are Leveraging the Benefits of AI-Driven ERPs

8 Min Read
office 365 data protection
Risk Management

Tips to Protect Office 365 Systems from Data Breaches

9 Min Read
AI in marketing
Artificial Intelligence

AI In Marketing: Is It Worth the Hype?

8 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

AI and chatbots
Chatbots and SEO: How Can Chatbots Improve Your SEO Ranking?
Artificial Intelligence Chatbots Exclusive
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Lost your password?