Cookies help us display personalized product recommendations and ensure you have great shopping experience.

By using this site, you agree to the Privacy Policy and Terms of Use.
Accept
SmartData CollectiveSmartData Collective
  • Analytics
    AnalyticsShow More
    image fx (67)
    Improving LinkedIn Ad Strategies with Data Analytics
    9 Min Read
    big data and remote work
    Data Helps Speech-Language Pathologists Deliver Better Results
    6 Min Read
    data driven insights
    How Data-Driven Insights Are Addressing Gaps in Patient Communication and Equity
    8 Min Read
    pexels pavel danilyuk 8112119
    Data Analytics Is Revolutionizing Medical Credentialing
    8 Min Read
    data and seo
    Maximize SEO Success with Powerful Data Analytics Insights
    8 Min Read
  • Big Data
  • BI
  • Exclusive
  • IT
  • Marketing
  • Software
Search
© 2008-25 SmartData Collective. All Rights Reserved.
Reading: Exterior Design, Part II: Frequency
Share
Notification
Font ResizerAa
SmartData CollectiveSmartData Collective
Font ResizerAa
Search
  • About
  • Help
  • Privacy
Follow US
© 2008-23 SmartData Collective. All Rights Reserved.
SmartData Collective > Big Data > Data Quality > Exterior Design, Part II: Frequency
Data QualityData WarehousingExclusive

Exterior Design, Part II: Frequency

Rob Armstrong
Rob Armstrong
7 Min Read
SHARE

To continue the theme of designing for an ever changing set of requirements, I wanted to talk about the second item we all know will happen.  Exactly how and when it will happen is open for debate but it will happen, unless you prevent it.  The second known is that the users will want to have the data more frequently.  For now I am referring to the same data as opposed to increasing the granularity of the data (that comes a bit later).  The best way to illustrate this is with a true story from my travels.

To continue the theme of designing for an ever changing set of requirements, I wanted to talk about the second item we all know will happen.  Exactly how and when it will happen is open for debate but it will happen, unless you prevent it.  The second known is that the users will want to have the data more frequently.  For now I am referring to the same data as opposed to increasing the granularity of the data (that comes a bit later).  The best way to illustrate this is with a true story from my travels.

I was in South America visiting a retail account.  I asked about the frequency of the data they have to analyze.  The business lead in the meeting responded that they get data every month.  The IT lead responded they get data every day.  This is quite a disparity.  After a bit of back and forth, the IT lead sighed heavily and said “we load daily data every month”.  I had to shake my head as I pointed out that if that was the case, they only had monthly data.  Once again we have to take the user perspective and getting May 5th data in June does not help you make decisions on May 7th.  So what was the problem in getting data that supported the end user needs?

More Read

power of big data and learning analytics
Discover The Power of Big Data And Learning Analytics For Education
Benefits Of Using Reporting and Analytics Tools in Small Businesses
Gen AI Helps Developers Automate Writing Coding
The Right Data Can Help Guide Business Decision Making
First Look – SAS In-Database Analytics

Editor’s note: Rob Armstrong is an employee of Teradata. Teradata is a sponsor of The Smart Data Collective.

In this case it was a matter of the historically processes put in place by the IT organization.  In order to load the data the tables need to be offline.  Since the total monthly load took over 48 hours (including the summary table and index maintenance as well) to be accomplished they could not run this job frequently.  Much of this was a legacy problem but no one ever bothered to either optimize the processes using newer techniques or calculate that running the job with a much smaller daily amount would not take the long downtime.  Again it was simply a case of this is how we do it and it is too hard to change.

So we can reasonably expect that the users will ask for their data to become fresher until it reaches the point that critical business data will be required with “real time”.  I actually do not like term as it means something completely different for many different parts of an organization.  I like to say that if the frequency of data loading has to be the half life of a user’s ability to respond to it.  Therefore, if a user can get a piece of data and take a direct action on it within 2 minutes then they should be getting the data updated every minute.  This is not a hard and fast rule but it at least establishes a starting point for the conversations.

Keeping our “exterior” perspective it also must be noted that the loading is not the end point.  The data has to be readily available to the end user so any ancillary maintenance to take it from loaded into detail to being accessible to queries must also be included.

This gets to another one of those interesting paradoxes.  Things like indexing, summary tables, cubes, and what not are suppose to improve performance for the end users.  Unfortunately as the data frequency trends to “real time” those very same “performance optimizations” become a barrier and will actually increase the latency from data available to data accessible.

So as in last installment, I’ll ask a question.  If you are in IT, how ready are you for the community to ask that the data become “more frequent”?  If you load weekly, are you ready for daily?  If you load daily are you ready for hourly?  

If you are in the business arena, what could you do to increase the value of your processes if you got the data “more frequently”?  Better inventory management, better response to customer problems, decrease the lag enough that you can implement analytics into the call center?  Finally, if you are in the executive or leadership roles, what would these improved processes mean in terms of increase revenue or decreased costs?

OK, one more readily known requirement to be designed for and that will be the next blog topic.

Finally, thank you to all those who provided feedback and comments on my original efforts.  I trust that over the next few postings I’ll find my voice and give you all a place to pick up some useful ideas and insights.

Share This Article
Facebook Pinterest LinkedIn
Share

Follow us on Facebook

Latest News

image fx (2)
Monitoring Data Without Turning into Big Brother
Big Data Exclusive
image fx (71)
The Power of AI for Personalization in Email
Artificial Intelligence Exclusive Marketing
image fx (67)
Improving LinkedIn Ad Strategies with Data Analytics
Analytics Big Data Exclusive Software
big data and remote work
Data Helps Speech-Language Pathologists Deliver Better Results
Analytics Big Data Exclusive

Stay Connected

1.2kFollowersLike
33.7kFollowersFollow
222FollowersPin

You Might also Like

Big Data & AI helping aviation
Artificial IntelligenceBig DataExclusive

Big Data and AI Are Helping the Aviation Industry Reach New Heights

6 Min Read
mobile intelligence
Artificial Intelligence

How is Mobile Intelligence Reshaping the Marketing Industry?

7 Min Read
preventing investing mistakes
Analytics

Data Analytics Is Critical For Preventing Investing Mistakes

6 Min Read
ai generated 8157379 1280
Cloud ComputingExclusiveIT

Cloud Technology Changes Role of Soft Skills in Nursing

7 Min Read

SmartData Collective is one of the largest & trusted community covering technical content about Big Data, BI, Cloud, Analytics, Artificial Intelligence, IoT & more.

ai chatbot
The Art of Conversation: Enhancing Chatbots with Advanced AI Prompts
Chatbots
giveaway chatbots
How To Get An Award Winning Giveaway Bot
Big Data Chatbots Exclusive

Quick Link

  • About
  • Contact
  • Privacy
Follow US
© 2008-25 SmartData Collective. All Rights Reserved.
Go to mobile version
Welcome Back!

Sign in to your account

Username or Email Address
Password

Lost your password?