Stream vs. Batch Processing: Which One is the Better Business Operations GPS?

4 Min Read

Many organizations across industries leverage “real-time” analytics to monitor and improve operational performance.  Essentially this means that they are capturing and collecting data in lots from various systems and analyzing it in batches through periodic on-demand queries.

Many organizations across industries leverage “real-time” analytics to monitor and improve operational performance.  Essentially this means that they are capturing and collecting data in lots from various systems and analyzing it in batches through periodic on-demand queries. By contrast, companies that are leveraging “streaming analytics” are continuously collecting and analyzing data and automatically course-correcting as events unfold – when there’s still an opportunity to positively impact the outcome.

When I describe how continuous visibility and real-time course correction works with respect to Operational intelligence, I think about how a GPS system works.  When you’re driving your car, your GPS system knows precisely where you are at any given moment and where you are heading.  It’s also continuously monitoring your progression and perpetually making small adjustments on your estimated arrival time.  As you approach a plotted turn, it alerts you in real time and tells you how many feet you are away and then instructs you on which direction to turn the wheel.  And if you miss the turn, it automatically adjusts and revises your course to get you back on track.   

The value that streaming analytics and continuous monitoring delivers hinges on the ability to anticipate your needs, to recognize when your needs change, and to proactively adjust to meet your needs as they unfold.  By contrast, if you’re not leveraging streaming analytics and continuous monitoring, it’s essentially akin to going online and printing out directions to your destination.  Sure, mapping services offer step-by-step directions, plot a course on an actual map, and provide you with the distance and estimated drive time.   However, if your course changes or something goes awry, a static sheet of paper isn’t terribly useful.

Whereas in this example the GPS represents streaming analytics and continuous monitoring, the static sheet of paper that you print out before beginning your journey is analogous to how batch processing systems work.  With Hadoop-based batch processing platforms, you run periodic queries on data lots, and from these on-demand queries you derive “near real-time” insight.  However, because the data is monitored in intervals and not continuously, you can’t detect changes and problems fast enough to address them before they cycle through and affect the customer and the broader organization.

Essentially, once you’re off track there’s no way to course correct because your insight will never be current enough to keep up with your reality.  This is because – without streaming analytics – there’s perpetually inherent latency in the data analysis process.

For operational processes that rely on immediate detection and corrective action – things like customer experience management, fraud detection and supply chain management, for instance – batch processing simply isn’t good enough.  This is because each passing second associated with data latency is causing a costly ripple effect through the organization and the customer base.

Leveraging streaming data provides you with the insight you need precisely when you need it – when you can still make a difference.  By continuously monitoring and making adjustments along the way, you improve the customer experience, prevent fraud and increase supply chain efficiency.  This is the heart of the value that streaming analytics delivers with respect to Operational Intelligence. 

Share This Article
Exit mobile version