In-Memory Analytics Tools To Take Center Stage In 2012
To play off one of the quotes uttered in the 1986 Tom Cruise blockbuster “Top Gun,” “I feel the need . . . the need for speed.” And when it comes to in-memory analytics, that’s the name of the game.
To play off one of the quotes uttered in the 1986 Tom Cruise blockbuster “Top Gun,” “I feel the need . . . the need for speed.” And when it comes to in-memory analytics, that’s the name of the game. According to Cindi Howson (@biscorecard), In-memory technology remained prominent in 2011 and will continue to capture our full attention in 2012, with its ability to provide speed-of-thought analysis on ever-increasing amounts of data.
Of course, as Forrester Research’s Boris Evelson (@bevelson) posts in a blog on the topic, not all in-memory analytics tools are created equal. Although Evelson’s blog dates back to Sept. 2010, the words continue to hold true today. As he notes, there are distinct differences between in-memory indexes, in-memory OLAP, in-memory ROLAP, and other approaches.
As Evelson further notes, while there are a raft of commodity features worth comparing, such as data integration and portal integration, it’s also important to evaluate the different features that various tools tout. These include compression ratios for in-memory data tools as well as load speeds plus the ability to combine advanced analytics with traditional BI reporting and analysis.
Indeed, such attributes are critical differentiators for data scientists who are looking to leverage the full strengths of in-memory analytics tools. For instance, in-memory analytics tools that are able to load more data into memory make it possible for users to manipulate more data that’s stored in memory. And because in-memory analytics tools work off the server’s memory, report generation doesn’t require network access or disk I/O.
In turn, this helps to enhance speed and time to market for different applications or other actions being taken by decision makers (e.g. identifying and then filling the demand for a product among a particular customer segment or geography).
In-memory analytics tools that are able to concentrate more data on a limited or fixed amount of hardware are ultimately able to provide more bang for the buck.
A cio.com article on the topic offers the following rhetorical example: a retailer is able to monitor a customer’s real-time behavior while in a store or on a web site and draw on historical data from its loyalty system regarding the customer’s spending patterns in order to come up with relevant offers that the customer may respond to at that moment.
As the cio.com article points out, the concept of running databases is nothing new. But given plunging memory prices, combined with the added pressure of having to beat competitors to market in today’s real-time business environment, doesn’t it make sense for companies to take advantage of all the technological and business opportunities that are available to them?
You may be interested
The Race for 5G Is the Race for Data DominanceDaniel Matthews - June 22, 2017
Have you noticed how often the phrase “by the year 2020” comes up? In the tech sphere, many are heralding…
The Direst Security Breaches of 2017 and How Data Centers Are RespondingDiana Hope - June 20, 2017
Cybersecurity is becoming a tremendous concern. By 2021, security breach cost will exceed $6 trillion a year. A number of…
You must log in to post a comment.