The ABCs of In-Memory Processing

5 Min Read

A:  What is it?

In-memory processing is a fairly simple yet very powerful innovation.  Here’s how it works:

A:  What is it?

In-memory processing is a fairly simple yet very powerful innovation.  Here’s how it works:

Retrieving data from disk storage is the slowest part of data processing.  And the more data you need to work with, the more the retrieval step slows down the analytics process.  The usual way of addressing this time problem has been to pre-process data in some way (cubes, query sets, aggregate tables, etc.) so the computer can “go get” a smaller number of records.  But those approaches typically require guessing in advance what data should be selected, and how it should be arranged for analysis.  If/when the analyst needs more or different data, it’s back to the drawing-board. 

In-memory processing eliminates the “go get” step completely, because for analytical purposes all the relevant data is loaded into super-speedy RAM memory all the time, and therefore does not have to be accessed from disk storage.  So the time factor changes dramatically.  Plus, it’s possible to see the data more flexibly and at a deeper level of detail, rather than in pre-defined high-level views.

B:  Why does it matter?

Basically, in-memory processing allows data analytics to be more like natural thought.  Humans typically acquire information over time, then recall that information selectively to solve problems and make decisions.  For example–if you’re familiar with seven restaurants in your neighborhood, and it’s time for dinner, you might consider which of these restaurants you’re in the mood for.  You might compare the different cuisines (Chinese? Indian? Seafood? Burgers?), the relative locations (walking distance? parking problems?), and the price level (pocket change? budget buster?).

You would probably just flip through these factors in your mind and make a decision on the fly.  Or you might collaborate with a dining companion. (“What are you in the mood for?”) But you probably would not make a table of all the relevant data, define the possible relationships among the different data items, memorize the table, retrieve the table from memory, write it down for display, and then begin your decision-making process–using only the facts in that specific table.

So here’s the money question:  If you had to go through all those steps to choose a restaurant . . . how often would you go out to eat?

Seriously.

Not only would the process itself take up a ridiculous amount of time, your final decisions would be based on limited data.  And remember—if you want to add more data to the mix (new restaurants, user reviews, etc.), you have to start all over again.

Just the same in business.  Fast, flexible access to large amounts of data offers the potential for lots of excellent analytics.  Slow access to small amounts of rigidly organized data not only discourages users from doing analysis but also may produce less-than-wonderful results.  And that sets up a vicious cycle.

C:  What next?

In-memory processing has evolved swiftly from “great idea” to a robust and ready technology that’s changing the BI landscape.  Find out more with an accessible overview from top BI analyst Cindi Howson, then review her InformationWeek white paper “Insight at the Speed of Thought: Taking Advantage of In-Memory Analytics” for additional detail.  (Registration painless and free!)

In many organizations, a key driver for use of in-memory tools will be the need for predictive analytics.  So next week’s ABCs post will look at how BI helps businesses see into the future.

Spotfire Blogging Team

Image Credit: Microsoft Office Clip Art

Share This Article
Exit mobile version