Reconciling In-Memory and Server-Based Analytics
Share This -
Digg
Delicious
Slashdot
Furl it!
Reddit
Spurl
Technorati
YahooMyWeb
Tuesday, 13 July 2010

By Rony Ross, Panorama Software

Business intelligence has been the top priority for most CIOs for the last five years. Their customers, the business users, always want to get more and more information at their fingertips in order to support them in their decision-making processes. As the amount of information grows exponentially, and as the business landscape changes at a faster pace, business users are eager for more and more aggregated, filtered, focused information to help them run the business.

What an ideal situation for the CIO -- customers who want more and more. But the fact of life is that the demand for more analytics places CIOs in a tough spot.

Traditionally, in order to provide fast responses to almost random questions coming from various users, the CIO had to set up a data warehouse (or at least a data mart), define the parameters that are relevant to the business users, define the metrics they wish to measure, and create a “Cube” -- a multidimensional data warehouse with enough pre-aggregations that can provide complex responses at lightning speed. The process just described requires professional skills, server capacity and processes. So as more and more users queued up to get their analytics done, CIOs found it hard to deliver so many solutions concurrently.

As a response to the business users’ needs, a couple of innovative vendors came up with the idea of “in memory” analytics. Let’s avoid the need to build Cubes and maintain them. Memory is cheap. Let’s get all the data into the memory and calculate results on the fly. Let’s offer these products directly to the business user, circumventing the CIO and their requirements for security, maintainability and enterprise-ability. Let’s do a departmental application, kind of “quick and dirty,” a “silo” of information that is sorely needed -- for the business user -- right now.

And it worked. Business users voted for these solutions and got satisfactory results. At least at first.

Why “at first”? Because as the departmental application took off and more users started to tap in, the organization soon realized that these applications do not scale so well -- that they are a silo, that the data structures created are proprietary and are not open to other applications, that the analyses associated with them are limited, and that they do not scale beyond a certain threshold.

So what do we do now? How do we reconcile the need for fast, on-the-fly implementation with the need for integrativity, enterprise-ability, openness, security and scalability? How do we reconcile these two worlds and base them on common data structures, models and protocols?

The answer should be a system that gives the users the ability to combine an in-memory module with server-based modules, where both are based on the same well-accepted data models and data structure standards. A system where both modules are open for interactions with each other and with other applications. A system where the same security model can be enforced on both modules and -- where one module runs out of scale and performance -- the other module can pick up the application and provide the additional benefits of speed, scalability and enterprise-ability.

Rony Ross is founder, chairman and CTO of Panorama Software, a business intelligence application provider.




Comment on this article
RSS comments

Only registered users can write comments.
Please login or register.

 
Share This -
Digg
Delicious
Slashdot
Furl it!
Reddit
Spurl
Technorati
YahooMyWeb