A Case for Big Data Analytics as a Service

8 Min Read

Business Analytics has assumed such strategic importance that it has evolved from a purely IT activity into a core business activity touching upon various business functions, such as marketing, R&D, etc. well into Corporate & Business strategy.  As a result of this shift in importance, there is an increasing focus on greater speed-to-insight than on platforms or technologies. As a result, the number of companies that are delivering Analytics as a Service via cloud-based platforms is increasing day-by-day.

Business Analytics has assumed such strategic importance that it has evolved from a purely IT activity into a core business activity touching upon various business functions, such as marketing, R&D, etc. well into Corporate & Business strategy.  As a result of this shift in importance, there is an increasing focus on greater speed-to-insight than on platforms or technologies. As a result, the number of companies that are delivering Analytics as a Service via cloud-based platforms is increasing day-by-day.

Shift of focus

The strategic goal of a Big Data Strategy is to derive insights and make better data-driven decisions faster than competition. This is a core competency that exists around three facets, namely Infrastructure, Analytics and Visualization (Consumption). A strategic evaluation of these three elements reveals that Infrastructure and Visualization capabilities could be achieved at a fraction of cost through partnerships with external agencies or through outsourcing, but domain-specific Analytics is a capability that has to be cultivated over a period of time to evolve as a strategic asset.

However, not all companies have the budget or resources to cultivate these capabilities in-house. Furthermore, company or domain-specific analytics capabilities restrict an organization’s creativity in deploying its asset base in innovation ways. As a result, a broader set of analytics capabilities that is applicable to specific scenarios is warranted.

With the emergence of cloud-based analytics services, companies could now spend more time analyzing data and less in hardware and software administration. Prior to cloud services, a data warehousing or business intelligence project meant months in hardware and software acquisition, customization and implementation. Hence, Big Data Infrastructure, Analytics & Visualization, as a service, are most likely the elements that catalyze Big Data adoption in the forthcoming years.

The missing part of the puzzle

According to Adrian Gardner, Director of IT & Communications Directorate at NASA Goddard Space Flight Center, commoditization and packaging are currently the missing parts in current Big Data product portfolio. According to GardnerCompanies will have to morph for big data. The companies that flow with information will survive, but we’re going to be overwhelmed.

Technology as an enabler

The concept of leveraging external talents for internal data analytics has been in existence for a few decades. Webservers today could handle large bandwidths and process terabytes of data in a fraction of time, even without a cloud. With cloud technology, it is now possible to progressively develop analytics applications to process large datasets online in real time as required and deliver results through diverse channels ad hoc.

Furthermore, Internet users, especially Gen Y talents, have much a faster broadband connectivity than the earlier generations, and are on a perennial look out for challenging problems to solve. With social networking and Internet, these talents could connect and collaborate over networks, such as Kaggle, in tackling the most challenging of tasks.

In other words, technology has enabled talents distributed across the globe to work on problems specific to a business entity. Furthermore, it is beyond the realms, interests and capabilities of any particular organization to recruit and retain all these talents in-house. Many of these talents may not even be located in the same country or work in the same industry or function. Hence, these technological developments and socio-demographic constraints strongly present an argument for implementing Analytics as a Service, whereby globally distributed talents could be sourced in on the fly, regardless of demographic and geo-positional constraints.

Arguments for Analytics as a Service

The forces in favor of Analytics as a Service are manifold, but the following issues faced by most organizations provide a strong reason:

  1. Many organizations lack the time, resources or analytical expertise (Data Scientists) to solve Big Data challenges in-house.
  2. Companies are slammed with internal data and operate in established structures that make innovating on existing frameworks challenging
  3. Internal Big Data projects could experience schedule slippages, cost overruns, etc. due to the lack of prior experience in Big Data delivery
  4. Lack of prior experience in Big Data Analytics makes the problem too difficult to be solved internally or the steps to arrive at a pragmatic solution are considered beyond the organization’s capabilities or are regarded as overtly complicated
  5. The financial costs associated with analytics investments in in-house are prohibitive despite the value that such an approach could deliver

Architecture

Running Analytics as a Service, particularly in a cloud environment, proffers several benefits. The underlying infrastructure could capture data from a range of sources, such as SQL databases, other public clouds, Hadoop clusters, NoSQL databases, mobile applications, etc.

Analytics services that implement the RESTful API enable Data Scientists to aggregate, mash, analyze and extract intelligence using data gathered from diverse information sources. Such an approach allows Data Scientists to evaluate various important attributes, such as contexts, sentiments, geographical co-ordinates, etc. Data Analysis, as a service, could be performed either using popular proprietary statistical software packages or use algorithms developed with open source programming languages, such as Perl, Python, etc.

Conclusion

Today, the dominant design of Big Data and Analytics revolves around the Design to Build (D2B) paradigm that places pieces of technology or module blueprints at the core of a solution. This approach is rather static or dependent on specific questions posed a priori for which a limited scope of analyses is performed.

What is required are solutions based on Design to Use (D2U) paradigm that places information users at the core of solution design. With the need for predictive analytics and growing demand for business insights hidden in data, innovative business cases could be formulated.

Thus, stovepipe implementations of D2B cannot cater to the scaling complexities posed by evolving consumer behaviors, performance metrics, correlations, model parameters, etc. The move to truly scalable analytics can only then be delivered when Big Data application work in conjunction with other applications and algorithms on an ad hoc basis.

Share This Article
Exit mobile version