Big Data in Insurance – Is the Industry Ready to Take Advantage?

11 Min Read

The need for information has created this disruptive nomenclature that has taken the world of data by storm.  This revolutionary term has the same bearing as “service oriented architecture”, a few years back.  Analytics has been the mainstay of business processing for decades however, the term “Big Data” has added several new dimensions to the world of data.  Big data enable companies to create new products and services, enhance existing ones, and in a lot of cases invent entirely new business models.

The need for information has created this disruptive nomenclature that has taken the world of data by storm.  This revolutionary term has the same bearing as “service oriented architecture”, a few years back.  Analytics has been the mainstay of business processing for decades however, the term “Big Data” has added several new dimensions to the world of data.  Big data enable companies to create new products and services, enhance existing ones, and in a lot of cases invent entirely new business models. All this has become possible, not because Big Data has the answer.  Rather, people have now started asking the right questions.  The ability to look for the answer will become the key basis for competitive growth, and in a lot of cases, a fight for survival.  So the strategic thought leaders of the organization need to aggressively build big data capabilities.

While technologies such as DBMS, Columnar Databases, database appliances, in-memory analytics, text analytics and content analytics have been long in existence, newer technologies such as Hadoop clusters, NOSQL, and MapReduce have gained popularity due to low capital cost for initial setup.  However, large scale storage systems, better data processing capabilities and limited skilled manpower is one of the major drawback when it comes to new technology developments.  

So what does it all really mean to insurance carriers?

Insurance companies have traditionally operated under silos by building their systems and applications from ground up.  As new processes, products and technologies emerged, more silos were created due to lack of integration among the existing one.  The traditional methods employed by insurance companies for risk, actuarial and product analyses, reserving, market penetration, customer churn etc. are being pushed to backseat, and more modern, accurate, penetrative and conclusive methods are driving business priorities.  Providing consumers with the personalized and timely digital experience will require mass churning of data in the background with all necessary analytics.  These disruptive changes are slowly shifting the focus from actuaries to data scientists to find that piece of “gold nugget” information in the piles of dirt.  The reactive and proactive approaches have become a thing of the past, and it is all about predictive analytics, that will allow businesses to be ready for action should an event take place.  In many cases, predictive analytics can change the course of the event itself, if predicted accurately.  

The emergence of real-time location data has created an entirely new set of location-based services from navigation to pricing property and casualty insurance based on where, and how, people drive their vehicles.  Sensors in the vehicle, can now tell also how vehicle is being maintained as well?   The maintenance history data from major vehicle repair shops may also provide insights into the type of maintenance work done.  The scope can also be expanded from regular insurance data (ISO, ACORD, LexisNexis, Marshall-Swift, D&B, Acxiom, comparative raters) to newer input sources and data types such as telematics data, smart grid data for properties, social networks, blogs, Department of Insurance data, police reports, content streams, geo-spatial and weather patterns.  As there is more interaction with consumers and distribution channels (Agents, Brokers, Clubs etc.), more data is generated.  Each interaction, whether mobile or web based creates a pattern that may only be visible as a needle in the haystack.  The ability to prepare, segment, analyze, model, prove, fine tune and then feed back into the mainstream ecosystem of business applications are going to be the key differentiator in  company performances.  Ability to predict an event or an action will also reduce the reaction time, thus saving time and money.

The business needs for all insurance companies could be summarized as under:

  1. Identifying the right customer for the right product at the right price at the lowest risk
  2. Reducing  claims costs and revenue leakage by providing efficient means of processing and detecting fraud sooner
  3. Speed to “new” market sooner – The new market is not just geographical but also demographical

The above business needs need to be achieved by providing optimal customer experience without compromising financial objectives of the organization.  To cater to the above business needs insurance companies will require tremendous amalgamation and analyses of various sources of data as described earlier.    

Easier said than done?

Processing Big Data is not devoid of its hurdles either.  There are some major challenges that need to be address for insurance carriers to stay in the race.

Application Modernization: Insurance companies traditionally have operated in silos.  In order to take better advantage of the new data elements and attributes legacy applications may have to be modernized / replaced.  The modernization is not just limited to the legacy applications, but also the way they might be integrated with other applications.  The batch processes have made way for more real-time and event driven approaches to integrate information.  This in turn creates the need for real-time analyses.  (Amazon: 20% of people who saw this bought that… sound familiar?)   The analytical tools have also been transformed over the last decade.  Reports have given way to dashboards and other forms of visualizations.

Data Quality: The old adage comes to mind.  GIGO (Garbage in Garbage out).   Technologies do not make an iota of difference if the data quality is not addressed upfront.  Faster processing by newer technologies is going to make us fail faster, if the data is not current and correct.

Storage:  As every interaction with a business, customer, and employee across several channels create new forms of data, the volume of data generated for analytical purpose gets exponentially increased every day.  The large volumes of data require massive storage.  There is an impending need for sifting through the data to understand which portions of data need to be stored and which ones need to be purged.  Data retention and governance play a very important role in managing costs of retaining data.

Resources:  The market has been glutted with new products and technologies.  Choosing the right set of solutions is obviously important.  Even more important, is the ability to get the skilled resources to support the solution.  Technologies such as Hadoop, MapReduce, Cassandra, Mahout, Scala, and R are relatively new or at least gained importance in the last few years.  More importantly data scientists are very much sought after, for their ability spot, visualize and tell a compelling story on the data analyzed. Skilled resources in these areas can not only be scarce but also expensive, thereby making the Big Data initiative, overall quite an expensive affair.

Solutions Design

Insurance carriers need comprehensive solutions that have the capability of providing answers that actuaries, marketing and product managers, underwriters and fraud investigators have been asking from their respective functions.  We need to find easier ways to integrate transactional systems, data warehouse / data marts, social data, telematics data to provide an innovative capability to analyze the data.  The solution should provide innovative business facing techniques to comprehend statistical techniques such as regression, pattern recognition, social data aggregation, text analytics, and sentiment analysis (via natural language processing) to provide a 360 degree view of an event, person, or a system.  The solution should be data driven where the data scientists can sift through large volumes of data to identify patterns, isolate the noise, create models, prove hypotheses, and finally integrate the model with the front-end solution. 

Good news is that the above design principles are not impossible to achieve. Saama’s sixthSENSE Solution is a good start. It does not have all the answers yet, but can give a jump start on Big Data leverage for insurance carriers. A brief overview of the functional capabilities of sixthSENSE is shown below.

The figure below represents the functional capabilities of the sixthSENSE solution and the bottom line which is results that one can generate from a Big Data Solution such as sixthSENSE.

 

We can discuss the impact of Big Data across each of the above Solution areas in future posts. The key is to identify an area that makes most sense for your business with Big Data and start working on it before it is too late.

Further Reading

1)  Saama SixthSense – Download a Point of View Presentation

 

 

Share This Article
Exit mobile version