Game Monetization: Recognize and Reward Your Most Valuable Players

January 15, 2012

(First installment a series of player LTV and game monetization articles. The following is an excerpt from Chapter Four of ‘Game Loyalty’, an upcoming book by the founders of Core Analytics, LLC)

(First installment a series of player LTV and game monetization articles. The following is an excerpt from Chapter Four of ‘Game Loyalty’, an upcoming book by the founders of Core Analytics, LLC)

All players were not created equal. This holds true across the entire gaming industry. Video gamers look and act differently from social gamers. Collection-type players of genres of casual games (e.g. Farmville) look different from those who gravitate more frequently to skill-based games (e.g. Angry Birds). Hard-core FPS gamers have different demographics and motivations from regular casino patrons. The gamers around the blackjack table are not identical to the gamers playing the slot machines. There is not only a great variance around demographic and behavioral characteristics of gamers, but also a wide variance in both player profitability and player Lifetime Value (LTV).

From a game developer’s perspective, understanding who is playing and how these players are interacting with the game and with others is the foundation for data-driven decisions about patches, new characters, enhancements, new features, virtual good offerings, pricing, timing of updates, prioritization of bug-fixes, localized version differences, platform, new game development strategy and so on. From a game marketer’s perspective, advertising and creative decisions can also be optimized at a player-micro segment level. This holds reasonably true across the entire game industry. Both video and social game developers can benefit by understanding who their most valuable player segments are, how they are generating revenue (subscription, virtual good consumption, in-game advertising response), and how they are playing the game. In this way developers can optimize game experience for valuable players and even differentiate game versions based on player segment type, optimize virtual good design and introduction, as well as serve up the most relevant advertising for that player type.

In terms of social game monetization and measurement, in-game progress is generally tied to a player’s personal profile. One difference with social games is how players interact with their real world networks through social networks, in contrast to traditional video game environments which remain largely virtual (players do meet at game conferences and in real life, and a great many relationships and marriages are generated from initial virtual video game relationships. The comparative longevity of these relationships is discussion for another chapter.) The basic LTV modeling approach for video games and social games is necessarily different, given that currently video games operate on a subscription model, and many social games are based on a freemium model where revenues stem from virtual good purchases and response to advertising. The video game industry is slowly moving towards a more social-game based monetization model, and LTV calculation approaches will likely start to converge across the game space.

How then, to best measure player LTV? Core Analytics (TM) and Game Loyalty have incorporated player LTV metrics into the BrandMeter (TM) Game Module. These metrics have been developed and selected through a rigorous literature review of LTV calculations, with the end result incorporating the best of these approaches for game type, genre and markets. Below is an abbreviated literature review of academic journal articles on the LTV measurement topic. The section following discusses the advantages of each approach and attempts to cherry-pick the best characteristics of traditional LTV calculation approach and translates the outcome according to the nuances specific to the video and social game industry.

Customer LTV has been defined as the ‘total net income expected from customers’. In two related 2002 and 2003 Data Mining and Knowledge Discovery and SIGKDD papers, Rosset and his group of researchers attempted to apply LTV modeling in a telecommunications scenario. They look at LTV as a function of Value over time, Tenure (Length of Service) and a Discounting Factor. This is broken down into three separate models and then combined for an overall LTV measure. This group focuses primarily on modeling the Tenure (expected length of relationship of customer with brand) and experiment with a survival function and hazard function, both of which are estimated from current and historical data. The discounting factor is typically calculated through either an Exponential Decay or Threshold Function approach, and value is constructed with other traditional methods.

Rosset explains that the spectrum of modeling approaches towards tenure (LOS) is wide. A simple method observing an average churn rate of 5% and then setting s(t)=.95’ is functionally as appropriate as a more complex model incorporating individual characteristics, contract differences and other individual and life-event based factors into a Proportional Hazards model, where hundreds of variables are used as predictors. The danger of the latter is either a lack of data to support the method, or missing out on local, current present effects by building based on older data. The approach used in the paper uses only one month of data. Since pure parametric approaches tend to result in non-smooth functions spiking around contract renewal dates, semi-parametric approaches, such as the Cox Proportional Hazard model, seem to be more flexible. Another option would be a neural net where each tenure output has its own node, and tenure is indicated at a monthly or categorical level. In certain data sets, this neural net approach has outperformed the proportional hazard models. It’s also possible to approach this with a survival analysis or standard supervised learning model where tenure is a predictor and churn is the response. The authors suggest that the Kaplan-Meier approach that requires left-censored data is also appropriate but there may not be enough churn events in recent data to reflect current trends and for this reason non parametric estimates of hazard rates are calculated for tenure modeling. In terms of value, this can be done with forecasting, trend analysis and time series, and current average value can also be used at a segment level.

Another 2004 article from the Journal of Managerial Issues by Pfeifer and his co-authors, approached the customer LTV problem by differentiating between customer lifetime value and customer profitability. They assume that customer LTV is important as it is a primary determinant of customer acquisition spending strategy. Pfeifer and team argue that the two terms are not interchangeable, and that confusion exists in the industry when it comes to customer value and customer profitability measures. Most agree that value is a discounted measure of future dollar amounts associated with the customer. Where a more standard definition of customer Lifetime Value revolves around a standard time value of money measure using discounting and traditional present value technique, Pfeifer suggests a finance approach, focusing on customer ‘valuation’ by determining net present value.