AOL Advertising on the Need for Speed in Yield Optimization

6 Min Read

AOL Advertising is one of the most innovative firms in the online advertising industry. If you haven’t noticed, it’s likely because they consider many of their innovations to be trade secrets, and they have therefore been reluctant to speak publicly about them in the past. New CEO Tim Armstrong is changing that with a mandate that encourages AOL employees to promote their innovations. In a recent example of this, AOL Advertising’s Michael Kamprath and Tracy La gave a presentation at ad:tech NY on their use of Price/Volume Analytics for Advanced Bid Optimization and Inventory Management, a yield optimization technique used to improve media planning. Price/Volume curves plot the expected impression volume available for a range of bids in a targeted market and help answer questions like:

  • How should I bid when entering a market?
  • How should I adjust my bids when my objectives change?
  • What bid will generate the most profit?

Here are the slides for your reference …

AOL Advertising is one of the most innovative firms in the online advertising industry. If you haven’t noticed, it’s likely because they consider many of their innovations to be trade secrets, and they have therefore been reluctant to speak publicly about them in the past. New CEO Tim Armstrong is changing that with a mandate that encourages AOL employees to promote their innovations. In a recent example of this, AOL Advertising’s Michael Kamprath and Tracy La gave a presentation at ad:tech NY on their use of Price/Volume Analytics for Advanced Bid Optimization and Inventory Management, a yield optimization technique used to improve media planning. Price/Volume curves plot the expected impression volume available for a range of bids in a targeted market and help answer questions like:

  • How should I bid when entering a market?
  • How should I adjust my bids when my objectives change?
  • What bid will generate the most profit?

Here are the slides for your reference:

As Michael mentioned during his presentation, price/volume curves are not complicated to understand and use. However, in this context, they are quite complicated to construct due to:

  • Large data volumes
  • Analytical complexity
  • Time pressures (the “need for speed”)

On the topic of large data volumes, Michael mentioned that their ad servers generate between 5TB-10TB of data per day, and their use of price/volume curves requires analysis of between 1-4 weeks of this data. This translates into somewhere between 35TB-280TB of data being analyzed, and just to put that into perspective, Michael mentioned that simply reading 35TB of data off a typical hard drive on a home computer takes 26 hours.

On analytical complexity, Michael mentioned that this resulted in part from the need to support complex market targeting and frequency capping, and described an innovative algorithm AOL Advertising developed for the construction of price/volume curves that essentially instantiates their ad server decision model (the algorithm that determines which ad to serve in response to each impression request) in a Netezza system (note that Netezza is my employer). The algorithm that AOL Advertising developed simultaneously resolves many thousands of complex targets across multiple campaigns in a very short period of time – generating multiple price/volume curves simultaneously with a single scan of the impression data. This is important because, as is the case with many classes of terabyte-scale data analysis, the inability of data transfer speeds to keep pace with growing data volumes creates a significant performance bottleneck – so minimizing disk reads and data movement has a big impact on overall performance.

On the “need for speed,” Michael noted AOL Advertising’s experience evaluating multiple technology platforms in order to obtain the best possible performance – important since they construct thousands of price/volume curves each day. They found that the amount of time required to generate a single price/volume curve with different approaches were as follows:

  • Legacy technology: more than 1 day
  • 180-node Hadoop cluster: less than 1 day
  • Netezza: 1-to-5 minutes

The important point here is that the first two approaches attempted were simply not good enough – primarily because this is yet another case where increased analytic performance translates directly into increased business value for a digital media solution. In other words, the firm that analyzes the most data the quickest – wins.

Photo credit:  YtseJam Photography

Share This Article
Exit mobile version