- How should I bid when entering a market?
- How should I adjust my bids when my objectives change?
- What bid will generate the most profit?
Here are the slides for your reference …
- How should I bid when entering a market?
- How should I adjust my bids when my objectives change?
- What bid will generate the most profit?
Here are the slides for your reference:
As Michael mentioned during his presentation, price/volume curves are not complicated to understand and use. However, in this context, they are quite complicated to construct due to:
On the topic of large data volumes, Michael mentioned that their ad servers generate between 5TB-10TB of data per day, and their use of price/volume curves requires analysis of between 1-4 weeks of this data. This translates into somewhere between 35TB-280TB of data being analyzed, and just to put that into perspective, Michael mentioned that simply reading 35TB of data off a typical hard drive on a home computer takes 26 hours.
On analytical complexity, Michael mentioned that this resulted in part from the need to support complex market targeting and frequency capping, and described an innovative algorithm AOL Advertising developed for the construction of price/volume curves that essentially instantiates their ad server decision model (the algorithm that determines which ad to serve in response to each impression request) in a Netezza system (note that Netezza is my employer). The algorithm that AOL Advertising developed simultaneously resolves many thousands of complex targets across multiple campaigns in a very short period of time – generating multiple price/volume curves simultaneously with a single scan of the impression data. This is important because, as is the case with many classes of terabyte-scale data analysis, the inability of data transfer speeds to keep pace with growing data volumes creates a significant performance bottleneck – so minimizing disk reads and data movement has a big impact on overall performance.
On the “need for speed,” Michael noted AOL Advertising’s experience evaluating multiple technology platforms in order to obtain the best possible performance – important since they construct thousands of price/volume curves each day. They found that the amount of time required to generate a single price/volume curve with different approaches were as follows:
- Legacy technology: more than 1 day
- 180-node Hadoop cluster: less than 1 day
- Netezza: 1-to-5 minutes
The important point here is that the first two approaches attempted were simply not good enough – primarily because this is yet another case where increased analytic performance translates directly into increased business value for a digital media solution. In other words, the firm that analyzes the most data the quickest – wins.
Photo credit: YtseJam Photography