PDN runs targeted campaigns that reach shoppers when they are most receptive to your message, in 100's of shopping-related partner apps while they are planning shopping trips and actually in-store.
We engage Shoppers via Add2List ad units that get your brand on their shopping lists, prompting them when they go shopping, that week and for months to come. Dozens of large and small brands alike use us to drive brand recognition and sales.
When running campaigns PDN targets and optimizes for engagement across a range of factors. We know this optimization process is critical to getting good performance and that it takes thought, time and effort from our Ad Ops team even using our platform. But we want to get a concrete idea of just how impactful it is in this study.
Optimization of many of the above factors is often constrained by campaign goals. For example, geography is not something we are usually free to optimize for, rather it is something we have to target based on the campaign requirements. Sometimes we have control over the creative and messaging but sometimes we don't. But Time of Day, Application, and Frequency are almost always optimizations we control.
Time of Day. First, we looked at the Time of Day. We see 69.3% better performance during peak campaign time of day versus the worst TOD. This varies somewhat depending on the geography of the campaign but there are always best times and they seem to be driven by key shopping times that vary on a location by location basis.
Application. Optimizing which apps we are delivering into had an even bigger impact. We see 100X better performance between the best and worst apps for a given brand and we see that change on a daily basis. Some apps will perform well, then poorly, then perform well again. We see this impact irrespective of the frequency cap we are using. It is not clear what causes this and we are studying it further, so look out for more from us on this in the future. The bottom line, for now, is our platform monitors for this and we make constant optimizations to get you the best performance possible across all apps.
Frequency. In our study, within broad ranges, frequency had an irregular impact on engagement performance. Our working theory was that there should be a sweet spot: it takes a certain number of exposures to get an engagement, and then after that number future exposures just dilute the engagement rate until some reorder interval has passed, typically a week or a month depending on the product. We could not gather any evidence to support this notion and as a result, we advise being cautious with underexposure, because that is where we actually saw the biggest hit to engagement rates.
Our platform and team detect and optimize around these factors on an ongoing basis and it is this more than anything that lets us increase performance deep into the lifecycle of a campaign, over 200% in the example below.