How User Acquisition Advertising Has Changed

 

Google UAC

Are you ready to have an algorithm do your job? Ready or not, it’s happened. Late in 2017, Google moved all-new app install campaigns over to Google UAC (Universal App Campaigns). About a month later, they turned off any Search, Display and YouTube app promo campaigns that had been running. If you wanted to do mobile app install campaigns on Google, you were going to do them via Google UAC. And that meant you were going to let Google’s algorithm manage many of the functions in your campaigns.

Facebook didn’t take long to follow suit. Their move happened in February 2018, when they rolled out what we call their “best practices” update. Facebook’s changes weren’t as forced as Google’s, but they were just as consequential.  The shift has meant, basically, that we’ve handed over most of the control of some of our primary advertising tasks to the algorithms. This does mean we’re freed up to focus on things like creative and audience expansion. But we’ve also given up a lot of control.

User Acquisition Manager vs. Algorithms

Fortunately, giving the algorithms this much control – and having algorithms sophisticated enough to do this work in the first place – actually has a lot of upsides. Now that so much of the work of a user acquisition manager has been moved over to the algorithms:

It’s possible for less-experienced advertisers to get results comparable to their more advanced peers. This means more advertisers can profitably use the platforms.

Many (if not most) third-party advertising tools have become less necessary. Ad tech tools had been a significant competitive advantage before, but now both Facebook and Google UAC basically offer comparable tools for free.

Most of the companies we work with used to spend 10% of their budgets on Google. Now they spend 30-50% Google. A little competition is always a good thing.

How to Manage Facebook Campaigns With Google UAC and Facebook’s Best Practices

Prior to February 2018, Facebook allowed advertisers to run a lot of ads. The ads could have overlapping audiences, and there were no penalties for making frequent bid changes, even if they were made every few hours. Ads could be paused and budgets could be modified all the time.

Our internal AdRules tool allowed us and the companies we work with to edit bids, budgets and pause rules with the speed and precision of a high-frequency trading desk. We would get early indications of how an ad or ad set was performing, and then modify settings to either maximize that ad or ad set’s exposure or to kill the ad/ad set if it was underperforming. We can still do all that and more with AdRules, but how we use that tool has shifted.

So basically, prior to February 2018, optimization was done through thousands of actions that were controlled by the advertiser or a third-party ad tech tool.

Then everything changed. All those micro-management changes started to incur penalties, and it became clear that Facebook would reward advertisers for running their campaigns according to the best practices outlined in Facebook’s “Blueprint Certification.”  Those best practices include:

  • Running fewer ads. Prior to February, we had been running sometimes thousands of ads for each account. After February, we saw that some of our clients who had been running far fewer ads (like 50) were doing well.
  • Using fewer campaigns with minimal audience overlap. Before, we had used many campaigns (thousands of them) with many different settings. We did detailed audience targeting and segmentation. But after the best practices shift, it worked better to rely more on the Facebook algorithm to sift through audiences and settings to find the right prospective customers. Broad targeting with no overlapping audiences, combined with Facebook’s Value Optimization (VO) and App Event Optimization (AEO)  worked better. So now we let Facebook predict the quality of audiences and segments, and run less than 100 campaigns at a time.
  • Managing for the concept of “significant edits”.  After February, if an advertiser paused a campaign, changed a campaign’s budget by more than 40%, or changed an ad’s bid by more than 30%, the campaign would be moved from an optimized mode back into what’s called the “learning” phase or mode.
  • All ads start out in the “learning” mode, then graduate to the optimized phase. Completing the learning phase typically requires about 50 conversions per ads set per week.  Once that’s completed, the campaign shifts into optimization mode, and CPMs tend to drop by 30%. Clearly, that’s far more efficient, so there are real motivations to avoid substantial edits and keep campaigns in optimized mode.
  • Google’s implementation of this principle is even more pronounced than Facebook’s. Google UACs are all but “set it and forget it” at this point. It takes some money and patience for Google’s algorithm to get good results, but it usually happens within a couple of days.
  • Shifting from Mobile App Installs (MAI) to Value Optimization (VO) and App Event Optimization (AEO). Before, we had used campaign goals like Mobile App Installs (MAI). Now we use Value Optimization (VO) and App Event Optimization (AEO) as campaign goals.

“When marketers advertise with Facebook, they want to build campaigns that ultimately drive efficient return on their ad spend. Yet managing and optimizing their campaigns previously required using proxy metrics such as clicks, impressions, and installs to gauge whether or not a campaign had driven meaningful business impact. Today we’re introducing value optimization so that marketers can focus their campaigns on anticipated purchase value.

Value optimization works by using the purchase values sent from the Facebook pixel to estimate how much a person may spend with your business over a seven-day period. The ad’s bid is then automatically adjusted based on this estimation, allowing campaigns to deliver ads to people likely to spend more with your business at a low cost.”

– Facebook

This makes sense; it’s revenue we want – not necessarily just app installs. So if the algorithm can crunch the data well enough to show us not just who’s most likely to install an app, but who’s most likely to purchase from it, that’s a big help.

  • Using value-based Lookalike Audiences When we add in another new Facebook tool value-based Lookalike Audiences, we can let the algorithm not just find people who are likely to purchase, but people who are likely to make a big purchase. For example, a $20 purchase versus a 99-cent purchase.

 

This also illustrates how the new algorithm-driven customer targeting allows us to vastly expand our audiences. If the algorithm is smart enough to handpick people, there’s no need for narrowly-defined audiences. Cast the net wide, and give Facebook or Google plenty of people to choose from. They’ll find the right ones.

Table of Contents section 2