2019 Creative Testing Best Practices

2019 Creative Testing Best Practices

  • by Brian Bowman | July 15, 2019
  • Facebook Advertising
  • No Comments (0)

Facebook and Google are better at advertising than you are. 

Don’t take it personally. It’s just that you can’t crunch 10,000 data points in a minute. And you can’t keep recalculating those 10,000 data points over and over – hour after hour – for weeks, much fewer years on end. 

That doesn’t mean you’re obsolete. It doesn’t mean you’re out of a job (yet). It just means it’s time to let the machines do what they do best, while we humans do what we do best. 

Fortunately for us, AI and machine learning algorithms still can’t make good creative assets. They aren’t that great at strategically testing creative. And they can’t do a competitive creative analysis yet, either. Or develop a creative strategy.

So that’s what we recommend user acquisition managers focus on now: Creative. Creative analysis, creative testing, all fueling a data-driven creative strategy. Let the machines handle the quantitative aspects of user acquisition while we focus on what only humans can do.  

 

How We Got Here

Over the last few years, the machine learning algorithms at Facebook and Google have grown very efficient at buying media and managing bids and placements and audiences. As a result, the levers that user acquisition managers had used for campaign optimization are increasingly being taken away. The campaign optimization work that humans have been doing for years can increasingly be automated. 

But one major area is still well within human control: Creative. 

Facebook and Google’s machine learning algorithms can’t develop creative. That may change in the future, but for now, humans are still the undisputed champions at creative development, strategy, and creative testing. 

 

A Brief History Of How User Acquisition Advertising Became Automated

 

Google

Google App Campaigns launched in 2015 (as “Google Universal App Campaigns” at the time). From the start, it was something of a black box. All the app advertising campaigns on the platform were automatically switched over the new campaign type, and Google Ads’ machine learning algorithm took over the controls of placement, bids and audience selection. 

If you were okay having your campaigns become automated, this was all good news. As Google put it at the time, “All you need to do is provide some text, a starting bid, and budget, and let us know the languages and locations for your ads. Our systems will test different combinations and show ads that are performing the best more often, with no extra work needed from you.”

Facebook

Facebook took a different approach. Instead of doing everything in one sweep, they have incrementally taken away certain levers from advertisers, and they’ve usually made those changes optional (with the notable exception of Campaign Budget Optimization, which will become mandatory for most advertisers as of September 1, 2019).

But Facebook has been moving toward algorithm-controlled campaigns for a while. AEO (App Event Optimization), VO (Value optimization) and LTV (lifetime value targeting) were one of the first steps. Advertisers could specify those types of goals and then let the algorithm figure out how to best get new users for a price set by the advertisers. But even with that level of automation, it was (and is) still a human picking which type of optimization strategy they want to use. When Campaign Budget Optimization goes into effect in September, humans may no longer be picking the bidding strategy.

With those new goal types, Facebook’s algorithm was controlling ad placements, audience, and when and which device ads showed on. Facebook also made it possible for advertisers to let the algorithm create new lookalike audiences for them once the advertiser had defined a “seed” audience. 

There was some creative testing involved in all this, too. Facebook, like Google, also developed a way to test creative elements via Dynamic Creative. This was complemented by Dynamic Language Optimization, where the algorithm picked which languages to run ads based on performance. 

Facebook’s Simplified Campaign Structure

So user acquisition has been being slowly taken over by algorithms for quite a while. Facebook new recommended simplified campaign structure, which came out barely a month ago, is yet another example of how rapidly user acquisition is becoming automated. 

The net of all this is that while Google is getting more complex, with Ad groups, Value Bidding, Similar Audience, and media library and asset reporting due to come out soon, Facebook is getting easier. 

We expect these two platforms will basically meet in the middle between total automation and human-control campaign control around Q1 2020 when Facebook’s Campaign Budget Optimization and Dynamic Creative Optimization really kick in. So for the next nine to twelve months or so, humans will still be managing a fair amount. But after that, the machines will take over. 

 

As User Acquisition Becomes Automated, What Are The Roles For Machines, and for Humans?

Machines are very good at quantification – crunching the numbers. As they become able to take over more and more of campaign optimization work, will it push human user acquisition managers out of a job?

We don’t think so. If you’re able to learn new skills and you can pivot effectively, you’ll probably be okay. And honestly, most good UA managers already have those two skills down anyway. 

So here’s the deal: We expect that human UA managers probably have about nine to twelve months to enhance their skills so they can be ready for when the machines really do take over the bulk of user acquisition advertising. 

Quantitative Creative Testing

In the next few months, UA managers should focus on math-based creative testing (which we call quantitative creative testing), establish a thorough competitive ad research methodology, and develop a robust a/b testing strategy and system so they can test ads as efficiently and as affordable as possible. 

Creative teams also need to learn how to do robust creative audits and how to develop effective creative strategies. They have to become data-driven, rather than brand-driven.

The old dichotomy between creative, right-brained people and quantitative, left-brained people needs to be bridged, and maybe even merged. These two groups need to learn how to speak each other’s language. People who can speak both “left-brain” and “right-brain” will become even more valuable to high-performance advertising teams. If you can bridge those two skill sets and ways of thinking, you don’t have to worry about losing your job. You might even get a raise. 

Third-party Adtech Tools Obsolete

But that’s only one aspect of this shift to automation. The evolution of Google and Facebook’s ad platforms have had one other massive consequence: they’ve made third-party adtech tools basically obsolete… or at the very least, crushed their value. Now that anyone can have access to world-class advertising AI platforms via Google or Facebook advertising, having a fancy adtech tool is no longer as much of an advantage. 

Optimizing these platforms has one other major consequence: Because Facebook and Google’s ad platforms are increasingly automated, they’ve made it possible for almost anyone to advertise profitably. So increasingly you don’t need to be an advertising whiz to get good ROAS. 

This means businesses can hire less skilled people. It also means that exponentially more advertisers can get success with advertising on these platforms. Automation may end up being the best move Google and Facebook have ever made to grow their advertising user base.

So now that we’ve covered what’s happened with Google and Facebook, and how advertising automation will affect advertisers, user acquisition managers, adtech tools, and the platforms themselves let’s shift into today. 

Here’s what you need to be doing right now if you want to run a competitive advertising program in Q3 2019. 

 

Creative Testing Best Practices for Q3 2019

User acquisition advertising is evolving rapidly. Every few months for the last few years, either Facebook or Google have made significant changes to their platforms that have made it more and more possible to automate user acquisition advertising. And because these changes are available to everyone, competition has increased. Any competitive advantage third-party adtech tools had given is gone. 

This has leveled the playing field. The last thing the machines have not automated or started to automate – creative – ends up being a UA manager’s last competitive advantage. 

This makes every aspect of creative vital to success. 

 

Most Ads Fail

Creative excellence isn’t easy. High-performance, control-beating creative is a rare thing. In our experience, after spending over a billion dollars in user acquisition advertising, usually only one out of twenty ads can beat the current “control” (the top-performing ad). 

The reality is, most ads fail. The chart below shows the results of about 600 different ads. Spend was distributed based on performance. As you can see, out of those 600 ads, only a handful was responsible for the lion’s share of results. 

The extremely high failure rate of most creative shapes advertising budgets and advertising testing. Because 95% of creative fails, if you can’t test ads quickly and affordably, your campaign performance is going to be crippled. 

But testing alone isn’t enough. You also have to generate enough creative to fuel that testing machine. Because 19 out of 20 ads fail don’t just need one new piece of creative; you need 20. 

And because creative fatigues so quickly, you don’t need 20 new creatives every year or so. You need 20 new creative concepts every month, or possibly even every week. 

 

How to Test UA Creative in Q3 2019 Quickly and Affordably 

So 95% of new ads will fail to out-perform the “control” (the current top-performing ad) in an advertiser’s account. This is true regardless of the platform, the industry, or anything else. It’s true for your competitors, too. 

And so if most ads fail, the best way to find a breakout, a control-beating ad is to test a lot of creative. 

The secret is to do that testing affordably and quickly. Because there are a lot of ways to test creative. There are dozens of variables in any given ad to test. If you don’t have a disciplined, methodic practice for testing, it’s easy to blow way too much time and way too much money on testing creative and still not have a winner. 

Prototype Ads

We use “prototype” ads to get around those pitfalls. Prototype ads are run for short periods of time (bursts, really) to determine if they will perform or not. These ads may bend or even break brand guidelines, but this doesn’t matter too much because they’ll typically get only 25,000-50,000 impressions or less. 

We overcome most of the limitations of statistical relevance with prototype ads, too, because we aren’t looking for minor performance differences – we’re looking for 10x, even 100x results. Breakout ads. So we don’t need to have as many conversion events accrue as we would if we were looking for minor differences in performance. Discerning a winner that performs 25% or better than the control takes much less time than discerning a winner that performs only 2-5% better.

Of course, most of the prototype ads we test will fail. But when a prototype ad does perform, we’ll clean it up a bit to make it more brand complaint, and then test it again to a larger audience. 

Prototype ads work because they let us test dozens of new concepts at a time, hopefully with very few creative restrictions. 

That is one key requirement for prototype ads: Creative freedom. Advertisers need to give both internal and external creative teams enough freedom to develop the kind of bold new concepts that become breakout ads. 

20% Concepts / 80% Variations

Prototype ads don’t just work for brand new concepts. They can also be used for variations. When we test variations we take a winning ad and use it basically as a template. Then we test dozens of slight variations to the ad to see if we can’t squeeze better results from it, or at least get it to last a little longer before ad fatigue sets in. 

80% of what we test is a variation. This minimizes the losses that happen when you test big, bold new concepts, but it still leaves room to test enough of those new concepts to keep ads fresh and to keep creative teams from getting stuck in a rut. 

Creative Testing Best Practices

We’ve tested hundreds of thousands of ads. Based on that experience, we’ve developed a creative testing methodology that allows us to find winners faster and more affordable than is traditionally possible. 

Our core methodology right now (it is always evolving) is as follows, in this order:

  • Creative Audit
  • Competitive Audit
  • Creative Strategy
  • Creative History of Winners and Losers
  • Concept Refresh
  • Winner Variation Testing
  • Asset Folders For Winning Ads

Here are the details for each of these best practices:

Creative Audit

We hate repeating work, and we hate repeating mistakes even more. So before we create new ads or develop a new strategy, we’ll dig deep into the prior performance of a user acquisition advertising account, with particular focus on the creative assets. 

Doing an audit like this will help us avoid repeating the same tests and mistakes an advertiser has made before. It will also give us valuable information about what has worked, and what might work going forward. 

The end result of a creative audit is to analyze, document and share what has worked and failed in the past six months. We focus on videos and images, ad copy, and which concepts and variations have performed best and worst. 

The goal is to identify key winning concepts and to fuel ideas for which approaches we want to test with the creative strategy. Then we’ll create a living document where winners and losers are visually documented so both creative teams and UA managers (internal or external) can learn from each other. 

 

Competitive Audit

Competitors’ ads are a bank vault of creative insights – if you know what to look for. The good news is that there’s a nearly endless supply of tested concepts. The bad news is 95% of your competitors’ ads fail, too. 

Facebook’s new Ads Library is a great way to see which ads your competitors have been running. But it lacks conversion data, impressions, and interaction data – all metrics essential to evaluating ads. So we also use tools like Social Ad Scout, Connect Explore, SocialPeta, AdSpy and other resources to get that information. 

Effectively evaluating competitors’ creative is a valuable skill. But it’s usually not ideal to have either a true creative or a true quant do it. Ideally, you want someone with a balance of both right and left brain thinking. Someone with a psychology background and good math skills might fit the profile. They need to be able to interpret the metrics on competitors’ ads but also be able to see the psychology behind why certain ads are working. 

A good creative evaluator will be able to see trends in both creative and data. They’ll come up with a hypothesis about why winning ads work, and then incorporate that hypothesis into a client’s creative strategy and testing protocol. 

Doing competitive audits like this can make a huge difference in long-term performance. This is a high-value activity UA managers would do well to put more time into. 

A good competitive audit will include:

  • An analysis of competitors ad concepts based on spend, conversions, engagements, and any other meaningful metrics you’re especially interested in. 
  • Documentation of how competitors have used the following elements:
    • Ad copy
    • Buttons
    • End/start cards
    • Calls to action
    • Messaging
    • Offers
    • Use of animation/motion
    • Colors
    • Backgrounds
    • Text placement
    • Characters
    • Logos and/or stickers
    • New elements
    • Old elements used in a new way

The final competitive analysis document will include screenshots of all these elements. It’s usually best to organize them into a spreadsheet so you can sort the data by:

  • Competitor
  • The ads they’re running
  • What you’ve noticed about those ads
  • What test ideas you’ve gotten by reviewing their ads

If you’re in need of a few truly “out of the box” ideas for new ads, look to other industries or niches. Find breakaway, 100x ads and then analyze them like you would a competitor’s ads. This is especially effective if you can find an industry or niche your highest value audience has a particular affinity for. 

 

Creative Strategy

This is a plan based on the creative audit and the competitive audit. Its goal is to reduce failure across the team (for both UA managers and creatives). It should be an evolving map of how to do testing and messaging. 

A creative strategy lays out what the team/s plan to do, the timeframe they intend to do it in, and the ideas they’re going to test. It usually includes:

  • a rough budget of ad spend and testing investments
  • a rough calendar of when tests and creative will launch
  • specifics about what the creative strategy’s key goals are

A good creative strategy is a living document, in that it will probably change over time, but it provides everyone (on both internal and external teams) a blueprint to work from.

 

Creative History of Winners and Losers

This is a document of everything we’ve tested, why we tested it, and the results of each test. It’s updated weekly, and every new version is shared with our client, our creative team and any other interested parties. 

The purpose of this document is to minimize repeated tests and mistakes and to build on what’s worked, regardless of which team is doing the testing or creative development. It’s a way to share the results of our work product and how we developed that product. 

 

Winner Variation Testing

We’ve mentioned how 80% of what we test are variations of winning ads, and how 20% are completely new concepts. 

Here’s one other way to use that framework. Once we have a winning ad, we’ll test every element it’s made of. This allows us to figure out which elements or combinations of elements are making the ad work. 

It takes quite a lot of tests and a fair amount of money to break up and test an ad like this, which is why we don’t do it for every ad – only 100x, breakout winning ads get analyzed like this. The information we learn from the analysis is vital to developing break-out ads going forward. It’s also fantastic for something called a “concept refresh.”

 

Concept Refresh

This is a new practice we’ve been getting excellent results with. Concept refreshes allow us to keep an ad alive for almost forever  – if we know which elements to keep the same and which to vary. But after we’ve done the winner variation testing mentioned above, we know which elements matter most. And so we can keep the anchor elements that are driving results, and refresh everything else in the ad. 

Asset Folders For Winning Ads

Whenever you get a winner, all the files – the videos, the Sketch files, the Illustrator files, the Photoshop files, the music files, all of it – get dumped into a folder. When you create variations, use the elements from the winning ads folder. Not the files from other variation tests. 

This may seem like a small thing, but tiny differences can have big effects. 

Don’t get sloppy with file naming conventions, either. Pick a naming convention and stick with it. That allows both internal and external teams to easily access the files they need quickly, without having to waste time looking for the right file. 

Conclusion

Creative development, testing, and strategy are still best done by human beings. The algorithms at Google and Facebook may be able to test creative elements, but they still can’t create those elements. They can’t do competitive analysis, either. And they can’t plan out a coherent creative strategy. 

If you’re an acquisition manager, we recommend you focus on expanding your skills in those areas. You don’t necessarily have to become creative, but you do need to show creatives how to become data-driven, and you need to be able to distill and interpret data for them so they can deliver better creative. 

But also keep your eye out for new opportunities. Machine learning is a powerful tool, but it flunks if you ask it to solve for problems it hasn’t encountered before, or if you ask it to interpret data in ways it hasn’t been programmed to do. 

So if, say, a machine learning algorithm was asked to optimize an ad for augmented reality, or if it was asked to optimize voice-enabled ads, it wouldn’t perform well. 

A human, however – a smart, agile user acquisition manager – might be able to take what they’ve learned and applied that knowledge to the new situation. 

Humans have always been good at this. We’re great at adapting. And it’s a skill we’re going to need going forward.

 

Contact Us Today!