Video Ad Creative Best Practices

What’s the best way to get more results from your user acquisition advertising? We say it’s video ad creative.

 

Video Ad Creative Best Practices

 

After spending over $3 billion dollars on user acquisition advertising in the last four years, here’s where our biggest wins have come from:

If you drill down into which creative elements drove the results from creative testing, it breaks out like this:

  • 60% of videos or images
  • 30% text
  • 10% headlines and calls to action

While videos and images do share first place for their impact on results, videos tend to outperform images. You can see this in almost any study of social media ads or content done in the last few years: Video content almost always outperforms still images.  Side note: we have found that Facebook Carousel ads work great for B2B apps and websites.

With ads, the likelihood of a video beating a still image goes even higher. This is why Facebook and Google (and creative partners like ourselves) have gone to such lengths to make adding motion to ads easier. Facebook launched its “Video Creation Kit,” and we’ve spent a lot of time developing our own best practices for how to add motion to still images and AdRules offers a video editor and bulk sheet editing to create thousands of videos for localization, resizing, and quick additions of end cards and start cards all done inside our Creative Studio.

video ad
The image is from https://www.consumeracquisition.com/guide-to-creating-facebook-video-ads-using-still-images/

 

So if creative testing delivers the most improvements for user acquisition campaigns… and video delivers the bulk of those creative testing improvements… if you could only do one thing to improve your advertising, it would be to create better videos.

That’s exactly what we’re going to cover in this post.

 

Quantitative Creative Testing

Quantitative creative testing for user acquisition campaigns is essential now, of course. And we’ve written about it in several posts. So let’s approach video advertising best practices by breaking video ads down into their essential parts. Those would include:

  • Ad copy
  • Buttons
  • End/start cards
  • Calls to action
  • Messaging
  • Offers
  • Use of animation/motion
  • Colors
  • Backgrounds
  • Text placement
  • Video ratio/ length

Here’s how to optimize each of those elements:

 

Ad copy

We strongly recommend checking your competitors’ ads to see which ad copy they’re using over and over again. A simple twist on what’s been working for them could help your ads a lot.

But while your competitors’ ads can be helpful, there’s one golden rule to follow: Use emotion.

Use emotion as often and as strongly as you can. Emotion rules over rationality, especially for game and lifestyle apps.

This is basically a new spin on that old ad copy rule of “sell benefits, not features.” The most fundamental (and powerful) benefit ever is how a product or service makes people feel. So always focus on emotions. They’re the ultimate product experience.

Here’s how Gardenscapes leverages emotion and empathy to urge users to come back to play their game.

gardenscapes
Image is from @Gardenscapes ads in the Facebook Ads Library

 

Buttons

Speaking of old-school advertising tricks – if you had to distill any ad down to two things, they would be the headline and the call to action. The ad copy in a video ad serves as the headline. Buttons are the call to action.

Once again, checking your competitors’ ads can be helpful, but don’t stop there. There are plenty of ways to test buttons. Try using “my” on your buttons instead of “you” – this is yet another old direct response trick that still works in 2019.

Whatever you do, make the copy on your buttons clear. Confused people don’t take action. And you’re pitching to people who are scrolling through a river of information; even the slightest whiff of confusion is enough to suppress conversion rates.

 

Start and end cards

Not using these as part of your video advertising? We think you’re missing out.

Here’s why start and end cards work so well:

  • Tell the story of the ad.
  • Create a powerful first frame visual.

 

Start Cards

Start cards are simple. They usually include just the name of the app. Try that, but also try going a step further. Adding a call to action can help a lot.

Some apps use calls to action or phrases that introduce gameplay or explain to the user what they should do, almost like a mini-tutorial. For example, Gardenscapes will use a CTA like “Save Your Garden!” or “Make a Choice to Save Them!”

Some apps try to draw in the consumer’s attention by using phrases such as “The Best Matching Game” or “It’s Harder Than It Looks!” on their start cards. These kinds of phrases draw the consumer to watch the video instead of scrolling past it.

While start cards are not as important as end cards, they serve an important purpose: They can stop people from scrolling past your ad when it shows up on their timeline or social media feed. And if people never see past the start card, they’ll never see any other part of your ad.

video ad
The image is from @socialpeta

 

End Cards

End cards are used to pique interest in the game based on the call to action and the brand slogan.

They work in part thanks to a psychology principle called “the recency effect.” When consumers are looking for new apps, they are more likely to remember apps whose ads use end cards.

Most end cards have the app’s name and a call to action like “Accept the Challenge!” or “Try it Yourself!” Some include a button, too, with copy like “Download Now!” or “Play Now!”

video ad
The image is from @homescapes on the @socialpeta site
video ad
The image is from a Gardenscapes ad

 

Many advertisers also include prompts to download their app on the app store/Google Play, but we don’t recommend that. Including the platform logos frequently drop the conversion rate by 10-15%.

You can try blurring out the gameplay in the background of the end card. We’ve seen it work well, and lots of gaming apps use that tactic.

 

Text placement, fonts, colors, and emojis

How text looks and where it’s placed can have a big effect on conversion rates. Again, you can get some ideas from your competitors’ ads in the Facebook Ads Library and other tools, but we like to position text towards the top and bottom of the screen and to use bright, pure colors for optimal response rates.

These aspects of an ad are important in another way, too. They’re good examples of elements best chosen by Creative teams. User acquisition managers or teams may want to manage these aspects of ads:

  • Call to actions
  • Messaging
  • Buttons
  • Headers and Footers

But Creative Teams should be given authority to pick the text attributes we’ve just mentioned, plus background images, button colors, and fonts, and video ad aspect ratios and ad lengths.

video ad
The image is from @luckyday on the @socialpeta site

 

Video ad ratio

Landscape, square, or horizontal? But this aspect of your ads can have a huge effect on performance.

We recommend every advertiser use at least these three aspect ratios in their campaigns:

video ad
Image is from Mobile Apps Unlocked Consumer Acquisition – FINAL powerpoint deck

 

Of course, creating videos for every ratio and placement size is a lot of work. This is why only your highest-performing video ads should be made into every possible size and aspect ratio. Otherwise, you’ll just waste a lot of time and budget creating endless versions of low-performing videos.

That said, because Facebook and Google’s media buying is mostly automated now, any video you run may be seen across a huge array of properties. This is part of why we recommend using videos so much – we’ve found that 45% of total impressions on iOS are from video ads.

Also consider creating more than one video, even if you can’t afford robust video ad creative testing. We’ve found that adding just two videos to almost any campaign will increase conversions by 25%. The more you spend, the more videos you need, too. Advertisers spending even $50,000-$75,000 per month should create at least four new high-performing videos per month.

Google has one other interesting recommendation: Make videos that flex in length. So even if you stick with one video aspect ratio, create three versions of that same video ad. One 10 seconds long, another that 15 seconds long, and one that’s 30 seconds long. Then let the algorithm figure out where to show the different ads.

 

Video Ad Creative Conclusion

If you haven’t been using many – or any – video ads for your user acquisition campaigns, that needs to change right now. Videos convert. They’re more than worth the extra investment.

Or maybe you’re a step ahead of that. But if you’ve only been creating one or two videos at a time, it’s time to upgrade your advertising to develop more video assets, stat. You need to be testing multiple ads, and then when you find a winner, make multiple versions of that winning ad in different aspect ratios and in different video ad lengths.

Giving Facebook and Google good, flexible creative like that lets the algorithms do their work far more effectively. It will get your user acquisition campaigns a considerably better return on ad spend, too.

 

Media Buying Best Practices For Creative Testing

In 2019, the success or failure of user acquisition advertising campaigns comes down to creative testing.  

Facebook and Google’s shifts toward automation have removed most of the advantages of third-party adtech tools used to deliver. Those platforms’ automation of bid and budget management and audience selection has leveled the playing field even more. 

But creative is still an opportunity. The algorithms can test different creative elements of ads, but they cannot create those elements. Creative is still best done by human beings. 

Trouble is the vast majority of creative fails. 95% of new creative will not beat the current control. And if an ad can’t beat the control, there’s no point in running it. It’ll only cost you money. 

So the real competitive advantage lies not just in creative, but in testing creative – to identify creative winners as quickly as possible, and for the lowest amount of spend per variation. 

 

Creative Testing

We’ve created and tested tens of thousands of ads in the last two years. We’ve profitably managed over $1 billion in ad spend. Here’s where our biggest wins have come from: 

  • 60% creative testing
  • 30% audience expansion
  • 10% of everything else

Creative is the differentiator. Creative is where the bulk of the big wins are. And from those creative wins, here’s which elements tend to win the most:

  • 60% of videos
  • 30% text
  • 10% headlines and calls to action

That gives you an idea of where to start your tests. But it’s not all you need to know. Take Facebook user acquisition advertising, for example. It has several hidden challenges, including:

  • Multiple strategies for testing ads – It’s nice to have choices, but they can complicate things. You can test creative on Facebook with their split-test feature, or by setting up one ad per ad set, or by setting up many ads within an ad set. Which one you pick will affect your testing results.
  • Data integrity – The data for each of your tests won’t come in evenly. Some ads will get more impressions than others. The CPM   for different ads and ad sets will vary. This makes for noise in the data, which makes it harder to determine the winning ad.
  • Cost – Testing has an extremely high ROI, but it can also have a very high investment cost. If you don’t set up your creative testing right, it can be prohibitively expensive.
  • Bias – Facebook’s algorithm prefers winning ads. And because you’ll be running your control against each new ad, the system will favor the winning ad. This skews the data even more and makes it harder to establish which ad won.

Running tests in Google Ads has many similar challenges, but it is about to get easier soon when Google App Campaigns launches “asset reporting” (expected towards the end of this July). 

 

Perfect Versus Cost-Effective Creative Testing

Let’s take a closer look at the cost aspect of creative testing – and how to overcome it. 

In classic testing, you need a 95% confidence rate to declare a winner. That’s nice to have, but getting a 95% confidence rate for in-app purchases will end up costing you $20,000 per variation. 

Here’s why: To reach a 95% confidence level, you’ll need about 100 purchases. With a 1% purchase rate (which is typical for gaming apps), and a $200 cost per purchase, you’ll end up spending  $20,000 for each variation in order to accrue enough data for that 95% confidence rate. 

And that’s actually the best-case scenario. Because of the way the statistics works, you’d also have to find a variation that beats the control by 25% or more for it to cost “only” $20,000. A variation that beat the control by 5% or 10% would have to run even longer to achieve a 95% confidence level. 

That’s a deal killer for a lot of businesses. Few advertisers can afford to spend $20,000 per variation, especially if 95% of new creative fails to beat the control. 

So what do you do? 

You move the conversion event you’re targeting up a little in the sales funnel. Instead of optimizing for purchases, for mobile apps, you optimize for impression to install rate (IPM). For websites, you optimize for impression to top-funnel conversion rate.

 

Why impression to action rate?

The obvious concern here is that ads with high CTRs and high conversion rates for top-funnel events may not be true winners for down-funnel conversions and ROI / ROAS.  But while there is a risk of identifying false positives with this method, we’d rather take this risk than the risk and expense of optimizing for bottom-funnel metrics.

If you decided to test for bottom-funnel performance anyway:

  1. You would be substantially increasing the spend per variation and you’d introduce substantial risk into your portfolio’s metrics.
  2. Or you’d need to rely on fewer conversions to make decisions, which runs the risk of identifying false positives.

Here’s one other benefit: When we’re optimizing for IPM (installs per thousand impressions), we’re effectively optimizing for relevance score. 

As you know, a higher relevance score (Quality Rank, Engagement Rank or Conversion Rank) comes with lower CPMs and access to higher-quality impressions. Ads with higher relevance scores and lower revenue per conversion will often outperform ads with lower relevance scores and higher revenue per conversion because Facebook’s algorithm is biased towards ads with higher relevance scores.

So optimizing for installs works better than optimizing for purchases on several levels. Most importantly, it means you can run tests for $200 per variation because it only costs $2 to get an install. For many advertisers, that alone can make more testing possible. There just aren’t a lot of companies that can afford to test if it’s going to cost $20,000 per variation. 

Here are a few other best practices to make the whole creative testing system work:

  • Facebook’s split-testing feature – We mentioned earlier that there are several different ways to test ads, even within Facebook. Skip the other options and just use their split-testing feature.
  • Always test against a top-performing control – If you don’t test every variation against your control, you’ll never know if the new ad will actually beat your control. You’ll only know how the new ad performed compared to the other new ads you tested, which doesn’t actually help.
  • Only test on Facebook’s news feed – There are 14 different placements available in Facebook’s ad inventory. Testing all of them at once creates a lot of noise in the test data as each placement has different CPMs, conversion rates, and CTRs. So don’t do that. Keep the data clean and just test for the news feed.
  • Optimize for app installs – This is a major lever in getting your costs down. It may not be a perfect solution, but it works well enough.
  • Aim for 100 installs minimum – You need at least 100 installs to reach statistical significance. We can bend the rules of statistics a bit to find winning ads faster and cheaper, but we cannot break the rules entirely.
  • Use the right audience – Use an audience that’s considered high quality so it’s representative of performance at scale, but also one that isn’t being used elsewhere in your account. This minimizes the chance that audiences used in test cells could be exposed to other ads that are running concurrently in other ad sets.
  • Consistent data drives higher confidence in results – Never judge a test by its first day of results. Look at aggregate performance for variations and for stability across day-to-day results. If test data is consistent and winners and losers are no longer changing day-over-day, your test results will be far more reliable than if cumulative variation performance is still changing day by day.

The data below shows the winner changing on the last day, which is an indicator that additional data would increase our confidence in the data. The winner “orange line” had a poor day 1, but a strong day 2 and day 3. This is an indicator that the results were still changing at the time of test completion.

Ad Build Results for creative testing

How to Test “Predicted Winners”

If you follow all those best practices, you may have a few new ads that have performed well (and reliably) against the control. You’ll have what we call “predicted winners.”

We know these newly-tested predicted winners performed well against the control in a limited test. What we don’t know is what will happen if we really increase their exposure and test them against purchases and for ROAS. Will they continue to perform?

To find that out, each winning variation should be launched into existing ad sets. These variations should be allowed to compete for impressions versus other top ads. This will allow us to verify whether these new predicted winners are holding up at scale.

Run through this process enough, and you’ll finally have found that precious thing… an ad that beats your control.

Pro Tips for Mobile User Acquisition Testing

  1. Get back to testing – Your new winners will soon fatigue and performance will deteriorate. The best way to offset performance fatigue is to replace old creative with new creative winners
  2. Do more competitive creative analysis – If you’re a new media buyer (or even an experienced one), spend every bit of free time you have doing competitive creative analysis. It’s a high-return activity that will help you generate better ads to run your tests with.
  3. Don’t trash the near winners – Sometimes in tests, we’ll have ads that were within 10% of the control’s performance but didn’t quite beat it. We don’t just kill those “near winners.” We’ll send them back to the creative team so they can tweak those ads just a bit to improve performance.

 

Conclusion for Creative Testing

Quantitative creative testing has one of the highest ROIs of any business activity. No matter who you are, or how much creative testing you’re doing, do more of it. It’s the single best way to improve the ROAS for your accounts.

2019 Creative Testing Best Practices

Facebook and Google are better at creative advertising than you are. 

Don’t take it personally. It’s just that you can’t crunch 10,000 data points in a minute. And you can’t keep recalculating those 10,000 data points over and over – hour after hour – for weeks, much fewer years on end.

That doesn’t mean you’re obsolete. It doesn’t mean you’re out of a job (yet). It just means it’s time to let the machines do what they do best, while we humans do what we do best. 

Fortunately for us, AI and machine learning algorithms still can’t make good creative assets. They aren’t that great at strategically testing creative. And they can’t do a competitive creative analysis yet, either. Or develop a creative strategy.

So that’s what we recommend user acquisition managers focus on now: Creative. Creative analysis, creative testing, all fueling a data-driven creative strategy. Let the machines handle the quantitative aspects of user acquisition while we focus on what only humans can do.  

 

How We Got Here

Over the last few years, the machine learning algorithms at Facebook and Google have grown very efficient at buying media and managing bids and placements and audiences. As a result, the levers that user acquisition managers had used for campaign optimization are increasingly being taken away. The campaign optimization work that humans have been doing for years can increasingly be automated. 

But one major area is still well within human control: Creative. 

Facebook and Google’s machine learning algorithms can’t develop creative. That may change in the future, but for now, humans are still the undisputed champions at creative development, strategy, and creative testing. 

 

A Brief History Of How User Acquisition Advertising Became Automated

 

Google

Google App Campaigns launched in 2015 (as “Google Universal App Campaigns” at the time). From the start, it was something of a black box. All the app advertising campaigns on the platform were automatically switched over the new campaign type, and Google Ads’ machine learning algorithm took over the controls of placement, bids, and audience selection. 

If you were okay having your campaigns become automated, this was all good news. As Google put it at the time, “All you need to do is provide some text, a starting bid, and budget, and let us know the languages and locations for your ads. Our systems will test different combinations and show ads that are performing the best more often, with no extra work needed from you.”

Facebook

Facebook took a different approach. Instead of doing everything in one sweep, they have incrementally taken away certain levers from advertisers, and they’ve usually made those changes optional (with the notable exception of Campaign Budget Optimization, which will become mandatory for most advertisers as of September 1, 2019).

But Facebook has been moving toward algorithm-controlled campaigns for a while. AEO (App Event Optimization), VO (Value optimization) and LTV (lifetime value targeting) were one of the first steps. Advertisers could specify those types of goals and then let the algorithm figure out how to best get new users for a price set by the advertisers. But even with that level of automation, it was (and is) still a human picking which type of optimization strategy they want to use. When Campaign Budget Optimization goes into effect in September, humans may no longer be picking the bidding strategy.

With those new goal types, Facebook’s algorithm was controlling ad placements, audience, and when and which device ads showed on. Facebook also made it possible for advertisers to let the algorithm create new lookalike audiences for them once the advertiser had defined a “seed” audience. 

There was some creative testing involved in all this, too. Facebook, like Google, also developed a way to test creative elements via Dynamic Creative. This was complemented by Dynamic Language Optimization, where the algorithm picked which languages to run ads based on performance. 

Facebook’s Simplified Campaign Structure

So user acquisition has been being slowly taken over by algorithms for quite a while. Facebook new recommended simplified campaign structure, which came out barely a month ago, is yet another example of how rapidly user acquisition is becoming automated. 

The net of all this is that while Google is getting more complex, with Ad groups, Value Bidding, Similar Audiences, and media library and asset reporting due to come out soon, Facebook is getting easier. 

We expect these two platforms will basically meet in the middle between total automation and human-control campaign control around Q1 2020 when Facebook’s Campaign Budget Optimization and Dynamic Creative Optimization really kick in. So for the next nine to twelve months or so, humans will still be managing a fair amount. But after that, the machines will take over. 

 

As User Acquisition Becomes Automated, What Are The Roles For Machines, and for Humans?

Machines are very good at quantification – crunching the numbers. As they become able to take over more and more of campaign optimization work, will it push human user acquisition managers out of a job?

We don’t think so. If you’re able to learn new skills and you can pivot effectively, you’ll probably be okay. And honestly, most good UA managers already have those two skills down anyway. 

So here’s the deal: We expect that human UA managers probably have about nine to twelve months to enhance their skills so they can be ready for when the machines really do take over the bulk of user acquisition advertising. 

 

Quantitative Creative Testing

In the next few months, UA managers should focus on math-based creative testing (which we call quantitative creative testing), establish a thorough competitive ad research methodology, and develop a robust a/b testing strategy and system so they can test ads as efficiently and as affordable as possible. 

Creative teams also need to learn how to do robust creative audits and how to develop effective creative strategies. They have to become data-driven, rather than brand-driven.

The old dichotomy between creative, right-brained people and quantitative, left-brained people needs to be bridged, and maybe even merged. These two groups need to learn how to speak each other’s language. People who can speak both “left-brain” and “right-brain” will become even more valuable to high-performance advertising teams. If you can bridge those two skill sets and ways of thinking, you don’t have to worry about losing your job. You might even get a raise. 

Third-party Adtech Tools Obsolete

But that’s only one aspect of this shift to automation. The evolution of Google and Facebook’s ad platforms have had one other massive consequence: they’ve made third-party adtech tools basically obsolete… or at the very least, crushed their value. Now that anyone can have access to world-class advertising AI platforms via Google or Facebook advertising, having a fancy adtech tool is no longer as much of an advantage. 

Optimizing these platforms has one other major consequence: Because Facebook and Google’s ad platforms are increasingly automated, they’ve made it possible for almost anyone to advertise profitably. So increasingly you don’t need to be an advertising whiz to get good ROAS. 

This means businesses can hire less skilled people. It also means that exponentially more advertisers can get success with advertising on these platforms. Automation may end up being the best move Google and Facebook have ever made to grow their advertising user base.

So now that we’ve covered what’s happened with Google and Facebook, and how advertising automation will affect advertisers, user acquisition managers, adtech tools, and the platforms themselves let’s shift into today. 

Here’s what you need to be doing right now if you want to run a competitive advertising program in Q3 2019. 

 

Creative Testing Best Practices for Q3 2019

User acquisition advertising is evolving rapidly. Every few months for the last few years, either Facebook or Google has made significant changes to their platforms that have made it more and more possible to automate user acquisition advertising. And because these changes are available to everyone, competition has increased. Any competitive advantage third-party adtech tools had given is gone. 

This has leveled the playing field. The last thing the machines have not automated or started to automate – creative – ends up being a UA manager’s last competitive advantage. 

This makes every aspect of creative vital to success. 

 

Most Ads Fail

Creative excellence isn’t easy. High-performance, control-beating creative is a rare thing. In our experience, after spending over 3 billion dollars in user acquisition advertising, usually only one out of twenty ads can beat the current “control” (the top-performing ad). 

The reality is, most ads fail. The chart below shows the results of about 600 different ads. Spend was distributed based on performance. As you can see, out of those 600 ads, only a handful was responsible for the lion’s share of results. 

creative ad spend

The extremely high failure rate of most creative shapes advertising budgets and advertising testing. Because 95% of creative fails, if you can’t test ads quickly and affordably, your campaign performance is going to be crippled. 

But testing alone isn’t enough. You also have to generate enough creative to fuel that testing machine. Because 19 out of 20 ads fail don’t just need one new piece of creative; you need 20. 

And because creative fatigues so quickly, you don’t need 20 new creatives every year or so. You need 20 new creative concepts every month, or possibly even every week. 

How to Test UA Creative in Q3 2019 Quickly and Affordably

So 95% of new ads will fail to out-perform the “control” (the current top-performing ad) in an advertiser’s account. This is true regardless of the platform, the industry, or anything else. It’s true for your competitors, too. 

And so if most ads fail, the best way to find a breakout, a control-beating ad is to test a lot of creative. 

The secret is to do that testing affordably and quickly. Because there are a lot of ways to test creative. There are dozens of variables in any given ad to test. If you don’t have a disciplined, methodic practice for testing, it’s easy to blow way too much time and way too much money on testing creative and still not have a winner. 

 

Prototype Ads

We use “prototype” ads to get around those pitfalls. Prototype ads are run for short periods of time (bursts, really) to determine if they will perform or not. These ads may bend or even break brand guidelines, but this doesn’t matter too much because they’ll typically get only 25,000-50,000 impressions or less. 

We overcome most of the limitations of statistical relevance with prototype ads, too, because we aren’t looking for minor performance differences – we’re looking for 10x, even 100x results. Breakout ads. So we don’t need to have as many conversion events accrue as we would if we were looking for minor differences in performance. Discerning a winner that performs 25% or better than the control takes much less time than discerning a winner that performs only 2-5% better.

Of course, most of the prototype ads we test will fail. But when a prototype ad does perform, we’ll clean it up a bit to make it more brand complaint, and then test it again to a larger audience. 

Prototype ads work because they let us test dozens of new concepts at a time, hopefully with very few creative restrictions. 

That is one key requirement for prototype ads: Creative freedom. Advertisers need to give both internal and external creative teams enough freedom to develop the kind of bold new concepts that become breakout ads. 

 

20% Concepts / 80% Variations

Prototype ads don’t just work for brand new concepts. They can also be used for variations. When we test variations we take a winning ad and use it basically as a template. Then we test dozens of slight variations to the ad to see if we can’t squeeze better results from it, or at least get it to last a little longer before ad fatigue sets in. 

80% of what we test is a variation. This minimizes the losses that happen when you test big, bold new concepts, but it still leaves room to test enough of those new concepts to keep ads fresh and to keep creative teams from getting stuck in a rut. 

concepts & variations

 

Creative Testing Best Practices

We’ve tested hundreds of thousands of ads. Based on that experience, we’ve developed a creative testing methodology that allows us to find winners faster and more affordable than is traditionally possible. 

Our core methodology right now (it is always evolving) is as follows, in this order:

  • Creative Audit
  • Competitive Audit
  • Creative Strategy
  • Creative History of Winners and Losers
  • Concept Refresh
  • Winner Variation Testing
  • Asset Folders For Winning Ads

Here are the details for each of these best practices:

 

Creative Audit

We hate repeating work, and we hate repeating mistakes even more. So before we create new ads or develop a new strategy, we’ll dig deep into the prior performance of a user acquisition advertising account, with a particular focus on the creative assets. 

Doing an audit like this will help us avoid repeating the same tests and mistakes an advertiser has made before. It will also give us valuable information about what has worked, and what might work going forward. 

The end result of a creative audit is to analyze, document, and share what has worked and failed in the past six months. We focus on videos and images, ad copy, and which concepts and variations have performed best and worst. 

The goal is to identify key winning concepts and to fuel ideas for which approaches we want to test with the creative strategy. Then we’ll create a living document where winners and losers are visually documented so both creative teams and UA managers (internal or external) can learn from each other. 

Competitive Audit

Competitors’ ads are a bank vault of creative insights – if you know what to look for. The good news is that there’s a nearly endless supply of tested concepts. The bad news is 95% of your competitors’ ads fail, too. 

Facebook’s new Ads Library is a great way to see which ads your competitors have been running. But it lacks conversion data, impressions, and interaction data – all metrics essential to evaluating ads. So we also use tools like Social Ad Scout, Connect Explore, SocialPeta, AdSpy and other resources to get that information. 

creative

Effectively evaluating competitors’ creative is a valuable skill. But it’s usually not ideal to have either a true creative or a true quant do it. Ideally, you want someone with a balance of both right and left brain thinking. Someone with a psychology background and good math skills might fit the profile. They need to be able to interpret the metrics on competitors’ ads but also be able to see the psychology behind why certain ads are working. 

A good creative evaluator will be able to see trends in both creative and data. They’ll come up with a hypothesis about why winning ads work, and then incorporate that hypothesis into a client’s creative strategy and testing protocol. 

Doing competitive audits like this can make a huge difference in long-term performance. This is a high-value activity UA managers would do well to put more time into. 

 

A good competitive audit will include:

1. An analysis of competitors’ ad concepts based on spend, conversions, engagements, and any other meaningful metrics you’re especially interested in. 

2. Documentation of how competitors have used the following elements:

  • Ad copy
  • Buttons
  • End/start cards
  • Calls to action
  • Messaging
  • Offers
  • Use of animation/motion
  • Colors
  • Backgrounds
  • Text placement
  • Characters
  • Logos and/or stickers
  • New elements
  • Old elements used in a new way

3. The final competitive analysis document will include screenshots of all these elements. It’s usually best to organize them into a spreadsheet so you can sort the data by:

  • Competitor
  • The ads they’re running
  • What you’ve noticed about those ads
  • What test ideas you’ve gotten by reviewing their ads

If you’re in need of a few truly “out of the box” ideas for new ads, look to other industries or niches. Find breakaway, 100x ads and then analyze them like you would a competitor’s ads. This is especially effective if you can find an industry or niche your highest value audience has a particular affinity for. 

 

Creative Strategy

This is a plan based on the creative audit and the competitive audit. Its goal is to reduce failure across the team (for both UA managers and creatives). It should be an evolving map of how to do testing and messaging. 

A creative strategy lays out what the team/s plan to do, the timeframe they intend to do it in, and the ideas they’re going to test. It usually includes:

  • a rough budget of ad spend and testing investments
  • a rough calendar of when tests and creative will launch
  • specifics about what the creative strategy’s key goals are

A good creative strategy is a living document, in that it will probably change over time, but it provides everyone (on both internal and external teams) a blueprint to work from.

 

Creative History of Winners and Losers

This is a document of everything we’ve tested, why we tested it, and the results of each test. It’s updated weekly, and every new version is shared with our client, our creative team, and any other interested parties. 

The purpose of this document is to minimize repeated tests and mistakes and to build on what’s worked, regardless of which team is doing the testing or creative development. It’s a way to share the results of our work product and how we developed that product. 

 

Winner Variation Testing

We’ve mentioned how 80% of what we test are variations of winning ads, and how 20% are completely new concepts. 

Here’s one other way to use that framework. Once we have a winning ad, we’ll test every element it’s made of. This allows us to figure out which elements or combinations of elements are making the ad work. 

It takes quite a lot of tests and a fair amount of money to break up and test an ad like this, which is why we don’t do it for every ad – only 100x, breakout winning ads get analyzed like this. The information we learn from the analysis is vital to developing break-out ads going forward. It’s also fantastic for something called a “concept refresh.”

Concept Refresh

This is a new practice we’ve been getting excellent results with. Concept refreshes allow us to keep an ad alive for almost forever  – if we know which elements to keep the same and which to vary. But after we’ve done the winner variation testing mentioned above, we know which elements matter most. And so we can keep the anchor elements that are driving results, and refresh everything else in the ad. 

creativecreative

Asset Folders For Winning Ads

Whenever you get a winner, all the files – the videos, the Sketch files, the Illustrator files, the Photoshop files, the music files, all of it – get dumped into a folder. When you create variations, use the elements from the winning ads folder. Not the files from other variation tests. 

This may seem like a small thing, but tiny differences can have big effects. 

Don’t get sloppy with file naming conventions, either. Pick a naming convention and stick with it. That allows both internal and external teams to easily access the files they need quickly, without having to waste time looking for the right file. 

creative

 

Creative Testing Best Practices Conclusion

Creative development, testing, and strategy are still best done by human beings. The algorithms at Google and Facebook may be able to test creative elements, but they still can’t create those elements. They can’t do competitive analysis, either. And they can’t plan out a coherent creative strategy. 

If you’re an acquisition manager, we recommend you focus on expanding your skills in those areas. You don’t necessarily have to become creative, but you do need to show creatives how to become data-driven, and you need to be able to distill and interpret data for them so they can deliver better creative. 

But also keep your eye out for new opportunities. Machine learning is a powerful tool, but it flunks if you ask it to solve for problems it hasn’t encountered before, or if you ask it to interpret data in ways it hasn’t been programmed to do. 

So if, say, a machine learning algorithm was asked to optimize an ad for augmented reality, or if it was asked to optimize voice-enabled ads, it wouldn’t perform well. 

A human, however – a smart, agile user acquisition manager – might be able to take what they’ve learned and applied that knowledge to the new situation. 

Humans have always been good at this. We’re great at adapting. And it’s a skill we’re going to need going forward.

For more information, read or download our most recent whitepaper, 2020 Creative Testing: Why Is The Control So Hard To Beat? Or enjoy this funny video on creative testing.

2019 July Highlight Reel

2019 July Highlight Reel…

In an effort to showcase our creative more frequently, we are producing highlight reels to show our best work. Our first half, 2019 July Highlight Reel showcases some of our favorite ads we have produced for clients. This includes clients across gaming, e-commerce, entertainment, financial services, and automotive verticals. Check it out!

Creative Drives Performance!

July Highlight Reel

Facebook and Google’s new algorithms have simplified media buying, making ad creative the primary driver of performance. Our in-house Creative Studio has managed over $3 billion in ad spend and produced over 300,000 videos for the world’s largest mobile games and apps. As a result, we deliver ad creative at scale for these channels in our 2019 July Highlight Reel:

 

About Consumer Acquisition

Founded in 2013, Consumer Acquisition provides creative services, fully-managed user acquisition services, and SaaS tools for social advertisers. We understand the greatest ROI comes from creative testing. In addition, we have an unmatched ability to produce videos and images at scale. Our creative studio supports Facebook, Instagram, Google App Campaign, Snap, Pinterest, and IAB formats. And, our AdRules platform also supports Facebook and Google App Campaign.

In January 2019, we announced that we’re now offering Google App Campaign for managed services. We also expanded our managed services offerings and tiered them for advertisers of all budget sizes.  We also enhanced AdRules with new workflow automation features: AdBuilder Express & Audience Builder Express.  The platform offers the industry’s lowest fees; 60 days free, then 0.7% of spend.

Google Premier Partner

highlight reelIn May 2019, we were awarded a Google Premier Partner Badge; recognizing Consumer Acquisition as a leading social advertising and creative solution. Consumer Acquisition is Google’s only Partner in North America with a Creative and Premiere Certification. The Google Partner Program is a highly selective program designed to help customers identify the highest performing and most reliable companies based on their Google App Campaign social advertising and creative needs.

A company must meet several requirements in order to qualify as a Google Premier Partner. According to Google, “Achieving Partner status means that your company has demonstrated AdWords skill and expertise. It also meets Google ad spend requirements and delivers company agency and client revenue growth. Thus, sustained and grown its client base”. Premier Partners are held to an even higher standard. They must demonstrate expert-level AdWords knowledge and performance in order to achieve this badge.

Facebook Marketing Partner

highlight reelIn addition to being a Google Premier Partner, we are a Facebook Marketing Partner badged in Creative and AdTech. So, we’re uniquely positioned to help you with Facebook and Google App Campaign and creative and UA optimization. Contact us if you’d like to discuss how we can help with Creative or User Acquisition services.

Learn more about our Creative Studio.

Facebook and Google App Campaign Best Practices for Competitive Analysis of a Competitor’s Best Ads, Using the Latest Ad Spying Tools

When we start working with a new client, we begin by evaluating the ads of their top competitors. Why? Because competitive analysis is the single most important step to generate better creative for your user acquisition ads.

It’s not an easy task. Creating a winning ad is very hard – and only one in twenty new ads will usually beat a control. With a 5% success rate, you have a tremendous amount of work to uncover a fresh winning concept.

Also, we’re not looking to create just “another ad.” Ad creative is critical to the success of your campaigns, and we want to create a 10x or 100x ad that will blow the doors off all prior results. Usually, that means we’ll have to test a lot of ads, but we’ll also have to start off with some strong new concept ads, too. Competitive analysis is especially good for developing these types of bold new concepts – completely new approaches that haven’t been tried before.

Because high-performance ads are so challenging to create, it makes sense to use every trick you’ve got. And the best trick is competitive analysis.

Your competitors have already poured a ton of resources and ad spend and testing time into creating their own high-performance ads and they are failing at 95% too. Why not “borrow” from their best ads and customize them to create fresh ideas for your campaigns?

Once you know how to find and identify your competitor’s high-performing ads, you’ll have an endless supply of tested concepts ready to either make new ads from or to use in your ad tests.

It matters who you determine to run the competitive analysis

We’ve found that not everyone can do the analysis well. User acquisition managers typically aren’t good enough at evaluating creative concepts, and the creative team normally isn’t good enough with analytics to assess that part of ad performance.

The ideal person for this work will have both quantitative and creative skills, or you can create a team of two or more people so together they can have the optimal skill set.

Once you’ve got your creative analyst (or your creative analyst team) the objective will be to reject as few ideas as possible so you can increase your creative library – be data-driven, not opinion driven.

To do this, you’ll need a few tools. The first and best two tools are free and easily accessible. They’re even on Facebook.

Use Facebook’s new ads library tool, and other competitive analysis tools

Facebook’s new ads library tool just launched, and it’s an aggressive move towards transparency for the ad platform, sparked by ongoing calls for more transparency.

The tool will let you see: 

  • Every ad that’s active now or that has even been active since May of 2018.
  • How much a page has spent on Facebook ads.
  • Which pages’ ads reference a particular keyword.

You can also run daily, weekly, monthly and quarterly reports for different searches, and you don’t even need a Facebook account to access the information.

Clearly, this is a goldmine of competitive information. You want to be reviewing your competitors’ ads for trends like:

  • Messaging
  • Offers
  • Use of motion
  • Call to actions
  • Colors and backgrounds
  • Text placement
  • Characters
  • Logos and/or stickers
  • Anything that looks like a new element in an ad, or an old element used in a new way

The Facebook ads library tool may become even more useful soon. That’s because Facebook developer accounts can now have access to the ads library API. This, of course, means developers can build tools to sift through ads and data, and means we might be able to spot trends faster, and build better ads and better tests.

One thing Facebook is holding back on is data. We still can’t see performance data about the particular ads, just the ads themselves. But it’s fairly safe to assume that if your competitors are smart, and an ad has been running for more than a couple of days, it’s doing well.

Facebook ads library leapfrogs what used to be one of our favorite competitive analysis tricks: To go to the Facebook page of a competitor and click the “Info and Ads” link in the left-hand column. You can still do that, but the new ads library is vastly more powerful.

But what about that performance data? You may still be able to get it but from third-party tools. We recommend all of these Facebook ad spying tools, and we used them ourselves: 

  • Social Ad Scout
  • PowerAdSpy
  • Connect Explore
  • SocialPeta
  • AdSpy

 

Create a Table for Competitive Analysis

As you review competitors’ ads, make a log of what you find. It’s probably best to organize your research first by a competitor, then by the ads they’re running, and then by what you notice, and then by what you specifically want to create or test. Create a table for each competitor so you can add rows for dates.

A table could be a starting point. Adapt as necessary, and consider adding another column for screenshots or even captured videos of ads that especially interest you.

competitive analysis

 

Keeping a log like this takes time, but you don’t have to have a table for every competitor. And if you set aside even one hour a week to do a review like this, you’ll have all the new ideas you can handle. 

If you’ve got the time, consider following the ads of a few non-competitive advertisers whose work you admire. Sometimes, really great ideas can come from outside your niche, though don’t expect miracles…outside the box ad concepts can work really well, but often they flop pretty hard, too. Use your testing methodology to minimize how much exposure you give any ad until it has proven itself. 

Conclusion

Why miss out on a ton of actionable information – proven ads that are working for other advertisers’ in your niche? Doing even basic competitive analysis can be highly profitable, especially now that Facebook’s ads library makes it so much easier.

For more information on how to conduct an effective competitive analysis of ad creative contact us!

 

whitepapers

Read Our
Whitepapers

Creative & UA Best Practices For Facebook, Google, TikTok & Snap ads.

    Please prove you are human by selecting the Cup.