Mobile App Creative Optimization For 2021
- by Brian Bowman | November 4, 2020
- Facebook Advertising
- No Comments (0)
What is the best way to guarantee success in mobile app advertising? It is not your brand, and it is not your budget. It is great creative testing and optimization. Over the last seven years, we have developed a world-class creative optimization method that’s given a competitive advantage to thousands of advertisers. It is more powerful than machine learning or branding. And ultimately, more valuable than simply spending more ad budget.
Why? Because creative and creative optimization is the last advantage for any mobile app advertiser. Over the last few years, Google and Facebook’s increasing reliance on automation has taken away most of the competitive advantages of adtech tools and the granular campaign management levers used by humans. “Creative is the only major lever remaining to influence performance for mobile app advertisers,” as Dustin Engel explains in his article, The Faster, Better, Cheaper Mandate in Extraordinary Times.
But not just any creative can cut it. Advertisers need high-performance creative that can perform as well or better than their current winning ad. And being that only one in ten or twenty ads are good enough to beat a winning creative, that means advertisers need a lot of new creative concepts – not variations, and a cost-effective way to test and optimize creative.
Creative Optimization: Why Efficiency is Essential
So efficient, accurate, scalable creative testing is the single best competitive advantage any advertiser has.
But it is harder to do than you would think.
Take efficiency: Anyone who is ever run a simple split-test knows there is a downside to running tests. Ad variations that do not convert as well as your control cost money to test and lose you money. Sometimes a lot of money.
If you test a lot of ads, you can cumulatively lose a big chunk of budget simply by testing ads that do not perform as well as your control. We recommend that advertisers allocate 10-15% of their monthly budget to A/B testing and assume that 85-95% of that test budget will produce a ROAS of $0.00 – zip – nada – nothing!
This is why we recommend you pay close attention to competitive trends and player profiles. Paying attention to competitors increases your success rate above 5% to 17%. And when done properly, will provide you an endless supply of tested concepts to try. This approach allows us to find the big new ideas that can lead to huge improvements but to find them in a way that minimizes failure. Why Pablo Picasso said “a good artist will borrow but a great artist will steal”, you can’t rely on simply coping with competitive concepts. You need to look broadly across the Facebook / Google / TikTok ecosystem to understand which creative trends are gaining momentum and filter those concepts through user profiles/motivations and a title’s universe of assets and creative restrictions to come up with original ideas.
But that is only half the battle.
Not only do underperforming ads lose money, but just testing itself requires a certain amount of ad spend. The example below shows how getting to statistical significance can cost $20,000 for each version of creative tested. But our shortcut method, IPM creative testing, can find a winner for 1% of the cost that statistical significance would require.
How do we do this? Basically, our internal creative testing methodology is designed to look for big wins early on. If the tests are managed well, and we can control when the algorithm tries to play favorites, we can cut the time it takes to find a new winner. This means we save money by not running ad spend through underperforming ads, but more importantly, it means we can test more creative and test it faster.
This means we can have fresh, high-performance creative running always, with minimal dips in performance between one top-performing ad and the next. We can side-step most of the uneven performance triggered by creative fatigue.
These are some of the tactics that allow us to test more creative faster. They’ve led to the bulk of performance improvements in our clients’ accounts.
Basically, we know creative testing has the single highest return on investment of any other activity in-app businesses. And therefore, we tell every advertiser to do more testing. No matter who you are, or how much you are testing, do more.
Betatyping: Facebook’s New Framework for Creative Optimization
So, given how powerful creative optimization can be, we were eager to see Facebook reveal a new update of their own creative testing framework. They call it “betatyping.”
“Betatyping is a way to intentionally experiment, uncover fertile creative territories and drive success. Looked at from a different angle, Betatyping is a creative and measurement framework to equip advertisers to answer their most pressing business questions, the building blocks of a campaign, by directly tapping into the pulse of their audiences on Facebook.”
Facebook breaks its betatyping framework down into four elements: Ask, Make, Learn, and Adapt.
Here is how they describe each element:
- Ask: Craft hypotheses based on what you are trying to learn and the outcome measures that will determine success.
- Make: Design experiments and creative assets based on your hypothesis and what you are trying to learn.
- Learn: Analyze results and insights from the experiment based on primary KPIs and secondary diagnostics.
- Adapt: Strategically and creatively determine how the learnings will be implemented.
So, does betatyping work? Yes. Facebook draws on a case study from McDonald’s to show how powerful betatyping can be.
Betatyping in Action: McDonald’s Sweden Case Study
McDonald’s Sweden wanted more app downloads. They hypothesized “that app installs would be higher for creative featuring relatable, real-life moment-based deals versus their business-as-usual straightforward deal offers.”
So, they ran an experiment and got some impressive results: an 82% cost reduction per app install compared to previous campaigns.
But that is only the first step. Per Facebook’s new testing approach, McDonald’s Sweden went on to learn from the results of this survey and to adapt their creative development strategy going forward so it reflected their new “relatable, real-life moment-based deals” creative strategy.
They have been loving it ever since. Betatyping that, is. This is one of the core aspects of this new framework: It is ongoing. As Facebook explains, “It’s that continual cycle of hypothesizing, testing, learning and adapting that uncovers fertile creative territories and drives long-term success.”
We could not agree more. Mobile app creative optimization cannot be a “one and done” exercise. It must be built into the creative development process from the ground up. But with betatyping, this idea of ongoing optimization is central. It is a creative testing framework that “leads to accumulating valuable insights about a brand and business, beyond what works in one single campaign.”
That last part is key. Betatyping is a far more strategic, and even “meta” approach to creative testing than standard A/B split-tests. What Facebook is describing here does not come from just running split-tests that compare the performance of a hundred different creative elements, like colors, hero-shots, call to action, and the like.
What Facebook is talking about here is on a different level. They are describing a test of an entire creative approach, not just optimizing one isolated ad.
That said, this framework could be used to optimize different creative elements. A company could posit a hypothesis like “bold colors will outperform muted colors.” A company could do that with this framework, but they would be missing the larger opportunity here.
This is both the power of this framework, but also a reason to use it carefully. We advise using betatyping very carefully. Because the whole framework hinges on something that many advertisers have not nailed yet.
The ultimate success of betatyping comes down to posing the right hypothesis and to choosing the right KPI to measure it with.
Betatyping Still Needs an Underlying Creative Strategy
Without understanding the goals of your advertising, and your best plan for how to achieve those goals, the betatyping model might not help as much as you would hope.
Therefore, one of the first things we do when we work with a new account is to see:
- what they have done in the past (creative audit)
- what their competitors are doing (competitive audit)
This sort of historical analysis and competitive research gives us a framework for a data-driven creative strategy. It also gives us a deep enough understanding of their account so we can pose hypotheses worth testing.
For example, another creative strategy best practice we like to use is the concept of “player profiles.” “Player profiles” are basically a way to segment gaming audiences not so much by demographics, but by what motivates different audiences to play a game.
This sort of creative strategy analysis would be an ideal complement to Facebook’s betatyping approach. You could use each player profile as the hypothesis for the test. Then run your experiments to see if the data from creative performance proved you had defined each player correctly.
If the creative experiments showed you had done those player profiles correctly, you could then confidently adapt all your current creative strategy to align with those player motivations and the six “gaming emotional hooks.”
Left brain and right brain thinking for creative optimization
There is one other reason why we like this new betatyping framework so much. It marries data and creative so well. This is exactly the left-brain/right-brain mindset we’ve been advocating for user acquisition managers to embrace.
So welcome to yet more evidence of the blend of data and creativity in current user acquisition advertising – and the proof that this is where UA is heading. With this new betatyping framework, we have been given yet another tool to shift into User Acquisition 2.0.
Consumer Acquisition Special Offers
View over 1.5 million video ads from competitive apps and see which creatives drive performance.
See how your mobile app KPIs perform vs industry benchmarks? Uncover your performance vs competitors and see KPIs like CTR, CPM, CPC, CPI, IPM, Conv%, country breakdowns, and much more. https://www.consumeracquisition.com/mobile-app-industry-benchmarks/