This can either speed up or slow down the learning process. We need access to an existing account to perform a creative audit and media buying audit. This way, we can see what has worked and what hasn’t worked historically, and put together a creative and media strategy tied to your media budget.
As we work through the creative and media buying strategy doc, we conduct calls to discuss what works/what didn’t work and make adjustments. The strategy docs are constantly updated/modified based on performance, network changes, and other external factors.
We will need access to the primary account for a creative and media buying audit. Without access, we won’t be able to see what has worked/failed historically and are far more prone to failure. Starting with a new account, we’ll need to train Facebook / Google’s algorithm on AEO bidding (App Event Optimization) or VO bidding (Value Optimization). Starting with fresh accounts means that we’ll waste both time and money retraining Facebook’s algorithm vs gaining access to your already optimized account(s).
Regardless if we get access to a new or existing account, we’ll conduct a competitive analysis.
Assuming a Facebook or Google account is already running, we start by restructuring your account into best practices. We launch ads using your best performing ads to establish a baseline. Then we perform ad copy tests first and while those tests are running we generate creative briefs internally. Next, we send those briefs to our editors, do internal feedback — once ready, and send them to you for feedback.
We will launch our initial campaigns/ad sets using your best videos/audiences and images to establish a baseline. Most likely we will also set up our first ad copy tests.
While that is happening, we will create initial videos and provide them to you for review and feedback. Once approved, we’ll run a/b tests to establish their performance. The above process typically takes between 5-7 days.
The number of creatives generated is tied to your available budget, risk for non-converting spend, and size of audiences.
Based on metrics, we iterate on designs and modify strategy.
Machine learning has simplified the optimization algorithms at Facebook & Google App Campaign. With automated algorithms, creative has become the driver of performance, but….
Only 5% of creative will outperform the best ad in your portfolio. That means only ~1 out of 20 concepts performs well.
For most clients, they are unable to spend enough money in a given month to test 20 new videos and incur all of that non-converting spend. As such, creative testing must be spread out over more than 30 days. This is one of the primary reasons 30-day tests fail almost certainly.
Our objective with creative testing is to move quickly, limit non-converting spend by prototyping ads.
Allowing us to work quickly to uncover winning concepts that can be enhanced with colors, fonts, and materials to reach brand compliance once it is proven, would be very helpful.
In fact, it would be more financially efficient, if you could approve imperfect ads (perhaps incorrect fonts, colors…). Allow us to launch these ads quickly, gather learnings and we’ll adjust for the next set of ads. This will allow us to explore more creative concepts quickly, uncover winners, and ultimately deliver better results.
To illustrate the scope of the creative testing challenge, here is a gaming company that has to spend & $15mm on Facebook.
As you can see a vast, vast, vast majority of ads bomb quickly. That is why 30-day tests fail so often.
How many creatives can get produced and tested in 30 days? This question is nearly impossible to answer in the abstract. The number of videos we produce is governed by your media test budget that we have to spend and your appetite for financial risk / non-converting spend.
We’ve found that on average there is a 95% failure rate for new creative. That means only 1 out of 20 videos will become a winner – which results in a lot of non-performing spend on media testing.
For example, if we guaranteed 10 videos per month, that number may be too high given a budget of say $200K / month as we may not be able to run them while hitting ROAS metrics.
As important, if we produce a bunch of creative quickly, we’d make assumptions on what works vs using data to guide our process. As a result, we may end up with poor performers that we would not have designed if we had the data before creative ideation began.
Most importantly, we find focusing on a data-driven, quantitative creative testing process very productive. We like to get data to understand what is working and not working and use that data to guide our creative process. Ultimately that saves your media dollars and us time (win-win).