The mobile game industry is fast-paced and aggressive - someone is always waiting to replicate your game or your creatives. Maintaining efficiency and speed while testing keeps you ahead of the game as you discover new opportunities for optimization. Using the right technology and approach to testing can help you improve your creative strategy and attract the best kind of users across each channel.
Here, John Wright, Head of Customer Success at ironSource Luna, shares his top four testing tips to boost your creative performance as efficiently as possible.
1. Automate creative production for more chances of success
Having enough creatives for testing is essential to keep improving performance and finding opportunities for optimization. To make sure you have a big enough bank of ads to keep up the pace of testing, your process for creative production needs to be efficient. Traditionally, this process looks like a UA manager requesting ads from the creative team. After they’re designed, the UA manager manually uploads the creatives to each channel, lets them run to gather data, analyzes performance, then adjusts their brief for new ones to be built - and then the process repeats. This limits production speed because you’re testing 3 creatives every 2-3 days, then manually pausing, uploading, and reviewing them. In the end, it can take you three weeks to test just 20 videos on an SDK network.
To design more creatives faster, you can automate the process. Technology like dynamic creative optimization by Facebook lets you automatically create versions of different video creatives that are optimized based on what’s performing best. So instead of three weeks to test 20 videos, automation lets you test upwards of 100 creatives per week.
Instead of three weeks to test 20 videos, automation lets you test upwards of 100 creatives per week.
Luna meanwhile has a dynamic playable optimization feature that applies this automation technology to playable ads. Your creative team can design one playable file, upload it to Luna, then let the platform do the work to test different versions and optimize them based on which had the highest IPM.
For example, GameJam increased their creative output by 4x in 48 hours using Luna Playable. With the tool, they ran multivariate tests on 73 playable concepts and 1200 variations quickly, easily, and entirely in-house. This led to a 15% increase in IPM and drove over 2 million installs for their game.
2. Test macro concept changes and micro iterations
Starting at the macro level before diving into the micro helps you squeeze the most juice from your creatives and identify more opportunities for improvement. Taking this approach is an excellent starting point for honing in on what you should be iterating on, instead of trying to test everything at once.
As you begin testing, zoom out first and try many different concepts (that’s the macro) - this often results in a big impact on performance. For example, Codigames tested two concepts for their playable ad tutorial. One featured a barbershop environment and another included a sleeping character that showcased obvious emotions. The version featuring the sleeping character had a 60% engagement rate, compared to 54% for the other concept - it went on to achieve over 100 million impressions.
Once you identify the top-performing concept, you can start iterating on the details and refining your creative (that’s the micro). Testing features like the background, color, and length can help optimize performance. Since these are often small changes, it’s important to test many variations quickly to identify what elements move the needle and to shorten the learning curve.
Tastypill used Luna Elements to increase their concept testing capabilities - within one month, they produced at least 20 Elements-built playables each day. Applying the learnings from each iteration to the next let them keep improving performance and identify a top-performing playable that drove 33M impressions and had an ER of 65%.
3. Visualize and compare data
Always use creative KPIs to drive decision-making. This can seem like a no-brainer, but I’ve seen many examples of situations where studios spend a huge budget on creatives that seemed to attract a lot of users. But many of these users churned quickly once they installed the app - this led to a very low ROAS and in the end, they paused the ad. However, visualizing and comparing metrics from the entire funnel (like CTR vs. ROAS) from the beginning would have shown that these types of creatives led to low retention despite having high click rates. This way, you can identify which ads are performing better before you get to that point.
Measure user quality by looking at lower-funnel metrics like:
- Retention rate
- ARPU
- ARPDAU
- ROAS
And look at top-of-funnel metrics to analyze the effectiveness of the ad, like:
- CTR
- Time to engage (TTE)
- Engagement rate (ER%)
- Number of engagements (#E)
- Experiences completed
- CVR
- IPM
Compare metrics from both the upper and lower parts of the funnel to get a full view of creative performance and spot opportunities for optimization.
These metrics higher up the funnel confirm the performance of the ad itself so you can identify the themes and features that work best. Then you can look at the data further down the funnel to see how tweaking each of these elements affects user quality. Analyzing creative testing performance by looking at KPIs across the entire funnel ensures you’re taking a data-backed approach to optimization - and doing so with as much efficiency and as little wasted ad spend as possible.
Using Luna Control, Ludia aggregated their UA campaign data from all channels into one place so they could easily pull their creative insights and compare campaigns to spot opportunities for improvement. Having all of their metrics easily accessible helped them identify key optimization opportunities - and risks they should avoid.
“We had a dashboard with data that we could trust, aggregating across multiple channels and sources to really get to the root of what was working and what wasn’t. "
- Taylor Lundgren, User Acquisition Director at Ludia
4. Allocate your budget wisely
How and where you spend your budget is incredibly important. For starters, you want to allocate enough budget when testing - the more you spend, the faster you’ll be able to understand what’s performing (or not).
Once the ads are live, you can quickly re-allocate spend for these creatives across your marketing channels. Get a comprehensive view of your creatives across all (yes, all) channels so you can clearly compare which ones are worth the spend and which aren’t. Then you can adjust spend accordingly and get the most out of your budget.
For example, cocone used Luna Control to pull performance data from multiple UA channels, analyze metrics based on specific features, and get actionable insights - all in one platform. As a result, they reduced the time they spent analyzing creative data by 50%. WIth a clearer and more comprehensive view of performance, they re-allocated their budget which helped them exceed their CPI goal by 20%.
Succeed across channels
Each UA channel will perform differently because they have different characteristics and audiences. End cards, for example, tend to perform well on SDK networks since they automatically appear at the end of a creative. But they usually don’t have high conversion rates on Facebook because users need to click through to see it - it adds another step to the funnel and increases friction.
No matter the channel, though, testing is always the best approach to optimize your creatives. Keep the tips in mind above and adapt them for each channel to boost creative performance more quickly and easily.