This is the third episode of our series on ad creative optimization. Check out the previous episodes here and here. This time round, our host, Melissa Zeloof, welcomes Noa Eckstein, Director of Creative Performance at Playworks. They discuss all things in-ad data, including Playworks' process for optimizing ad creatives for some of the industry's biggest names.
Listen to the full episode here, or read the transcript below.
What the heck is in-ad data?
1:45 Noa - “About three and a half years ago, before Playworks started activity, the first playable mobile game ads emerged and it was a game changer and life changer for everyone that loves data. Before that, we had impressions, clicks, installs, and post install events. But, once the playable ad came to life, we had a whole new data world. We have, now, the entire user funnel within the ad.”
Breaking the ice on icebreaker
2:59 Noa - “I created the ICEBREAKER framework because I wanted to convey, in a straightforward and very simple way, the most important principles you need to take into account when optimizing your creatives. You need to understand that a big part of what I do is making data more approachable to designers, developers, and for business people who are not really used to using data and trying to get actionable insights from them.
To make in-ad and data driven decisions, know your KPI. You always need to know what you’re optimizing towards because it’s a huge world out there and you have many options.
Do proper research and always know what you tested before. For example, what won, what lost, what your competitors are doing, and what other trends that are in the industry because we see something that is really successful and everyone learns from that and uses it. Basically, do your research and maintain a really regular routine of analyzing, optimizing, and analyzing again. Don’t let ad fatigue kick in.”
5:06 Noa - “To use my operation and production hat, playable ads are more challenging. They’re developed by a team not like how end cards can be developed by one designer and videos can also be done by one designer but basically created end-to-end. Playable ads require more of a team. We use game designers, a designer, a developer and then we need QA. We need to be more in sync and it takes longer. From a production and operational point of view, they are the most challenging. Even performance wise, they are more challenging than other creatives because you need to make a full experience and a feeling. A huge team of people create a game and try to convey experience and feeling and then need to cut it down to 30 seconds max of gameplay and try to convey the same feeling and try to get the user to install the game.
We take a huge gameplay, separate it into the different features and subgenres within, and test each one differently to find what is working best for each game.”
Analyzing step by step
7:50 Noa - “We always need to understand what the client’s KPI is. We always start with analyzing the creative that we want to improve, or do a video for.
Then we try to understand what is the problem we are trying to solve. It’s basically some kind of puzzle. You have a problem, for example, the creative is not scaling in RV or interstitial, or we are trying to crack the specific genre, or, maybe, everything is great but the client wants better user quality and that’s something we need to solve.
We first focus on the main problem and then we try to dive into the data. First, we always analyze the CTR, CVR, and IPM, just to receive a benchmark. Then, we dive into the in-ad data and analyze the context.
After we take the data and analyze it, we compare it to the benchmarks … After that, we find the places where we can improve … and then, we try to understand the user journey and create the actionable insight and the action items on how this creative can improve.
For example, in a puzzle game, we gave the user the option between crosswords, cross and world puzzle and most users chose crosswords and we took it and created a whole creative around it because we knew it was the user preference.”
Mistakes are going to happen
12:53 Noa - “I see a lot of shooting in the dark. I see a lot of cases where our clients, specifically app developers, test the same stuff over and over again even though it’s not working and you can see that in analysis. People also don’t keep routine and will turn off creatives after one day which is really fast, and, sometimes, they don’t even have enough data because they just have this new creative and want to test it. I see, sometimes, that people are limiting creativity and testing only one gameplay when most of the time there are several features in a game that can be deconstructed to test different stuff. Basically, be more creative in your iterations. I think there’s arrogance, like finding a winning creative and not trying to find additional ones. We know there’s ad fatigue and we know you need to iterate your creatives. It’s super important that even if you find something that is working really well for you, try something else. If you keep trying, you can find the next big thing or the next game changer.”
Frequency of analysis
15:46 Noa - “I don’t think you need to look at it all the time. Anyone who does a proper A/B testing knows that you’re not allowed to look at the data less than a week after it went live because everytime you can see different results. The more traffic you get you can have more frequent iterations and you can analyze data faster. I think the rule of thumb for the average app is once a week.”
Users are changing
18:00 Noa - “I see that creatives are getting more and more aggressive and CTRs are getting higher and higher. As an industry leader, we have strict rules and guidelines to limit aggressiveness. Every drafted creative we receive is being tested through a really great QA team and everything is tested on several devices. We make sure there are no creatives with bugs or issues running on our network. We can also make sure, because we are doing that process, that no creative is too aggressive that would damage the user’s experience, will run on our network. I think we need to understand that IPM is built by CTR and CVR. It’s basically synergy between these two to find the balance and increase the CTR without lowering the CVR to an extent that the IPM will be lower as well. I think if we can make the same IPM with the lower CTR and higher CVR, it will be beneficial to all parties included.”