The DCO Differentiator - How to Drive ROI with Retargeting Creatives - AppsFlyer
4 Min. Read 624Reads

The DCO Differentiator – How to Drive ROI with Retargeting Creatives

Andrew Hunter Whiteside (Guest Author) Dec 11, 2018

Mobile ads are the campaign element that advertisers have the most control over, in the sense that they create or approve the content their users see (while other elements, such as bidding, are, or should be, automated for optimal results.) Finding the right user at the right time and price means nothing, unless you show them the right creative. However, effectively leveraging creatives is also one of the campaign components marketers struggle with the most.

 

So how can app marketers leverage ads to drive incremental sales? What must they keep in mind for efficient and insightful A/B testing?

 

 

An ad is not an image.

Any display ad is not simply an image, it’s actually made up of several creative elements.

  • Call to Action Button Color
  • Colors
  • Copy
  • Design
  • Font Size
  • Font Style
  • Images
  • Logo

This means advertisers running dynamic creatives can (and should) treat each of these elements as an optimization variable.

 

How many different ad variations should you run at the same time/how many different ad variations should you test?

With programmatic (reliable) transparency, it is much easier to understand what ads are working and why. This is key. Why does a certain ad perform better than the other? An overload of variants, that is, testing too many variations at once, can make it hard to understand that.

Since each ad element is a data point, you don’t want to test too many different ads at the same time. We’ve found testing 3-5 different ad variations gives the most insights within a reasonable time frame. (Note: for apps with lower volume of active users, it’s advisable to test up to 3 variations).

Creative tests should have enough statistical significance to be actionable. Marketers testing too many ad variations at the same time may find the resulting data is not really conclusive for making informed decisions over time. If it didn’t affect the user experience, ad tests could run for longer periods of time and potentially test more variables at the same time. However, running the same ads for long periods of time results in ad fatigue (and most likely irritated users).

 

How long should creative tests run for?

We typically recommend running tests for 3-4 weeks. After that period, ad performance drops anyway. A 3-4 week period will give you sufficient time to gather data on the different ad elements without boring/tiring users by showing the same ads.

retargeting

Source: Internal Jampp Data for a Food Vertical App [Aug-Sep 2018]

 

What metrics should marketers consider?

The great thing about programmatic campaigns is that you can track an ad’s performance from impression to the completion of the desired in-app event.

 

  • Impressions – Impression level data can help you identify how statistically significant is the test scope/size for which you are calculating the CTR. It’s good to have this information so you understand the significance of % metrics.
  • Click-Through Rate (CTR) – This is a key metric in measuring ad performance. How many of the users that saw the ad clicked on it?
  • Event Conversion Rate (ECR) – This metric can help you understand how efficient clicks are in generating events.  
  • Cost Per Action (CPA) – Evaluating CP per ad can offer further insights into what attracts your successfully retargeted users.

 

Keep in mind: Any time you are considering events, understand that your scope/test sample is going to be smaller. Think of the app funnel; there are always less in-app events than there are clicks. It’s easier to get a significant amount of clicks to evaluate CTR. For some apps, it might be harder to get a significant amount of in-app events during the ad test period. Therefore, metrics such as ECR and CPA are better for apps with high DAU.

 

What ad elements have the most impact?

retargeting

Having run multiple A/B tests across different campaigns on Jampp’s App Retargeting Platform, as well as specific experiments to determine the impact of each element, we’ve found images and colors are the ad components that have the most significant impact on CTR. Alternatively, elements such as font style and font size barely moved the needle.

retargeting

It makes sense too, our brains can process images much faster than they can process text, so changing the photograph, background color, or button color can have a big impact.

 

How do you test different variations continuously within the brand guidelines?

One of the main challenges brands face when testing creatives is designing significantly different ads while respecting brand guidelines. However, this need not be an issue.

Advertisers reading this post may be thinking that images and colors may have the most impact, but that they are also the trickiest elements to update. Nobody is expecting you to use different brand colors; on the contrary, using brand colors has a positive impact, as users are more likely to quickly recognize your brand.

However, changing the way the colors are used in the ad can make a big difference. Same goes with images – sometimes it’s as easy as changing the way the image appears in the ad: circle, full blown image, etc.

 

A Few Examples

A Fashion/Shopping App may test the same design using a photo of an item versus a photo of somebody wearing a chic outfit.

retargeting

A Food Delivery App might use the same design showing photos of different dishes.

retargeting

A Travel App might leverage photography, using the same photo in slightly different layouts.

retargeting

 

Key Takeaways

Dynamic Creative Optimization (DCO) tools can help you identify your best ads more quickly, leverage machine learning for ads and maximize the metrics you care about.

 

  • Ads can make a difference, trust the data.  Ads play a key role in getting users back to the app and guiding them to complete specific in-app actions, such as purchases and orders and bookings. But the “prettiest” ad won’t necessarily be the best performing one. Test different variations and find out what resonates with your users.
  • Steady does it. It’s better to test a few variations frequently than to test too many at the same time. Testing 3-4 variations for 3-4 weeks will allow you to gain performance insights quickly and updating ads regularly will go a long way towards engaging your users, and subsequently driving more sales.
  • Diversify! Every ad (no matter how awesome) will see a CTR drop over time. It’s important to refresh your ads to avoid inflicting ad fatigue on your users. You may even find that after running designs with photos for a while, an ad with text only may perform better, simply because it is different and therefore grabs the users’ attention. Don’t stick to one winner. Keep testing and learning. Design, test, refresh.

 

Bonus: Holidays and Special Dates

retargeting

Finally, leverage the holidays! Holidays and special dates are a great way of testing different elements. These ads typically run for a shorter period of time, but they tend to be more engaging, as users realize it’s “current” and relevant to what’s top of mind for them.  

 

Comments