Media Deep Dive Part II: Network Testing for Success | AppsFlyer
4 Min. Read

Media Deep Dive Part II: Network Testing for Success

Avatar Shani Rosenfelder Jan 24, 2018

Like good soundmen, app marketers should add the call “Testing, Testing 1,2,3!” to their repertoire. After all, constantly testing new media sources to increase reach and improve performance is a key component in the app marketing checklist.

In our previous Media Deep Dive blog, we looked at the average number of networks used by apps based on their level of ad spend. This time, we segmented apps by the number of non-organic installs over an 18 month period in an attempt to understand the prevalence and success rate of test campaigns.

What did we find?

  • 43% of test campaigns are successful; all you need are a few wins to drive growth
  • Test campaign success is distributed across many media sources, with over 50% of media sources delivering a success rate above 40%
  • Testing is common: for every app, over 30% of campaigns are test campaigns
 
Adopting an ‘always be testing’ mindset

In today’s fast-paced and hyper-dynamic mobile advertising space, remaining static is ranked high on the list of worst things a marketing manager can do. You always need to be thinking: How can I improve my reach? How can I lower my costs and increase my revenue? Often, the answer demands a fair amount of testing.

Turning the levers up and down, or even off, is an integral part of the job. If your media partners know they are being tested they will try harder, whether it’s on an ongoing basis or as part of a more significant incrementality test or head-to-head test vs. another network. After all, churn is a significant pain for media companies so they will go the extra mile to keep your business.

Testing usually comes with lower media costs, so there’s less to lose and much to gain. As marketers introduce new players into their marketing mix, often competing for the same budgets, vendors are pushed to deliver more advertiser-friendly pricing.

It is no wonder that in some companies, testing is entrenched in policy. It’s part of a company’s culture, and it usually demonstrates a healthy attitude towards business overall.

How did we define test campaigns in our analysis? We know that testing means different things for different advertisers and there’s definitely not a single source of truth. Having said that, we applied the following conditions:

  • Media sources that did not generate any installs for a specific app for at least six consecutive months during an 18-month period
  • Media sources that had at least three months of data for the app in question  
  • A threshold of at least three networks per app and 500 monthly installs per media source per app was enforced

 

Insight #1: 43% of test campaigns are successful

According to our data, a sufficient amount of test campaigns are successful at the very least to warrant further investigation. We labeled test campaigns as successful if they had at least three months worth of data and their traffic had increased for at least two consecutive months after the test began.

  • We can see that for most apps, the success rate is between 41% and 60%, with the 21%-40% range not far behind
  • Almost 1 in 4 large apps have an impressive 61%-80% test success rate

 

Ultimately, all you need are a few successful tests. And when you scale these campaigns, that’s where you can achieve growth. The more you test, the greater your chance of finding aces for your app.

The fact that testing is largely successful means it’s worth the effort, but it does require resources – both time and funding — depending on your app’s size.

From a technical standpoint, testing is easier if you work with an attribution provider since all you have to do is set up attribution links and enable relevant postbacks. In minutes, you should be up and running.

Beyond measurement, larger apps have broader manpower and financial resources. They are often savvier advertisers with plenty of experience setting up tests — pinpointing creatives, targeting the best audiences and allocating the right budget — and closely measuring them. However, getting things through legal and finance can be a hassle. For smaller apps, the opposite is true: setup is harder, while legal and finance restrictions are lighter.

 

Insight #2: Success is distributed across many media sources

In order to demonstrate that the app success rate shown above is not limited to a small number of media sources, we examined share at the media level and found the following:

  • Overall, over 50% of media sources delivered a success rate exceeding 40%
  • The largest concentration of media sources running successful tests for their clients was found between the 21% and 60% success rates
  • Among large apps, almost 1 in 5 media sources delivered an impressive 61-80% success rate

 

Insight #3: Over 30% of the average app’s campaigns are test campaigns

With many successful tests proving to be a success, it is no wonder we’ve found that testing is quite common. Let’s take a look at the numbers:                     

  • We can see that across the board, app marketers understand the importance of testing: over 30% of media sources on average are in a testing phase, and their share is relatively similar across different app sizes
  • The larger the app, the higher its number of test campaigns: large apps run 2x and 4x more tests than medium and small apps, respectively; this is largely due to resource availability

 

Things You Should Know Before Testing

Research. Read, investigate, ask peers, join a community. There are many experienced, knowledgeable, and friendly marketers out there so don’t be shy. Also, check out industry reports on media networks like our very own AppsFlyer Performance Index,which has been steering marketers in the right direction since 2015.

Get key info from media partners. Are they self serve or managed? Is their inventory direct or non-direct? What is the level of transparency offered? How much of their inventory is “exclusive”? Which ad formats do they offer (make sure you see an actual preview, not just the design specs)?

Clearly define legal terms. It is highly recommended to include an opt-out clause in the contract where it is clearly stated when and under which terms a test can end. In addition, make sure the type of inventory offered in the campaign is included (direct, non-direct).

Define data-driven goals for success. Set a minimum sample size and consider working with a clean variant to isolate noise. For example, run your test in a single country so you can get conclusive results faster. Whether you have short term goals (retention, or early funnel events like log-in, tutorial completion, flight search etc.), or longer term goals (monetization), make sure success is based on data from a trusted attribution provider. Lastly, ensure the time frame given to the media source to reach the goals is well defined.

 

In short, ‘always be testing, 1, 2, 3’ to find your optimal sound for the best performance.