analyze A/B test results

How to analyze A/B test results with a winning variant

We haaaaave a winner! Great success, now what’s next? Analyze A/B test results with a winning variant for better understanding what led to the improved performance. Here’s how.

Metrics

The metrics that determine the success of the campaign is usually set before starting the campaign. Depending what the objective of the campaign is, you can choose to follow the CVR, the CPM or the number of installs for example. Some common metrics for ASO A/B testing include app store impressions, app page views, conversion rate, download rate, and user engagement. The key is to identify the key performance indicators (KPIs) that are relevant to your specific goal.

User behavior

What is even more insightful is analyzing the user behavior. A heat map for example shows the most clicked assets on the product page. Data on the heat map helps you determine the importance of specific elements. Typically the first screenshot is the most impactful as it is the one seen upon arrival. Therefore naturally it is the one that needs most focus on but this analysis helps determine the importance of the rest of the set.

Considering the different user segments, exploring and decisive users, can also bring valuable insights. Is your current audience more exploring or decisive? Are they scrolling through the store page or do they instantly, without seeing screenshots or reading the description, click download? What should you focus on depending on how fast your audience is to download? By segmenting users and analyzing their behavior separately, you can gain a more nuanced understanding of how the winning variant impacts different the user groups.

Analyze A/B test results

To create a holistic analysis of the results it is good to keep in mind the objectives of the campaign. Which goals were met, which weren’t, and what are the elements that affect that. The different metrics and numbers show that data that something either happened or not, while user behavior helps you understand why it did or did not happen. Are there some prior learnings form prior tests that should be taken into account? Does this test open more questions for further testing?

It is good to remember that the analysis doesn’t stop here. Monitoring the performance of the store page continues outside testing to be able to continue good performance and act fast to avoid crashing numbers.

What’s next

Finding a winner variant doesn’t mean you are obligated to change the product page assets accordingly. It is rather a guideline that shows the performance of a different variation. When testing for a live app, usually building the store page based on the winner is suggested. On the other hand, when A/B testing to validate a concept, the results actually give data to back decisions that are made throughout the development. For example, locking in on a pirate theme for a game early on could cost lots of money, if a game with a space theme would have had significantly higher performance. Therefore it is best to test early to lose small and win big.

It is good to view A/B testing as an iterative process. The biggest gain comes from constantly testing for best-performing assets or concepts. A best practice is to have a clear structure, schedule, and strong hypotheses to back the tests up.

Head on over to our previous blog post about what to look for in your A/B test’s campaign results to read further about results.

Share

Table of Contents

Turn impressions into installs

See Geeklab in action and experience why top developers use Geeklab in everyday marketing.
Join today!