What can you learn from your a/b test’s campaign results? You have set up a campaign on Geeklab, it has ended, collected all the data and this is where you get to the juicy part. Whether you are doing concept validation or optimizing the store page of a live app, the results will guide you towards the right decisions as to what your next steps should be. This is a rundown of the key points in reading results and what can be analyzed from these results. Let’s dive right in.
Overview
The overview shows the variant’s conversion rate and the split between decisive and exploring users. The probabilities tab will show the probability of each variant winning again utilizing the bayesian model and previously gathered data. If a clear winning variant has been found, that will be shown below the bar chart. Sometimes, a winner is not found due to an insufficient amount of impressions per variant. This will be shown underneath the bar chart in addition to the amount of lacking impressions.
To inspect the results per variant more closely, scroll down to the breakdown of variants. The data points will show the statistics for the variant in question. From there, it is easy to view how the time to install varies, or how the split between decisive and exploring users is. These data points give valuable insight into the behavior of the audience.
Traffic
The traffic tab goes more into detail about the conversion rate and installs vs. impressions over time. The graph shows the change in results over time which shows the effect of seasonality or source for example. Additionally, in the case of concept validation, it is possible to survey demographics to find out more about the audience. Those results can also be found here.
Visuals
This tab allows you to see further the differences between each screenshot in each variant. You can inspect the conversion rate per variant in the line chart. Further down the detailed breakdown of each screenshot per variant shows the results in numbers. Additionally, if the store page includes a video, you are able to see the engagement and conversion of the video. The conversion rate per video is derived by dividing the number of installs by the number of individuals who saw the specific video to a specific second.
Engagement
This is the part where you get to see how the users behave on your store page. Which screenshots gain the most attraction, does the description catch the reader’s eye, and where on the page does the user spends the most time. Furthermore, a breakdown of the number of clicks each element gets is displayed to help determine where the user engages and what makes them engage.
Things to consider
Here’s a list of things to consider and analyze to make the most out of your results:
- How do the users behave and which of the user segments, decisive or exploring, plays the bigger role in the overall conversion rate?
- How are the individual screenshots in variations performing? Which are the most important one’s to pay the most attention to?
- In addition to the conversion rate or IPM (install per mille), focus on the user behavior insight. A lot can be analyzed from the users’ actions on the lookalike store page.
- Combine quantitative and qualitative data. On Geeklab it is possible using SurveyLab to actually get to know your audience.
- If the store page includes a video, take a look at its overall performance both engagement and conversion wise. Pay attention to how long the video captures the user’s attention.
With these things in mind, analyzing the a/b test’s campaign results will guide you towards your next step whether it be making decisions or testing some more. If you are reading this before setting up a campaign – no worries. Head on over here to get started and come back to this if needed to.
PS. Are you a follower already?