Fluent in Conversion Rate Optimisation?
Have you heard of ASO? – No? Don’t worry, you’re not the only one.

I work in a CRO specialised agency, along with a team of international experts in data analysis, CRO and digital marketing. The other day I was talking to a colleague who I admire for his broad knowledge in online marketing and digital growth. I mentioned that I’m planning to write an article about how all these ASO tools are overrated for keyword optimisation… When he looked at me and asked

“Sorry, just one second… What is ASO?”
Me: “App Store Optimisation…”
Him: “Ah, interesting! What is this, I’ve never heard about it!”

And then it hit me, how I was in the geek bubble. Even if you’re a specialist in digital, you might not already know this, just like me before I started diving into the mobile acquisition conversion world.

So what is ‘App Store Optimisation’ and why should you care?

App Store Optimisation (ASO) is obviously only relevant to apps. There is no connection to website optimisation. If you have an app, and you want people to find it when they are searching for you or any description of you or your product in the Apple App Store, in Google Play or other stores that offer mobile apps for downloading, then you want to appear high up in search results.

Sounds familiar? This is basically a version of SEO, just inside app stores. You’ll optimise your keywords to rank higher than your competitors and get a higher number of click throughs, which will then be another factor, improving your ranking. This is one of the two pillars of ASO; Keyword Search Optimisation.

The second one, and probably more interesting one for all A/B testers, is the CRO part of ASO. Once a visitor has clicked on the search result or on your ad, they land on your store listing page. You could call this the landing page of the app store world and this is where you can start testing to improve your numbers. Meaning increase the number of installs for your app. This should be the first step, just like in web CRO, to lift the store conversion rate which then decreases costs on your ad spendings or maximises free acquisition through organic search.

How do you test on app store listings?

Different from website testing, integrating and using testing tools in app store experiments has quite some limitations, since you are dependant on what the store providers allow, meaning Apple and Google (and a few others, who I won’t elaborate on today).

Testing @ Apple
Apple is – who would have thought – very protective of their app store and its appearance. If you ever submitted an app to Apple, you will know the specific guidelines from the app submission to approval procedure are strict. Everything needs to go through Apple quality assurance and might take several rounds of re-submitting your app before you go live.

This might be the reason that there is no straightforward possibility to A/B test your store listing in the Apple App Store. You can change elements over time and perform a time series analysis, which doesn’t provide the same context for the compared variations and is therefore a rather unpopular testing method. There are a few A/B testing tool providers for the Apple App Store, like Splitmetrics or Storemaven. They allow you to split test different variations of your store listing to see which one makes more people click the install button.


Img source: Splitmetrics

What makes marketers hesitate here, is the fact that the experiment will run on an additional page that looks exactly like the store listing (control) or that is changed in an element (variation) and is shown to visitors before they even land on the real page:

Img source: Storemaven

So if they convert on the test page, they then get re-directed to the actual page, where they would have to convert again. Which is not a great user experience.

Img source: Splitmetrics

Through this process you obviously confuse the user and lose installs, yet gain insights. For some marketers, these would have to be very valuable to allow the conversion loss through testing.

Testing @ Google
Google makes it easier and lets you test directly in their store. Google Play offers the possibility to test almost every element of your store listing as soon as your app is published. A common workaround process for app store testing is therefore to test in Google first and later apply well working variations to Apple. However, keeping in mind the different target groups, user behaviour and experiences for both use cases, it might be valuable to do both.

Which elements can be tested in Google Play?
You can test all graphical elements of your listing as well as the descriptions.

Usually only one element should be tested at a time, to make sure the change can be attributed accordingly. Which requires a prioritisation of your test ideas considering the expected impact and implementation effort. Graphical changes, especially above the fold, mostly have the biggest impact, while copy changes need to be aligned with the keyword strategy and usually don’t affect the CVR as dramatically as i.e. an icon change. However, everything is context related and can only be proven by testing!

How should you prioritize testing?

1. Icon
You would want to do your icon tests first, after freshly publishing your app, to establish the best converting icon in stores and on your users phones. Changing an icon at a later stage in your app history might not be as simple as in the beginning. However, never stop optimising and test new ideas even if this results in updating the icon on the phones of a 1 million user base.


Img source: Tech Crunch

2. Feature graphic
Similar to a hero banner images on your website, this has a huge impact on download numbers and should be prioritised up high in your list. Also test if having a promo video vs. not having one, may work for you.

3. Screenshots
Get creative on hypothesis for your screenshots and test them. You might want to test real in-app screenshots vs. marketing designed screenshots. The first screenshot has the biggest impact, the last few much less since users don’t usually go through all of them.

4. Short description
Call to action!
Include as many relevant keywords as possible and ask the visitor to take an action.

5. Long description
Changes here have the lowest impact on the CVR, if any, in my experience. However it always depends on the app, the context and the extent of the change.

Different to most website testing tools, Google Play will calculate statistical significance with a 90% confidence interval only. This should be kept in mind when analysing your results.

In the end, it all comes down to having a plan, yet staying flexible and innovative in your testing strategy – just like website optimisation!

Do you want to know more about this topic? Contact Daniela for more information or questions.

Spread the love