A/B Test Implementation Checklist
Launch A/B test to measure the impact of specific change
Test and measure the impact of every configuration or strategy change on your KPI and business goals.
Why should you do it?
A/B testing allows you to create two alternative search experiences with different settings and test with real user traffic which performs better. You can put your merchandising and other strategic hypotheses to test and base your decisions on user data.
A/B testing relevance configurations
A/B Testing: Algolia AB Testing helps you uncover the best performing relevance strategy and enable you to iterate on your relevance configuration directly from the dashboard. You can test and see how each of your changes impacts your business. By enabling A/B testing, you can make more data driven decisions and lower the risks of negative business impact before full rollout of your new relevance settings. You can iterate as many times as you want without code changes.
Algolia AB testing can be used for the following use cases. You can A/B test any of the following relevance configurations:
- Textual Relevance settings - plurals, stop words, synonyms
- Business Relevance settings - custom ranking (decide whether to use sales margins, products popularity for ranking)
- User centric Relevance settings - Personalization, AI re-ranking
- Manual relevance settings - merchandising rules
Note: Before you set up any A/B tests, you’ll need to first send events to Algolia. In regard with sending events, there are a number of methods you can choose to send events. You might need the assistance of your engineering team for this step.
Once your engineering team completes the set up in order to send click and conversion events to Algolia, you can create a new test.
For example, the digital merchandiser of an ecommerce website would like to test the impact of a new Personalization strategy.
First, step would be to configure the variants: two different scenarios of the personalization strategy.
Variant A is usually the setting used in your live index, while Variant B contains the configuration we want to test. In this case, A variant has the Personalizations settings off, B variant - on.
Next, the digital merchandiser can configure the percentage % of traffic they want to forward from A variant to B variant and the duration of the test. As best practice, we recommend keeping the A/B test running for at least 2 weeks.
The test created by the digital merchandiser will run until it completes the test period and by end of test, they will be able to evaluate the results based on different metrics:
- Total searches and users: methodology data regarding the A/B test
- CTR and CVR: what are the click-through-rate and conversion rates?
- Significance score for each of the A/B tests: can we make a decisive conclusion based on the test?
- The impact of the new relevance setting: did the new relevance setting increase or decrease CTR and CVR?