📖 Definition
A Multipage test enables you to test a new version of one or several elements across a user’s journey, that is to say on different pages which don’t share the same structure (such as the homepage, the product pages and the basket page). These element(s) may have different layouts depending on page structure.
As for A/B tests, after analyzing the results of your test, you need to decide which version has performed best according to the goal you wanted to reach (e.g., increasing the number of clicks, the number of pages viewed, decreasing the bounce rate). You can then apply these changes directly to your website.
⚙️ Configuration
Here are some tips for a successful set-up:
-
MAIN INFORMATION: we recommend establishing a hypothesis of your test based on the following model: if I apply [this change on my webpage] to [this audience], then [it will impact] and enables to enhance [this goal]. And to add it in the description field.
In the Pages section, enter the URL you want to load in the editor for each page or group of pages. Each page coincides with a part of the user's journey you want to test. You need to include at least 2 pages which must be different (e.g.,: the product pages and the basket page).
-
EDITOR: you must adapt the modification to each page you have configured. Each page has the same number of variations, as you build an entire new user’s journey.
If it is clickable, always put an action tracking on the element you add or modify in order to follow the performance of your campaign (e.g.,: cross to close the popin, CTA in the popin etc.). Don’t forget to put these action trackings on each page. -
GOALS: as the primary goal enables you to determine which variation takes precedence over the others, for Multipage tests, exceptionally, your primary goal should be the result of the user’s journey, regarding the main goal of your test: retention (primary goal should be the number of viewed pages), loyalty (revisit rate) or conversion (transaction rate).
You should also choose the action tracking related to the elements you have modified as secondary goals. Indeed, this is the user behavior that is most likely to be affected by the modification you have made in the editor. -
TARGETING: the target pages (Where section)must be different for each page, as they relate to a specific step in the user’s journey. The segment(s) (Who section) must be the same for each page.
Don’t forget to configure the targeting for each page. If one or several sections (Who, Where, How) have the same configuration, you can use the Replicate targeting option. - TRAFFIC ALLOCATION: traffic allocation must be identical for each variation. For example, avoid the following distribution: original 20%, variation 1 50%, variation 2 30%.
- QA: QA is a fundamental step to make sure the modifications appear correctly on all targeted devices and that your action trackings have been configured correctly. For more information, refer to Using the QA assistant.
- Before launching your test into production (that is to say, making it visible to your audience), make sure the QA mode is disabled, the targeting of your test has been saved and traffic allocation is evenly divided between the variations.
- We recommend that you let a test run for at least 15 days before analyzing it.
💡 Use cases
Multipage tests can be used in the following cases:
🖊️ Action / modification |
🎯 Goal(s) |
Changing the color of the ‘Add to cart’ CTA, visible on product pages and list pages (quick buy) |
Limiting anxiety of the CTA and easing the add to cart action. |
Adding delivery fee information on different pages (every 5 items in the product list, in the product description of the product pages and as a banner in the cart page). |
Monitoring the impact of giving more transparent information about delivery fees across the customer’s journey. |
Replacing indoor pictures with outside photographs throughout the website. |
Finding the best way to shoot your models regarding the appetence of your audience for real life photographs. |
Need additional information?
Submit your request at product.feedback@abtasty.com
Always happy to help!