Running your campaign through the QA process enables you to view it in the same conditions as your visitors will. This way, you can make sure that visual changes are displayed correctly, that targeting criteria are complied with and that click tracking is taken into account and visible in the campaign reporting section.
When you decide to check that your variations apply correctly to all browsers, you may not be assigned to your campaign or to all your variations.
If you cannot view your changes, it may be for one of the following reasons:
- The AB Tasty tag is out of date,
- There is a targeting error,
- You are assigned to the original version.
The AB Tasty tag is out of date
When you launch your campaign to run it through the QA process, the tag is updated automatically. However, if you make changes once the campaign has been started, the tag isn't updated automatically. If the tag is out of date, changes made to the campaign will not be visible in QA mode, even though changes have been saved on the interface.
In this case, you can click the green arrow button from the top right-hand section of the interface (all steps but the variation editor) to force a tag update. If on the variation editor, pause and play again the campaign from the left sidebar.
There is a targeting error
In production, you may not be assigned to a campaign at all. This is the case if the targeting criteria aren't met, for instance. The most frequent errors are often due to incorrect page targeting, a targeted IP address that does not match your own, or to the fact that the targeted device differs from the one you are running the QA process on.
In this case, to correct these errors, return to the targeting step of your campaign and apply the following steps:
- Make sure the targeted IP address matches your own. If necessary, you may add several original IP addresses.
- Make sure the target pages coincide with the page you want to view.
- Make sure the targeted device (if applicable) coincides with the one you are using to run the QA process.
- Open a new private browser window or delete your cookies before refreshing the page.
To find out which campaign you are assigned to, enter the ABTasty.results command via the Chrome console. The list of campaigns is displayed, along with the variation you are assigned to.
You are assigned to the original version
In production, you may be assigned to the campaign’s original version, particularly if traffic allocation is set to a 50/50 ratio.
In this case, to avoid seeing the original version set traffic allocation to 100% on the variation you want to see.
👍 |
Once you’ve finished running the QA process, change the split traffic back to 50/50 for it to be evenly divided between the variations. |
To find out which variation you are assigned to, enter theABTasty.getTestsOnPagecommand via the console. The list of campaigns is displayed, along with the variation you are assigned to.
In most cases, if you cannot see your changes when using the QA mode, it is for one of these reasons. If this isn’t the case, the problem is due to the changes themselves (for instance, if they weren’t made on the correct selector). In this case, please contact your dedicated CSM.
I'm assigned to the campaign or variation
Before you start a test, we recommend that you go over the following points to avoid wasting time debugging it when it’s running:
- Check that the AB Tasty tag is present on all the site’s pages. If your site is not fully tagged, and one of your tests targets a page where our tag is not present, your modifications will not be applied to that page.
- Verify the target of your test; this is the most common error. You must have a good understanding of the various URL formats on your site to correctly configure URL targeting. Use the “Exactly matches,” “Contains,” “Regular Expression,” “Include,” and “Exclude” operators judiciously to target the correct URLs. Your target must not be too broad or too specific, depending on the desired case.
- Verify the other target options and the possible presence of excluded IPs at the testing or account level.
- Use the preview feature available in the menu of each variation to test your changes on any page or browser.
- Reset your test to "live" on your live site by changing the target options. To do this, set your test to target only visitors who have a specific cookie (ex: check-test=1), allocate 100% of the traffic to one of your variations, and run the test. Load the page you wish to test, and delete your cookies. Use your browser’s development console to recreate the cookie that will allow you to see the test. Enter the following command in the console: document.cookie = "check-test=1"; Reload the page to see the test, and check that everything is working properly.
- Make sure you have properly defined the test’s objectives, whether they are click objectives (tracking action) in the editor, or URL objectives in the report interface. Remember that even though you can create URL objectives after the fact, you can’t do the same for tracking actions, which must be defined before the test is run.
If you start your test and see errors in the display, keep the following elements in mind when you are debugging:
- If you change one of your tests while it is running, you must validate the publication of the changes by clicking on the “Publish” button at the top right of the editor. Without this validation, your changes will not be taken into account.
- After you have started a test or validated a change, it may take up to 30 seconds for the script containing your changes to be updated on our CDN. So wait at least 30 seconds before checking whether your changes have had the desired effect.
- Your browser’s cache content and cookies may disrupt test debugging. Delete them before trying to see why your changes aren’t showing up.
- As a general rule, if you wish to make further changes to your pages such as changing block layout, we recommend that you make these changes with JavaScript/jQuery code rather than using the graphical editor’s features. You can make your changes with fewer steps and avoid creating duplicate sets of changes that might overlap or cancel each other out.