A/B Tasty is an intuitive A/B testing solution with advanced features. It allows you to modify website pages using a visual editor, without writing any code, and without technical knowledge. You can then measure which versions of your pages produce the best results for your objectives (pageviews, registrations, purchases, etc.). This quick start guide will help you as you create your first A/B test.
Using AB Tasty
There are five steps to setting up a test with AB Tasty:
- Setting up the AB Tasty tag on all the pages of your site.
- Creating the different versions of your pages ("variations") in AB Tasty’s WYSIWYG editor.
- Configuring the testing options (in particular, the test target and the traffic affected by each variation)
- Defining which indicators should be measured.
- Running the test and reading the results.
Setting up the AB Tasty tag
Once the tag has been added, you don’t have to worry about it anymore, because the same tag allows you to run all your future tests. Where do I find this tag and where do I install it?
You can also consult our help section on technical questions if you have further questions.
Creating your variations with the editor
After the AB Tasty script has been added to your pages, log in to AB Tasty with your credentials. You can then access the Dashboard, which allows you to create your first test. This Dashboard will allow you to manage your tests in the future. Learn more about the Dashboard and the available options.
To create your first test, click on the "Create" button. You will be invited to enter the URL of the page to be modified. After you click on the “Save” button, you are sent to the WYSIWYG editor, which will allow you to create your variations, modify your pages, and manage the test settings. Learn more about the WYSIWYG editor.
A first variation, which is an exact copy of the original version, is created by default. You can navigate between the variations by using the tabs at the top of the editor. You can create as many variations as you wish by clicking on the “New Variation” button.
Click on the "Variation 1" tab to activate it in the editor. When you move your mouse around on the page, you will see that the page elements are highlighted when you mouse over them. Left-click to select one of them. A contextual menu will appear, giving you several options to modify the selected element or refine your selection.
Among the proposed modifications, you can:
- Move the selected element,
- Delete the element or hide its content,
- Resize the element,
- Change the text or HTML content of the element,
- Change the styles associated with the element,
- Change the selected links or images,
- Add elements (paragraphs, images, HTML code, etc.).
With the selection features, among other things, you can:
- Expand the current selection,
- Reduce the selection,
- Select elements that are similar to the one that is selected.
After you have modified the selection, you can apply a modification to it.
Discover all the modifications offered in the editor.
Configure your test options
After you have prepared all of your variations, you need to configure the options that apply to the test. These are available from the left hand menu.
They allow you to:
- Target the pages on which your changes should be applied (ex: every page of one template),
- Target the visitors who should see the test (ex: only visitors who come from one of your sponsored links),
- Assign a percentage of your traffic to each variation.
To target your test, click on the "Targeting" link. You can access a window that will allow you to define certain target criteria on the page level or visitor level.
You can target a test by URL, source type, visitor behavior, and more. Discover all the target methods offered.
Finally, to define the percentage of your traffic to be allocated to each variation, click on the "Traffic Allocation" link. A window appears, allowing you to use sliders to attribute a specific percentage to each variation.
If you have other questions, see our help section on setting up tests.
Setting up indicator tracking
Before you run your test, you need to define the objectives you wish to measure. In AB Tasty, you can measure three types of objectives:
- Objectives of visitors’ engagement with your content. These objectives are measured with indicators like bounce rate, time spent on the site, and number of pageviews. These indicators are automatically calculated by AB Tasty, and you don’t need to set them up. Learn more about the engagement indicators.
- Click objectives. You can easily measure the click rate on certain important elements you want to track, like your action buttons (ex: add to cart button). To do so, just select the element in the editor and choose the “Expert > Click tracking” button. Give your click objective a name, and it will automatically be shown in the report interface when your test is run. Learn more about adding click tracking.
- URL objectives. Similar to Google Analytics’ URL objectives, these objectives allow you to record that an objective has been met when a visitor sees a specific page (ex: checkout confirmation page). To create this type of objective, go to the test’s report interface and click on the “New objective” button. You will see a window where you can indicate the URL formats of your objective pages. Learn more about adding a URL objective.
Reading the test results
After you have run your test, the results are shown in real time in the report interface. To see them, from the Dashboard, click on the “Report” button that corresponds to your test. The report interface is displayed in a new window like the one shown below.
The main information displayed on the screen will allow you to quickly analyze your test results:
- Test name
- Launch date, number of visitors tested and test status (active, paused)
- Test result sharing tools (send by email, PDF and Excel export, comment area, public address to access the results without an account)
- List of objectives, with the following information for each:
- Type of objective (action tracking, URL objective)
- List of variations
- Number of visitors for each variation
- Number of conversions recorded for each variation
- Corresponding conversion rate
- Percentage of improvement over the original version
- Statistical reliability of the data
You can also refine the test results by filtering your data through various criteria, such as the traffic’s origin, the visitor’s behavior, etc. Use the drop down menu below to adjust the filters for the analyses you want to perform.
See our online help section to learn more about reading and interpreting your test results.