🛠️ [Troubleshooting] There are differences in data between AB Tasty and my analytics 

Many parameters can affect your data analysis and prevent you from comparing data accurately between two tools. For example: 

  • The data that you compare must cover the same scope of analysis
  • Your web analytics tool’s configuration must match the configuration of your tests in AB Tasty
  • The technical and network infrastructure (probes, bots, and crawlers) needs to be taken into consideration as it can affect data collection and create discrepancies in your metrics

 

Calculations

 

It’s important to understand that AB Tasty is not a web analytics tool (like Google Analytics or AT Internet). Its job is to help you to decide between several versions of your page by comparing their performance based on indicators that are calculated identically by AB Tasty regardless of the differences between the versions. The calculation method is specific to the AB Tasty solution.

 

Web analytics tools each have their specificities and calculation methods. Therefore, it’s not possible to simply compare AB Tasty data – for example on unique visitors – with that provided by Google Analytics or AT-Internet.

 

AB Tasty measures unique visitors and unique conversions. Before attempting to compare indicators provided by our reporting and that of your analytics tool, you must make sure these metrics are comparable.

 

A common mistake is to compare our visitor metric with Google Analytics’ default visits metric. But if you do this, you’ll end up with incoherent data. Instead, you should compare AB Tasty’s figures with unique visitor numbers in Google Analytics.

 

This calculation also has an impact on the measured goals, because we use metrics that avoid duplicating conversions. If an internet user converts repeatedly on your site (for example, by purchasing twice during the analysis period), AB Tasty will only record a single conversion.

 

For example: to compare a button click objective (AB Tasty), in Google Analytics you must implement and monitor the unique event indicator to compare the data accurately.




Configuration

 

⭐ Tip: Before performing A/B testing campaigns, we recommend you perform a calibration test – called an A/A test.

This will help you confirm:

  • The random distribution of traffic according to the selected traffic modulation
  • The reporting of targets of URL types or clicks
  • The consistency of the data with your web analytics tool

In certain cases, differences may appear in your results between AB Tasty and your analytics tool. While differences of the order of 10% are acceptable, beyond that, check that your scopes of analysis are the same and you are comparing identical criteria.

 

Try asking yourself these questions:

 

  • Do you know the exact settings of your web analytics tool?
  • Does your tagging plan contain specific characteristics that affect the measurement of the indicators?
  • Have you excluded certain IP addresses in your web analytics tool which are not excluded in AB Tasty?
  • Is your conversion tunnel properly configured on the web analytics tool?
  • Does the URL of your target in the conversion tunnel take into account all hypothetical cases? You must ensure that the definition of the conversion target in AB Tasty is identical to that of your web analytics tool.
  • Have you enabled cross-domain tracking on the analytics side?




Comparison of data from different scopes

 

You may find errors when comparing data between AB Tasty and a web analytics tool. This is because these tools’ scopes are sometimes quite different.

 

Here are some of the most common reasons for these errors:

 

  1. Different tagging plans: AB Tasty’s tag and your analytics tool’s tag are not deployed in the same way on your site. Depending on the age of your site, the work that’s been done to it, and its various versions, the tagging plan’s complexity can differ.
  2. The absence of the AB Tasty tag on certain pages will have a mechanical impact on the recorded data.
  3. The analysis scope is being changed by targeting options. Your targeting options can reduce the measurement. For example:
    • You target a test on the homepage
    • Your objective is to measure the conversion rate of a variant of this page
    • If the visitor does not see your homepage but arrives on your site via a deeper page, and they convert, this conversion will be counted in your analytics tool, but not in AB Tasty as the visitor will not have been subject to the test

      It is therefore not appropriate to compare conversions in the two tools. In your web analytics tool, you must restrict your analysis to those users who saw the home page, to compare like with like.
      Overall, you need to be more vigilant in the comparison of data if you use restrictive targeting in your tests.
  4. Bad settings of your target URLs. Make sure you have entered the URL format of your target confirmation page correctly and that you have exhaustively taken all cases into account. It is not uncommon to find certain URL formats with forgotten objectives, for which no conversion will have been recorded. 

For example, you have entered the following URL format: “URL must be equal to http://www.example.fr/confirmation-target.”

Ask yourself, can we access this page:

  • Via the SSL protocol? In that case, conversions performed on https://www.example.fr/confirmation-target will not be taken into considerable
  • Without mention of a sub-domain? In that case, the URL http://example.fr/confirmation-target will not be taken into account
  • In the additional URL parameters you ignore? In this case, conversions performed on http://www.example.fr/target-confirmation?parameter=a-specific-case will not be recorded.

A clear vision of possible scenarios for these URLs must therefore be drafted beforehand. You can then use the various operators proposed by AB Tasty to correctly configure your target URLs (exactly equal to, contains, regular expression, etc.).

 



Take traffic modulation parameters into consideration

 

If you try to compare visitors and conversions between tools, you should remember to readjust your calculations according to the chosen modulation of traffic. On a test of two variations (original and variant), by default, the modulation of the traffic is 50/50 and all of your visitors are subjected to the test. 

 

If you modulate the traffic and apply, for example, a modulation of 70% for the original and 30% for the change, only 60% of your traffic will be included in your test (30% for the original and 30% for the variant to compare samples of the same size).

 

If conversion rates are not affected by this modulation (the orders of magnitude will be the same with your web analytics tool), the number of conversions will necessarily different.




Absence of web analytics side indicators

 

Certain targeting options specific to AB Tasty are not visible in your web analytics tool, so you cannot compare this data. For example, if you target a typology of specific pages based on their content and not on their URLs (for example, only the pages containing a badge with a promotional offer), you will retrieve conversion data for a more restricted population of internet users.

 

Unless you have a very sharp analytics tagging plan, you will probably not have specific indicators of this internet user population in your analytics tool.

 



Temporality

 

The time range over which you are going to compare data will have consequences on the differences that are visible in your tools. For example, if you compare results for the same day. On the AB Tasty side, a conversion is linked to a visitor for the total length of the test (several days or weeks). If you filter AB Tasty reporting on a single day, the displayed number of unique visitors will correspond to the number of new visitors that were assigned to the test on that day.

 

The number of conversions displayed will, however, match the conversions by these new users from that day – even if the conversion takes place later.

 

Not being a web analytics tool, in the sense of Google Analytics, we do not display the number of conversions carried out on a single day but the number of conversions completed over time by internet users who were tested on a given date.




Technical infrastructure and network

 

In very specific cases, it is possible that we either do not collect all user data or, on the contrary, we collect more compared to the analytics tools that you use.

 

If the user has a very slow internet connection

 

AB Tasty prevents the launch of a test if it takes more than two seconds for the page hosting your changes to be loaded, once the AB Tasty tag has requested it. For slower connections (such as 3G), this saves the visitor from seeing a flickering effect.

 

Mobile users 

 

If your mobile audience is significant, it may create discrepancies with the data from your analytics tool. In this case, it's better to exclude mobile users from your test and compare only the desktop traffic in your web analytics tool.

 

Use of sensors 

 

If you use sensors to mimic the behavior of your internet users (process tests, debugging, etc.) but their settings for third-party tools are different, this may also create discrepancies with the data from your analytics tool. One common issue arises if they have been previously configured in order not to distort the data reported by the web analytics tool but have not been updated during the integration of AB Tasty, which then provides more data than your web analytics tool.

 

Heads up ⚡

The latest version of Tasty AB’s tags automatically excludes users using older browsers (Internet Explorer 8 or earlier). These browsers represent a tiny part of web traffic, but if a large share of your users still uses them, this could also explain discrepancies.

 

Was this article helpful?

/