Big Query - Daily exports from AB Tasty to Big Query

đź’ˇ Good to know

This feature is currently available in Early Adoption. Please contact your CSM to enroll into the Early Adopter program.

 

Google BigQuery is a serverless, highly scalable and fully managed data warehouse that comes with a built-in query engine. BigQuery enables scalable analysis of petabytes of data. 

The Google BigQuery integration allows you to export any data collected by AB Tasty’s tracking system to a Google BigQuery dataset, daily.

We will now proceed to configure the connector and the export.

 

The connector

Step 1: Activate BigQuery 

Go to your BigQuery console and activate BigQuery.

 

Step 2: Create a service account

The service account is used to generate your credentials (in JSON format). More information on service accounts here.

  1. Go to the service account 

  1. Click on create Service account 

  1. Enter a name and a description
  2. Add a role: you need to add “BigQuery Admin”

  1. Click on “done” to validate. 

 

Step 3: Create and export keys

Now that the service account is created, we will create the credential keys and export them.

  1. Click on the new services account created 
  2. Go into your service account, then “Keys” and click “add key” > “create new key”
  3. Select “JSON” and click “create”.
  4. Download the key. 

 

The content of the key should look like this: 

 

Step 4: Create a new dataset

  1. Go back to BigQuery
  2. In the Explorer menu choose your GCP project and click the three dots, then click “Create dataset”

 

  1. Give it an ID and a location (no other mandatory options)
  2. Click “create dataset”

Your dataset is now created and should appear in the Explorer menu. By clicking on your dataset you should be able to display its details. 

More information on how to create a BigQuery dataset can be found here.

 

Step 5: Setup the connector in AB Tasty

  1. In AB Tasty, go to the Integration Hub page > databases > BigQuery > setup connector
  2. Enter a name for the connector
  3. Enter the dataset location (info can be found in the details of the created dataset)
  4. Enter the dataset ID (info can be found in the details of the created dataset). Copy and paste the part to the right of the ". (see the above screenshot)
  5. Enter the project ID where your dataset is located. Copy and paste the part to the left of the ".  (see the above screenshot)
  6. Choose Service account as the Authorization Method
  7. JSON credentials: paste the content of the key (JSON file) that was downloaded when you created the credentials.
  8. Click on “Test connection”
  9. Validate by clicking on “Next step”.


Good to know đź’ˇ

You will get an error message, if one of the fields contains an error. 

 

Your connector is now set up, and you can proceed to set up your Export. 

 

The export 

To set up your daily export, please refer to the guide: Databases integration.

Step 1: Generate your payload

Refer to the Databases article to create your payload.

Step 2: Set up the export

  1. Export name: the name of your export, give an explicit name to easily retrieve it in AB Tasty
  2. Name of the table: the name of the table we will create in your BigQuery 
  3. Data exporter query: paste here the payload of your data explorer query 
  4. Click save and create. 

The Google BigQuery integration is now complete, and you will soon see the data flowing into your dedicated database (It can take up to 2–3 hours, depending on the size of your report).

 

Good to know đź’ˇ

The export is activated upon creation, and new data will be appended to the current one, daily. The following screenshot shows that the export is activated on creation:

You can only have one export per day on the free plan. When creating a new export, all previous ones will be automatically deactivated.
Please contact your KAM to upgrade your plan if you need more exports.

Was this article helpful?

/