This feature is currently available in Early Adoption. Please contact your CSM to enroll into the Early Adopter program.
Snowflake is a cloud data warehouse that can store and analyze all your data records in one place. It can automatically scale up/down its compute resources to load, integrate, and analyze data.
The connector
Step 1: Create the worksheet
From your SnowFlake console, execute the following steps:
- Create a SnowFlake account
- Go into your account to create a new SQL Snowflake worksheet
- Copy this below code (change the annotation field as role, username, password etc ..)
-- set variables (these need to be uppercase)
set abt_snwoflake_role = 'YOUR_ROLE'; // field to be defined by you
set abt_snowflake_username = 'YOUR_USER_NAME'; // field to be defined by you
set abt_snowflake_warehouse = 'YOUR_WAREHOUSE';// field to be defined by you
set abt_snowflake_database = 'YOUR_DATABASE';// field to be defined by you
set abt_snowflake_schema = 'YOUR_SCHEMA'; // field to be defined by you
-- set user password
set airbyte_password = 'your_password'; // field to be defined by you
begin;
-- create your role
use role securityadmin;
create role if not exists identifier($abt_snwoflake_role);
grant role identifier($abt_snwoflake_role) to role SYSADMIN;
-- create your user
create user if not exists identifier($abt_snowflake_username)
password = $airbyte_password
default_role = $abt_snwoflake_role
default_warehouse = $abt_snowflake_warehouse;
grant role identifier($abt_snwoflake_role) to user identifier($abt_snowflake_username);
-- change role to sysadmin for warehouse / database steps
use role sysadmin;
-- create your warehouse
create warehouse if not exists identifier($abt_snowflake_warehouse)
warehouse_size = xsmall
warehouse_type = standard
auto_suspend = 60
auto_resume = true
initially_suspended = true;
-- create your database
create database if not exists identifier($abt_snowflake_database);
-- grant your warehouse access
grant USAGE
on warehouse identifier($abt_snowflake_warehouse)
to role identifier($abt_snwoflake_role);
-- grant your database access
grant OWNERSHIP
on database identifier($abt_snowflake_database)
to role identifier($abt_snwoflake_role);
commit;
begin;
USE DATABASE identifier($abt_snowflake_database);
-- create schema for Airbyte data
CREATE SCHEMA IF NOT EXISTS identifier($abt_snowflake_schema);
commit;
begin;
-- grant Airbyte schema access
grant OWNERSHIP
on schema identifier($abt_snowflake_schema)
to role identifier($abt_snwoflake_role);
commit;
- Paste it inside the console and run it
- Once the script is done, refresh the page. On the left part, a new line will appear with the name you gave inside the script for the Data Warehouse
set abt_snowflake_database = 'YOUR_DATABASE';// field to be defined by you
In the above example, the name of the worksheet is YOUR_DATABASE
Step 2: Get SnowFlake Host
On the Snowflake homepage, click at the bottom left of the page and copy the account URL
Step 3: Set up the connector
- Go to the integration page > Data Warehouse > Snowflake > setup connector
- Enter a name for the collector
- Host: it corresponds to the account URL you’ve already copied
- Role: the role we created in the previous script
- Warehouse: the warehouse we created in the previous script
- Data Warehouse: the Data Warehouse we created in the previous script
- Authorization method: choose username and password
- Username: the username we created in the previous script
- Password: the password we created in the previous script
- Loading Method: choose internal staging
You will get an error message, if one of the fields contains an error.
Your connector is now set up, and you can proceed to set up your Export.
The export
To set up your daily export, please refer to the guide: Data Warehouse integrations: General information.
Step 1: Generate your payload
Refer to the Data Warehouse article to create your payload.
Step 2: Set up the export
- Export name: the name of your export, give an explicit name to easily retrieve it in AB Tasty
- Name of the table: the name of the table we will create in your Snowflake
- Data exporter query: paste here the payload of your data explorer query
- Click save and create.
The SnowFlake integration is now complete, and you will soon see the data flowing into your dedicated Data Warehouse (It can take up to 2–3 hours, depending on the size of your report).
The export is activated upon creation, and new data will be appended to the current one, daily. The following screenshot shows that the export is activated on creation: