A guide on how to send event data to your own Google BigQuery instance.
Please note that the event pipeline is not a self-service product. In an individual project, we set it up on your own infrastructure based on your needs for you. This guide is only relevant to customers.
With our event pipeline, you will get full access to the underlying raw data in Google BigQuery.

1. Set up a BigQuery project:

Create a Google Cloud Platform (GCP) project if you do not have an existing one. You'll find more details here.

2. Set up a Service Account

Create a new service account in BigQuery under IAM & Admin > Service Accounts and set permissions to BigQuery Data Editor. Next, create a JSON key for the service account.

3. Optional: Create a dataset

A dataset will be created automatically for you if it doesn't exist. If you want to, you can also create a dataset in BigQuery by yourself. We recommend all EU-customers to also choose the location EU.

Activate the Google BigQuery destination

The destination type is google_bigquery
curl -X 'POST' \
'' \
-H 'Authorization: Bearer TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"name": "Google BigQuery",
"type": "google_bigquery",
"config": {
"projectId": "MY_GCP_PROJECT",
"dataset: "MY_DATASET",
"serviceAccount": "SERVICE_ACCOUNT_JSON"
You can optionally specify custom table names for sessions, events & entities by setting the config parameters tableSessions, tableEvents, tableEntities.
All tables will be created automatically for you.