elbwalker docs
Search…
Google BigQuery
A guide on how to send event data to your own Google BigQuery instance.
With our Google BigQuery Destination, you will get full access to the underlying raw data.

1. Set up a BigQuery project before adding it as a destination in elbwalker:

Create a Google Cloud Platform (GCP) project if you do not have an existing one. You'll find more details here.

2. Set up a Service Account

Create a new service account in BigQuery under IAM & Admin > Service Accounts and set permissions to BigQuery Data Editor. Next, create a JSON key for the service account.

3. Optional: Create a dataset

A dataset will be created automatically for you if it doesn't exist. If you want to, you can also create a dataset in BigQuery by yourself. We recommend all EU-customers to also choose the location EU.

Activate the Google BigQuery destination

API
Web App
​
The destination type is google_bigquery
1
curl -X 'POST' \
2
'https://app.p.elbwalkerapis.com/projects/PROJECTID/destinations' \
3
-H 'Authorization: Bearer TOKEN' \
4
-H 'Content-Type: application/json' \
5
-d '{
6
"name": "Google BigQuery",
7
"type": "google_bigquery",
8
"config": {
9
"projectId": "MY_GCP_PROJECT",
10
"dataset: "MY_DATASET",
11
"serviceAccount": "SERVICE_ACCOUNT_JSON"
12
}
13
}'
Copied!
You can optionally specify custom table names for sessions, events & entities by setting the config parameters tableSessions, tableEvents, tableEntities.
  • Navigate to destinations and add a new Google BigQuery destination.
  • Specify your google project id and dataset as well as a JSON service account with Big Query Data Editor permission
  • Activate the Google BigQuery destination with the toggle and you're done βœ…
All tables will be created automatically for you.
Got questions? Contact us, we can help πŸ€“