Configuration
A guide on how to send event data to your own Google BigQuery instance.
Please note that the event pipeline is not a self-service product. In an individual project, we set it up on your own infrastructure based on your needs for you. This guide is only relevant to customers.
With our event pipeline, you will get full access to the underlying raw data in Google BigQuery.
Create a Google Cloud Platform (GCP) project if you do not have an existing one. You'll find more details here.
Create a new service account in BigQuery under IAM & Admin > Service Accounts and set permissions to
BigQuery Data Editor
. Next, create a JSON key for the service account.A dataset will be created automatically for you if it doesn't exist. If you want to, you can also create a dataset in BigQuery by yourself. We recommend all EU-customers to also choose the location EU.
API
The destination type is
google_bigquery
curl -X 'POST' \
'https://app.p.elbwalkerapis.com/projects/PROJECTID/destinations' \
-H 'Authorization: Bearer TOKEN' \
-H 'Content-Type: application/json' \
-d '{
"name": "Google BigQuery",
"type": "google_bigquery",
"config": {
"projectId": "MY_GCP_PROJECT",
"dataset: "MY_DATASET",
"serviceAccount": "SERVICE_ACCOUNT_JSON"
}
}'
You can optionally specify custom table names for sessions, events & entities by setting the config parameters
tableSessions, tableEvents, tableEntities
.All tables will be created automatically for you.
Last modified 9mo ago