Google BigQuery
Load event data into Google BigQuery for large-scale analytics, reporting, and machine learning workloads.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
project_id | string | Yes | The Google Cloud project ID that contains the target BigQuery dataset. |
dataset_id | string | Yes | The BigQuery dataset ID where the table resides. |
table_id | string | Yes | The BigQuery table ID to insert rows into. The table must already exist with a compatible schema. |
service_account_json | secret | Yes | The full JSON key file content for a GCP service account with BigQuery Data Editor permissions. |
location | select | Yes | The geographic location of the BigQuery dataset. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Google BigQuery or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/google_bigquery/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Google BigQuery",
"variant": "default",
"config": {
"project_id": "my-gcp-project",
"dataset_id": "datafly_events",
"table_id": "events"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Google BigQuery. No client-side scripts are loaded for this integration.
Visit Google BigQuery documentation for full API documentation and credential setup instructions.