Google BigQuery

Load event data into Google BigQuery for large-scale analytics, reporting, and machine learning workloads.

Configuration

FieldTypeRequiredDescription
project_idstringYesThe Google Cloud project ID that contains the target BigQuery dataset.
dataset_idstringYesThe BigQuery dataset ID where the table resides.
table_idstringYesThe BigQuery table ID to insert rows into. The table must already exist with a compatible schema.
service_account_jsonsecretYesThe full JSON key file content for a GCP service account with BigQuery Data Editor permissions.
locationselectYesThe geographic location of the BigQuery dataset.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Google BigQuery or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/google_bigquery/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Google BigQuery",
    "variant": "default",
    "config": {
      "project_id": "my-gcp-project",
      "dataset_id": "datafly_events",
      "table_id": "events"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Google BigQuery. No client-side scripts are loaded for this integration.

Visit Google BigQuery documentation for full API documentation and credential setup instructions.