Google Bigtable
Load event data into Google Bigtable for high-throughput, low-latency NoSQL storage at massive scale.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
project_id | string | Yes | The Google Cloud project ID that contains the Bigtable instance. |
instance_id | string | Yes | The Bigtable instance ID. |
table_id | string | Yes | The Bigtable table ID to write rows into. |
column_family | string | Yes | The column family to write data to. Defaults to events. |
service_account_json | secret | Yes | The full JSON key file content for a GCP service account with Bigtable User role. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Google Bigtable or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/google_bigtable/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Google Bigtable",
"variant": "default",
"config": {
"project_id": "my-gcp-project",
"instance_id": "my-bigtable-instance",
"table_id": "events",
"column_family": "events"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Google Bigtable. No client-side scripts are loaded for this integration.
Visit Google Bigtable documentation for full API documentation and credential setup instructions.