Google Cloud Storage
Store event data as objects in Google Cloud Storage buckets for archival, data lake, and batch processing use cases.
⚠️
This integration is currently in beta. Configuration and behaviour may change.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
bucket_name | string | Yes | The name of the GCS bucket where event files will be written. |
project_id | string | Yes | The Google Cloud project ID that owns the bucket. |
service_account_json | secret | Yes | The full JSON key file content for a GCP service account with Storage Object Creator permissions on the bucket. |
prefix | string | No | Optional prefix (folder path) prepended to all object names. Include a trailing slash. |
file_format | select | Yes | The output file format for event data written to GCS. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Google Cloud Storage or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/google_cloud_storage/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Google Cloud Storage",
"variant": "default",
"config": {
"bucket_name": "my-datafly-events",
"project_id": "my-gcp-project",
"service_account_json": "YOUR_SERVICE_ACCOUNT_JSON"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Google Cloud Storage. No client-side scripts are loaded for this integration.
Visit Google Cloud Storage documentation for full API documentation and credential setup instructions.