Azure Blob Storage
Store event data as blobs in Azure Blob Storage containers for data lake, archival, and analytics workloads.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
storage_account | string | Yes | The name of the Azure Storage account. |
container_name | string | Yes | The name of the blob container where event files will be written. |
connection_string | secret | Yes | The Azure Storage connection string with write access to the container. Found under Access Keys in the Azure portal. |
prefix | string | No | Optional prefix (virtual directory) prepended to all blob names. Include a trailing slash. |
file_format | select | Yes | The output file format for event data written to Azure Blob Storage. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Azure Blob Storage or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/azure_blob/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Azure Blob Storage",
"variant": "default",
"config": {
"storage_account": "mydataflystorage",
"container_name": "datafly-events",
"connection_string": "YOUR_CONNECTION_STRING"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Azure Blob Storage. No client-side scripts are loaded for this integration.
Visit Azure Blob Storage documentation for full API documentation and credential setup instructions.