Amazon Data Firehose
Deliver event data through Amazon Data Firehose for automatic batching, transformation, and loading into S3, Redshift, or other destinations.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
delivery_stream_name | string | Yes | The name of the Firehose delivery stream to send records to. |
region | select | Yes | The AWS region where your Firehose delivery stream is configured. |
access_key_id | string | Yes | The AWS access key ID with firehose:PutRecord and firehose:PutRecordBatch permissions. |
secret_access_key | secret | Yes | The AWS secret access key associated with the access key ID. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Amazon Data Firehose or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_firehose/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Amazon Data Firehose",
"variant": "default",
"config": {
"delivery_stream_name": "datafly-firehose-stream",
"region": "us-east-1",
"access_key_id": "AKIAIOSFODNN7EXAMPLE"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon Data Firehose. No client-side scripts are loaded for this integration.
Visit Amazon Data Firehose documentation for full API documentation and credential setup instructions.