IntegrationsCloud StorageAmazon Data Firehose

Amazon Data Firehose

Deliver event data through Amazon Data Firehose for automatic batching, transformation, and loading into S3, Redshift, or other destinations.

Configuration

FieldTypeRequiredDescription
delivery_stream_namestringYesThe name of the Firehose delivery stream to send records to.
regionselectYesThe AWS region where your Firehose delivery stream is configured.
access_key_idstringYesThe AWS access key ID with firehose:PutRecord and firehose:PutRecordBatch permissions.
secret_access_keysecretYesThe AWS secret access key associated with the access key ID.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Amazon Data Firehose or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_firehose/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Amazon Data Firehose",
    "variant": "default",
    "config": {
      "delivery_stream_name": "datafly-firehose-stream",
      "region": "us-east-1",
      "access_key_id": "AKIAIOSFODNN7EXAMPLE"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon Data Firehose. No client-side scripts are loaded for this integration.

Visit Amazon Data Firehose documentation for full API documentation and credential setup instructions.