Amazon S3
Stream event data to Amazon S3 buckets for durable storage, data lake ingestion, and downstream analytics.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
bucket_name | string | Yes | The name of the S3 bucket where event files will be written. |
region | select | Yes | The AWS region where your S3 bucket is located. |
access_key_id | string | Yes | The AWS access key ID with write permissions to the target bucket. |
secret_access_key | secret | Yes | The AWS secret access key associated with the access key ID. |
prefix | string | No | Optional prefix (folder path) prepended to all object keys. Include a trailing slash. |
file_format | select | Yes | The output file format for event data written to S3. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Amazon S3 or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_s3/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Amazon S3",
"variant": "default",
"config": {
"bucket_name": "my-datafly-events",
"region": "us-east-1",
"access_key_id": "AKIAIOSFODNN7EXAMPLE"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon S3. No client-side scripts are loaded for this integration.
Visit Amazon S3 documentation for full API documentation and credential setup instructions.