Amazon S3

Stream event data to Amazon S3 buckets for durable storage, data lake ingestion, and downstream analytics.

Configuration

FieldTypeRequiredDescription
bucket_namestringYesThe name of the S3 bucket where event files will be written.
regionselectYesThe AWS region where your S3 bucket is located.
access_key_idstringYesThe AWS access key ID with write permissions to the target bucket.
secret_access_keysecretYesThe AWS secret access key associated with the access key ID.
prefixstringNoOptional prefix (folder path) prepended to all object keys. Include a trailing slash.
file_formatselectYesThe output file format for event data written to S3.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Amazon S3 or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_s3/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Amazon S3",
    "variant": "default",
    "config": {
      "bucket_name": "my-datafly-events",
      "region": "us-east-1",
      "access_key_id": "AKIAIOSFODNN7EXAMPLE"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon S3. No client-side scripts are loaded for this integration.

Visit Amazon S3 documentation for full API documentation and credential setup instructions.