Amazon Kinesis

Stream event data in real time to Amazon Kinesis Data Streams for processing, analytics, and downstream consumption.

⚠️

This integration is currently in beta. Configuration and behaviour may change.

Configuration

FieldTypeRequiredDescription
stream_namestringYesThe name of the Kinesis data stream to write records to.
regionselectYesThe AWS region where your Kinesis stream is provisioned.
access_key_idstringYesThe AWS access key ID with permissions to put records on the stream.
secret_access_keysecretYesThe AWS secret access key associated with the access key ID.
partition_keystringNoTemplate expression for the Kinesis partition key. Determines shard placement. Defaults to a random UUID if not set.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Amazon Kinesis or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_kinesis/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Amazon Kinesis",
    "variant": "default",
    "config": {
      "stream_name": "datafly-events-stream",
      "region": "us-east-1",
      "access_key_id": "AKIAIOSFODNN7EXAMPLE"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon Kinesis. No client-side scripts are loaded for this integration.

Visit Amazon Kinesis documentation for full API documentation and credential setup instructions.