Amazon Kinesis
Stream event data in real time to Amazon Kinesis Data Streams for processing, analytics, and downstream consumption.
⚠️
This integration is currently in beta. Configuration and behaviour may change.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
stream_name | string | Yes | The name of the Kinesis data stream to write records to. |
region | select | Yes | The AWS region where your Kinesis stream is provisioned. |
access_key_id | string | Yes | The AWS access key ID with permissions to put records on the stream. |
secret_access_key | secret | Yes | The AWS secret access key associated with the access key ID. |
partition_key | string | No | Template expression for the Kinesis partition key. Determines shard placement. Defaults to a random UUID if not set. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Amazon Kinesis or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_kinesis/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Amazon Kinesis",
"variant": "default",
"config": {
"stream_name": "datafly-events-stream",
"region": "us-east-1",
"access_key_id": "AKIAIOSFODNN7EXAMPLE"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Amazon Kinesis. No client-side scripts are loaded for this integration.
Visit Amazon Kinesis documentation for full API documentation and credential setup instructions.