Amazon Kinesis

Datafly Signal delivers events to Amazon Kinesis Data Streams for real-time stream processing, analytics, and event-driven architectures.

⚠️

This integration is currently in beta. Configuration and behaviour may change.

Prerequisites

Before configuring Amazon Kinesis in Signal, you need an AWS account with a Kinesis data stream and IAM credentials.

Create an AWS Account

If you don’t already have one, sign up at aws.amazon.com.

Create a Kinesis Data Stream

  1. Open the Kinesis console.
  2. Click Create data stream.
  3. Enter a Data stream name (e.g. datafly-events).
  4. Choose the Capacity mode:
    • On-demand — automatically scales to handle your throughput (recommended for variable workloads).
    • Provisioned — set a fixed number of shards. Each shard supports 1 MB/s write and 2 MB/s read.
  5. Click Create data stream.

On-demand mode automatically manages shard count and scales up to 200 MB/s write throughput. For predictable, high-volume workloads, provisioned mode gives more control over costs.

Create an IAM User for Signal

  1. Open the IAM console.
  2. Go to Users > Create user.
  3. Enter a username (e.g. datafly-signal-kinesis).
  4. Attach a custom policy:
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": [
        "kinesis:PutRecord",
        "kinesis:PutRecords"
      ],
      "Resource": "arn:aws:kinesis:us-east-1:123456789012:stream/datafly-events"
    }
  ]
}
  1. Replace the region, account ID, and stream name with your values.

Generate Access Keys

  1. On the IAM user detail page, go to Security credentials.
  2. Under Access keys, click Create access key.
  3. Select Application running outside AWS.
  4. Copy the Access Key ID and Secret Access Key immediately — the secret is only shown once.
⚠️

Store these credentials securely. If you lose the secret access key, you must create a new key pair.

Configuration

FieldTypeRequiredDescription
stream_namestringYesThe name of the Kinesis data stream.
regionselectYesThe AWS region where the data stream is located.
access_key_idsecretYesThe AWS access key ID for authentication.
secret_access_keysecretYesThe AWS secret access key for authentication.
partition_keystringNoTemplate expression for the partition key. Determines shard placement. Defaults to a random UUID for even distribution.

Signal Setup

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Amazon Kinesis or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default blueprint.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/amazon_kinesis/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Amazon Kinesis",
    "variant": "default",
    "config": {
      "stream_name": "datafly-events",
      "region": "us-east-1",
      "access_key_id": "AKIAIOSFODNN7EXAMPLE",
      "secret_access_key": "wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY",
      "partition_key": "message_id"
    },
    "delivery_mode": "server_side"
  }'

Testing

  1. Enable the integration in Signal and trigger a test event on your website.
  2. Open the Kinesis console and select your data stream.
  3. Check the Monitoring tab for incoming put record metrics.
  4. Use the Data Viewer tab to inspect individual records in the stream.
  5. In Signal, check the Live Events view to confirm delivery status shows as successful.

Troubleshooting

ProblemSolution
Events not appearing in the streamVerify the stream name and region are correct. Check that the stream status is Active.
AccessDeniedExceptionThe IAM user lacks kinesis:PutRecord or kinesis:PutRecords permission. Update the IAM policy.
ResourceNotFoundExceptionThe stream does not exist in the specified region. Check the stream name and region.
ProvisionedThroughputExceededExceptionThe shard write capacity has been exceeded. Add more shards or switch to on-demand mode.
Records landing on the same shardChoose a partition key with high cardinality (e.g. message_id) to distribute records evenly across shards.
High latency on consumersIncrease the number of shards to improve read parallelism. Each shard supports 2 MB/s read throughput.
Credential errorsVerify the access key ID and secret access key are correct and the IAM user has not been deactivated.

Visit Amazon Kinesis documentation for full API reference and stream management guides.