Confluent Cloud
Datafly Signal delivers events to Confluent Cloud for fully managed Apache Kafka streaming with enterprise features like Schema Registry, ksqlDB, and connectors.
Prerequisites
Before configuring Confluent Cloud in Signal, you need a Confluent Cloud account with a Kafka cluster, a topic, and an API key pair.
Create a Confluent Cloud Account
- Sign up at confluent.io/confluent-cloud.
- Complete the onboarding wizard.
Create a Kafka Cluster
- In the Confluent Cloud console, go to Environments and select your environment (or create a new one).
- Click Add cluster.
- Choose a cluster type:
- Basic — multi-tenant, pay-as-you-go (suitable for development and low-volume production).
- Standard — single-tenant, higher throughput limits.
- Dedicated — fully isolated, custom configurations.
- Select a Cloud provider (AWS, GCP, or Azure) and Region.
- Click Launch cluster.
- Note the Bootstrap server address from the cluster settings (e.g.
pkc-abc123.us-east-1.aws.confluent.cloud:9092).
Create a Topic
- In your cluster, go to Topics > Create topic.
- Enter a Topic name (e.g.
datafly-events). - Set the number of Partitions (6 is a good default for moderate throughput).
- Configure Retention settings (default 7 days).
- Click Create with defaults or customise further.
Generate an API Key Pair
- In your cluster, go to API Keys (under Cluster settings or Data integration).
- Click Create key.
- Choose Global access or specify a service account.
- Click Create.
- Copy the API Key and API Secret immediately — the secret is only shown once.
⚠️
Store the API secret securely. If you lose it, you must create a new key pair.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
bootstrap_servers | string | Yes | The Confluent Cloud bootstrap server address (e.g. pkc-abc123.us-east-1.aws.confluent.cloud:9092). |
topic | string | Yes | The Kafka topic to produce messages to. |
api_key | secret | Yes | The Confluent Cloud API key for authentication. |
api_secret | secret | Yes | The Confluent Cloud API secret for authentication. |
Signal Setup
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Confluent Cloud or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default blueprint.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/confluent_cloud/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Confluent Cloud",
"variant": "default",
"config": {
"bootstrap_servers": "pkc-abc123.us-east-1.aws.confluent.cloud:9092",
"topic": "datafly-events",
"api_key": "YOUR_API_KEY",
"api_secret": "YOUR_API_SECRET"
},
"delivery_mode": "server_side"
}'Testing
- Enable the integration in Signal and trigger a test event on your website.
- In the Confluent Cloud console, go to your cluster > Topics > your topic.
- Click Messages to view recently produced messages.
- Inspect the message value to verify the event data.
- In Signal, check the Live Events view to confirm delivery status shows as successful.
Troubleshooting
| Problem | Solution |
|---|---|
| Events not appearing in the topic | Verify the bootstrap server address and topic name are correct. |
| Authentication failure | The API key or secret is incorrect. Verify the credentials. Generate a new key pair if needed. |
TopicAuthorizationException | The API key lacks produce permission on the topic. Create a new key with appropriate ACLs or use a global access key. |
UnknownTopicOrPartition | The topic does not exist. Verify the topic name in the Confluent Cloud console. |
| Connection timeout | Ensure Signal can reach the bootstrap server on port 9092. Check any firewall or network policies. |
MessageSizeTooLarge | Individual messages exceed the topic’s max message size (default 1 MB). Check event payload size. |
| API key expired or revoked | Generate a new API key pair in the Confluent Cloud console. |
Visit Confluent Cloud documentation for full API reference, Schema Registry setup, and connector configuration.