Apache Kafka
Publish event data to Apache Kafka topics for real-time stream processing, event sourcing, and integration with downstream consumers.
Configuration
| Field | Type | Required | Description |
|---|---|---|---|
bootstrap_servers | string | Yes | Comma-separated list of Kafka broker addresses in host:port format. |
topic | string | Yes | The Kafka topic to produce messages to. The topic must already exist or auto-creation must be enabled. |
security_protocol | select | Yes | The protocol used to communicate with Kafka brokers. |
sasl_mechanism | select | No | The SASL mechanism for authentication. Required when using SASL_PLAINTEXT or SASL_SSL security protocol. |
sasl_username | string | No | The SASL username for authentication. Required when a SASL mechanism is selected. |
sasl_password | secret | No | The SASL password for authentication. Required when a SASL mechanism is selected. |
Quick Setup
- Navigate to Integrations in the sidebar.
- Open the Integration Library tab.
- Find Apache Kafka or filter by Cloud Storage.
- Click Install, select a variant if available, and fill in the required fields.
- Click Install Integration to create the integration with a ready-to-use default configuration.
API Setup
curl -X POST http://localhost:8084/v1/admin/integration-catalog/kafka/install \
-H "Authorization: Bearer YOUR_TOKEN" \
-H "Content-Type: application/json" \
-d '{
"name": "Apache Kafka",
"variant": "default",
"config": {
"bootstrap_servers": "broker1:9092,broker2:9092",
"topic": "datafly-events",
"security_protocol": "SASL_SSL"
},
"delivery_mode": "server_side"
}'Delivery
Events are delivered server-side from your Datafly Signal infrastructure directly to Apache Kafka. No client-side scripts are loaded for this integration.
Visit Apache Kafka documentation for full API documentation and credential setup instructions.