Apache Kafka

Publish event data to Apache Kafka topics for real-time stream processing, event sourcing, and integration with downstream consumers.

Configuration

FieldTypeRequiredDescription
bootstrap_serversstringYesComma-separated list of Kafka broker addresses in host:port format.
topicstringYesThe Kafka topic to produce messages to. The topic must already exist or auto-creation must be enabled.
security_protocolselectYesThe protocol used to communicate with Kafka brokers.
sasl_mechanismselectNoThe SASL mechanism for authentication. Required when using SASL_PLAINTEXT or SASL_SSL security protocol.
sasl_usernamestringNoThe SASL username for authentication. Required when a SASL mechanism is selected.
sasl_passwordsecretNoThe SASL password for authentication. Required when a SASL mechanism is selected.

Quick Setup

  1. Navigate to Integrations in the sidebar.
  2. Open the Integration Library tab.
  3. Find Apache Kafka or filter by Cloud Storage.
  4. Click Install, select a variant if available, and fill in the required fields.
  5. Click Install Integration to create the integration with a ready-to-use default configuration.

API Setup

curl -X POST http://localhost:8084/v1/admin/integration-catalog/kafka/install \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{
    "name": "Apache Kafka",
    "variant": "default",
    "config": {
      "bootstrap_servers": "broker1:9092,broker2:9092",
      "topic": "datafly-events",
      "security_protocol": "SASL_SSL"
    },
    "delivery_mode": "server_side"
  }'

Delivery

Events are delivered server-side from your Datafly Signal infrastructure directly to Apache Kafka. No client-side scripts are loaded for this integration.

Visit Apache Kafka documentation for full API documentation and credential setup instructions.