Kafka — Event Streaming Backbone

To explore the full API, download the OpenAPI or view it in the Scalar docs.

SDL runs and internally manages an event streaming broker for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. The event streaming backbone handles data ingestion and processing pipelines.

Users are managed by the Strimzi Kubernetes Operator, which creates a KafkUser CRD for each user that is granted access to Kafka (by an administrator). The KafkaUser ID matches the user’s principal ID in Keycloak. This enables Kafka to identify the user by their principal ID in Keycloak (instead of a separate access key and secret key), which is needed to authorize the user for access to specific objects.

For authorization decisions, Kafka defers to Open Policy Agent (OPA) on a per-topic basis. This is where security policies are applied, such as classification access controls.

Kafka Overview

See Clients for getting connected.