Kafka
Data Fabric runs and internally manages a Kafka broker for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. Kafka serves as the backbone for data ingestion and processing pipelines.
Users are managed by the Strimzi Kubernetes Operator, which creates a KafkUser
CRD for each user that is granted access to Kafka
(by an administrator). The KafkaUser
ID matches the user’s principal ID in Keycloak.
This enables Kafka to identify the user by their principal ID in Keycloak (instead of a separate access key and secret key),
which is needed to authorize the user for access to specific objects.
For authorization decisions, Kafka defers to Open Policy Agent (OPA) on a per-topic basis. This is where security policies are applied, such as classification access controls.
See Clients for getting connected.