Kafka binding spec

Detailed documentation on the Kafka binding component

配置

To setup Kafka binding create a component of type bindings.kafka. See this guide on how to create and apply a binding configuration.

  1. apiVersion: dapr.io/v1alpha1
  2. kind: Component
  3. metadata:
  4. name: <NAME>
  5. namespace: <NAMESPACE>
  6. spec:
  7. type: bindings.kafka
  8. version: v1
  9. metadata:
  10. - name: topics # Optional. in use for input bindings
  11. value: topic1,topic2
  12. - name: brokers
  13. value: localhost:9092,localhost:9093
  14. - name: consumerGroup
  15. value: group1
  16. - name: publishTopic # Optional. in use for output bindings
  17. value: topic3
  18. - name: authRequired # Required. default: "true"
  19. value: "false"
  20. - name: saslUsername # Optional.
  21. value: "user"
  22. - name: saslPassword # Optional.
  23. value: "password"
  24. - name: maxMessageBytes # Optional.
  25. value: 1024

Warning

以上示例将密钥明文存储, It is recommended to use a secret store for the secrets as described here.

元数据字段规范

字段必填绑定支持详情Example
topicsN输入A comma separated string of topics“mytopic1,topic2”
brokersYInput/OutputA comma separated string of kafka brokers“localhost:9092,localhost:9093”
consumerGroupN输入A kafka consumer group to listen on“group1”
publishTopicY输出The topic to publish to“mytopic”
authRequiredYInput/OutputDetermines whether to use SASL authentication or not. Defaults to “true”“true”, “false”
saslUsernameNInput/OutputThe SASL username for authentication. Only used if authRequired is set to - “true”“user”
saslPasswordNInput/OutputThe SASL password for authentication. Only used if authRequired is set to - “true”“password”
maxMessageBytesNInput/OutputThe maximum size allowed for a single Kafka message. Defaults to 10242048

绑定支持

此组件支持 输入和输出 绑定接口。

字段名为 ttlInSeconds

  • create

Specifying a partition key

When invoking the Kafka binding, its possible to provide an optional partition key by using the metadata section in the request body.

The field name is partitionKey.

You can run Kafka locally using this Docker image. To run without Docker, see the getting started guide here.

  1. curl -X POST http://localhost:3500/v1.0/bindings/myKafka \
  2. -H "Content-Type: application/json" \
  3. -d '{
  4. "data": {
  5. "message": "Hi"
  6. },
  7. "metadata": {
  8. "partitionKey": "key1"
  9. },
  10. "operation": "create"
  11. }'

相关链接