Kafka sink¶
The Kafka sink produces data to a Kafka topic.
Entities sent to this sink will use the "key"
, "value"
, "headers"
, "key_schema"
and "value_schema"
properties to produce the messages sent to the Kafka topic. The latter two properties are only relevant if the "confluent_schema_json"
serializer is used. The properties "key"
and "value"
are mandatory. The "headers"
property is optional, but it must be an object with string keys and string or bytes values if present.
The properties used matches the properties emitted by the Kafka source. This means that it should be possible to consume a topic and produce to a new topic in a pipe with no DTL.
The sink will flush to Kafka after every batch.
Note
The sink does not do client-side schema valiation at this time. We might add this in a future release.
Prototype¶
{
"type": "kafka",
"system": "kafka-system-id",
"topic": "some-topic"
}
Properties¶
Property |
Type |
Description |
Default |
Req |
---|---|---|---|---|
|
String |
The id of the Kafka System component to use. |
Yes |
|
|
String |
The topic to send to. |
Yes |
|
|
String |
Name of the serializer to use for the key. Allowed values are |
|
|
|
String |
Name of the serializer to use for the value. Allowed values are |
|
|
|
Integer |
The time in milliseconds to await acknowledgement from the broker and the time allowed for retriable send failures. |
|