docs/index.asciidoc in logstash-output-kafka-6.2.2 vs docs/index.asciidoc in logstash-output-kafka-6.2.4

- old
+ new

@@ -18,31 +18,16 @@ include::{include_path}/plugin_header.asciidoc[] ==== Description -Write events to a Kafka topic. This uses the Kafka Producer API to write messages to a topic on -the broker. +Write events to a Kafka topic. -Here's a compatibility matrix that shows the Kafka client versions that are compatible with each combination -of Logstash and the Kafka output plugin: +This plugin uses Kafka Client 0.11.0.0. For broker compatibility, see the official https://cwiki.apache.org/confluence/display/KAFKA/Compatibility+Matrix[Kafka compatibility reference]. -[options="header"] -|========================================================== -|Kafka Client Version |Logstash Version |Plugin Version |Why? -|0.8 |2.0.0 - 2.x.x |<3.0.0 |Legacy, 0.8 is still popular -|0.9 |2.0.0 - 2.3.x | 3.x.x |Works with the old Ruby Event API (`event['product']['price'] = 10`) -|0.9 |2.4.x - 5.x.x | 4.x.x |Works with the new getter/setter APIs (`event.set('[product][price]', 10)`) -|0.10.0.x |2.4.x - 5.x.x | 5.x.x |Not compatible with the <= 0.9 broker -|0.10.1.x |2.4.x - 5.x.x | 6.x.x | -|0.11.0.0 |2.4.x - 5.x.x | 6.2.2 |Not compatible with the <= 0.9 broker -|========================================================== +If you're using a plugin version that was released after {version}, see the https://www.elastic.co/guide/en/logstash/master/plugins-inputs-kafka.html[latest plugin documentation] for updated information about Kafka compatibility. If you require features not yet available in this plugin (including client version upgrades), please file an issue with details about what you need.. -NOTE: We recommended that you use matching Kafka client and broker versions. During upgrades, you should -upgrade brokers before clients because brokers target backwards compatibility. For example, the 0.9 broker -is compatible with both the 0.8 consumer and 0.9 consumer APIs, but not the other way around. - This output supports connecting to Kafka over: * SSL (requires plugin version 3.0.0 or later) * Kerberos SASL (requires plugin version 5.1.0 or later) @@ -300,13 +285,20 @@ [id="plugins-{type}s-{plugin}-retries"] ===== `retries` * Value type is <<number,number>> - * Default value is `0` + * There is no default value for this setting. -Setting a value greater than zero will cause the client to -resend any record whose send fails with a potentially transient error. +The default retry behavior is to retry until successful. To prevent data loss, +the use of this setting is discouraged. + +If you choose to set `retries`, a value greater than zero will cause the +client to only retry a fixed number of times. This will result in data loss +if a transport fault exists for longer than your retry count (network outage, +Kafka down, etc). + +A value less than zero is a configuration error. [id="plugins-{type}s-{plugin}-retry_backoff_ms"] ===== `retry_backoff_ms` * Value type is <<number,number>>