:plugin: csv :type: output /////////////////////////////////////////// START - GENERATED VARIABLES, DO NOT EDIT! /////////////////////////////////////////// :version: %VERSION% :release_date: %RELEASE_DATE% :changelog_url: %CHANGELOG_URL% :include_path: ../../../../logstash/docs/include /////////////////////////////////////////// END - GENERATED VARIABLES, DO NOT EDIT! /////////////////////////////////////////// [id="plugins-{type}s-{plugin}"] === Csv output plugin include::{include_path}/plugin_header.asciidoc[] ==== Description CSV output. Write events to disk in CSV or other delimited format Based on the file output, many config values are shared Uses the Ruby csv library internally [id="plugins-{type}s-{plugin}-options"] ==== Csv Output Configuration Options This plugin supports the following configuration options plus the <> described later. [cols="<,<,<",options="header",] |======================================================================= |Setting |Input type|Required | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|Yes | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|Yes | <> |<>|No |======================================================================= Also see <> for a list of options supported by all output plugins.   [id="plugins-{type}s-{plugin}-create_if_deleted"] ===== `create_if_deleted` * Value type is <> * Default value is `true` If the configured file is deleted, but an event is handled by the plugin, the plugin will recreate the file. Default => true [id="plugins-{type}s-{plugin}-csv_options"] ===== `csv_options` * Value type is <> * Default value is `{}` Options for CSV output. This is passed directly to the Ruby stdlib to_csv function. Full documentation is available on the http://ruby-doc.org/stdlib-2.0.0/libdoc/csv/rdoc/index.html[Ruby CSV documentation page]. A typical use case would be to use alternative column or row seperators eg: `csv_options => {"col_sep" => "\t" "row_sep" => "\r\n"}` gives tab seperated data with windows line endings [id="plugins-{type}s-{plugin}-dir_mode"] ===== `dir_mode` * Value type is <> * Default value is `-1` Dir access mode to use. Note that due to the bug in jruby system umask is ignored on linux: https://github.com/jruby/jruby/issues/3426 Setting it to -1 uses default OS value. Example: `"dir_mode" => 0750` [id="plugins-{type}s-{plugin}-fields"] ===== `fields` * This is a required setting. * Value type is <> * There is no default value for this setting. The field names from the event that should be written to the CSV file. Fields are written to the CSV in the same order as the array. If a field does not exist on the event, an empty string will be written. Supports field reference syntax eg: `fields => ["field1", "[nested][field]"]`. [id="plugins-{type}s-{plugin}-file_mode"] ===== `file_mode` * Value type is <> * Default value is `-1` File access mode to use. Note that due to the bug in jruby system umask is ignored on linux: https://github.com/jruby/jruby/issues/3426 Setting it to -1 uses default OS value. Example: `"file_mode" => 0640` [id="plugins-{type}s-{plugin}-filename_failure"] ===== `filename_failure` * Value type is <> * Default value is `"_filepath_failures"` If the generated path is invalid, the events will be saved into this file and inside the defined path. [id="plugins-{type}s-{plugin}-flush_interval"] ===== `flush_interval` * Value type is <> * Default value is `2` Flush interval (in seconds) for flushing writes to log files. 0 will flush on every message. [id="plugins-{type}s-{plugin}-gzip"] ===== `gzip` * Value type is <> * Default value is `false` Gzip the output stream before writing to disk. [id="plugins-{type}s-{plugin}-path"] ===== `path` * This is a required setting. * Value type is <> * There is no default value for this setting. This output writes events to files on disk. You can use fields from the event as parts of the filename and/or path. By default, this output writes one event per line in **json** format. You can customise the line format using the `line` codec like [source,ruby] output { file { path => ... codec => line { format => "custom format: %{message}"} } } The path to the file to write. Event fields can be used here, like `/var/log/logstash/%{host}/%{application}` One may also utilize the path option for date-based log rotation via the joda time format. This will use the event timestamp. E.g.: `path => "./test-%{+YYYY-MM-dd}.txt"` to create `./test-2013-05-29.txt` If you use an absolute path you cannot start with a dynamic string. E.g: `/%{myfield}/`, `/test-%{myfield}/` are not valid paths [id="plugins-{type}s-{plugin}-spreadsheet_safe"] ===== `spreadsheet_safe` * Value type is <> * Default value is `true` Option to not escape/munge string values. Please note turning off this option may not make the values safe in your spreadsheet application [id="plugins-{type}s-{plugin}-common-options"] include::{include_path}/{type}.asciidoc[]