:plugin: s3 :type: input /////////////////////////////////////////// START - GENERATED VARIABLES, DO NOT EDIT! /////////////////////////////////////////// :version: %VERSION% :release_date: %RELEASE_DATE% :changelog_url: %CHANGELOG_URL% :include_path: ../../../../logstash/docs/include /////////////////////////////////////////// END - GENERATED VARIABLES, DO NOT EDIT! /////////////////////////////////////////// [id="plugins-{type}s-{plugin}"] === S3 input plugin include::{include_path}/plugin_header.asciidoc[] ==== Description Stream events from files from a S3 bucket. Each line from each file generates an event. Files ending in `.gz` are handled as gzip'ed files. [id="plugins-{type}s-{plugin}-options"] ==== S3 Input Configuration Options This plugin supports the following configuration options plus the <> described later. [cols="<,<,<",options="header",] |======================================================================= |Setting |Input type|Required | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|Yes | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>, one of `["us-east-1", "us-east-2", "us-west-1", "us-west-2", "eu-central-1", "eu-west-1", "eu-west-2", "ap-southeast-1", "ap-southeast-2", "ap-northeast-1", "ap-northeast-2", "sa-east-1", "us-gov-west-1", "cn-north-1", "ap-south-1", "ca-central-1"]`|No | <> |<>|No | <> |<>|No | <> |<>|No | <> |<>|No |======================================================================= Also see <> for a list of options supported by all input plugins.   [id="plugins-{type}s-{plugin}-access_key_id"] ===== `access_key_id` * Value type is <> * There is no default value for this setting. This plugin uses the AWS SDK and supports several ways to get credentials, which will be tried in this order: 1. Static configuration, using `access_key_id` and `secret_access_key` params in logstash plugin config 2. External credentials file specified by `aws_credentials_file` 3. Environment variables `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` 4. Environment variables `AMAZON_ACCESS_KEY_ID` and `AMAZON_SECRET_ACCESS_KEY` 5. IAM Instance Profile (available when running inside EC2) [id="plugins-{type}s-{plugin}-aws_credentials_file"] ===== `aws_credentials_file` * Value type is <> * There is no default value for this setting. Path to YAML file containing a hash of AWS credentials. This file will only be loaded if `access_key_id` and `secret_access_key` aren't set. The contents of the file should look like this: [source,ruby] ---------------------------------- :access_key_id: "12345" :secret_access_key: "54321" ---------------------------------- [id="plugins-{type}s-{plugin}-backup_add_prefix"] ===== `backup_add_prefix` * Value type is <> * Default value is `nil` Append a prefix to the key (full path including file name in s3) after processing. If backing up to another (or the same) bucket, this effectively lets you choose a new 'folder' to place the files in [id="plugins-{type}s-{plugin}-backup_to_bucket"] ===== `backup_to_bucket` * Value type is <> * Default value is `nil` Name of a S3 bucket to backup processed files to. [id="plugins-{type}s-{plugin}-backup_to_dir"] ===== `backup_to_dir` * Value type is <> * Default value is `nil` Path of a local directory to backup processed files to. [id="plugins-{type}s-{plugin}-bucket"] ===== `bucket` * This is a required setting. * Value type is <> * There is no default value for this setting. The name of the S3 bucket. [id="plugins-{type}s-{plugin}-delete"] ===== `delete` * Value type is <> * Default value is `false` Whether to delete processed files from the original bucket. [id="plugins-{type}s-{plugin}-exclude_pattern"] ===== `exclude_pattern` * Value type is <> * Default value is `nil` Ruby style regexp of keys to exclude from the bucket [id="plugins-{type}s-{plugin}-interval"] ===== `interval` * Value type is <> * Default value is `60` Interval to wait between to check the file list again after a run is finished. Value is in seconds. [id="plugins-{type}s-{plugin}-prefix"] ===== `prefix` * Value type is <> * Default value is `nil` If specified, the prefix of filenames in the bucket must match (not a regexp) [id="plugins-{type}s-{plugin}-proxy_uri"] ===== `proxy_uri` * Value type is <> * There is no default value for this setting. URI to proxy server if required [id="plugins-{type}s-{plugin}-region"] ===== `region` * Value can be any of: `us-east-1`, `us-east-2`, `us-west-1`, `us-west-2`, `eu-central-1`, `eu-west-1`, `eu-west-2`, `ap-southeast-1`, `ap-southeast-2`, `ap-northeast-1`, `ap-northeast-2`, `sa-east-1`, `us-gov-west-1`, `cn-north-1`, `ap-south-1`, `ca-central-1` * Default value is `"us-east-1"` The AWS Region [id="plugins-{type}s-{plugin}-secret_access_key"] ===== `secret_access_key` * Value type is <> * There is no default value for this setting. The AWS Secret Access Key [id="plugins-{type}s-{plugin}-session_token"] ===== `session_token` * Value type is <> * There is no default value for this setting. The AWS Session token for temporary credential [id="plugins-{type}s-{plugin}-sincedb_path"] ===== `sincedb_path` * Value type is <> * Default value is `nil` Where to write the since database (keeps track of the date the last handled file was added to S3). The default will write sincedb files to some path matching "$HOME/.sincedb*" Should be a path with filename not just a directory. [id="plugins-{type}s-{plugin}-temporary_directory"] ===== `temporary_directory` * Value type is <> * Default value is `"/tmp/logstash"` Set the directory where logstash will store the tmp files before processing them. default to the current OS temporary directory in linux /tmp/logstash [id="plugins-{type}s-{plugin}-common-options"] include::{include_path}/{type}.asciidoc[]