Sha256: edd37202f686e183675fb9b2ecd6afcdf0f29dfec833f604d5e68987305b10ad
Contents?: true
Size: 1.43 KB
Versions: 1
Compression:
Stored size: 1.43 KB
Contents
# Fluent::Plugin::Elasticsearch I wrote this so you can search logs routed through Fluentd. ## Installation $ gem install fluent-plugin-elasticsearch ## Usage In your fluentd configration, use `type elasticsearch`. Additional configuration is optional, default values would look like this: ``` host localhost port 9200 index_name fluentd type_name fluentd ``` **More options:** ``` logstash_format true # defaults to false ``` This is meant to make writing data into elasticsearch compatible to what logstash writes. By doing this, one could take advantade of [kibana](http://kibana.org/). --- ``` include_tag_key true # defaults to false tag_key tag # defaults to tag ``` This will add the fluentd tag in the json record. For instance, if you have a config like this: ``` <match my.logs> type elasticsearch include_tag_key true tag_key _key </match> ``` The record inserted into elasticsearch would be ``` {"_key":"my.logs", "name":"Johnny Doeie"} ``` --- fluentd-plugin-elasticsearch is a buffered output that uses elasticseach's bulk API. So additional buffer configuration would be (with default values): ``` buffer_type memory flush_interval 60 retry_limit 17 retry_wait 1.0 num_threads 1 ``` ## Contributing 1. Fork it 2. Create your feature branch (`git checkout -b my-new-feature`) 3. Commit your changes (`git commit -am 'Add some feature'`) 4. Push to the branch (`git push origin my-new-feature`) 5. Create new Pull Request
Version data entries
1 entries across 1 versions & 1 rubygems
Version | Path |
---|---|
fluent-plugin-elasticsearch-0.1.1 | README.md |