README.md in fluent-plugin-bigquery-2.2.0 vs README.md in fluent-plugin-bigquery-2.3.0
- old
+ new
@@ -1,7 +1,15 @@
# fluent-plugin-bigquery
+## Notice
+
+We will transfer fluent-plugin-bigquery repository to [fluent-plugins-nursery](https://github.com/fluent-plugins-nursery) organization.
+It does not change maintenance plan.
+The main purpose is that it solves mismatch between maintainers and current organization.
+
+---
+
[Fluentd](http://fluentd.org) output plugin to load/insert data into Google BigQuery.
- **Plugin type**: Output
* insert data over streaming inserts
@@ -50,11 +58,11 @@
| table | string | yes (either `tables`) | yes | nil | |
| tables | array(string) | yes (either `table`) | yes | nil | can set multi table names splitted by `,` |
| auto_create_table | bool | no | no | false | If true, creates table automatically |
| ignore_unknown_values | bool | no | no | false | Accept rows that contain values that do not match the schema. The unknown values are ignored. |
| schema | array | yes (either `fetch_schema` or `schema_path`) | no | nil | Schema Definition. It is formatted by JSON. |
-| schema_path | string | yes (either `fetch_schema`) | no | nil | Schema Definition file path. It is formatted by JSON. |
+| schema_path | string | yes (either `fetch_schema`) | yes | nil | Schema Definition file path. It is formatted by JSON. |
| fetch_schema | bool | yes (either `schema_path`) | no | false | If true, fetch table schema definition from Bigquery table automatically. |
| fetch_schema_table | string | no | yes | nil | If set, fetch table schema definition from this table, If fetch_schema is false, this param is ignored |
| schema_cache_expire | integer | no | no | 600 | Value is second. If current time is after expiration interval, re-fetch table schema definition. |
| request_timeout_sec | integer | no | no | nil | Bigquery API response timeout |
| request_open_timeout_sec | integer | no | no | 60 | Bigquery API connection, and request timeout. If you send big data to Bigquery, set large value. |
@@ -70,9 +78,10 @@
| template_suffix | string | no | yes | nil | can use `%{time_slice}` placeholder replaced by `time_slice_format` |
| skip_invalid_rows | bool | no | no | false | |
| insert_id_field | string | no | no | nil | Use key as `insert_id` of Streaming Insert API parameter. see. https://docs.fluentd.org/v1.0/articles/api-plugin-helper-record_accessor |
| add_insert_timestamp | string | no | no | nil | Adds a timestamp column just before sending the rows to BigQuery, so that buffering time is not taken into account. Gives a field in BigQuery which represents the insert time of the row. |
| allow_retry_insert_errors | bool | no | no | false | Retry to insert rows when an insertErrors occurs. There is a possibility that rows are inserted in duplicate. |
+| require_partition_filter | bool | no | no | false | If true, queries over this table require a partition filter that can be used for partition elimination to be specified. |
#### bigquery_load
| name | type | required? | placeholder? | default | description |
| :------------------------------------- | :------------ | :----------- | :---------- | :------------------------- | :----------------------- |