README.md in fluent-plugin-bigquery-0.2.11 vs README.md in fluent-plugin-bigquery-0.2.12
- old
+ new
@@ -108,19 +108,61 @@
### Authentication
There are two methods supported to fetch access token for the service account.
-1. Public-Private key pair
-2. Predefined access token (Compute Engine only)
+1. Public-Private key pair of GCP(Google Cloud Platform)'s service account
+2. JSON key of GCP(Google Cloud Platform)'s service account
+3. Predefined access token (Compute Engine only)
+4. Google application default credentials (http://goo.gl/IUuyuX)
-The examples above use the first one. You first need to create a service account (client ID),
+#### Public-Private key pair of GCP's service account
+
+The examples above use the first one. You first need to create a service account (client ID),
download its private key and deploy the key with fluentd.
-On the other hand, you don't need to explicitly create a service account for fluentd when you
-run fluentd in Google Compute Engine. In this second authentication method, you need to
-add the API scope "https://www.googleapis.com/auth/bigquery" to the scope list of your
+#### JSON key of GCP(Google Cloud Platform)'s service account
+
+You first need to create a service account (client ID),
+download its JSON key and deploy the key with fluentd.
+
+```apache
+<match dummy>
+ type bigquery
+
+ auth_method json_key
+ json_key /home/username/.keys/00000000000000000000000000000000-jsonkey.json
+
+ project yourproject_id
+ dataset yourdataset_id
+ table tablename
+ ...
+</match>
+```
+
+You can also provide `json_key` as embedded JSON string like this.
+You need to only include `private_key` and `client_email` key from JSON key file.
+
+```apache
+<match dummy>
+ type bigquery
+
+ auth_method json_key
+ json_key {"private_key": "-----BEGIN PRIVATE KEY-----\n...", "client_email": "xxx@developer.gserviceaccount.com"}
+
+ project yourproject_id
+ dataset yourdataset_id
+ table tablename
+ ...
+</match>
+```
+
+#### Predefined access token (Compute Engine only)
+
+When you run fluentd on Googlce Compute Engine instance,
+you don't need to explicitly create a service account for fluentd.
+In this authentication method, you need to add the API scope "https://www.googleapis.com/auth/bigquery" to the scope list of your
Compute Engine instance, then you can configure fluentd like this.
```apache
<match dummy>
type bigquery
@@ -139,10 +181,23 @@
field_float requesttime
field_boolean bot_access,loginsession
</match>
```
+#### Application default credentials
+
+The Application Default Credentials provide a simple way to get authorization credentials for use in calling Google APIs, which are described in detail at http://goo.gl/IUuyuX.
+
+In this authentication method, the credentials returned are determined by the environment the code is running in. Conditions are checked in the following order:credentials are get from following order.
+
+1. The environment variable `GOOGLE_APPLICATION_CREDENTIALS` is checked. If this variable is specified it should point to a JSON key file that defines the credentials.
+2. The environment variable `GOOGLE_PRIVATE_KEY` and `GOOGLE_CLIENT_EMAIL` are checked. If this variables are specified `GOOGLE_PRIVATE_KEY` should point to `private_key`, `GOOGLE_CLIENT_EMAIL` should point to `client_email` in a JSON key.
+3. Well known path is checked. If file is exists, the file used as a JSON key file. This path is `$HOME/.config/gcloud/application_default_credentials.json`.
+4. System default path is checked. If file is exists, the file used as a JSON key file. This path is `/etc/google/auth/application_default_credentials.json`.
+5. If you are running in Google Compute Engine production, the built-in service account associated with the virtual machine instance will be used.
+6. If none of these conditions is true, an error will occur.
+
### Table id formatting
`table` and `tables` options accept [Time#strftime](http://ruby-doc.org/core-1.9.3/Time.html#method-i-strftime)
format to construct table ids.
Table ids are formatted at runtime
@@ -280,12 +335,12 @@
You can set `insert_id_field` option to specify the field to use as `insertId` property.
```apache
<match dummy>
type bigquery
-
+
...
-
+
insert_id_field uuid
field_string uuid
</match>
```