README.md in human_power-0.0.2 vs README.md in human_power-0.0.3
- old
+ new
@@ -22,11 +22,11 @@
If you are using Rails, you can add a sample *config/robots.rb* configuration file and route for `/robots.txt`:
$ rails g human_power:install
-It will allow crawlers to access to the whole site by default.
+It will allow crawlers access to the whole site by default.
Now you can restart your server and visit `/robots.txt` to see what's generated from the new configuration file.
## Usage
@@ -67,9 +67,16 @@
# Add one or more sitemaps
sitemap sitemap_url
sitemap one_url, two_url
```
+
+Then visit `/robots.txt` in your browser.
+
+## Crawlers
+
+Please see [user_agents.yml](https://github.com/lassebunk/human_power/blob/master/user_agents.yml) for a list of 170+ built-in user agents/crawlers you can use like shown above.
+The list is from [UserAgentString.com](http://www.useragentstring.com/pages/Crawlerlist/).
## Caveats
Human Power is great for adding rules to your robots.txt.
You should note, however, that the user agents are sorted alphabetically upon rendering.