README.md in roboto-0.0.2 vs README.md in roboto-0.1.0
- old
+ new
@@ -5,39 +5,24 @@
Roboto is a Rails Engine that gives you the ability to specify enviornment specific robots in your Rails 3.1+ application.
Don't let crawlers access your staging environment. This is [bad for SEO](http://www.seomoz.org/learn-seo/duplicate-content).
## Installing
+You can add it to your Gemfile with:
-First, remove the default, generate robots.txt in your Rails App
-
```
- #> rm public/robots.txt
+gem 'roboto'
```
-Next, add roboto to your gemfile:
-
+After you need to run the generator:
```
- gem 'roboto'
+#> rails generate roboto:install
```
-Then, add robot to your routes (config/routes.rb):
+If you already have robots.txt, it will be kept for your production environment in config/robots/production.txt
-```
- Rails.application.routes.draw do
- mount_roboto
- end
-```
-
-You can now specify environment specific robots.txt files in config/robots.
-
-It's recommended for staging that you do disallow crawlers from accessing your site. Once you've created a separate Rails environment for staging, define a config/robots/staging.txt file like so:
-
-```
- #place this in config/robots/staging.txt
- User-Agent: *
- Disallow: /
-```
+You can now specify environment specific robots.txt files in config/robots/.
+By default crawlers are disallow from accessing your site has been made for all your environments.
## Contributing
1. Fork it
2. Create your feature branch (`git checkout -b my-new-feature`)