Sha256: be713a853dfc1c1bec50a8edfb5c0f32cfad1569055abc2a8ae2599d71a6f82d

Contents?: true

Size: 200 Bytes

Versions: 124

Compression:

Stored size: 200 Bytes

Contents

# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
User-Agent: *
Disallow: /

Version data entries

124 entries across 124 versions & 4 rubygems

Version Path
houston-core-0.9.2 templates/new-instance/public/robots.txt
houston-core-0.9.1 templates/new-instance/public/robots.txt
houston-core-0.9.0 templates/new-instance/public/robots.txt
houston-core-0.9.0.rc1 templates/new-instance/public/robots.txt
houston-core-0.8.4 templates/new-instance/public/robots.txt
houston-core-0.8.3 templates/new-instance/public/robots.txt
houston-core-0.8.2 templates/new-instance/public/robots.txt
houston-core-0.8.1 templates/new-instance/public/robots.txt
houston-core-0.8.0 templates/new-instance/public/robots.txt
houston-core-0.8.0.pre2 templates/new-instance/public/robots.txt
houston-core-0.8.0.pre templates/new-instance/public/robots.txt
houston-core-0.7.0 templates/new-instance/public/robots.txt
houston-core-0.7.0.beta4 templates/new-instance/public/robots.txt
houston-core-0.7.0.beta3 templates/new-instance/public/robots.txt
houston-core-0.7.0.beta2 templates/new-instance/public/robots.txt
houston-core-0.7.0.beta templates/new-instance/public/robots.txt
bonethug-0.0.99 skel/base/public/robots.txt
houston-core-0.6.3 templates/new-instance/public/robots.txt
houston-core-0.6.2 templates/new-instance/public/robots.txt
houston-core-0.6.1 templates/new-instance/public/robots.txt