Sha256: be713a853dfc1c1bec50a8edfb5c0f32cfad1569055abc2a8ae2599d71a6f82d

Contents?: true

Size: 200 Bytes

Versions: 124

Compression:

Stored size: 200 Bytes

Contents

# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
User-Agent: *
Disallow: /

Version data entries

124 entries across 124 versions & 4 rubygems

Version Path
bonethug-0.0.85 skel/base/public/robots.txt
bonethug-0.0.84 skel/base/public/robots.txt
bonethug-0.0.83 skel/base/public/robots.txt
bonethug-0.0.82 skel/base/public/robots.txt
bonethug-0.0.81 skel/base/public/robots.txt
bonethug-0.0.80 skel/base/public/robots.txt
bonethug-0.0.79 skel/base/public/robots.txt
bonethug-0.0.78 skel/base/public/robots.txt
bonethug-0.0.77 skel/base/public/robots.txt
bonethug-0.0.76 skel/base/public/robots.txt
bonethug-0.0.75 skel/base/public/robots.txt
bonethug-0.0.73 skel/base/public/robots.txt
bonethug-0.0.72 skel/base/public/robots.txt
bonethug-0.0.71 skel/base/public/robots.txt
bonethug-0.0.70 skel/base/public/robots.txt
bonethug-0.0.69 skel/base/public/robots.txt
bonethug-0.0.68 skel/base/public/robots.txt
bonethug-0.0.67 skel/base/public/robots.txt
bonethug-0.0.66 skel/base/public/robots.txt
bonethug-0.0.63 skel/base/public/robots.txt