Sha256: 7211220c8604056197040505b0d69b95bf77c69c6cd93884dfdd16896041afb8

Contents?: true

Size: 303 Bytes

Versions: 10

Compression:

Stored size: 303 Bytes

Contents

# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file
#
# To ban all spiders from the entire site uncomment the next two lines:
# User-Agent: *
# Disallow: /

User-agent: *
Crawl-delay: 3
Disallow: /card/
Disallow: /admin/
Disallow: /account/
Disallow: *?*

Version data entries

10 entries across 10 versions & 1 rubygems

Version Path
wagn-1.13.0.pre2 config/samples/robots.txt
wagn-1.13.0.pre1 config/samples/robots.txt
wagn-1.13.0.pre config/samples/robots.txt
wagn-1.12.13 config/samples/robots.txt
wagn-1.12.12 config/samples/robots.txt
wagn-1.12.11 config/samples/robots.txt
wagn-1.12.10 config/samples/robots.txt
wagn-1.12.9 config/samples/robots.txt
wagn-1.12.8 config/samples/robots.txt
wagn-1.12.7 config/samples/robots.txt