journalduhacker/public/robots.txt

17 lines
208 B
Plaintext
Raw Normal View History

# block all spiders by default
User-agent: *
2012-10-24 18:53:21 +02:00
Disallow: /
# but allow major ones
User-agent: Googlebot
Allow: /
User-agent: Slurp
Allow: /
User-Agent: msnbot
Disallow:
User-agent: Baiduspider
2012-10-24 18:53:21 +02:00
Disallow: /