journalduhacker/public/robots.txt

20 lines
241 B
Plaintext
Raw Normal View History

# block all spiders by default
User-agent: *
2012-10-24 18:53:21 +02:00
Disallow: /
# but allow major ones
User-agent: Googlebot
Allow: /
User-agent: Slurp
Allow: /
User-Agent: msnbot
Disallow:
User-agent: Baiduspider
2012-10-24 18:53:21 +02:00
Disallow: /
2015-01-11 15:19:10 +01:00
User-agent: *
Disallow: /search