diff options
author | ng0 <ng0@taler.net> | 2019-12-10 00:47:36 +0000 |
---|---|---|
committer | ng0 <ng0@taler.net> | 2019-12-10 00:47:36 +0000 |
commit | ca901e4b99a037553a5afe1beeb50255aeecb27c (patch) | |
tree | 0be1261a10486902002a73215dee1572912ebda7 /static/robots.txt | |
parent | 7bf827948000a632667a7664f7e4cc9953acdd7f (diff) | |
download | www-ca901e4b99a037553a5afe1beeb50255aeecb27c.tar.gz www-ca901e4b99a037553a5afe1beeb50255aeecb27c.tar.bz2 www-ca901e4b99a037553a5afe1beeb50255aeecb27c.zip |
merge new website generation.
Diffstat (limited to 'static/robots.txt')
-rw-r--r-- | static/robots.txt | 20 |
1 files changed, 20 insertions, 0 deletions
diff --git a/static/robots.txt b/static/robots.txt new file mode 100644 index 00000000..0a639917 --- /dev/null +++ b/static/robots.txt @@ -0,0 +1,20 @@ +# +# robots.txt +# +# This file is to prevent the crawling and indexing of certain parts +# of your site by web crawlers and spiders run by sites like Yahoo! +# and Google. By telling these "robots" where not to go on your site, +# you save bandwidth and server resources. +# +# This file will be ignored unless it is at the root of your host: +# Used: http://example.com/robots.txt +# Ignored: http://example.com/site/robots.txt +# +# For more information about the robots.txt standard, see: +# http://www.robotstxt.org/robotstxt.html +# +# For syntax checking, see: +# http://www.frobee.com/robots-txt-check + +User-agent: * +Crawl-delay: 10 |