Browse Source

Bug 6411 add another example to README.robots

Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
tags/v3.06.00
JAMES Mason 10 years ago
committed by Chris Cormack
parent
commit
2e8fe364cb
1 changed files with 10 additions and 0 deletions
  1. +10
    -0
      README.robots

+ 10
- 0
README.robots View File

@@ -5,13 +5,23 @@ look for the file /robots.txt. If this file is found and has lines that apply
to them they will do as instructed. A very basic robots.txt follow as an
example:

-------------------------------------------
# go away
User-agent: *
Disallow: /
-------------------------------------------

This tells every search engine that cares (User-agent: *) to not index the site
(Disallow everything past /).

Another slightly more intelligent robots.txt file example allows for some bot indexing (good for your site in google, etc), but also stops your Koha from getting thrashing by ignoring URLs that cause heavy search load

-------------------------------------------
# do some indexing, but dont index search URLs
User-agent: *
Disallow: /cgi-bin/koha/opac-search.pl
-------------------------------------------

If you have installed Koha to /usr/local/koha3 then this file would be placed
in the directory /usr/local/koha3/opac/htdocs/. This should prevent search
engines from browsing every biblio record, and every view of each record, on


Loading…
Cancel
Save