Browse Source

Add instructions to INSTALL and README.robots about robots.txt

Instructions are given in INSTALL and README.robots about adding a robots.txt
file to the opac to prevent search engines from indexing Koha.

Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
3.2.x
Michael Hafen 14 years ago
committed by Galen Charlton
parent
commit
62f43c04e9
  1. 1
      INSTALL
  2. 18
      README.robots

1
INSTALL

@ -56,6 +56,7 @@ Default installation instructions:
(note that you will want to run Zebra in daemon mode for a production
system)
9. Browse to http://servername:8080/ and answer the questions
10. Optionally add a robots.txt file. See README.robots for details
OR if you want to install all dependencies from CPAN and are root, you can
replace steps 1-3 with "perl install-CPAN.pl" but this is non-standard and

18
README.robots

@ -0,0 +1,18 @@
The robots.txt file.
Search engines, when looking for sites to show in search results, will first
look for the file /robots.txt. If this file is found and has lines that apply
to them they will do as instructed. A very basic robots.txt follow as an
example:
# go away
User-agent: *
Disallow: /
This tells every search engine that cares (User-agent: *) to not index the site
(Disallow everything past /).
If you have installed Koha to /usr/local/koha3 then this file would be placed
in the directory /usr/local/koha3/opac/htdocs/. This should prevent search
engines from browsing every biblio record, and every view of each record, on
your Koha install periodically.
Loading…
Cancel
Save