To test:
1 - Find or add a record by author Slavoj Žižek
2 - Search for 'Zizek'
3 - No results
4 - Apply patch
5 - Reindex
6 - Search again
7 - Success!
https://bugs.koha-community.org/show_bug.cgi?id=22073
Signed-off-by: Eric Phetteplace <ephetteplace@cca.edu>
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
To test:
1 - Apply first patch
2 - Attempt searching by arp, no results
3 - Apply this patch
4 - Copy bib1.att and ccl.properties to the correct locations
5 - Restart zebra
6 - Rebuild indexes
7 - Search agian, success!
Signed-off-by: Margie Sheppard - Central Kansas Library System CKLS <msheppard@ckls.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Since we now require the <branch> block, we should add it to the config
templates
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Remove:
- BIB_INDEX_MODE and AUTH_INDEX_MODE env var
- bib_index_mode and auth_index_mode options from scripts
- Warnings from about page, just kept one if zebra_bib_index_mode or
zebra_auth_index_mode still exist in config and are set to grs1
Test plan:
- Install Koha from src
- Install Koha from pkg
- Read the code, carefully!
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Rebased
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
The Rewrite rules for Apache don't work unless you're using
debian/templates/apache-shared-opac-plack.conf or
debian/templates/apache-shared-intranet-plack.conf.
This patch fixes the Rewrite rules for the non-Plack Debian
Apache configuration templates as well as the standard
Apache configuration file that comes with Koha.
__BEFORE APPLYING__
1. Visit /api/v1/app.pl/api/v1/spec on your git dev install
2. This should display a large page of JSON
3. Visit /api/v1/spec on your git dev install
4. This should generate a 404 error
__APPLY PATCH__
__AFTER APPLYING__
5. Visit /api/v1/app.pl/api/v1/spec on your git dev install
6. This should display a large page of JSON
7. Visit /api/v1/spec on your git dev install
8. This should display a large page of JSON (identical to
the one from earlier steps)
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Passed QA with few notes posted separately to Bugzilla.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch adds the index definitions for zebra faceting of ccode in
koha for marc21, normarc and unimarc.
We also add lines to the templates to expose the new facet and enable
non-zebra faceting for ccode too.
Signed-off-by: David Cook <dcook@prosentient.com.au>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Improvements:
1) Index settings moved from code to etc/searchengine/elasticsearch/index_config.yaml. An alternative can be specified in koha-conf.xml.
2) Field settings moved from code to etc/searchengine/elasticsearch/field_config.yaml. An alternative can be specified in koha-conf.xml.
3) mappings.yaml has been moved from admin/searchengine/elasticsearch to etc/searchengine/elasticsearch. An alternative can be specified in koha-conf.xml.
4) Default settings have been improved to remove punctuation from phrases used for sorting etc.
5) State variables are used for storing configuration to avoid parsing it multiple times.
6) A possibility to reset the fields too has been added to the reset operation of mappings administration.
7) mappings.yaml has been moved from admin/searchengine/elasticsearch to etc/searchengine/elasticsearch.
8) An stdno field type has been added for standard identifiers.
To test:
1) Run tests in t/Koha/SearchEngine/Elasticsearch.t
2) Clear tables search_fields and search_marc_map
3) Go to admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1
4) Verify that admin/searchengine/elasticsearch/mappings.pl displays the mappings properly, including ISBN and other standard number fields.
5) Index some records using the -d parameter with misc/search_tools/rebuild_elastic_search.pl to recreate the index
6) Verify that you can find the records
7) Put <elasticsearch_index_mappings>non_existent</elasticsearch_index_mappings> to koha-conf.xml
8) Verify that admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1 fails because it can't find non_existent.
9) Copy etc/searchengine/elasticsearch/mappings.yaml to a new location and make elasticsearch_index_mappings setting in koha-conf.xml point to it.
10) Make a change in the new mappings.yaml.
11) Clear table search_fields (mappings reset doesn't do it yet, see bug 20248)
12) Go to admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1
13) Verify that the changes you made are now visible in the mappings UI
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Bug 20073: Move Elasticsearch yaml files back to admin directory
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 20187 has changed the JS and CSS rewrite rules to :
RewriteRule ^(.*)_[0-9][0-9]\.[0-9][0-9][0-9][0-9][0-9][0-9][0-9].js$ $1.js [L]
RewriteRule ^(.*)_[0-9][0-9]\.[0-9][0-9][0-9][0-9][0-9][0-9][0-9].css$ $1.css [L]
This patch changes this rules using [0-9]{N} and fusion in one rule.
And espaces the dot in extension js and css.
Test plan :
1) Go to intranet and opac
2) Check CSS and JS are doing well
3) Apply patch changes on our Apache configuration
4) Reload intranet and opac pages (Ctrl + F5)
5) Check CSS and JS are doing well
Signed-off-by: Charles Farmer <charles.farmer@inLibro.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This reverts commit f489d2034b.
This commit breaks the install process when using debian packages.
Reverting as we are very close to the 18.05.00 release
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch adds an option to the koha-conf.xml file for specifying
a temporary uploaded files directory.
The koha-create script is adjusted to handle it and a convenient option
switch is added. If ommited, it will default to
/var/lib/koha/<instance>/uploads_tmp.
koha-create-dirs is patched to create the required directory with the
right permissions.
The docs get the new parameter documented.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Benjamin Rokseth <benjamin.rokseth@deichman.no>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
It implements only the "client credentials" flow with no scopes
support. API clients are tied to an existing patron and have the same
permissions as the patron they are tied to.
API Clients are defined in $KOHA_CONF.
Test plan:
0. Install Net::OAuth2::AuthorizationServer 0.16
1. In $KOHA_CONF, add an <api_client> element under <config>:
<api_client>
<client_id>$CLIENT_ID</client_id>
<client_secret>$CLIENT_SECRET</client_secret>
<patron_id>X</patron_id> <!-- X is an existing borrowernumber -->
</api_client>
2. Apply patch, run updatedatabase.pl and reload starman
3. Install Firefox extension RESTer [1]
4. In RESTer, go to "Authorization" tab and create a new OAuth2
configuration:
- OAuth flow: Client credentials
- Access Token Request Method: POST
- Access Token Request Endpoint: http://$KOHA_URL/api/v1/oauth/token
- Access Token Request Client Authentication: Credentials in request
body
- Client ID: $CLIENT_ID
- Client Secret: $CLIENT_SECRET
5. Click on the newly created configuration to generate a new token
(which will be valid only for an hour)
6. In RESTer, set HTTP method to GET and url to
http://$KOHA_URL/api/v1/patrons then click on SEND
If patron X has permission 'borrowers', it should return 200 OK
with the list of patrons
Otherwise it should return 403 with the list of required permissions
(Please test both cases)
7. Wait an hour (or run the following SQL query:
UPDATE oauth_access_tokens SET expires = 0) and repeat step 6.
You should have a 403 Forbidden status, and the token must have been
removed from the database.
8. Create a bunch of tokens using RESTer, make some of them expires
using the previous SQL query, and run the following command:
misc/cronjobs/cleanup_database.pl --oauth-tokens
Verify that expired tokens were removed, and that the others are
still there
9. prove t/db_dependent/api/v1/oauth.t
[1] https://addons.mozilla.org/en-US/firefox/addon/rester/
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
- Removed merge marker
- Changed include path in favor of using the Asset tt plugin (bug 20538)
- Changed access_dir to a two-level entry for clarity
Test plans stay the same, just make sure that the two-level configuration entry
work properly and everything pass QA.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This squash contains all of these commits:
- Adds a page to access log files on the server from the intranet
- Update ID to allow for permalinking
- Rename config to "'accessdir' and fix qa
- Allows for multiple directories to be accessible
- Update the link under reports
- (Follow-up) Fixing merge error and cosmetic changes
- (Follow-up) Fix tab chars and move javascript to the footer
- (QA Follow-up) Fix datatable
- Make filename unicode-proof, renamed accessdir to access_dir and fix update
Test plans:
- Apply patch, update database
- Add to koha-conf:
<access_dir>/tmp/koha-public/one</access_dir>
<access_dir>/tmp/koha-public/two</access_dir>
<access_dir>/tmp/koha-public</access_dir>
- Create these directories ( mkdir /tmp/koha-public , etc...)
- Create these files:
echo "hello world!" > /tmp/koha-public/❤
echo "test" > /tmp/koha-public/one/samename.txt
echo "this is not the same" > /tmp/koha-public/two/samename.txt
- Login as Superadmin, go to tools > reports files
- Click on ❤, make sure it's downloadable and readable
- Click on both samename.txt, look inside and make sure the file is different
- Login as NON-superadmin. Go under tools, see no Report/Log under the third column
- Go to add tools/access_file permission to user
- See new entry under tools third column.
- validate link is ok.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Improvements:
1) Index settings moved from code to etc/searchengine/elasticsearch/index_config.yaml. An alternative can be specified in koha-conf.xml.
2) Field settings moved from code to etc/searchengine/elasticsearch/field_config.yaml. An alternative can be specified in koha-conf.xml.
3) mappings.yaml has been moved from admin/searchengine/elasticsearch to etc/searchengine/elasticsearch. An alternative can be specified in koha-conf.xml.
4) Default settings have been improved to remove punctuation from phrases used for sorting etc.
5) State variables are used for storing configuration to avoid parsing it multiple times.
6) A possibility to reset the fields too has been added to the reset operation of mappings administration.
7) mappings.yaml has been moved from admin/searchengine/elasticsearch to etc/searchengine/elasticsearch.
8) An stdno field type has been added for standard identifiers.
To test:
1) Run tests in t/Koha/SearchEngine/Elasticsearch.t
2) Clear tables search_fields and search_marc_map
3) Go to admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1
4) Verify that admin/searchengine/elasticsearch/mappings.pl displays the mappings properly, including ISBN and other standard number fields.
5) Index some records using the -d parameter with misc/search_tools/rebuild_elastic_search.pl to recreate the index
6) Verify that you can find the records
7) Put <elasticsearch_index_mappings>non_existent</elasticsearch_index_mappings> to koha-conf.xml
8) Verify that admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1 fails because it can't find non_existent.
9) Copy etc/searchengine/elasticsearch/mappings.yaml to a new location and make elasticsearch_index_mappings setting in koha-conf.xml point to it.
10) Make a change in the new mappings.yaml.
11) Clear table search_fields (mappings reset doesn't do it yet, see bug 20248)
12) Go to admin/searchengine/elasticsearch/mappings.pl?op=reset&i_know_what_i_am_doing=1
13) Verify that the changes you made are now visible in the mappings UI
Signed-off-by: Nicolas Legrand <nicolas.legrand@bulac.fr>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Bug 18571 added default ES configuration for packaging config.
This patch adds it to dev install in koha-conf.xml.
Database name is used in index_name to allow multiple installs.
Test plan :
- Run dev install
- Install ElasticSearch server and Koha deps
- Enable ElasticSearch in Koha
- Check indexing and searching works directly
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Some libraries need to be able to send additional patron data from the
extended patron attributes in made up SIP2 fields for the patron
information and patron status responses.
Test Plan:
1) Apply this patch
2) Create 3 new patron attributes with the codes CODE1, CODE2, CODE3.
Make a least one repeatable.
3) Create a patron, add those attibutes for the patron, make sure there
are at least two instances of the repeatable code
4) Edit your SIP2 config file, add the following within the login stanza:
<patron_attribute field="XX" code="CODE1" />
<patron_attribute field="XY" code="CODE2" />
<patron_attribute field="XZ" code="CODE3" />
5) Using the sip cli emulator, run patron_information and
patron_status_request messages for the patron
6) Note the values you set for the patron attributes are sent in the
corrosponding fields!
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Daniel Mauchley <dmauchley@duchesne.utah.gov>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: added parentheses on line 488 when assigning hashref to array.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
The flags [N,L] make no sense: next and last combined.
Choosing here for L to stop the rewriting process.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Victor Grousset <victor.grousset@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Koha has the ability to include custom css in the apache configuration.
If a library has any custom css ( or adds a custom js file in some way ),
and that file has an underscore in it ( e.g. my_custom.css ), the
apache rewrite rule will convert it to my.css and thus it will 404.
We should make the rewrite rules as specific as possible for the
format we are using.
Test Plan:
1) Set OPAC_CSS_OVERRIDE to a file with an underscore in it
2) Note it does not work
3) Apply this patch
4) Update the apache rewrite rules to match those in the patch
For kohadevbox, just run /home/vagrant/misc4dev/cp_debian_files.pl
5) Restart apache
6) Reload the page, your custom css should load now!
Signed-off-by: Victor Grousset <victor.grousset@biblibre.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Olli-Antti Kivilahti <olli-antti.kivilahti@jns.fi>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
This patch adds Makefile.PL the capability of handling the template_cache_dir configuration entry.
To do so, it:
- Adds the --template-cache-dir option switch (consistency with koha-create)
- Sets a default value for template_cache_dir to '/tmp/koha'
- Adds a dialog requesting the path for the template cache dir to Makefile.PL
- It tweaks etc/koha-conf.xml so it is correctly changed by rewrite-config.PL
To test:
- Apply this patch
- Run:
$ perl Makefile.PL --template-cache-dir your/favourite/dir
=> SUCCESS: The dialogs don't ask for template cache dir
=> SUCCESS: The resulting Makefile contains an entry for TEMPLATE_CACHE_DIR which value
matches what we passed to --template-cache-dir
- Run:
$ perl Makefile.PL
- When prompted for a template cache dir, introduce whatever you want
=> SUCCESS: The default you are offered is /tmp/koha
=> SUCCESS: At the end of the process, Makefile contains what we put in there
- Run:
$ sudo make install
=> SUCCESS: The resulting koha-conf.xml contains a <template_cache_dir> entry containing
whatever you picked for that purpose.
- Sign off :-D
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This Commit is at the heart of adding an interlibrary loans framework
for Koha. The framework does not prescribe a particular workflow.
Instead it provides a general framework that can be extended &
implemented by individual backends whose responsibility it is to
implement a specific workflow.
The module is largely self-sufficient: it adds new tables to the Koha
database and touches only a few files in the Koha source tree.
Primarily, we add our files to the Makefile and the koha-conf.xml,
define ill paths for the REST API, and introduce links from the main
intranet, opac pages & user permissions.
Outside of this we simply add new files & functionality.
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Benjamin Rokseth <benjamin.rokseth@kul.oslo.kommune.no>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Some SIP services ( such as Comprise ) require that an attempt at
over-paying a patron's account via SIP2 should fail, rather than create
a credit on the account. We should make this a configurable option on a
per-login basis in the SIP2 config file.
Test Plan:
1) Apply this patch
2) Enable the new parameter
disallow_overpayment="1"
for the login to be used in this test.
3) Restart your SIP server
4) Create or find a patron with fines
5) Attempt to send a payment via SIP for more than what the
patron's balance is
6) Note the response indicates a payment failure
7) Attempt to send a payment via SIP for the account balance or
less
8) Note the response indicates the payment has succeeded
9) Verify in Koha that the payment was processed
Signed-off-by: Rhonda Kuiper <kuiper@roundrocktexas.gov>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Koha's SIP2 server sends the patron's name in the format "Firstname
Surname" which is not very good for machine reading. We need to allow
the format of the patron name to be customized in a manner similar to
what is done with the DA field on bug 16755.
Test Plan:
1) Apply this patch, start or restart your SIP server
2) Find a patron with a first and last name
3) Send a patron information request via the sip2 cli tool
4) Note the AE field has the format "<firstname> <surname>" ( i.e. the current behavior )
5) Add this parameter to the login stanza you are using:
ae_field_template="[% patron.surname %][% IF patron.firstname %], [% patron.firstname %][% END %]"
6) Restart your SIP server
7) Repeat step 3
8) Note the AE field now has the format "<surname>, <firstname>"
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Benjamin Daeuber <BDaeuber@cityoffargo.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
The SIP2 DA field that Koha transmits is an odd and arbitrary format
that some SIP2 clients cannot handle. It would be best if this
format were customizable on a per-login basis in the same manner as
the AV field.
Test Plan:
1) Find an item that is checked out with holds
2) Return the item via SIP2 ( using the SIP2 cli emulator )
3) Note the value of the DA field
4) Apply this patch, restart your SIP2 server
5) Repeat step 2
6) Note the DA field value has not changed
7) Add this parameter to the login stanza you are using:
da_field_template="[% patron.surname %][% IF patron.firstname %], [% patron.firstname %][% END %]"
8) Restart the SIP2 server again
9) Repeat step 2
10) Note the DA field returned is now in the format "$surname, $firstname"
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Benjamin Daeuber <BDaeuber@cityoffargo.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
One of the RewriteCond directives in koha-httpd.conf was affecting
the wrong RewriteRule after its original RewriteRule was commented out
years ago.
_TEST PLAN_
0) Before applying patch, build Koha from source
*) make
*) make install (or make upgrade)
*) Copy or symlink etc/koha-httpd.conf to your Apache vhost directory
(and enable if you're on a Debian based system)
*) Restart Apache
1) Make sure that you have at least 1 bibliographic record in Koha
(URL like this http://server:port/cgi-bin/koha/opac-detail.pl?biblionumber=1)
2) Go to http://server:port/bib/1
3) Note that you get a 404 error
4) Apply the patch
5) Rebuild Koha from source as per step 0
6) Go to http://server:port/bib/1
7) Note that you now see the same page as you would if you went to
http://server:port/cgi-bin/koha/opac-detail.pl?biblionumber=1
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch adds a numeric index 'not-onloan-count' containing the value
of 999$x. This subfield is filled by 'rebuild_zebra.pl' by making use of
(bug's 18208) 'EmbedItemsAvailability' filter.
bib1.att and indexes definitions are updated accordingly.
To test:
- Apply the patch
- Pick the right biblio-zebra-indexdefs.xsl file for your setup and
replace the one your Zebra uses [1]
- Replace your bib1.att
- Replace your ccl.properties
- Have at least one record with more than one item, checkout some
item(s) from that record(s).
- Rebuild zebra's indexes:
$ sudo koha-shell kohadev
k$ cd kohaclone
k$ misc/migration_tools/rebuild_zebra.pl -r -b -v -k
(notice the dump directory is kept, you can try the XSLT yourself
running:
$ xsltproc \
etc/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl \
/tmp/the_dump_dir/biblios/exported_records | less
=> SUCCESS: There are records with the not-onloan-count index, and the
value is correct!
- Check Zebra yourself:
$ yaz-client unix:/var/run/koha/kohadev/bibliosocket
Z> base biblios
Z> find @attr 1=9013 @attr 2=5 @attr 4=109 0
=> SUCCESS: The search matches the amount of records with not-onloan
items.
Z> s 1+1
=> SUCCESS: Records with 999$x having a value higher than 0 are rendered
- Sign off :-D
Note: While this work is complete on its purpose, it is part of an
attempt to create a better way of filtering by availability.
Sponsored-by: ByWater Solutions
[1] In kohadevbox this would be
/etc/koha/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
Edit: Added the missing XSLT changes for UNIMARC and NORMARC
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Many SIP2 services such as those by Comprise Technologies are able to or
require that an ILS be able to accept writeoffs via SIP2. The SIP2
protocol specifies that payment type be a two digit number, but does not
specify a code for writeoffs. To this end we should allow the write-off
code to be specified in the SIP2 config on a per-account basis so that
if different vendors use different fixed codes for write-offs we can
handle that gracefully.
Test Plan:
1) Apply this patch
2) Modify your SIP2 config to include
payment_type_writeoff="06"
in the login portion of the account you will be using for the test.
3) Restart your SIP2 server
4) Create a fee for a patron
5) Send a SIP2 fee paid message specifying the payment type code we
defined earlier, with a payment amount that is *not* equal to the
amount outstanding for the fee.
6) Note the fee paid response indicates the payment failed
7) Repeat step 5, but this time send the amount outstanding as the
payment amount
8) Note that the fee paid response indicates a successful payment
9) Note in Koha that the fee has been written off!
Signed-off-by: Rhonda Kuiper <kuiper@roundrocktexas.gov>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
In summary, changes are:
1) If you have chosen MySQL, Makefile.PL will ask you if you want TLS (default:
"no"), and then the locations for CA cert, client cert and client key
(reasonable defaults are provided). Settings <tls>, <ca>, <cert> and <key> are
added in koha-conf.xml
2) If <tls>yes</tls> in koha-conf.xml, the installer and database connection
scripts add the TLS options in both DBI connection strings and mysql command
line
To test
1/ Apply patch
2/ Check everything still works and db connections are the same as before
3/ Either run Makefile.PL and step through the options or edit your koha-conf.xml to
enable TLS
4/ Check db connections are still working
Patch provided to me by Dimitris Kamenopoulos and I reformatted it into a git patch,
any errors are probably mine
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
TEST PLAN
1. Make sure you have no items checked out.
2. Run sudo koha-rebuild-zebra -f -v kohadev.
3. Go to search the catalog and search.
4. Check items availability and then click on limit to currently
available items.
5. This should return no results.
6. Apply patch and reload.
7. Results should show.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Attribute 14: " Specifies whether un-indexed fields should be ignored. A
zero value (default) throws a diagnostic when an un-indexed field is
specified. A non-zero value makes it return 0 hits."
From http://www.indexdata.com/zebra/doc/querymodel-zebra.html
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Looking at the default framework's fields that are linked to authority
records, there's a divergence with the Zebra index definitions.
This yields to authority usage count be incorrect for users searching
for authority records.
MariaDB [koha_kohadev]> SELECT tagfield,tagsubfield,authtypecode FROM
marc_subfield_structure WHERE authtypecode IS NOT NULL AND
authtypecode<>'' AND frameworkcode='' GROUP BY
tagfield,tagsubfield,authtypecode ;
+----------+-------------+--------------+
| tagfield | tagsubfield | authtypecode |
+----------+-------------+--------------+
| 100 | a | PERSO_NAME |
| 110 | a | CORPO_NAME |
| 111 | a | MEETI_NAME |
| 130 | a | UNIF_TITLE |
| 440 | a | UNIF_TITLE |
| 600 | a | PERSO_NAME |
| 610 | a | CORPO_NAME |
| 611 | a | MEETI_NAME |
| 630 | a | UNIF_TITLE |
| 648 | a | CHRON_TERM |
| 650 | a | TOPIC_TERM |
| 651 | a | GEOGR_NAME |
| 654 | a | TOPIC_TERM |
| 655 | a | GENRE/FORM |
| 656 | a | TOPIC_TERM |
| 657 | a | TOPIC_TERM |
| 658 | a | TOPIC_TERM |
| 662 | a | GEOGR_NAME |
| 690 | a | TOPIC_TERM |
| 691 | a | GEOGR_NAME |
| 696 | a | PERSO_NAME |
| 697 | a | CORPO_NAME |
| 698 | a | MEETI_NAME |
| 699 | a | UNIF_TITLE |
| 700 | a | PERSO_NAME |
| 710 | a | CORPO_NAME |
| 711 | a | MEETI_NAME |
| 730 | a | UNIF_TITLE |
| 796 | a | PERSO_NAME |
| 797 | a | CORPO_NAME |
| 798 | a | MEETI_NAME |
| 799 | a | UNIF_TITLE |
| 800 | a | PERSO_NAME |
| 810 | a | CORPO_NAME |
| 811 | a | MEETI_NAME |
| 830 | a | UNIF_TITLE |
| 896 | a | PERSO_NAME |
| 897 | a | CORPO_NAME |
| 898 | a | MEETI_NAME |
| 899 | a | UNIF_TITLE |
+----------+-------------+--------------+
This patch adds the missing ones to the authority number index as it is
done for the rest of the fields.
To test:
- Verify that
etc/zebradb/marc_defs/marc21/biblios/biblio-koha-indexdefs.xml
contains intries pointing the $9 subfield of all the fields in the
'tagfield' column above, to the Koha-Auth-Number:w index.
- Sign off :-D
Signed-off-by: Hugo Agud <hagud@orex.es>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch restores access to zebra facets (or zebra::snippet) with YAZ 5.8.1 or higher.
It was failing due to The <retrieval syntax="xml" name="zebra::*" /> entry in
retrieval-info-bib-dom.xml which IndexData said it wasn't even needed to
get that access.
Edit: I amended the commit message (tcohen)
Signed-off-by: Colin Campbell <colin.campbell@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
I tested on kohadevbox and found no regression or behaviour change. I
will provide a followup for the packages.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
It would be nice if we could control the number of workers and max
requests on a per instance basis, rather than the numbers being
hardcoded in the plack startup script.
Test Plan:
1) Build a new package of Koha with this patch applied ; )
2) Verify koha-plack still works
3) Add the following to the config section of your koha-conf.xml:
<plack_max_requests>75</plack_max_requests>
<plack_workers>4</plack_workers>
4) Stop plack
5) Start plack
6) Verify the number of works and max requests worked!
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Larry Baerveldt <larry@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Rebased against master and added a description for the new configuration
entries
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
And comment it, as we don't know what are the sysop's preferences
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Adding POD, changing the config files to live in a path pointed to by
koha-conf.xml
This means multiple instances can have their own config
Please test the 2 patches together
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Works as advertised. Arbitrary arguments can be passed to SMS:Send
drivers. If an argument is provided that has already been set by
SMS::Send or the driver, it will be overwritten by the value from
the YAML file. My only suggestion for an improvement would be an
example of what the YAML should look like, but that is a minor thing.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch makes Zebra index the 648$9 link for chronological terms on
bibliographic records. This way an authority search on chronological terms
will show the right number in 'Used in X records' message.
To test:
- Have a record with a 648 field, linked to an authority record (i.e. with an authid on 648$9).
- Search for the record, notice it is indexed.
- Perform an authority search for the chronological term
=> FAIL: the term is linked to our record, but koha shows '0' count.
- Apply the patch
- Run:
$ cd kohaclone
$ xsltproc etc/zebra/xsl/koha-indexdefs-to-zebra.xsl \
etc/zebradb/marc_defs/marc21/biblios/biblio-koha-indexdefs.xml \
> etc/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
$ git diff
=> SUCCESS: Notice the shipped etc/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
is up-to-date
- Run:
$ sudo cp etc/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl \
/etc/koha/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
$ sudo koha-restart-zebra kohadev
$ sudo koha-rebuild-zebra -f -b -v kohadev
- Search for the record, notice it is indexed.
- Perform an authority search for the chronological term
=> SUCCESS: the term is linked to our record, usage count is 1
- Sign off :-D
I assume NORMARC is similar on this regard. Feel free to fail it if the NORMARC part of the
patch is wrong.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Hugo Agud <hagud@orex.es>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Test plan:
1) Apply patch
2) Create new instance with parameter --zebralang cs
3) Insert some record with basic latin characters and some with "czech" characters (for example: "č" - should be sorted after "c", "š" - should be sorted after "s")
4) Try to search in katalog (staff and opac) and sort by other field then relevance - title or author for instance
5) Records should be sorted correctly by Czech rules
6) Look at code and confirm it is ok
Signed-off-by: radiuscz <radek.siman@centrum.cz>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
I did not test this patch, but trust in the author and signoffer
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
to test bug...
1/ make a random user
2/ change to random user
3/ access any zebra database with random user and no authentication
4/ read zebra database
here is a transcript of the bug...
---------------------------
root@xen1:~# adduser bob
root@xen1:~# su -l bob
bob@xen1:~$ cd /var/lib/koha
bob@xen1:/var/lib/koha$ ls
topsecret
bob@xen1:/var/lib/koha$ yaz-client unix:/var/run/koha/topsecret/bibliosocket
Connecting...OK.
Sent initrequest.
Connection accepted by v3 target.
ID : 81
Name : Zebra Information Server/GFS/YAZ
Version: 4.2.30 98864b44c654645bc16b2c54f822dc2e45a93031
Options: search present delSet triggerResourceCtrl scan sort extendedServices namedResultSets
Elapsed: 0.001002
Z> base biblios;
Z> find the
Sent searchRequest.
Received SearchResponse.
Search was a success.
Number of hits: 1130, setno 2
SearchResult-1: term=the cnt=1130
records returned: 0
Elapsed: 0.005518
Z> show
Sent presentRequest (1+1).
Records: 1
[biblios]Record type: USmarc
01824cam a2200397 a 4500
001 000045782309
003 AuCNLKIN
005 20111013213222.0
008 100707s2011 maua 001 0 e
...
---------------------------
5/ apply changes to a Koha instance's config files, that you plan to test
6/ restart zebra for instance
# sudo koha-restart-zebra topsecret
7/ repeat steps 2 and 3, but receive a 'bad user/passwd ' error from zebra
bob@xen1:~$ yaz-client unix:/var/run/koha/topsecret/bibliosocket
Connecting...OK.
Sent initrequest.
Connection rejected by v3 target.
1: code=1011 (Init/AC: Bad Userid and/or Password),
NOTE: this patch currently will only fixes newly created instances, it wont fix existing instances
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Good catch Mason
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
This patch removes Memcached configurations from the shipped apache files.
Note: testing is not actually needed for this patch, as it is really trivial. But I
include testing steps, just in case QA members require it.
To test:
- Apply the patch
- Do a (standard/dev/single) Koah install
=> SUCCESS: Verify the resulting koha-httpd.conf file doens't include memcached data
- Have a packages install
- Replace
* /etc/koha/apache-site-https.conf.in
* /etc/koha/apache-site.conf.in
with the ones from this patch
- Create an instance
=> SUCCESS: The apache configuration doesn't include memcached configurations
- Sign off :-D
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch introduces the memcached_servers and memcached_namespace
configuration entries as expected by 11921.
Note: better test this one and the followup together to ease the process.
To test:
- Do a source Koha install (dev, standard, single)
=> SUCCESS: The resulting koha-conf.xml file includes the memcached_* entries
which are filled with the right values.
- In kohadevbox (packages setup):
- Replace /etc/koha/koha-conf-site.xml.in with the one from this patch
- Create a new koha instance
=> SUCCESS: The instance's koha-conf.xml includes the relevant entries
- Sign off
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Remove trailing whitespace and replace tabs with 4 spaces.
Signed-off-by: Claire Gravely <claire_gravely@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Removes commented line from bib1.att.
Adjust OCLC-number to Other-control-number in comment of ccl properties.
No need to explicitly add 035$a and $z if you index 035 completely in
record.abs as well as biblio-koha-indexdefs.xml.
Rerun koha-indexdefs-to-zebra.xsl on index defs.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
1) Apply patch
2) Make sure that you have a bib that has MARC21 035$a (and possibly also 035$z) populated.
pre 3) Replace all modified zebra files and restart zebra server
3) Rebuild zebra: misc/migration_tools/rebuild_zebra.pl -x -b -z
4) Add the following to the intranetuserjs syspref:
$(document).ready(function(){
// Add Other Control Number to advanced search
if (window.location.href.indexOf("catalogue/search.pl") > -1) {
$(".advsearch").append('<option value="Other-control-number">Other Control Number</option>');
}
});
5) Do an advanced search, select "Other Control Number" from the search menu, then add the Other Control Number in 035$a for the bib specified in step 1.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works, no koha-qa errors
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
A long-standing typo in our apache config files:
[intranet]/search refers to search.pl (which does not exist)
This patch refers it to catalogue/search.pl
Test plan:
Run an install or copy the change from apache-shared-intranet.conf or
koha-httpd.conf to your apache config. Restart Apache and check
if http://[your staff client]/search works.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Tested by making manual changes according to the patch. Did not test a
new installation.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Most selfchecks have persistent connections and send a
periodic status request at intervals (approx every 5mins appears
the norm) The timeout was dropping connections by default every 30secs
which for the client appears as a very flakey network.
This patch adds a separate parameter client_timeout that can be
used if you do want to force a disconnect if the client sends
no requests for a period. The sample config sets it to 600, but you
can also define a 0 value meaning no timeout. If the parameter is not
defined, it will fallback to service timeout.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Restored this patch from Colin in order to separate it from the
get_timeout patch. Adjusted the commit message slightly.
The original value of 600 from Colin's earlier patch may give less
discussion than setting to 0 (no timeout) in a later proposal.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch indexes 024$a into the "phrase" index type, and the "url" index type,
if the 024$2 equals "uri".
TEST PLAN
1) Apply the patch.
1b) If you're using a gitified Koha or a git install,
you'll need to upgrade your instance or copy your zebradb files
over to /etc/koha/zebradb or your "kohadev" directory.
2) Add a 024$a with a URL like http://libris.kb.se/resource/bib/219553
to a bibliographic record
3) Re-index Zebra
4) Type "id-other,st-urx,fuzzy=http://libris.kb.se/resource/bib/219553"
into the "Search the catalog" box in the Staff Client and search
5) Note that you retrieve your record
NOTE: The fuzzy is required because Koha's query "parsing" functions change
http:// to http=// which won't correctly match the value in the "Identifier-other:u" index.
NOTE: Alternatively, you could do the following search instead:
"id-other,phr=http libris kb se resource bib 219553".
It would work as well by using the "Identifier-other:p" index.
Advanced tester version:
4) In a terminal window, find the "koha-conf.xml" file in your "etc" directory.
5) Open "koha-conf.xml" and find <listen id="biblioserver">.
Copy the URI you find there. (e.g. unix:/home/dcook/koha-dev/var/run/zebradb/bibliosocket).
6) Type "yaz-client unix:/home/dcook/koha-dev/var/run/zebradb/bibliosocket"
7) After it connects, type "base biblios" and press enter
8) Type "format xml" and press enter
9) Type "elements zebra::index" and press enter
10) Type "f id-other,st-urx=http://libris.kb.se/resource/bib/219553" and press enter
11) Note that you should have at least one result
12) Type "show 1"
13) If you scroll through the results, you should find something like the following:
<index name="Identifier-other" type="w" seq="28">@^</index>
<index name="Identifier-other" type="w" seq="1"></index>
<index name="Identifier-other" type="w" seq="29">http</index>
<index name="Identifier-other" type="w" seq="30">libris</index>
<index name="Identifier-other" type="w" seq="31">kb</index>
<index name="Identifier-other" type="w" seq="32">se</index>
<index name="Identifier-other" type="w" seq="33">resource</index>
<index name="Identifier-other" type="w" seq="34">bib</index>
<index name="Identifier-other" type="w" seq="35">219553</index>
<index name="Identifier-other" type="p" seq="28">http libris kb se resource bib 219553</index>
<index name="Identifier-other" type="u" seq="36">http://libris.kb.se/resource/bib/219553</index>
Signed-off-by: Hector Castro <hector.hecaxmmx@gmail.com>
Works as advertised the record is retrieved
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Some of the statements in the commit message do not work for me.
A search like "id-other,phr=http libris kb se resource bib 219553" does not
have results. Searching for "id-other,phr=libris.kb.se resource" does.
The steps in the advanced tester version do not work for me too.
I verified the following in yaz-client:
[1] Z> f @attr 1=9012 @attr 4=104 http://libris.kb.se/resource/bib/219553
Sent searchRequest.
Received SearchResponse.
Search was a success.
Number of hits: 1, setno 16
[2] First removed $2 and reindexed. Then searched again:
Z> f @attr 1=9012 @attr 4=104 http://libris.kb.se/resource/bib/219553
Sent searchRequest.
Received SearchResponse.
Search was a success.
Number of hits: 0, setno 1
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
changed ocurrences of 'lex' to 'lexile-number' in record.abs
Edits were made to the deprecated file record.abs *solely* to quiet
warnings in tests -- this makes sense until GRS-1 code is removed
from Koha.
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Jesse Weaver <jweaver@bywatersolutions.com>
Added the following indexes:
Interest-age-level | 591$a ind1=1
Interest-grade-level | 591$a ind1=2
lexile-number | 591$a ind1=8
Reading-grade-level | 591$a ind1=0
Moved 'lex' from a zebra index to a ccl alias to lexile-number.
Changed the handling of st-numeric in C4/Search.pm to allow for search ranges.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Hector Castro <hector.hecaxmmx@gmail.com>
Works as advertised
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Jesse Weaver <jweaver@bywatersolutions.com>
Koha's SIP2 server should have support for the AV field ( field items ).
The biggest problem with this field is that its' contents are not really
defined in SIP2 protocol specification. All it says is "this field
should be sent for each fine item". Due to this, I think the contents of
the field need to be configurable at the login level, so that the
contents can be defined based on the SIP2 devices requirements for the
AV field.
Test Plan:
1) Apply this patch
2) Find a patron with outstanding fines
3) Run a patron information request using misc/sip_cli_emulator.pl using the new -s option with the value " Y "
4) Note there is an AV field for each fee containing the description and amount
5) Edit your sip config, add an av_field_template parameter to the login you are using such as
av_field_template="TEST [% accountline.description %] [% accountline.amountoutstanding | format('%.2f') %]"
6) Restart your SIP server
7) Repeat the patron information request
8) Note your custom AV field is being used!
Signed-off-by: Chris Davis <cgdavis@uintah.utah.gov>
Signed-off-by: Jesse Weaver <jweaver@bywatersolutions.com>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
This patch will add indexes for Date/time-last-modified.
To test:
1. apply patch
2. reindex
3. search for dtlm:DATE and date-time-last-modified:DATE
4. confirm that you get results
Signed-off-by: Hector Castro <hector.hecaxmmx@gmail.com>
Works as advertised
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
I confirm Hector signing-off. A simple Zebra server restart suffice to get
working the searches on date-time-last-modified and dtlm.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
Allows for selection of DejaVu font path when installing from the command line. This
is useful for non-debian distributions that don't store the fonts in the same place.
Adds a new configuration variable to Makefile.PL: FONT_DIR
Defaults to the Debian install location for the fonts.
Test plan:
1. Run a CLI install, accepting the defaults.
2. Compare the generated koha-conf.xml to a
previous install - the font path for DejaVu fonts should be the same.
3. Run another CLI install, this time choosing a custom path for the fonts
4. Check that the path selected is reflected in the koha-conf.xml file.
NOTE: 'perl Makefile.pl' and 'make' generates blib/KOHA_CONF_DIR/koha-conf.xml
ran with a weird string for the font dir
copied that koha-conf.xml to my home dir
reran with all defaults
compared the two, and only the font paths differed.
Also, I cleaned up the tabs that snuck in. :)
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan Gallagher brendan@bywatersolutions.com
Only in MARC21 is possible to use ind2 of tag 245 to skip articles.
This patch is based on inserting a special template in
koha-indexdefs-to-zebra.xsl With this patch you must not insert index
Title:s in biblio-koha-indexdefs.xml, it is defined in
koha-indexdefs-to-zebra.xsl. It is not the best setup, but I find very
difficult to use biblio-koha-indexdefs.xml.
To test it in a english MARC21 setup:
Insert same records with titles and correct values in ind2 of 245.
If you have articles not in the skiping list of sort-string-utf.chr (The|the|a|A|an|An)
you can see that the sort by articles use also articles.
Insert the patch
Rebuilt indexes from scratch
Now all articles of titles are skipped
TO TEST WITHOUT INDEXING:
1. Go to etc/zebradb/marc_defs/marc21/biblios directory.
2. Put the sample MARCXML file in this directory.
3. Transform the file into Zebra indexes:
xsltproc biblio-zebra-indexdefs.xsl record.xml
Observe that the Title:s index contains:
01 Business and Technologies
4. Apply the patch.
5. Repeat:
xsltproc biblio-zebra-indexdefs.xsl record.xml
Observe that the Title:s index contains:
Business and Technologies
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Jesse Weaver <jweaver@bywatersolutions.com>
Verified working using yaz-client (as in
http://wiki.koha-community.org/wiki/Understanding_Zebra_indexing#Examine_Zebra_index,
though note that the `elem zebra::index` seems to be unneeded).
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
Test plan:
Without this patch, doing a zebra reindex on a fedora-based install will cause errors like this:
15:10:47-01/05 zebraidx(16108) [warn] No such record type: dom./etc/koha/zebradb/biblios/etc/dom-config.xml
With this patch, reindexing should just work.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
I have tested this doesn't break on debian/ubuntu systems, someone with a non
debian system will need to test it on that
Signed-off-by: Bob Ewart bob-ewart@bobsown.com
It works on openSUSE Leap 42.1
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Just noting that the debian zebra files already contain much more paths here.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
This patch adds the Zebra special attribute 14 to ccl.properties and
opac-search.pl, so that we can turn on OpacSuppression and still return
results even if there are no records in Zebra for the Suppress index.
_TEST PLAN_
Before applying:
1) Make sure that you have no suppressed records indexed in Zebra
2) Turn on OpacSuppression system preference
3) Search using a keyword which should bring up records
4) Note that no records are returned in the results
5) Change UseQueryParser system preference to "Try"
6) Repeat steps 3-4
Apply the patch.
After applying:
7) Repeat step 3 (ie search using a keyword which should bring up records)
8) Confirm that records are appearing in the results!
9) Change UseQueryParser system preference to "Do not try"
10) Repeat step 3
11) Confirm that records are appearing in the results!
Signed-off-by: Hector Castro <hector.hecaxmmx@gmail.com>
Works as advertised. No more, won't need to have at least one record with the
value "1" in the field mapped with this index. All records in OPAC returned.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If an item is not checked out when a checkin via SIP2 is attempted,
Koha's SIP server sends back an "ok" of 0, and the AF message "Item
not checked out". I am not entirely sure this is good and correct
behavior by the SIP2 protocol.
In particular, this will cause SIP2 book sorting devices to fail on
all items that are not checked out, causing them all to be put into
the "special handling" been that should be reserved for things like
items checked in at the wrong library and items on hold.
Test Plan:
1) Apply the patch for bug 13159 so you can use the new enhanced
SIP2 command line emulator
2) Use a command similar to the following to check in an item:
sip_cli_emulator.pl -a localhost -su <sip user> -sp <sip password> -l <instituation id> --item <barcode> -m checkin
3) Note the 3rd character is 0, and there is an AF field saying the item is not checked out
4) Apply this patch
5) Restart the SIP server
6) Repeat steps 2-3, note that nothing has changed
7) In the SIP config file, Add the parameter checked_in_ok="1" to the SIP account you are using.
8) Restart the SIP server
9) Repeat steps 2-3, note that this time the 3rd character is 1, and you do not recieve the item not checked out message.
Signed-off-by: Benjamin Rokseth <benjamin.rokseth@kul.oslo.kommune.no>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Single quotes in common language (not in programming) are usually ', but
there is also the form known as ’ in HTML. See
https://fr.wikipedia.org/wiki/Apostrophe_%28typographie%29
This bug proposes to transliterate all forms into a space.
Test plan :
(I'll use the code ’ instead of the unicode character)
- Without the patch
- Create a record with title : L’avion d’argile
- Index this record
- Search for "L’avion d’argile" => You find the record
- Search for "L'avion d'argile" => You do not find the record
- Apply patch
- Search for "L’avion d’argile" => You find the record
- Search for "L'avion d'argile" => You find the record
- Search for "L avion d argile" => You find the record
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This avoid getting a debug page when URL is wrong.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Actual routes are:
/borrowers
Return a list of all borrowers in Koha
/borrowers/{borrowernumber}
Return the borrower identified by {borrowernumber}
(eg. /borrowers/1)
There is a test file you can run with:
$ prove t/db_dependent/rest/borrowers.t
All API stuff is in /api/v1 (except Perl modules)
So we have:
/api/v1/script.cgi CGI script
/api/v1/swagger.json Swagger specification
Change both OPAC and Intranet VirtualHosts to access the API,
so we have:
http://OPAC/api/v1/swagger.json Swagger specification
http://OPAC/api/v1/{path} API endpoint
http://INTRANET/api/v1/swagger.json Swagger specification
http://INTRANET/api/v1/{path} API endpoint
Add a (disabled) virtual host in Apache configuration api.HOSTNAME,
so we have:
http://api.HOSTNAME/api/v1/swagger.json Swagger specification
http://api.HOSTNAME/api/v1/{path} API endpoint
Add 'unblessed' subroutines to both Koha::Objects and Koha::Object to be
able to pass it to Mojolicious
Test plan:
1/ Install Perl modules Mojolicious and Swagger2
2/ perl Makefile.PL
3/ make && make install
4/ Change etc/koha-httpd.conf and copy it to the right place if needed
5/ Reload Apache
6/ Check that http://(OPAC|INTRANET)/api/v1/borrowers and
http://(OPAC|INTRANET)/api/v1/borrowers/{borrowernumber} works
Optionally, you could verify that http://(OPAC|INTRANET)/vX/borrowers
(where X is an integer greater than 1) returns a 404 error
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch add zebra indexes to RDA 264 field.
The new Provider index is added too.
QA comments corrected.
To test:
1) Download RDA records with 264 fields from this attachment <http://bugs.koha-community.org/bugzilla3/attachment.cgi?id=36825>. Import the file and re-index/rebuild zebra. These records contain 260 and 264 fields per record.
2) Do a search with pb:Bethany two records will appear with title The guardian. Search with pl:Minneapolis too, the two records will appear.
3) Select one record of both records and delete the 260 field keeping the 264 field and save, rebuild your zebra.
4) Search again with pb:Bethany and just one record will appear. Thats mean 264 is not indexed.
5) Apply patches.
6) Rebuild your zebra but this time all biblio records.
7) Search again with pv:Bethany or Provider:Bethany, this time will appear the two records, 264 is indexed. Note that if you search again with pb only one record appear. This is because the suggestion of LOC.
10) Search with copydate:2013 only launch records with 260 fields and pv:2013 show both fields, i.e., 260 and 264.
11) Apply QA Test Tools
Sponsored-by: Universidad de El Salvador
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Currently, Norwegian vowels are not sorted correctly, even when
you have chosen "nb" as the ZEBRA_LANGUAGE during installation.
To test:
- Make sure you have three records with titles that begin with ÆØÅ
respectively
- Do a search that turns up those three records and some others, and
sort the results by title, bot ascending and descending.
- Verify that ÆØÅ is shown in some weird order.
- Edit your sort-string-utf.chr* so it is in line with the current
patch. It should include these two lines:
lowercase {0-9}{a-z}æøå
uppercase {0-9}{A-Z}ÆØÅ
- Restart Zebra and reindex all the records, e.g.:
$ sudo koha-restart-zebra <instancename>
$ sudo koha-rebuild-zebra -f -v <instancename>
- Do the search again, make sure you order by title and check that
ÆØÅ are sorted in the order of 1. Æ 2. Ø 3. Å, and after all other
characters
* = If you are on a gitified install, you need to edit this file:
/etc/koha/zebradb/lang_defs/<your ZEBRA_LANGUAGE>/sort-string-utf.chr
NOT the file in your git clone (yeah, i wasted some time there...)
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
In DOM config file :
etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml, the 608$9 is
defined a second time instead of 610$9. Just a type I think.
Test plan :
- Apply patch
- Install a UNIMARC + DOM instance
- Define in a framework 610 using a thesaurus
- Create a new biblio
- Create a new authority (same type as the thesaurus defined above)
- Index : rebuild_zebra.pl -a -b -x -z
- Link the field 610 to the new authority
- Index : rebuild_zebra.pl -a -b -x -z
- In authorities search, search for the new authority
=> You see Use in 1 Records(s)
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
I confirm the typo.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
_Test plan_
Prerequisites: Make sure that you have an item with a valid dateaccessioned,
and that the bib is indexed in zebra.
For the purposes of explanation, I'm going to use the date '2011-09-07'
1) Go to advanced search in the staff client and choose 'Acquisition date (yyyy-mm-dd)'
2) Enter 2011-09-07 (or the date of your choice).
3) Click the search button - you should get your item in the search results.
4) Return to the advanced search screen and select Acquisition date again.
5) Enter a start and end date in the text field separated by ' - '.
For example:
2011-09-01 - 2011-09-30
6) Click the search button -- this will return no results.
7) Apply the patch and copy etc/zebradb/ccl.properties to whatever directory is specified
by the koha-conf.xml referenced by $KOHA_CONF.
8) Try the search again -- this will return the expected results
Signed-off-by: Barton Chittenden <barton@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
1) Import MARC21 bibs containing
- ISBN in 020$z
- ISSN in 022$y
- ISSN in 022$z
2) Make sure that bibs are indexed
3) Search by ISBN and ISSN above -- bibs should not show in search.
4) Apply patch, re-index
5) Search again; ISBN in 020$z and ISSN in 022$y and 022$z should return
results.
Signed-off-by: kholten@switchinc.org
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch introduces an extension to the current syntax for DOM index definition.
Specifically, it extends the 'index_subfields' tag to allow adding a 'condition'
attribute that is used as a condition ofr applying the specified index.
This (exotic) example is self-explanatory:
The previous syntax (which is keeped by this patch) took this snippet from biblio-koha-indexdefs.xml
<index_subfields tag="100" subfields="acbd">
<target_index>Encuadernador:w</target_index>
</index_subfields>
and generated an XSLT snippet in the DOM indexing XSLT that looks like this:
<xslo:for-each select="marc:subfield">
<xslo:if test="contains('acbd', @code)">
<z:index name="Encuadernador:w">
<xslo:value-of select="."/>
</z:index>
</xslo:if>
</xslo:for-each>
This patch introduces this syntax change (note the 'condition' attribute:
<index_subfields tag="100" subfields="acbd" condition="@ind2='7'">
<target_index>Encuadernador:w</target_index>
</index_subfields>
which yields to this XSLT snippet in the DOM indexing XSLT:
<xslo:if test="@ind2='7'">
<xslo:for-each select="marc:subfield">
<xslo:if test="contains('acbd', @code)">
<z:index name="Encuadernador:w">
<xslo:value-of select="."/>
</z:index>
</xslo:if>
</xslo:for-each>
</xslo:if>
To test:
- Verify that the shipped XSLT files are current regarding the shipped index definitions:
$ for i in marc21 normarc unimarc; do
xsltproc etc/zebradb/xsl/koha-indexdefs-to-zebra.xsl \
etc/zebradb/marc_defs/$i/biblios/biblio-koha-indexdefs.xml \
> etc/zebradb/marc_defs/$i/biblios/biblio-zebra-indexdefs.xsl
done
$ git status
(repeat for authorities, skip normarc which doesn't have authorities)
- Apply the patch
- Re-run the previous commands
=> SUCCESS: no changes
- Add a condition to an index_subfields tag (for example, condition="@ind2='7'" in the Author's index
- Regenerate the specific XSLT
=> SUCCESS: doing a diff shows the only change is the code has been wrapped inside an xslo:if using the condition for the test
- Apply the generated xsl to a MARCXML record that has a field matching the condition like this:
$ xsltproc .../biblio-zebra-indexdefs.xsl sample_record.xml
=> SUCCESS: There's an index on the result, containing the configured field/subfields, that matches the criteria.
- Sign off and feel really happy :-D
Note: the attached sample record includes a 100 field, with ind2=7 and $a=Tomasito
Edit: This patch was squashed once I figured it got too complex and Jonathan required a followup
to avoid code duplication.
This avoids code duplication, with the same results.
Sponsored-by: Orex Digital
Signed-off-by: Barton Chittenden <barton@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch changes the "itemnumber" alias so that it acts like
"itemnumber,st-numeric". That is, it always does a numeric search.
_TEST PLAN_
The best way to test this patch is to apply the patch and then run
"make upgrade", I suspect. As this will refresh your "ccl.properties".
However, this patch is actually really small, so you can just apply
it manually to an existing "ccl.properties" if you rather save time.
Basically, you just need to do the following steps:
0) Do a search for "itemnumber:<insert real indexed itemnumber here>"
1) Note that you can't retrieve any results
2) Change your ccl.properties to say "itemnumber 1=8010 4=109"
3) Repeat the search for "itemnumber:<X>"
4) Note that you now retrieve your result
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Tested on a gitified package install. Made the change to
/etc/koha/zebradb/ccl.properties manually. After this change
I can successfully search for "itemnumber:1".
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
* Renames uploadPath to upload_path to follow the standard naming
conventions in koha-conf which use underscores rather than camel case
* Remove reference to intranet-tmpl and replace with [% interface %]
required to pass qa
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Koha needs a better logger, and it seems like the best solution would be
to take advantage of Log4perl which is already a fully featured logger.
We use Log4perl to selectively decide what statements should be logged,
and where they should go!
Test plan:
0) Install Log::Log4perl via packages or cpan
1) Apply this patch and the example renewal patch
2) Copy etc/log4perl.conf to your koha conf directory, edit the paths
to match your current error logs
3) Edit your koha-conf file and add the
<log4perl_conf>/path/to/log4perl.conf</log4perl_conf> line
4) Watch your intranet and opac error logs
5) Perform a renewal via the staff interface, note there is nothing new
in the log file
7) Update the log4perl.conf, change the log level from WARN to TRACE
for both the staff and opac sides
8) Perform a renewal via the staff interface, note the logged lines
9) Perform a renewal via the opac, note the logged lines
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended this patch: Moved the renewal stuff to a separate example patch.
And upgraded the DEBUG level to WARN in the log4perl config file.
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Bug 11202 introduced a new index 'dissertation-information' for
UNIMARC. This patch adds the index also for MARC21 installations.
http://www.loc.gov/marc/bibliographic/bd502.html
To test:
- Apply patch
- Copy files in etc/zebradb changed by this patch to your
corresponding directory (koha-dev..)
- Make sure you have records with 502
- Reindex
- Verify you can search the field contents with
dissertation-information= and
diss=
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Can find by dissertation-information,
No errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Make the shipped XSLTs for authorities (MARC21 and UNIMARC) the same as the generated version
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
In authority-koha-indexdefs.xml, all tags use the namespace "kohaidx" except the tag "id".
When re-generating authority-zebra-indexdefs.xsl, the line :
<xslo:variable name="idfield" select="normalize-space(marc:controlfield[@tag='001'])"/>
is modified :
<xslo:variable name="idfield" select="normalize-space()"/>
This is an error.
This patch adds kohaidx namespace to correct.
Test plan :
- Without patch
- go to etc/zebradb/marc_defs/marc21/authorities/
- run : xslproc xsltproc ../../../xsl/koha-indexdefs-to-zebra.xsl authority-koha-indexdefs.xml > authority-zebra-indexdefs.xsl
- read authority-zebra-indexdefs.xsl
=> the line has changed : <xslo:variable name="idfield" select="normalize-space()"/>
- Apply patch
- go to etc/zebradb/marc_defs/marc21/authorities/
- run : xslproc xsltproc ../../../xsl/koha-indexdefs-to-zebra.xsl authority-koha-indexdefs.xml > authority-zebra-indexdefs.xsl
- read authority-zebra-indexdefs.xsl
=> the line has not changed
(same for unimarc flavor)
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
As Mirko mentioned, the xslt's now generate the facet-processing templates in
the authority xslt's too. They are harmless because we don't define facets
for authority records. If we did, it would be harmless too.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
All of them were found and fixed using codespell.
Signed-off-by: Stefan Weil <sw@weilnetz.de>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
2 lines in the Zebra configuration files prevent an exact search for C.,
while all other [A-Z]. searches work correctly.
After taking a look at the /etc/zebradb/etc/word-phrase-utf.chr
those 2 lines cause the problem:
map (^c\.) @
map (^C\.) @
I propose to remove them.
To test:
- Catalog a record with an item with callnumber: C.
- Catalog a record with an item with callnumber: B.
- Try seaching for the second using callnum,ext:B. (exact field search)
- Verify search works.
- Try searching for the other with callnum,ext:C.
- Verify no result.
- Apply the patch - copy the zebra config file if necessary into the right spot
- Reindex
- Repeat searches - both should not bring up the correct record.
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
NOTE : I use HTML codes for special characters to avoir encoding issues in patch file.
In ICU configuration, add a transliterate rule for
œ = oe
æ = ae
Test plan :
- Without patch
- Create a record R1 with title containing for example "cœur"
- Create a record R2 with title containing for example "coeur"
- Index those records
- Search for "cœur"
=> You only find R1
- Search for "coeur"
=> You only find R2
- Apply patch
- Restart zebra
- Index R1 and R2
- Search for "cœur"
=> You find R1 and R2
- Search for "coeur"
=> You find R1 and R2
(Same test plan for ae)
------
Tested with all variants of Ae ae Oe oe. Search worked as expected.
Note: The words with special characters were not highlighted, but I think this can be done in an other bug.
Signed-off-by: Marc Veron <veron@veron.ch>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds a mapping for the lower case ð character to word-phrase-utf.chr ( Ð was already mapped to d)
To test:
1. Add a record with the ð character (Arnaldur Indriðason is an example author)
2. Rebuild zebra
3. Search for your record using d instead of ð and verify it is not found
4. Apply patch and copy word-phrase-utf.chr to the appropriate folder
5. Restart and rebuild zebra
6. Search for your record using d instead of ð and verify it is found
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
works as expected
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch is for MARC21. To test:
1)Setup a site with
MARC21
2)Insert 2 record, one lang A in 041 and 008 pos
35-37 an other with lang A in 041 and lang B in 008 pos
35-37
3)Index them
4)Search in advanced search with filter
'languare' for lan A. You will see 2 records
5)Search in
advanced search with filter 'languare' for lan B. You will
see 0 records
6)Apply the patch
7)Full reindex
8)Search in advanced search
with filter 'languare' for lan B. You will see 1 records
http://bugs.koha-community.org/show_bug.cgi?id=12948
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
I have *not* actually tested this, but the changes are identical to the ones
done for NORMARC, which I have tested, so I think it is safe to sign off. If
anyone disagrees, please reset the bug to "Needs signoff".
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch is for Normarc
Same test plan as patch for MARC21, except you need a setup with Normarc.
http://bugs.koha-community.org/show_bug.cgi?id=12948
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
- Added a record with "bul" in 008pos35-37
- Verified that this did not turn up in an advanced search with language =
Bulgarian
- Applied the patch
- I was testing on a gitified install, so I had to copy the patched index file
to the right location with this command:
sudo cp etc/zebradb/marc_defs/normarc/biblios/biblio-zebra-indexdefs.xsl \
/etc/koha/zebradb/marc_defs/normarc/biblios/biblio-zebra-indexdefs.xsl
- Did a full reindex
- Verified that the record *did* turn up in an advanced search with language =
Bulgarian
- Signing off! Thanks Zeno!
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Small error in word-phrase-utf.chr.
It generates this logs :
17:03:25-21/01 zebraidx(10636) [warn] Map: 'ς' has no mapping
17:03:25-21/01 zebraidx(10636) [warn] duplicate entry for charmap from 'Σ'
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Small fixes :
more space characters : ¡¿
uppercase AE missing in equivalent
some trailling spaces
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Add greek support in word-phrase-utf.chr for searching in a Greek catalog (it can also contain latin records).
Developped in collaboration with Giannis Kourmoulis <ikourmou@lib.auth.gr>
Test plan :
- Install using CHR zebra indexing
- Index a greek catalog
- Look for results with mixed uppercase, lowercase and diacritics in title
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Add the sort-string-utf.chr for sorting Greek catalog (it can also contain latin records).
Developped in collaboration with Giannis Kourmoulis <ikourmou@lib.auth.gr>
Test plan :
- Install using "gr" in "Primary language for Zebra indexing"
- Index a greek catalog
- Sort by title and check sorting is correct
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Issue existed in koha-conf.xml of /etc.
The following lines were removed from the file:
<!-- uncomment these lines and comment out the above if running on MSWin32 -->
<!--
<listen id="biblioserver" >tcp:localhost:9998/bibliosocket</listen>
<listen id="authorityserver" >tcp:localhost:9999/authoritysocket</listen>
-->
This section was located on lines 9, 10, 11, 12 and 13 of the koha-conf.xml file.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Fix a typo. Not test plan required, just a look at default UNIMARC framework.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
The ICU configuration files contains a rule to remove control characters :
<transform rule="[:Control:] Any-Remove"/>
This rule is before tokenization.
The problem is that "[:Control:]" regex contains line feed, carriage return and tab. See http://www.regular-expressions.info/posixbrackets.html.
So when several lines are indexed, last word of line is joined with first line of next line. Thoses words are then not searchable.
For example :
First line
Second line
This will become "First lineSecond line", tokenized as "First", "lineSecond" and "line".
Test plan :
- Use ICU in Zebra configuration
- Choose an indexed field, like 300$a
- Create a new record
- Enter several lines in choosen field, like :
First line
Second line
- Index this record
=> Without patch the search on "Second" does not return the record
=> With patch the search on "Second" returns the record
- Same tests with tab and carriage return instead of line feed
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch fixes the biblio-koha-indexdefs.xml for NORMARC, so
it includes the <id> element.
Because of how our DOM files work, the resulting biblio-zebra-indexdefs.xsl
for NORMARC picked the whole MARC record as ID, so every time the record
was edited, the id wouldn't match and a new record was created.
To test:
- Have a MARCXML record
- run:
$ xsltproc etc/zebradb/marc_defs/normarc/biblios/biblio-zebra-indexdefs.xsl the_record | less
=> FAIL: verify the z:id property on the <z:record> line contains all subfields concatenated
- Apply the patch
- re-run the xsltproc line
=> SUCCESS: z:id contains the 999$c number
- Sign off :-D
Regards
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Known bug with DOM: Without <z:id> indexing biblionumber Zebra hasn't it record
unique ID, and so fails to identify existing records. Works as described. 999$c
is linked to biblionumber in default Normarc framework.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
I have applied the patch to my production server, and at least one customer has
confirmed that it fixes the problem with multiple copies of records in search
results.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes tests and QA script, fix matches what we have for the other MARC flavours.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
We should add the ability to apply a regular expression to screen
messages for the SIP2 server. This would allow libraries to not only
customize the screen messages the patron sees, but can also allow screen
messages to be translated.
Test Plan:
1) Apply this patch
2) Inspect etc/SIPconfig.xml, note the new screen_msg_regex tags
that can be nested inside a given login tag.
3) Add one or more screen_msg_regex tags to your own SIP config
Recommendation: s/Greetings from Koha./Welcome to your library!/g
4) Restart your SIP2 server
5) Test with a SIP2 machine, or use /misc/sip_cli_emulator.pl
6) Note your new AF fields!
Signed-off-by: Jason Burds <jburds@dubuque.lib.ia.us>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This followup
- changes some indexes in Queryparser configuration file
- supresses some clearly useless 6XX$9 in biblio-koha-indexdefs.xml and adds 2 new ones, probably useless (not sure of that)
- change the name of index Subject-geographical to Subject-name-geographical in ccl.properties (to match bib1.att)
the xsl file zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl was generated with the following command:
xsltproc zebradb/xsl/koha-indexdefs-to-zebra.xsl zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml > zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
To test :
1) Apply the 3 patches
2) copy the modified files from the source directory to the directory where you store the config files for Zebra and Queryparser
The files modified by the 3 patches and that need to be copied are:
etc/zebradb/biblios/etc/bib1.att
etc/zebradb/ccl.properties
etc/searchengine/queryparser.yaml
etc/zebradb/ccl.properties
.../unimarc/biblios/biblio-koha-indexdefs.xml
.../unimarc/biblios/biblio-zebra-indexdefs.xsl
3) Rebuild Zebra
4) Create a record A with some values in critical fields, for example:
- the string "test9828" in 600$c 600$f 600$p, 602$f, 616$c, 616$f, 606$2,600$2
- the string "subform" in 600$j
4) Create a record B with the string "subgeo" in 606$y
5) Create a record C with the string "subdate" in 606$z
WITHOUT QP activated in sysprefs ("Don't try to use QP"):
6) try to search "su:test9828". You should have no results
7) try to search "su-genre:subform". You should have 1 result : record A
8) try to search "su-geo:subgeo". You should have 1 result : record B
9) try to search "su-chrono:subdate". You should have 1 result : record C
10) on existing records, try su-ut, su-to, su-na, su-form, su-corp, su-geo indexes, and see it results are relevant
WITH QP activated in sysprefs:
Same tests
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Only cosmetic :
- the references to lines record.abs are now useless and outdated
- some comments added in record.abs could be usefull in biblio-koha-indexdefs.xml
No change expected, only comments
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
[New commit on 18 Aug 2014 : rebased, and DOM indexing only]
Issues to fix :
Most of 6XX may contain a $2 that identifies the system used for indexing. It should not be indexed.
In French libraries, $2 contains "rameau". So searching books about the music composer "Rameau" retreive thousands of records!
For some 6XX fiels, other subfields should not be indexed, for example dates of persons and family, or adresses.
In Unimarc guide, 600$t,601$t,602$t are said to exist but to be "not used". I keep them indexed.
Additionnally, subject indexing could be improved by using specific indexes for each 6XX if possible :
In ccl.properties :
- su-to, su-geo and su-ut are defined as aliases of Subject.
- a specific index is defined, but not used in record.abs : Subject-name-personal, alias su-na
We can use these indexes and create new specific indexes by using existing bib1 attributes.
We could also index $j,$x,$y,$z subdivision in specific indexes.
This patch does the following changes :
1) For all 6XX : Not indexing $2 (LSCH, Rameau...), $3 and $5
2) Suppressing the indexing of some specific subfields, depending on the field:
600 : Personal name used as a subject // see Marc21 600
not indexing c (additional elements),f (dates),p (address/affiliation)
602 : Family name used as a subject // see Marc21 600 3X
not indexing f (dates)
616 : Trademark
not indexing c,f
3) For all 6XX : index $j,$x,$y,$z in several indexes in addition to the specfific index for their 6XX field:
4) Define in ccl.properties some specific indexes :
Subject-name-conference 1=1073 => alias su-conf
Subject-name-corporate 1=1074 => alias su-corp
Subject-genre-form 1=1075 => alias su-genre and su-form
Subject-geographical 1=1076 => alias su-geo
Subject-chronological 1=1077 => alias su-chrono
Subject-title 1=1078 => alias su-ut and su-ti
Subject-topical 1=1079 => alias su-to
5) Adding new aliases in Search.pm :
su-chrono, su-form, su-genre, su-corp, su-conf, su-ti
6) Using these new indexes in for
600 : Subject and Subject-Personal-Name ; all subfields except subdivisions in Personal-name
601 : Subject, Subject-name-conference and Subject-name-corporate and Subject-name-conf ; all subfields except subdivisions in Corporate-name and Conference-name
602 : same as 600 but could be improved later
604 : Subject and Subject-title ; $a in Subject-Personal-Name ; all subfields except subdivisions in Name-and-Title
605 : Subject and Subject-title
606 : Subject and Subject-topical
607 : Subject and Subject-geographical ; all subfields except subdivisions in Name-geographic
608 : Subject and Subject-genre-form
To test :
A. In a UNIMARC-DOM indexing environment
1) Apply the patch
2) Rebuild zebra
3) Create a record A with some values in critical fields, for example:
- the string "test9828" in 600$c 600$f 600$p, 602$f, 616$c, 616$f, 606$2,600$2
- the string "subform" in 600$j
4) Create a record B with the string "subgeo" in 606$y
5) Create a record C with the string "subdate" in 606$z
6) try to search "su:test9828". You should have no results
7) try to search "su-genre:subform". You should have 1 result : record A
8) try to search "su-geo:subgeo". You should have 1 result : record B
9) try to search "su-chrono:subdate". You should have 1 result : record C
10) on existing records, try su-ut, su-to, su-na, su-form, su-corp, su-geo indexes, and see it results are relevant
Indexing of subjects could maybe be improved later
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
All seems to work as expected, I am not super-familiar with UNIMARC but I wonder if in su-corp and su-conf the subdivisions might be useful (e.g. France-Gendarmie / Staatsbibliothek-Berlin)
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
[1] Routine add_cron_job was added in 2007 but has not been defined.
Parameter Recurring is not used.
[2] Made some changes to koha-conf.xml. Instead of an example to edit,
I replaced it by the SCRIPT_NONDEV_DIR install variable.
[3] SCRIPT_NONDEV_DIR had to be included in rewrite-config.pl and the
path had to be corrected for dev installs in Makefile.PL
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested single and dev install for supportdir change.
Compared installations with and without the patches for this report.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
By adding a supportdir, this allows for configuring use in a
non-package install environment, such as git.
Seeing as I only tested git, I clearly had this defined.
Further testing should include packaging up an installation, and
installing a package version without setting the supportdir
configuration value.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This file should have been removed as part of Bug 12538.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
To test...
- apply patch
- build and install a new Koha .deb from patched codebase
- create a new Koha instance
- add some authority records to instance
- do a full zebra reindex
- do an authorities search, and get some results
note: this patch does not fix existing Koha instances, just new ones
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch updates the Zebra configuration for unimarc.
995$d and 995$j should not be indexed.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
- Local fallback was not very well implemented, this patch adds
better handling for such cases allowing clearer failure messages
- This patch also adds the ability to use single sign on via the
top bar menu in the bootstrap theme.
BUG8446, Follow up: Adds perldoc documentation
- Add some documentation to the Auth_with_Shibboleth module
including some guidance as to configuration.
BUG8446, Follow up: Correct filenames to match guidlines
- Moved Auth_with_Shibboleth.pm to Auth_with_shibboleth.pm to match
other files present on the system.
BUG8446, Follow up: Correct paths after file rename
BUG8446, Follow up: Implemented single sign out
- This follow up rebases the code against 3.16+ which managed to break
some of the original logic.
- As a side effect of the rebasing, we've also implemented the single
sign out element. Upon logout, koha will request that the shibboleth
session is destroyed, and then clear the local koha session upon
return to koha. Due to the nature of shibboleth however, you will
only truly be signed out of the IdP if they properly support Single
Sign Out (which many do not). As a consequence, although you may
appear to be logged out in koha, you might find that upon clicking
'login' the IdP does NOT request your login details again, but instead
logs you silently back into your koha session. This is NOT a koha bug,
but a shibboleth implementation issue that is well known.
BUG8446, Follow up: Fixed bootstrap login via modal
- The bootstrap theme enable login from any opac page via modal. To
enable this with shibboleth we had to make some template parameters
globally accessible when shibboleth is enabled.
BUG8446, Follow up: Add template rules for Shibboleth and CAS
- Add template rules so that CAS and Shibboleth can coexist.
BUG8446, Follow up: Added default config to config file
BUG8446, Follow up: Embellished perldoc documentation
- Updated perldoc to correct detail about configuring shibboleth
authentication.
- Updated perldoc to include subroutines and their respective functions.
BUG8446, Follow up: Enable configuration of match field
- Added clearer, more flexible, configuration of shibboleth attribute to
koha borrower field matching for authentication
- Correcting of documentation to make it more clear to the current
implementation
- Minor refactoring of code to reduce some code duplication
Signed-off-by: Matthias Meusburger <matthias.meusburger@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Note: NORMARC is missing the id field.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
This patch makes t/db_dependent/Search.t pass again.
NORMARC is currently not tested.
I checked the results before and after applying the patch
and the facets are now looking the same as before.
Passes all tests and QA script.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The itype facet was missing 952$y for both MARC21 and NORMARC.
This patch adds that. And also modifies the zebra-biblios-dom.cfg file
(also the debian/ version) so facetNumRecs is set to 1000 for zebra.
It is the amount of records that are taken into account. The more record,
the more exact the facets for the result set. 1000 was chosen as it changed
the time to reindex 1000 records from 18s to 19s.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds a variable to koha-conf.xml controlling the use of Zebra facets.
Usage:
- use_zebra_facets = 1 | 0
Zebra facets work only on DOM.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The previous patches for facet extraction from Zebra indexes set a default
namespace on the following files:
etc/zebradb/marc_defs/marc21/biblios/biblio-koha-indexdefs.xml
etc/zebradb/marc_defs/normarc/biblios/biblio-koha-indexdefs.xml
etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml
and hence the XML file index_subfields can be cleaned by removing the namespace.
To test:
- Apply this patch
- Run
$ for i in marc21 normarc unimarc
do xsltproc etc/zebradb/xsl/koha-indexdefs-to-zebra.xsl \
etc/zebradb/marc_defs/$i/biblios/biblio-koha-indexdefs.xml \
> etc/zebradb/marc_defs/$i/biblios/biblio-zebra-indexdefs.xsl
done
=> SUCCESS: no errors reported
- Run
$ git diff
=> SUCCESS: no differences on the xsl files
- Sign off :-D
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: David Cook <dcook@prosentient.com.au>
Seems to work with DOM and MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds the facets definitions to the biblio-koha-indexdefs.xml, based
on what is hardcoded on C4::Koha::getFacets().
The biblio-zebra-indexdefs.xsl file for NORMARC is generated using the usual:
xsltproc ...koha-indexdefs-to-zebra.xsl ...normarc/biblios/biblio-koha-indexdefs.xml > \
...normarc/biblios/biblio-zebra-indexdefs.xsl
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: David Cook <dcook@prosentient.com.au>
Seems to work with DOM and MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds the facets definitions to the biblio-koha-indexdefs.xml, based
on what is hardcoded on C4::Koha::getFacets().
The biblio-zebra-indexdefs.xsl file for UNIMARC is generated using the usual:
xsltproc ...koha-indexdefs-to-zebra.xsl ...unimarc/biblios/biblio-koha-indexdefs.xml > \
...unimarc/biblios/biblio-zebra-indexdefs.xsl
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: David Cook <dcook@prosentient.com.au>
Seems to work with DOM and MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds the facets definitions to the biblio-koha-indexdefs.xml, based
on what is hardcoded on C4::Koha::getFacets().
The biblio-zebra-indexdefs.xsl file for MARC21 is generated using the usual:
xsltproc ...koha-indexdefs-to-zebra.xsl ...marc21/biblios/biblio-koha-indexdefs.xml > \
...marc21/biblios/biblio-zebra-indexdefs.xsl
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: David Cook <dcook@prosentient.com.au>
Seems to work with DOM and MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch changes koha-indexdefs-to-zebra.xsl to correctly process a new syntax
for defining facet indexes on the XML files.
It also changes the retrieval file to allow access to Zebra's internal data from
Zoom (i.e. access to zebra::facet:*).
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: David Cook <dcook@prosentient.com.au>
Seems to work with DOM and MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Since nobody is currently working on the zebra layer introduced by bug
8233, Solr won't never work.
Some code has been introduced in 3.10 to prove several search engines
can cohabit into Koha but no help/fund has been found to go ahead.
It is useless to keep this code and to maintain an ambiguous situation.
I think the indexes configuration page could be restore later if someone
else introduces a new search engine into Koha.
Test plan:
Look at the code introduced by bug 8233 and verify all is removed.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Actually, in default UNIMARC install, 461$9 is indexed as Host-Item-Number, meaning it is used for analytical itemnumber.
But most UNIMARC catalog use the analytical relation using unimarc_field_4XX.pl plugin on 461$a. In fact, this plugin is defined in default UNIMARC frameworks.
If Host-Item-Number is defined but 461$9 is used for something else, it will lead to odd bugs. For example, records containing analytical items can not be deleted.
This patch comments the 461$9 indexing in UNIMARC zebra config.
Test plan :
- Create a fresh UNIMARC install
- Create a record with 461$9 containing a value
- Index the record
- Perform a search on Host-Item-Number : ccl=Host-Item-Number,alwaysmatches=''
=> Without the patch you get a result
=> With the patch you get no result
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Code is clean, commenting out all the indexing of 461$9.
Trusting the author that this is the correct thing to do :)
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Restore elementSetName to marcxml for DOM indexing in Zconn (Context.pm).
This prevents the need of rebuilding the index after restarting Zebra
server.
Removes the now incorrect reference to marcxml as 'superfluous' in four
dom config files.
Test plan:
[1] Do not yet apply this patch.
[2] Rebuild zebra index with the zebra config of commit
036f2a50e1.
[3] (Go back to master.) Restart your zebra server (no config change).
You will have results without details.
Apply this patch: you see details.
Reset to master: no details again.
[4] Install new zebra config from master.
Search again: you still see no details.
Restart zebra server. Search: you see details.
Apply this patch. Search: still details.
Restart zebra server. Search: still details.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested in a non-package environment (manual dev install).
The package environment should work now too (results in step 4c might differ).
Progress on bug 12012 would be appropriate to sync all changes.
Tested the response of the SRU server too.
Signed-off-by: Marc Veron <veron@veron.ch>
I tested starting on a VM with Koha 3.15.00.019 installed.
Did git pull -> Koha 3.15.00.051
Result: No details in search results.
Applied patch.
Result: Search results display fine.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes two problems:
a) Bad PDF when using Helvetica font.
Current label code assigns 'italic' or 'oblique' variants
to title. Helvetica-Oblique was not defined, but is present
b) Bad alignment using center/right justification
Problem was bad font parameter passed to StrWidth
routine
To test:
1. Try making a batch using Helvetica, downloaded PDF do not open.
2. Try a batch of mixed scripts with layout alignment center or
right, only latin scripts align almost correctly.
3. Apply the patch and update your koha-conf.xml to add Oblique variant
4. Try again 1, now PDF opens
5. Try 2, now alignment is correct
New problem (for another bug): DejaVuSans has a good
support for arabic, but not Oblique variant. As selection
of italic/oblique is hardcoded, now Arabic titles are
not displayed. I'll try to add a checkbox to select
or not this feature.
Added a FIXME for the hardcoded forced oblique -chris_n
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Since built-in PDF fonts suport just Latin-1 encoding, we have
to switch to TrueType fonts to correctly encode all UTF-8 characters
(which we should be getting from database anyway).
This approach also nicely sidesteps our encoding cludges, but
requires paths to TrueType fonts which are included in koha-conf.xml
under new <ttf> section. Without this directive in kona-conf.xml
code will still use Latin-1 built-in pdf fonts.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes changes to koha-conf.xml by removing the fallback section
from biblioserver and authserver. The information is in a include file on
the same server (no need to fall back) and moreover, some information
is not up-to-date and should be moved elsewhere.
The patch also simplifies the DOM retrieval-info files for auth and bib.
And eliminates superfluous F and usmarc from the dom-config files. (I felt
the urge to remove marcxml too, but left it for now; see also the second
patch.) For reference, look at the marcxml example files of Zebra.
NOTE: This patch does not deal with the Debian package installs. In the
same way koha-conf-site.xml.in, and *-retrieval-info-* could be adjusted.
Test plan:
[1] Run at least a dev install in order to copy the new files to your
Zebra folders. Choose for DOM indexing.
Enable the SRU server on port 9998 (small edit in koha-conf.xml).
[2] Restart Zebra and reindex -a -b -x.
[3] Verify if a search from Koha still functions as expected.
Check the SRU output on port 9998. NOTE: If you do not pass recordSchema,
you should get back a marc response now (instead of index schema).
Bonus: Add your server as a Z3950 target to another Koha install. And
perform a Z3950 search from the other server to your new install.
Bonus: Check response from the auth and biblio socket via yaz-client.
[4] Reindex again with -a -b but without -x.
[5] Repeat Koha search, SRU response (Z3950, yaz-client).
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add a separate phrases-icu.xml for phrase indexes
The file is based on that distributed with zebra
with a couple of additions to reflect Koha usage
This patch adds a separate tokenizer variable
for phrase indexes so that default.idx is
correctly rewritten for sites using icu
indexing
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
- Applied patch
- perl Makefile.PL --prev-install-log ../koha-dev/misc/koha-install-log
- make upgrade
- Restarted Zebra server
- Did a full reindex of bibliographic and authorities
- Checked various searches
- Links records to authorities
- Checked created links work correctly
I couldn't find a regression with this patch.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan :
- Create a fresh install UNIMARC flavor and GRS1 indexing for biblios
- Re-indexe database
- Perform a search with index "itemtype" (and then "itype") on an
existing value of 995$r. For example : itemtype:BOOK
=> Check you get results
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The accounts->login tag in SIPconfig.xml can now accept a new
parameter, "encoding". It will be mostly used to encode to utf8.
For this, simply add the parameter: encoding="utf8"
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Works as advertised, does nothing if encoding is not set.
Blows up all the machines that can't handled utf8 if it is set :) But
that's not Koha's fault. :)
Patch rebased by Christophe Croullebois <christophe.croullebois@biblibre.com>
Signed-off-by: Petter Goksoyr Asen <boutrosboutrosboutros@gmail.com>
But now I did it the right way! And I can confirm that this patch solves
all issues with mangled characters in SIP messages. Confirmed that it
looks good with Norwegian characters in patron name and in book titles.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds :w and :p versions to the index for »Lexile number«
(it has only :n so far) and adds indexes for 653 (Index term
uncontrolled), 655 (Index term Genre/Form), 041 (language-audio) and
041 (language-subtitle). It also adds the »curriculum«-index to
Search.pm.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds locking to rebuild_zebra.pl to ensure that simultaneous
changes are prevented (as one is likely to overwrite the other).
Incremental updates in daemon mode will skipped if the lock is busy
and they will be picked up on the next pass. Non-daemon mode
invocations will also exit immediately if they cannot get the lock
unless the new flag -wait-for-lock is specified, in which case they
will wait until the get the lock and then proceed.
Supporting changes made to Makefile.PL and templates for the new
locking directory (paralleling the other zebra lock directories).
We stash the zebra_lockdir in koha-conf.xml so rebuild_zebra.pl
can find it.
To address earlier QA concerns we:
1. added code to check if flock is available and ignore locking if
it's missing (from M. de Rooy)
2. changed default for adhoc invocations to abort if they cannot
obtain the lock. Added option -wait-for-lock if the user prefers
to wait until the lock is free, and then continue processing.
3. added missing entry to t/db_dependent/zebra_config.pl
4. added a fallback locking directory of /tmp
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Doug merged the original patch with the QA changes.
Just for the record, noting here that the original patch was tested
extensively too by Martin Renvoize.
I have added a followup for some exceptional cases.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the following changes to UNIMARC biblio indexing :
A. Changes to UNIMARC conf files
1. add comments to biblio-koha-indexdefs.xml
2. make biblio-koha-indexdefs.xml more compact by grouping some
declarations
Ex : 200$f and 200$g => one declaration for 200$fg
3. suppress unneeded declarations (indexing of some 4XX fields and 6XX
fields not in unimarc format)
4. unindex some (sub)fields unneeded by most users (318, 207,230,210a,
215, 4XXd)
5. change the way 308 field is indexed (no visible changes)
6. replace Title-host with Host-item -- see bug 11119
7. index 208 in Material-Type -- see bug 11119
8. index 100 pos 8-9 and 9-12 in pubdate:y and pubdate:n
9. index 100 pos 8-9 in pubdate:s instead of 210$d
10. Index all subfields of note 334 and 327 in note index
11. Index 304 and 327 in title index as well as note index
327 can contain a list of titles included in a work
304 can contain the title of the original work in case of a
translation
12. Index 314 in author index as well as note index
314 can contain authors not mentionned in 200$f/g (the 4th, 5th etc.
author)
13. Index 328 note in Dissertation-information as well as note
14. Index 328$t in Title
B. Changes to ccl.properties :
1. add a new index Dissertation-information (1056)
2. fix EAN, pubdate and acqdate (they were not linked with bib1 attributes)
C. Changes to Search.pm
1. add Dissertation-information and suppress Title-host and UPC
D. Changes to QP config file queryparser.yaml
1. add Dissertation-information
2 fix EAN, pubdate and acqdate
Test plan :
If you cannot test in GRS1, test only in DOM, as GRS will be deprecated.
1. Apply the patch in a UNIMARC Koha running with DOM and ICU
2. copy src/etc/searchengine/queryparser.yaml into the main config
directory of QP
3. copy src/etc/zebradb/ccl.properties into the main config directory
of Zebra
4. copy src/etc/zebradb/marc_defs/unimarc/biblio/* into the main config
directory of Zebra
5. reindex biblios (rebuild_zebra.pl -r -b -x -v)
6. test note index : make some searches on 334$b or 327$b
7. test author index : make some searches on 314 field
8. test title index : make some searches on 304 and 327 field, make a
search on 328$t subfield
9. test dissertation-information index : make some searches on 328 field
10. In a record, put in the dates of 100 fields the values "1000" (1st
date) and "1001" (2d date) ; try to search a book written in year
1000, you should find the record ; idem for year 1001
11. make some searches and sort by date. It should work better as before,
especially if you have values like "c2009" or "impr. 2010" in 210
field
12. Regression test : make some searches on several indexes, like EAN,
etc. It should work as before
Test 10-12 with and without Queryparser activated.
Be careful: with Queryparser activated, the index names (title,
dissertation-information...) must be entered in lowercase only.
Of course, to test search and sort by dates, you need to have full
records, with dates in 100 field as well as 210 field.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the MARC21 DOM index definitions to
index the 952$i as 'Number-local-acquisition' rather than
'stocknumber'.
To test (for a MARC21/DOM setup):
[1] Copy the MARC21 biblio-zebra-indexdefs.xsl over to the
active Zebra configuration directory.
[2] Reindex the bib records.
[3] Verify that 'stocknumber', 'inv', and 'number-local-acquisition'
searches work.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the Number-local-acquisition into QueryParser
configuration file.
Like in ccl.properties, "Number-local-acquisition" is the main index
name and "stocknumber" and "inv" are aliases.
Test plan :
Enable QueryParser :
- Enable UseQueryParser syspref
- Edit your koha-conf.xml
- Add to "config" node : <queryparser_config>[your path]/etc/searchengine/queryparser.yaml</queryparser_config>,
adapt [your patch] to your install configuration folder
- If needed copy from sources "etc/searchengine/queryparser.yaml" into
your install configuration folder
Test search :
- Add Number-local-acquisition on an existing subfield in records.abs.
For example on item barcode field
- Reindex Zebra database
- Choose a value of this field that will match some results. For
example : "0*" will match all barcodes beginning with zero
- In intranet, enter this URL : <your server>/cgi-bin/koha/catalogue/search.pl?idx=stocknumber&q=0*&sort_by=relevance
=> You get some results
- In intranet, enter this URL : <your server>/cgi-bin/koha/catalogue/search.pl?idx=inv&q=0*&sort_by=relevance
=> You get the same results
- In intranet, enter this URL : <your server>/cgi-bin/koha/catalogue/search.pl?idx=number-local-acquisition&q=0*&sort_by=relevance
=> You get the same results
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Comments on case sensitivity of index names in QueryParser, see Bugzilla.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adding Number-local-acquisition in C4::Search known indexes allows to
search without using "ccl=" prefix.
Also corrects in ccl.properties : inv must be an alias of
Number-local-acquisition.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 6256 replaced in bib1.att stocknumber by Number-local-acquisition
for number 1062.
In this case, Number-local-acquisition must be used in record.abs and
stocknumber can be an alias of it in ccl.properties.
Test plan (for MARC21/GRS1):
- drop zebra database (rebuild_zebra.pl -r ...)
- reindex
- test in simple search : ccl=Number-local-acquisition,alwaysmatches=''
=> you get all records with a stocknumber
- test in simple search : ccl=stocknumber,alwaysmatches=''
=> you get the same results
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Corrects a double entry for language in yaml file.
Language should have been language-original.
Test plan:
Check that you have language-original in your zebra install.
Specifically, this index should cover MARC21 041$h.
Enable QueryParser and search for a record with this index.
Note that this patch does not enable searching on this
index without QueryParser. This is true for many more indexes
in record.abs that are not included in the getIndexes routine.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described - make sure you are testing with a current
indexing configuration.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch restores the ability to request a DBI database handle
or a DBIx::Class schema object connected to a PostgreSQL database.
To address the concerns raised in bug 7188, only "mysql" and "Pg"
are recognized as valid DB schemes. If anything else is passed
to C4::Context::db_scheme2dbi or set as the db_scheme in the Koha
configuration file, the DBD driver to load is assumed to be "mysql".
Note that this patch drops any pretense of Oracle support.
To test:
[1] Apply patch, and verify that the database-dependent tests
pass when run against a MySQL Koha database.
[2] To test against PostgreSQL, create a Pg database and
edit koha-conf.xml to set db_scheme to Pg (and adjust
the other DB connection parameters appropriately). The
following tests should pass, at minimum:
t/Context.t
t/db_dependent/Koha_Database.t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described, some additional notes:
- Installed Postgres following
http://wiki.ubuntuusers.de/PostgreSQL
- Created a database user koha
- Created a database koha
- Changed the koha-conf.xml file
<db_scheme>Pg</db_scheme>
<database>koha</database>
<hostname>localhost</hostname>
<port>5432</port>
<user>koha</user>
<pass>xxxx</pass>
- Installed libdbd-pg-perl
- Ran the web installer until step 3 everything looked ok
Step 3 complains:
Password for user koha: psql: fe_sendauth: no password supplied
- Both t/Context.t and t/db_dependent/Koha_Database.t pass
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan the same as the original patch
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested according to test plan. Searches tested were:
fic=e
fiction=e
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
ff7-02 1=87020 (position 2 of field 007 in MARC21) should be
ff7-02 1=8702
lf 1=8833
lf fiction
fic fiction
should be
lf 1=8833
fiction lf
fic lf
To test :
1. apply the patch
2. copy the modified ccl.properties into your active Zebra config
directory
3. reindex zebra (rebuild_zebra.pl -b -x -r -v)
4. make some searches using the fixed indexes
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds language-original to the list of search fields
recognized by QueryParser.
To test:
[1] After doing the tests in the main patch, copy the configuration
file etc/searchengine/queryparser.yaml into place, turn on the
UseQueryParser system preference, and verify that searching on
language-original still works.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
It could be useful to index the original language of a document (i.e.
"fre" for the English translation of a French novel).
This patch renames the Bib-1 use attribute 1095 from
Code-language-original to language-original and uses it to index:
- MARC21 041$h subfield
- UNIMARC 101$c subfield
It adds "language-original" in the list of index in Search.pm.
Test plan :
A. in a MARC21 GRS1 environment
1. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/marc21/biblios/record.abs) from
your source etc/ directory to your main koha etc/ directory
2. Reindex zebra
3. Make some searches, like "language-original:fre"
B. in a MARC21 DOM environment
4. Copy Zebra config files (zebradb/biblios/etc/bib1.att, zebradb/ccl.properties,
marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl) from your source etc/
directory to your main koha etc/ directory
5. Reindex zebra
6. Make some searches, like "language-original:fre"
C. in a UNIMARC GRS1 environment
7. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/unimarc/biblios/record.abs) from
your source etc/ directory to your main koha etc/ directory
8. Reindex zebra
9. Make some searches, like "language-original:fre"
A. in a UNIMARC DOM environment
10. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl)
from your source etc/ directory to your main koha etc/ directory
11. Reindex zebra
12. Make some searches, like "language-original:fre"
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes a problem where if a staff member sets the
*defaultSortField/*defaultSortOrder system preferences to relevance
ascending while QueryParser is enabled, default keyword search
would break -- the query parser config did not declare relevance asc
as a possible "modifier".
Note that setting the sort order to relevance ascending does not
actually make catalog search return results with the least relevant
records showing up first; Zebra does not support such a mode. In other
words, relevance ascending acts exactly the same as relevance descending.
Test plan:
0/ Create some biblio with "history" in the title and
ensure that the QueryParser system preference is enabled.
1/ Define prefs defaultSortField = relevance and defaultSortOrder = asc
2/ Search "history" on the staff interface
3/ Note that no result is returned.
4/ Apply the patch
5/ Verify the queryparser config file in use takes the modification into
account (see the queryparser_config value in your $KOHA_CONF file).
6/ Relaunch the search and verify results are returned
Signed-off-by: Christopher Brannon <cbrannon@cdalibrary.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
With this combination of sysprefs, and a UNIMARC configuration, it was
impossible to search on location, barcode and ccode indexes :
QueryWeightFields is activated
QueryAutoTruncate only if * is added
But in UNIMARC, location, barcode and ccode (995 $e,$f,h) were indexed
only as "words". They need to be indexed also as "phrase".
Additionnaly, in UNIMARC, information about damaged and withdrawn status
of items is not indexed, while it is done in MARC21.
This patch
- add 2 new indexes for 995$1 (damaged) and 995$3 (withdrawn)
- index location, barcode and ccode as "phrase" as well as "words"
Indexing of items in UNIMARC could be improved later. So this patch also
add comments explaining the origin of Koha 995, I think it could be
useful for further changes.
To test, on a UNIMARC configuration :
A. indexed with GRS-1
1) Set sysprefs QueryWeightFields as "activated" and QueryAutoTruncate
as "only if * is added"
2) Select location index in advanced search and search for a value
existing in your records in 995$e => 0 results
3) Apply patch
4) Rebuild zebra
5) Select location index in advanced search and search for a value
existing in your records in 995$e => x results
6) Mark an item as withdrawn; search "withdrawn:1" => x results, and
among them the biblio to which the item is attached
7) Mark an item as damaged ; search "damaged:1" => x results, and among
them the biblio to which the item is attached
B. indexed with DOM
Do the same operations
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors
Test
Apply the patch
Begin with GRS-1
Full reindex
Search by location, no results
cp files biblio-*-indexdefs.xml and record.abs to destination on etc/zebra
Full reindex
Search by location, got results
Switch to DOM
reset files
Full reindex
Search by location, no results
cp files
Full reindex
Search by location, results !
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
I took as a base the patch of F. Demians, but made a lot of changes,
so I think it is more logical to create a new patch as the behavior is
not the same as previous patch.
I tried to define DOM config files as a "miror" of record.abs, so the
behavior be the same.
If it is OK, we will be able to improve indexing later, for example
suppressing warns, managing indicators or subdivisions, etc.
I made some little changes to record.abs :
- comments
- 216 was indexed in Conference-name as well as Trademark. I suppose
that "Conference-name" is an error, so I indexed only in Trademark
- index 2 new notes : 340 / 356
The only difference between record.abs and DOM is that DOM config files
does not index complete fields, but subfields.
Ex :
melm 200 ===> <kohaidx:index_subfields tag="200" subfields="abcdfgjxyz">
I took all the subfields from the UNIMARC Authorities manual. The only
subfields not indexed are numeric subfields : $7, $8 for language of
record, and $0,2,3,5,6 for 4XX/5XX/7XX
To test :
- index a set of bib and auth records with GRS-1
- make some searches on different kind of authorities
- index the same records with DOM
- make the same searches
- You are not supposed to see differences
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
As I am not a UNIMARC user it's hard for me to test this, but
while testing other authority related patches I noticed that I couldn't
index the UNIMARC authorities of the sample base. The files are obviously
missing and reindex_zebra.pl notes this. With this patch applied,
indexing works and authorities are searchable in my installation.
Signed-off-by: Vitor Fernandes <fvernandes@keep.pt>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In UNIMARC DOM indexing, "item" index was working only for subfields
of 995 field mapped with specific indexes, and also in index (ex :
$a, $b...). It was not working for the other subfields (ex : $g),
because a comment from record.abs was integrated in DOM config files.
This patch removes the comment.
To test, in a DOM UNIMARC environment :
1) In a item, write some value "Test10037" in 995$g
2) Search for this value in simple search, this way : item=Test10037
=> you should have no results
3) Apply the patch. if necessary, copy the modified
etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml and
etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl into
the /etc/... directory in your main Koha directory
4) Reindex Zebra biblios
5) Do the same search as 2) => you should have one result
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
Test
NOTE: default UNIMARC framework don't have 995g,
so I must add it first.
1) Added test string to 995b on some record
2) Reindex and search as indicated, no results
3) cp files to destination
4) reindex
5) search and result ok !
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>