Commit graph

76 commits

Author SHA1 Message Date
19d9ba176d Bug 23542: Fix SRU import encoding
When importing records from a SRU server, the diacritics have bad encoding.
I reproduce with BNF server so it may be a UNIMARC issue.

Tests show that difference between Z39.50 server and SRU is that leader contains 'a' at postion 9.
Looking at MARC::Record->encoding() shows that encoding depends on leader even for UNIMARC.
So this patch adds a call to MARC::Record->encoding('UTF-8') in case of a SRU server in C4::Breeding.

Same use exists in Koha::MetadataRecord::Authority::get_from_breeding().

In case of import via Z3950, MarcToUTF8Record() is called,
 which calls SetMarcUnicodeFlag(),
 which calls MARC::Record->encoding('UTF-8')

Test plan :
1) Use a UNIMARC database
2) Configure a connexion to a UNIMARC SRU, for example BNF,
   see https://doc.biblibre.com/koha/autour_de_koha/serveurs_z3950_sru#serveur_de_la_bnf
3) Go to cataloguing module
4) Click on 'New from Z39.50/SRU'
5) Choose only the SRU target
6) Search for ISBN 2266072889
7) Confirm you see good encoding : diacritic on 'a' of title 'Strate-a-gemmes'
8) Click on 'Marc preview'
9) Confirm you see good encoding
10) Click import
11) Confirm you see good encoding
12) Check also Authorities import via SRU
13) Check also SRU imports on a MARC21 database

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: Removed change to new_from_xml call. We should respect syntax.
But the added MARC::Record encoding does the tric! Which is implicit
for Z3950 targets where MarcToUTF8Record does the same.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>

Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
2020-08-12 11:46:25 +02:00
2bf171acaf
Bug 24052: Rename XSLT_Handler
Should be XSLT::Base now.
Removes old XSLT_Handler stub too (from bug 23290).
Result of a git grep | sed statement.

Test plan:
Run qa tools (so modules compile).
Run t/db_dependent/Breeding.t
Run t/db_dependent/Koha/XSLT/Base.t (This test fails when only this patch
has been applied; see subsequent patch.)
Enable XSLT use on results and details display. Check search results and
detail view on OPAC and staff.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
2020-03-24 10:42:23 +00:00
Charles Farmer
bd141a442e
Bug 11297: Add support for custom PQF attributes for Z39.50 server searches.
Adds the "Attributes" field to z3950 servers.

The feature here is not quite de same.

In the old patches, the attributes were applied to individual query parts if the part already contains "@attr" and the additionnal attribute is not already in the query part.

Here, the content of the new field is prepended to all PQF queries sent to the server.

This new way of doing is simpler and works for the sponsor.

Test plan:
 I) Apply the patch
II) Run updatedatabase.pl

1) Add a new z3950 server with the following parameters:
Hostname : catalogue.banq.qc.ca
Port     : 210
Database : IRIS
Syntax   : Marc21

2) Perform a z3950 search on that server.
    Keyword (Any) : egypt
2.1) Nothing Found.

3) Add attributes on the server administration page
    @attr 4=1

4) Perform the same z3950 search
4.1) A lot of results

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
2020-02-19 16:07:59 +00:00
68a92c02df
Bug 21921: Add date publication year to biblio Z39.50 search form
Test plan:

1) Apply the patch
2) Have a Z39.50 endpoint with attr 31 defined - Library of Congress
supports this
3) Try to find some biblio records through Z39.50 using the new field
"Publication year"

Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
2020-02-17 13:44:23 +00:00
f480ca803b
Bug 24267: (QA follow-up) Remove two calls, add transaction
Call to GetAuthorizedHeading is already done just before calling ImportBreedingAuth.
Call to GuessAuthTypeCode is not used.
Adding transaction to test (check your database, kidclamp ;)

Test plan:
Add new authority via Z3950 in the interface.
Run t/db_dependent/Breeding.t

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
2020-01-02 14:04:45 +00:00
7abb7350ef
Bug 24267: Improve ImportBreedingAuth
git grep ImportBreedingAuth - there is only one call to this routine
from SearchZ3950Auth

We pass it a MARC record, '2' for overwrite_auth

We then check for this record in the DB and get the breeding id,
however, when overwrite_auth is 2 we always add the auth to the batch
and return the new breeding id.

We don't actually use any of the other parameters returned here either

To recreate:
1 - Browse to Authorities
2 - Select New form Z3950
3 - Perform a search that returns results
4 - SELECT COUNT(*) FROM import_auths
5 - Repeat the search
6 - SELECT COUNT(*) FROM import_auths
7 - There are 20 more records
8 - SELECT * FROM import_auths - note the repeated rows

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
2020-01-02 14:04:18 +00:00
5867683fe9 Bug 22532: Remove Z39.50 random
Test plan:
Try to search, preview and import authority from Z39.50, everything
should work as expected

Signed-off-by: Hayley Mapley <hayleymapley@catalyst.net.nz>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2019-04-18 10:48:10 +00:00
Charles Farmer
d37da4d24f Bug 12747: (QA follow-up) Treat 010 according to marcflavour
Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>
Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-10-01 13:56:26 +00:00
David Bourgault
f6e86dc0ca Bug 12747: Add extra column in Z3950 search
This patch makes it possible to add an extra column to Z3950 search results.
The system preference AdditionalFieldsInZ3950ResultSearch decides which MARC field/subfields are displayed in the column.

Testing:

I Apply the patch
II Run updatedatabase.pl

ACQUISITIONS
0) Enter a field/subfield in the AdditionalFieldsInZ3950ResultSearch
1) Create a new basket or use an existing one
2) In -Add order to basket-, click "From an external source"
3) Select some search targets and enter a subject heading ex. house
4) Click Search bouton
5) Validate "Additional fields" column with the field/subfield value.

CATALOGUING
0) Shares same syspref as above
1) Go to cataloguing, click New from z3950
2) Fill to result in a successful search
3) Validate column Addition Fields

prove t/db_dependent/Breeding.t

Sponsored-by: CCSR (https://ccsr.qc.ca)

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-10-01 13:56:26 +00:00
39f4813041 Bug 21404: Refactor _build_query subroutines
Test plan:
1) Apply the patch
2) prove t/db_dependent/Breeding.t
3) Try to search using Z39.50, both, authority and biblio should still
work

Signed-off-by: Owen Leonard <oleonard@myacpl.org>

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-10-01 11:11:39 +00:00
de40463a7f Bug 21404: Remove unused variables in C4::Breeding->_auth_build_query
Signed-off-by: Owen Leonard <oleonard@myacpl.org>

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-10-01 11:11:39 +00:00
f3c2eae74c Bug 21318: Add control number as an option to search authority using Z39.50
1) Apply the patch
2) Go to administration and set up a z39.50 authority server, which does
support searching by control number (use attribute 12), you can use czech
national library server:
host: aleph.nkp.cz
port: 9991
base: aut-utf
format: MARC21
encoding: UTF-8
3) Try to find an authority by control number using z39.50 - if you use the server
recomended in point 2) there is web access to the base at
http://aleph.nkp.cz/eng/aut

Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Fixed a typo in a code comment and a whitespace issue in the template.

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-10-01 11:11:37 +00:00
15088c67d6 Bug 19436: (QA follow-up) Revert change in _handle_one_result
See Bugzilla comment 7. This change does not belong here and is
dubious on its own. Needs further attention on another report.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

Amended: In consultation with the author the same change is applied to the
corresponding lines in Z3950SearchAuth.

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-08-08 20:51:06 +00:00
Matthias Meusburger
7baa452a6a Bug 19436: Add SRU support for authorities
Test plan:
 - Apply the patch
 - Add an SRU authority server in admininistration -> Z39.50/SRU servers
   You can try with the French national library, configured as such:
   Hostname: catalogue.bnf.fr
   Port: 80
   Database: api/SRU
   Syntax: Unimarc
   Record type: authority
   Additional SRU options: version=1.2,sru=get
   SRU Search fields mapping example:
	Keyword (any): aut.anywhere
	Name (any): aut.anywhere
	Author (any): (aut.type any "pep org") and aut.accesspoint
	Author (personal): aut.type=pep and aut.accesspoint
	Author (corporate): aut.type=org and aut.accesspoint
	Author (meeting/conference): aut.type=org and aut.accesspoint
	Subject heading: (aut.type any "geo ram_nc ram_ge ram_pe ram_co") and aut.accesspoint
	Subject sub-division: aut.type=ram_pe and aut.accesspoint
	Title (any): (aut.type any "tic tut tum ram_tp ram_tu") and aut.accesspoint
	Title (uniform):(aut.type any "tut tum ram_tu") and aut.accesspoint

 - Try a search from Authorities -> New from Z39.50/SRU
 - Check that the authority is correctly displayed in "Show Marc"
 - Check that the authority is correclty added to koha in "Import"
 - prove t/db_dependent/Breeding.t

Signed-off-by: François Pichenot <fpichenot@ville-roubaix.fr>

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-08-08 20:31:34 +00:00
f253c72bc3 Bug 20272: Changes for Breeding.pm and Record.pm
In Breeding.pm we let Z3950Search return the xslt handler error codes back
to the template. They are converted to text messages by using an new include
file (added for opac and intranet now). The generic xslt_err code is now
obsoleted.

In Record.pm the errstr call is removed. The croak is done with the new
error code in err. This seems sufficient.

Test plan:
[1] Run Breeding.t
[2] Run Record.t
[3] Add a nonexisting xslt file to one of your Z3950 targets. Search on that
    target and check if you see a error 'XSLT file not found'.

The bonus is these error messages are now translatable as they are in
the templates

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>

Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
2018-07-02 12:12:49 +00:00
590cae04fd Bug 19096: Make Default authoritative in core modules
After feedback from the dev mailing list, it seems appropriate here to
propose making the Default framework authoritative for Koha to MARC
mappings. This implies checking only the Default framework in the
routines:

[1] GetMarcFromKohaField: The parameter frameworkcode is removed. A
    follow-up report (19097) will update the calls not adjusted here.
    This is safe since the parameter is silently ignored.
[2] GetMarcSubfieldStructureFromKohaField: Framework parameter is removed
    and calls are adjusted. Includes acquisitions_stats.pl.
[3] TransformKohaToMarc: The parameter is removed; all calls are verified
    or adjusted.
[4] TransformMarcToKoha: The parameter is no longer used and will be
    removed in a follow-up report (19097). It always goes to Default now.
[5] TransformMarcToKohaOneField: The parameter is removed and all calls
    are adjusted. Including: Breeding, XISBN and MetadataRecord modules.
[6] C4::Koha::IsKohaFieldLinked: This routine was called only once (in
    C4::Items::_build_default_values_for_mod_marc. It can be replaced by
    calling GetMarcFromKohaField. If there is no kohafield linked, undef
    is returned. (Corresponding unit test is removed here.)
[7] C4::Items::ModItemFromMarc: The helper routine
    _build_default_values_for_mod_marc does no longer have a framework
    parameter. The cache key default_value_for_mod_marc- is no longer
    combined with a frameworkcode. Three admin scripts are adjusted
    accordingly; some tests will be corrected in the next patch.

Test plan:
See next patch. That patch adjusts all tests involved.

Signed-off-by: Josef Moravec <josef.moravec@gmail.com>

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>

Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
2017-12-07 14:44:15 -03:00
01fbe2be99 Bug 10306: Core module changes for multiple mappings
In order to allow multiple Koha to MARC mappings (for one kohafield), we
need to adjust a few key routines in C4/Biblio.pm. This results in a few
changes in dependent modules too.

Note: Multiple mappings also include 'alternating' mappings. Such as the
case of MARC21 260 and 264: only one of both fields will be used. Sub
TransformMarcToKoha will handle that just fine; the opposite transformation
is harder, since we do no longer know which field was the source. In that
case TransformKohaToMarc will fill both fields. We only use that operation
in Koha for items (in Acquisition and Cataloging).

Sub GetMarcSubfieldStructure
This sub used a selectall_hashref, which is fine as long as we have only
one mapping for each kohafield. But as DBI states it: If a row has the same
key as an earlier row then it replaces the earlier row. In other words,
we lose the first mapping if we have two.
This patch uses selectall_arrayref with Slice and rearranges the output so
that the returned hash returns an arrayref of hashrefs for each kohafield.
In order to improve consistency, we add an order clause to the SQL
statement used too.

Sub GetMarcFromKohaField
This sub just returned one tag and subfield, but in case of multiple
mappings we should return them all now.
Note: Many calls still expect just one result and will work just fine:
    my ($tag, $sub) = GetMarcFromKohaField(...)
A possible second mapping would be silently ignored. Often the sub is
called for biblionumber or itemnumber. I would not recommend the use of
multiple mappings for such fields btw.
In case the sub is called in scalar context, it will return only the first
tag (instead of the number of tags and subfields).

Sub GetMarcSubfieldStructureFromKohaField
This sub previously returned the hash for one kohafield.
In scalar context it will behave like before: it returns the first hashref
in the arrayref that comes from GetMarcSubfieldStructure.
In list context, it returns an array of all hashrefs (incl. multiple
mappings).
The sub is not used in C4::Ris. Removed the use statement.

Sub TransformKohaToMarc
This sub got a second parameter: frameworkcode.
Historically, Koha more or less assumes kohafields to be defined across all
frameworks (see Koha to MARC mappings). Therefore it falls back to Default
when it is not passed.
When going thru all mappings in building a MARC record, it also supports
multiple mappings. Note that Koha uses this routine in Acquisition and in
Cataloging for items. Normally the MARC record is leading however and the
Koha fields are derivatives for optimization and reporting.

The added third parameter allows for passing a new option no_split => 1.
We use this option in C4::Items::Item2Marc; if two item fields are mapped to
one kohafield but would have different values (which would be very unusual),
these values are glued together. When transforming to MARC again, we do not
want to duplicate the item subfields, but we keep the glued value in both
subfields. This operation only affects items, since we are not doing this
reverse operation for biblio and biblioitem fields.

Sub _get_inverted_marc_field_map
This sub is a helper routine of TransformMarcToKoha, the opposite
transformation. When saving a MARC record, all kohafields are extracted
including multiple mappings.
Suppose that you had both 260c and 264c in your record (which you won't),
than both values get saved initially into copyrightdate like A | B. The
additional code for copyrightdate will extract the first year from this
string.
A small fix in TransformMarcToKoha makes that it only saves a value in a
kohafield if it is defined and not empty. (Same for concatenation.)

Sub TransformMarcToKohaOneField
This sub now just calls TransformMarcToKoha and extracts the requested
field. Note that since we are caching the structure, this does not result
in additional database access and is therefore performance-wise
insignificant. We simplify code and maintenance.
Instead of modifying the passed hashref, it simply returns a value. A call
in C4::Breeding is adjusted accordingly. The routine getKohaField in
Koha::MetadataRecord is redirected to TransformMarcToKohaOneField.
NOTE: The fourth patch restructures/optimizes TransformMarcToKoha[OneField].

Sub get_koha_field_from_marc
This sub can be removed. A call is replaced by TransformMarcToKohaOneField
in C4::XISBN.

Note: The commented lines for sub ModZebrafiles are removed (directly under
TransformMarcToKohaOneField).

Test plan:
For unit tests and interface tests, please see follow-ups.
Run qa tools in order to verify that the modules still compile well.
Read the code changes and verify that they make sense.

Signed-off-by: Josef Moravec <josef.moravec@gmail.com>

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>

Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
2017-12-07 14:44:15 -03:00
798d38e4c7 Bug 16011: $VERSION - Remove comments
perl -p -i -e 's/^.*set the version for version checking.*\n//' **/*.pm

+ manual adjustements

Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>

Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
2016-03-24 17:20:29 +00:00
017699c345 Bug 16011: $VERSION - Remove the $VERSION init
Mainly a
  perl -p -i -e 's/^.*3.07.00.049.*\n//' **/*.pm
Then some adjustements

Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>

Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
2016-03-24 17:20:28 +00:00
3830d78d46 Bug 16011: $VERSION - remove use vars $VERSION
perl -p -i -e 's/^(use vars .*)\$VERSION\s?(.*)/$1$2/' **/*.pm

Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>

Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
2016-03-24 17:20:26 +00:00
Blou
d10802f603 Bug 13987: Fix server name in z39.50 authority search results
When doing an Auth search through z3950, the resulting table has the first column (servers name) always empty.

TEST
1) once logged into the intranet, go to Authorities.
2) Click New from z39.50, fill appropriatly for a successful search.
3) Acknowledge first column is empty.  Always.
4) Apply the (very simple) patch.
5) Do another search, validate column is not empty anymore.

Signed-off-by: Nick Clemens <nick@quecheelibrary.org>

Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2015-04-22 14:39:00 -03:00
Jonathan Druart
a6c9bd0eb5 Bug 9978: Replace license header with the correct license (GPLv3+)
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>

http://bugs.koha-community.org/show_bug.cgi?id=9987

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2015-04-20 09:59:38 -03:00
Frédérick
eed620c773 Bug 11961 - Add a "Z39.50 search" button to the authority creation and modification pages.
This button lets you replace existing authorities using a Z39.50 search.

http://bugs.koha-community.org/show_bug.cgi?id=11961
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
All tests pass

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2015-01-24 18:19:06 -03:00
Jonathan Druart
fab96202fd Bug 13296: (follow-up) permit grep on AUTHUNIMARC
I would prefer not to hide this "stuff".

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-12-19 14:42:03 -03:00
9ccdbc49c7 Bug 13296 - error when using z3950 with UNIMARC authorities
When using a z3950 connexion with UNIMARC authorities, you get an error :
Unsupported UNIMARC character encoding [ ] for XML output for UNIMARCAUTH; 100$a -> 20141119

I've seen thant Bug 2060 when adds authorities import adds a special behavior for UNIMARC : marc flavor must be UNIMARCAUTH instead of just UNIMARC.

This patch adds the same behavior when using z3950 connexion and import.

Test plan :
 - Use a UNIMARC install
 - Define a z3950 connexion for UNIMARC authorities
 - Go to Authorities module
 - Click on "New from Z39.50"
 - Perform a search
=> Without patch : you get the error
=> With patch : you get results
 - Import one result
=> You get the authoritie creation form with all datas
You may check same plan with MARC21 install

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
NOTE: depending on the target, the syntax in the configuration
might not be UNIMARC, but MARC21/USMARC instead!

Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-12-19 14:41:52 -03:00
Dobrica Pavlinusic
3fa6bf051a Bug 12898 - Z39.50 title search doesn't work with multiple words
This fixes regression introduced by Bug 6536 so that multiple
words in title search will work.

Test scenario:

1. try z39.50 search with more than one word
2. verify that no results apper
3. apply patch and re-run search
4. verify that there are results

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-14 02:02:51 -03:00
db1929094f Bug 6536: QA Follow-up for fixing a unit test under Perl 5.18
The last test on the first series, fails randomly on Perl 5.18:
    not ok 12 - Third query makes no difference
    #   Failed test 'Third query makes no difference'
    #   at t/db_dependent/Breeding.t line 104.
    #          got: ''
    #     expected: '1'
    # Looks like you failed 1 test of 12.
not ok 1 - _build_query

This change makes tests pass. Please consider if this needs to be fixed
(i.e. sort order matters) or the test needs to be rewritten.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
I agree with adding the sort. (The need for doing this in Perl 5.18 is another
topic.)

Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:21 -03:00
168871403c Bug 6536: QA Follow-up for removing warnings from QA tools
Resolved:
[1]  FAIL   C4/Breeding.pm
FAIL   critic ControlStructures::ProhibitMutatingListFunctions

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
No warnings anymore.

Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:19 -03:00
a06b1ac728 Bug 6536: [QA Follow-up] Remove surrounding spaces in sru_fields
This patch only removes surrouding spaces at comma and equals-sign while
passing the options in sru_fields to the ZOOM object.

Test plan:
If you add spaces between options in sru_fields, searching should still work.
E.g. sru_fields= sru = get , sru_version = 1.1

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:12 -03:00
8c7377d21d Bug 6536: Add XSLT transformation on Z3950 search results
Use the stylesheets listed in field add_xslt of z3950servers to transform
search results of Z3950/SRU search.
Additionally, the template has been changed to make more error messages (or
warnings) visible when displaying results. Until now, error message were
shown in the results table and when connection errors occurred, no results
were displayed at all.

Test plan:
Create some stylesheets (or see the sample patch on bug 6536).
Add these stylesheets to some Z3950/SRU servers.
Do Z3950 search and verify the transformations.
Do a search with 2 targets; make one target fail (by manipulating its server
data). Do you see the connection error and the results for the other target?
Generate a XSLT error by modifying one stylesheet. Check search results. You
should see warnings.

Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:10 -03:00
fb0834e8f5 Bug 6536: Include SRU searching in Breeding.pm
This patch makes it possible to include SRU servers in Z3950 search.
It adjusts the Z3950Search routine in Breeding module.
It also replaces SQL code with DBIx statements in Breeding.pm/Z3950Search
and the associated scripts z3950search.pl in cataloguing and acqui.

Test plan:
Verify if a normal Z3950 search still works in cataloging/acqui.
Add a SRU target. (You could just use Koha's port 9998.)
Define sru_options like sru=get.
Use that target in a Z3950 search in cataloging and acqui. (Import.)
Test sru_fields translation by comparing search results between various
settings for some of the fields. For instance, leave title empty and
after that set it to the title field of your SRU target.

Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:07 -03:00
5b0d410d2d Bug 6536: Adjustments for servername and servertype
Replaces name by servername, type by servertype for running Z3950 search.
Limit search scripts to zed (z3950) servers until sru is supported.

Test plan:
Perform a Z3950 search in Cataloguing and Acquisition.
Verify that it still works as it did.

Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
2014-09-01 10:09:05 -03:00
Galen Charlton
990bb17e14 Bug 12112: remove disused routine C4::Breeding::ImportBreeding()
This patch removes the ImportBreeding() routine, which lost its
last caller as of the patch for bug 10462.

To test:

[1] Verify that prove -v t/Breeding.t passes.
[2] Perform a Z39.50 search in the staff interface.
[3] Perform a cataloguing reservoir search in the staff
    interface; verifying that cached records from the search
    done in step 2 are retrieved.

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2014-04-25 15:07:52 +00:00
Fridolyn SOMERS
cac06afeb1 Bug 11254: make reservoir search normalize ISBNs
When importing records, the ISBN is normalized and stored
into database (see C4::ImportBatch::_add_biblio_fields).  But when
searching with ISBN into reservoir, it is not normalized
(see C4::Breeding::BreedingSearch).  So search does not match.

This patch adds the normalisation to reservoir search.  Also, it
replaces call private method _isbn_cleanup by GetNormalizedISBN,
the correct public method.  Also allows the reservoir search
on ISBN with hyphens.

This is intended to fix only reservoir searches.

Revised Test plan
-----------------
 1) Back up DB
 2) Save copy of attached example somewhere findable
 2) Home -> Tools -> Stage MARC records for import
 3) Click Browse and select the example MARC file
 4) Click Upload file
 5) Tweak as desired then click Stage for import
 6) Click Manage staged records
 7) Click Import this batch into the catalog
 8) Home -> Cataloging
 9) In the Cataloging search text box type 978-0-691-14289-0 and
     click Submit
    -- ISBN13 with hypens not found in reservoir
10) In the Cataloging search text box type 9780691142890 and
     click Submit
    -- ISBN13 without hypens not found in reservoir
11) In the Cataloging search text box type 0-691-14289-0 and
     click Submit
    -- ISBN10 with hypens not found in reservoir
12) In the Cataloging search text box type 0691142890 and
     click Submit
    -- ISBN10 without hypens found in reservoir
13) Apply patch
14) Repeat steps 9-12, this time it is always found! :)

Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2014-04-19 21:44:30 +00:00
0fc114eee3 Bug 11419: display Z39.50 search errors more completely
When a z39.50 server isn't able to be searched successfully, the yellow
error box came up empty.  This patch fixes the problem.

Test Plan:
1) Go to Administration/z39.50 servers
2) Create a fake z39.50 server with a made up address
3) Go to cataloging, search only that server
4) Note the empty yellow alert box
5) Apply this patch
6) Re-run the search, not the alert box has a message in it now

Signed-off-by: Nora Blake <nblake@masslibsystem.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works according to test plan.
When one of the selected servers gives result no dialog
box is shown before and after applying the patch.

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-12-27 00:25:39 +00:00
264de29621 Bug 10096 - (follow-up) various QA improvements
- improve POD
- remove extraneous comments
- correct license statement in new files
- remove backticks in database update SQL

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-10-04 14:29:18 +00:00
1e0b890b0c Bug 10096 - Add a Z39.50 interface for authority searching
This patch introduces a new Z39.50 interface for searching Z39.50
compliant databases for MARC authority records.

These databases aren't as common as their bibliographic equivalents,
but they're out there and very useful. I have included info at the
bottom of this messsage for sample authority databases you can try.

To test this patch:

1) Set up Z39.50 client targets for authority databases. (I've included
information at the bottom of this message for LibrariesAustralia's
test server for authorities as well as instructions on how to use
your Koha's z39.50 authority server as well. The Library of Congress
also has authority databases available (unsure if these are test or
prod), and you might have access to others through OCLC or RLIN. OCLC
provides login credentials for their test databases.

2) Go to the Authorities module

3) Click on the new "Z39.50 search button"

4) Select your authority search targets from the list.

5) Do a search for an authority you would like using either the "Raw"
input box or the more specific input boxes for names, subjects, subject
sub divisions, or titles. (I like searching Name (personal): Eric on
the LibrariesAustralia test DB.)

6) You should see a table listing the server, heading, authority type,
and two other columns (MARC and a nameless column). "Authority type"
is the type of authority it will become when imported in to Koha. In
the Eric example, "PERSO_NAME".

7) Click on "MARC" next to the results of interest to review the MARC
authority record.

8) When you're satisfied with a record, click on "Import".

9) The pop-up window will close and your original Koha window will
change to the "Adding authority Personal Name" screen (in the Eric
example).

10) All the relevant fields should be filled out for the record. Review
them and make any changes as necessary. (N.B. The 001 will be cleared
when saved, so if you have a use for the imported control number, move
it to the 010, 016, or 035 as appropriate. If you have a default value
for the 003, this will also likely be overwritten. Move it if necessary.
The 005 will also be updated when saved, so do not worry about that.)

11) When you're satisfied, click save.

12) Presto! You've imported your first authority record via Z39.50!

--

Here is the info for the LibrariesAustralia test Z39.50 authority
database:

Z39.50 server: LibrariesAustralia Authorities
Hostname: z3950-test.librariesaustralia.nla.gov.au
Port: 210
Database: AuthTraining
Userid: ANLEZ
Password: z39.50
Syntax: MARC21/USMARC
Encoding: utf8

-

The U.S.A. Library of Congress also provides Z39.50 access to its Name
and Subject Authorities (http://www.loc.gov/z3950/lcserver.html).

Name Authority:
Z39.50 server: Library of Congress Name Authority File
Hostname: lx2.loc.gov
Port: 210
Database: NAF
Syntax: MARC21/USMARC
Encoding: utf8

Subject Authority:
Z39.50 server: Library of Congress Subject Authority File
Hostname: lx2.loc.gov
Port: 210
Database: SAF
Syntax: MARC21/USMARC
Encoding: utf8

(N.B. Both of these databases also include title authorities.)

-

For testing purposes, you can also set up a Z39.50 client target,
which points at your own Koha instance's Z39.50 authority server.

To find the hostname, go to /etc/koha-conf.xml and find the value for
the <listen id="authorityserver"> element. Depending on your
configuration, this could be something like the following:

unix:/zebra/koha/var/run/zebradb/authoritysocket

(N.B. You might be using a different scheme than unix sockets...)

To find the database, scroll down to the bottom of koha-conf.xml until
you reach the <config> element. Within this, look for the value of the
element <authorityserver>. It should probably be "authorities".

To set up this Z39.50 client target in Koha...

Z39.50 server: my koha authorities
Hostname: unix:/zebra/koha/var/run/zebradb/authoritysocket
Port:
Database: authorities
Userid:
Password:
Syntax: MARC21/USMARC (or whichever flavour you need)
Encoding: utf8

Signed-off-by: Mason James <mtj@kohaaloha.com>

Bug 10096 [FOLLOW-UP] - Add a z39.50 interface for authority searching

This patch adds the "recordtype" column to the "z3950servers" table.

The value in this column (biblio or authority) then controls whether
the z3950 server shows up in a bibliographic search (through the
Acq and Cataloguing modules) or in an authority search (through
the Authorities module).

I also edited the z3950 management console to show this value
and allow users to edit it. The default value is "biblio", since
the vast majority of z3950 targets will be bibliographic. However,
there is an option to add/edit a z3950 target as a source of
authority records.

Test Plan:

1) Apply both patches
2) Run updatedatabase.pl (after setting your KOHA_CONF and PERL5
environmental variables)
3) Use the test plan from the 1st patch

N.B. Make sure that your Z39.50 client target has a Record Type
of Authority, otherwise it won't display when you're doing a
Z3950 search for authorities.

Signed-off-by: Mason James <mtj@kohaaloha.com>

Bug 10096 [FOLLOW-UP] - fix tabs/whitespace errors to pass QA

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-10-04 14:26:29 +00:00
Galen Charlton
75842c7d62 Bug 10462: (follow-up) remove some undefined variable warning noise
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-07-24 16:55:52 +00:00
4bf04e4239 Bug 10462: QA Followup to resolve LCCN mixup and remove hardcoded marc tags
This patch corrects the mixup for LC call number and control number.

Further, as suggested by Galen, it would be better to not introduce hardcoded
tags in the Z3950Search subs in Breeding.pm.
This patch resolves that by calling TransformMarcToKohaOneField.
Note that this only involves changes to _addrowdata and _isbn_show. These
subs are only used in building the displayed results table.

Additionally, for French UNIMARC installs publicationyear is used to fill
the Date column (copyrightdate is not used in those installs). The edition
statement is only used in unimarc_lecture_pub not in unimarc_complet.

Test plan:
Do some Z3950 searches and look for values in all result columns.
For MARC21 on LOC (and/or others):
  Look for isbn 9780415964845 (check LCCN).
  Look for author Rowling.
For UNIMARC on BNF2 (and/or others):
  On BNF2 look for isbn 2070518426: result contains date and multiple isbn's.
  Look for title: Guide des candidats aux emplois de commissaire de police.
  Third result show edition statement (if you use 205$a with pub install).
  Note that there are no results with LCCN here (just as before).

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested for MARC21 and UNIMARC (French lecture_pub install).

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-07-24 16:33:00 +00:00
7e83c7ea38 Bug 10462: Followup for showing multiple ISBNs in Z3950 response
As Jonathan correctly noted, the new Z3950 response only showed one isbn
although more isbn numbers could be in the record and would be imported.
To resolve this display problem, I traverse them all now in the updated
routine _isbn_show. There is no change in the imported records.
Note that before this patch TransformMarcToKoha did put all isbn numbers in
one field, separated by pipes (for display only). This behavior is restored
now. The three regexes on the individual isbn numbers now seem to be
overkill, but I left them there for completeness.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested this on a fresh French install under UNIMARC with BNF server.
Tested it too for MARC21.

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-07-24 16:32:47 +00:00
52dad05b45 Bug 10462: Some optimizations in Z3950 search paving the way for enhancements
Refactors Z3950Search.
Disable batch record counts for z3950 records.

Test plan:
Do various Z3950 searches on multiple targets from Cataloging and Acquisition.
Behavior should not have changed.

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
2013-07-24 16:32:29 +00:00
aff9d00b71 Bug 9986: Two fixes for Z3950 search
Searching for stdid: Standard ID, srchany: RAW (any) somehow did not work
anymore.
Probably my fault :) Note that these two fields are in Cataloging Z3950 search
and not in Acquisition.

Fixing encoding problems: When adding -utf flag for CGI in acqui/z3950 and
cataloging/z3950, the decoding statements in C4/Breeding, Z3950Search should be
removed.

Test plan:
Search in Cataloging with:
Standard ID: 9782358670043 on LOC
RAW (any): musee [add an accent aigu on first e]  on LOC  -- Add diacritic!!!

Search in Acquisition
Somewhere, does not matter, but use a diacritic.

A note: My git version still has a hard time with utf8. Need to upgrade to version 1.7.10 to resolve this..

Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>

Comment: Work as described. No errors
Without patch z39.50 search for example Std ID OR musee gives no results,
with patch there are.
No problems in acq search.

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Good catch, passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2013-04-15 08:44:11 -04:00
d85fa9c5fb 9105: Followup for closing Zoom connections
Housekeeping: close the results sets and connections from Z3950 searches.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
No regression found, all tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2013-02-12 08:49:58 -05:00
3f73f9228b Bug 9105: Second housekeeping followup
Removing some unused variables.
Restoring timeout parameter that was only used in cataloging.
Restoring copyrightdate and editionstatement in row data for template.
Small adjustment at the end of the while loop with template vars.

Discovered while doing so, that the paging feature needs some further corrections; will propose a patch under a separate report.

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-12-22 16:16:59 -05:00
e201d55c21 Bug 9105: Housekeeping followup
Remove some debug warnings, fix indentation

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-12-22 16:16:59 -05:00
e9766f6094 Bug 9105: Move Z3950 search code to Breeding module
As a first step in realizing the goals of report 6536 (Z3950 Search improvements), this patch moves identical code in acquisition and cataloging to module level.
A followup deals with formatting.
Note that this patch should not change any behavior.

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
I did not find any regression

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-12-22 16:16:59 -05:00
Frédérick Capovilla
37340e3718 Normalize records imported from Z39.50 servers.
Some Z39.50 server may use the MARC-8 encoding, which uses separated
diacritics. By forcing a normalization, all imported records will have
combined diacritics.

Records with separated diacritics might not show up in Zebra searches if
the search terms use accented characters.

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>

http://bugs.koha-community.org/show_bug.cgi?id=8610
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
checked it still works after the patch with UNIMARC and BNF server (that
provide utf-8 records)
2012-10-08 18:46:56 +02:00
Mark Tompsett
514b32898e Bug 8350: warning in logs when searching for nonexistent ISBN
Searching for a 10 or 13 digit numeric string that does not exist in
one's catalog will fail to affect SQL statement correctly
in C4::Breeding. Moved string substitution, which was triggering error
when search was undefined, and fixed if statements accordingly.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-02 16:58:41 +02:00
Chris Cormack
509d673f10 Bug 7941 : Fix version numbers in modules
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-11 17:29:38 +02:00
Srdjan
12ff7355bb bug_7613: OCLC Connexion gateway
svc/import_bib:
* takes POST request with parameters in url and MARC XML as DATA
* pushes MARC XML to an impoort bach queue of type 'webservice'
* returns status and imported record XML
* is a drop-in replacement for svc/new_bib

misc/cronjobs/import_webservice_batch.pl:
* a cron job for processing impoort bach queues of type 'webservice'
* batches can also be processed through the UI

misc/bin/connexion_import_daemon.pl:
* a daemon that listens for OCLC Connexion requests and is compliant
  with OCLC Gateway spec
* takes request with MARC XML
* takes import batch params from a config file and forwards the lot to
  svc/import_bib
* returns status

ImportBatches:
* Added new import batch type of 'webservice'
* Changed interface to AddImportBatch() - now it takes a hashref
* Replaced batch_type = 'batch' with
  batch_type IN ( 'batch', 'webservice' ) in some SELECTs

Signed-off-by: MJ Ray <mjr@phonecoop.coop>
2012-04-06 17:26:20 +02:00