This patch makes the cataloging reservoir search results a configurable
DataTable. The empty edition and date columns are removed, and an import
data column is added.
To test, apply the patch and go to Cataloging.
- Perform a cataloging search which will return results from the
reservoir.
- The table of reservoir search results should be a DataTable with
paging, navigation, filtering, column configuration, etc.
- Confirm that all DataTable controls work correctly.
- Go to Administration -> Table settings -> Cataloging -> addbooks.
- Try modifying the default configuration and confirm that the
settings take effect.
Signed-off-by: Barbara Johnson <barbara.johnson@bedfordtx.gov>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
The code in the script and the module attempt to determine whether a term is an isbn, or not. Rather
than try to do this, we can simply search it on the three fields: isbn, title, author
Additionally, we should search as any of the ISBN variations to broaden our matches
Note: Curently only an ISBN 10 is stored in import biblios, so for an ISBN13 that doesn't convert
the value will be blank - this is another bug
To test:
1 - Perform a cataloging search for a valid ISBN 13 with no ISBN10 counterpart:
9798200834976
2 - 500 error
3 - Apply patch
4 - Repeat, no results
5 - Import some records
6 - Search by title/author/isbn
7 - Confirm searching works as expected
WNC amended to fix spelling
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
AMENDED: Useless call of ISBNs (plural) when you only pass one parameter.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
On bug 17591 we discovered that there was something weird going on with
the way we export and use subroutines/modules.
This patch tries to standardize our EXPORT to use EXPORT_OK only.
That way we will need to explicitely define the subroutine we want to
use from a module.
This patch is a squashed version of:
Bug 17600: After export.pl
Bug 17600: After perlimport
Bug 17600: Manual changes
Bug 17600: Other manual changes after second perlimports run
Bug 17600: Fix tests
And a lot of other manual changes.
export.pl is a dirty script that can be found on bug 17600.
"perlimport" is:
git clone https://github.com/oalders/App-perlimports.git
cd App-perlimports/
cpanm --installdeps .
export PERL5LIB="$PERL5LIB:/kohadevbox/koha/App-perlimports/lib"
find . \( -name "*.pl" -o -name "*.pm" \) -exec perl App-perlimports/script/perlimports --inplace-edit --no-preserve-unused --filename {} \;
The ideas of this patch are to:
* use EXPORT_OK instead of EXPORT
* perltidy the EXPORT_OK list
* remove '&' before the subroutine names
* remove some uneeded use statements
* explicitely import the subroutines we need within the controllers or
modules
Note that the private subroutines (starting with _) should not be
exported (and not used from outside of the module except from tests).
EXPORT vs EXPORT_OK (from
https://www.thegeekstuff.com/2010/06/perl-exporter-examples/)
"""
Export allows to export the functions and variables of modules to user’s namespace using the standard import method. This way, we don’t need to create the objects for the modules to access it’s members.
@EXPORT and @EXPORT_OK are the two main variables used during export operation.
@EXPORT contains list of symbols (subroutines and variables) of the module to be exported into the caller namespace.
@EXPORT_OK does export of symbols on demand basis.
"""
If this patch caused a conflict with a patch you wrote prior to its
push:
* Make sure you are not reintroducing a "use" statement that has been
removed
* "$subroutine" is not exported by the C4::$MODULE module
means that you need to add the subroutine to the @EXPORT_OK list
* Bareword "$subroutine" not allowed while "strict subs"
means that you didn't imported the subroutine from the module:
- use $MODULE qw( $subroutine list );
You can also use the fully qualified namespace: C4::$MODULE::$subroutine
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
When searching for authorities if an authorities server reply has invalid records
none are displayed.
At least french BNF SRU server doesn't fully follow norm and can return an error
confusing Koha protocol handler which then returns an empty MARC record.
This patch silently removed bogus records.
To Test:
1- Add BNF SRU server
2- Go to authorities page
3- Add an authority
4- Search for keyword(any) droits de l'homme
5- No result (Internal Server Error)
6- Apply patch
7- restart starman
8- redo 4
9- Many records are displayed
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
When importing records from a SRU server, the diacritics have bad encoding.
I reproduce with BNF server so it may be a UNIMARC issue.
Tests show that difference between Z39.50 server and SRU is that leader contains 'a' at postion 9.
Looking at MARC::Record->encoding() shows that encoding depends on leader even for UNIMARC.
So this patch adds a call to MARC::Record->encoding('UTF-8') in case of a SRU server in C4::Breeding.
Same use exists in Koha::MetadataRecord::Authority::get_from_breeding().
In case of import via Z3950, MarcToUTF8Record() is called,
which calls SetMarcUnicodeFlag(),
which calls MARC::Record->encoding('UTF-8')
Test plan :
1) Use a UNIMARC database
2) Configure a connexion to a UNIMARC SRU, for example BNF,
see https://doc.biblibre.com/koha/autour_de_koha/serveurs_z3950_sru#serveur_de_la_bnf
3) Go to cataloguing module
4) Click on 'New from Z39.50/SRU'
5) Choose only the SRU target
6) Search for ISBN 2266072889
7) Confirm you see good encoding : diacritic on 'a' of title 'Strate-a-gemmes'
8) Click on 'Marc preview'
9) Confirm you see good encoding
10) Click import
11) Confirm you see good encoding
12) Check also Authorities import via SRU
13) Check also SRU imports on a MARC21 database
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: Removed change to new_from_xml call. We should respect syntax.
But the added MARC::Record encoding does the tric! Which is implicit
for Z3950 targets where MarcToUTF8Record does the same.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Should be XSLT::Base now.
Removes old XSLT_Handler stub too (from bug 23290).
Result of a git grep | sed statement.
Test plan:
Run qa tools (so modules compile).
Run t/db_dependent/Breeding.t
Run t/db_dependent/Koha/XSLT/Base.t (This test fails when only this patch
has been applied; see subsequent patch.)
Enable XSLT use on results and details display. Check search results and
detail view on OPAC and staff.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Adds the "Attributes" field to z3950 servers.
The feature here is not quite de same.
In the old patches, the attributes were applied to individual query parts if the part already contains "@attr" and the additionnal attribute is not already in the query part.
Here, the content of the new field is prepended to all PQF queries sent to the server.
This new way of doing is simpler and works for the sponsor.
Test plan:
I) Apply the patch
II) Run updatedatabase.pl
1) Add a new z3950 server with the following parameters:
Hostname : catalogue.banq.qc.ca
Port : 210
Database : IRIS
Syntax : Marc21
2) Perform a z3950 search on that server.
Keyword (Any) : egypt
2.1) Nothing Found.
3) Add attributes on the server administration page
@attr 4=1
4) Perform the same z3950 search
4.1) A lot of results
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
1) Apply the patch
2) Have a Z39.50 endpoint with attr 31 defined - Library of Congress
supports this
3) Try to find some biblio records through Z39.50 using the new field
"Publication year"
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Call to GetAuthorizedHeading is already done just before calling ImportBreedingAuth.
Call to GuessAuthTypeCode is not used.
Adding transaction to test (check your database, kidclamp ;)
Test plan:
Add new authority via Z3950 in the interface.
Run t/db_dependent/Breeding.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
git grep ImportBreedingAuth - there is only one call to this routine
from SearchZ3950Auth
We pass it a MARC record, '2' for overwrite_auth
We then check for this record in the DB and get the breeding id,
however, when overwrite_auth is 2 we always add the auth to the batch
and return the new breeding id.
We don't actually use any of the other parameters returned here either
To recreate:
1 - Browse to Authorities
2 - Select New form Z3950
3 - Perform a search that returns results
4 - SELECT COUNT(*) FROM import_auths
5 - Repeat the search
6 - SELECT COUNT(*) FROM import_auths
7 - There are 20 more records
8 - SELECT * FROM import_auths - note the repeated rows
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
Try to search, preview and import authority from Z39.50, everything
should work as expected
Signed-off-by: Hayley Mapley <hayleymapley@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch makes it possible to add an extra column to Z3950 search results.
The system preference AdditionalFieldsInZ3950ResultSearch decides which MARC field/subfields are displayed in the column.
Testing:
I Apply the patch
II Run updatedatabase.pl
ACQUISITIONS
0) Enter a field/subfield in the AdditionalFieldsInZ3950ResultSearch
1) Create a new basket or use an existing one
2) In -Add order to basket-, click "From an external source"
3) Select some search targets and enter a subject heading ex. house
4) Click Search bouton
5) Validate "Additional fields" column with the field/subfield value.
CATALOGUING
0) Shares same syspref as above
1) Go to cataloguing, click New from z3950
2) Fill to result in a successful search
3) Validate column Addition Fields
prove t/db_dependent/Breeding.t
Sponsored-by: CCSR (https://ccsr.qc.ca)
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test plan:
1) Apply the patch
2) prove t/db_dependent/Breeding.t
3) Try to search using Z39.50, both, authority and biblio should still
work
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
1) Apply the patch
2) Go to administration and set up a z39.50 authority server, which does
support searching by control number (use attribute 12), you can use czech
national library server:
host: aleph.nkp.cz
port: 9991
base: aut-utf
format: MARC21
encoding: UTF-8
3) Try to find an authority by control number using z39.50 - if you use the server
recomended in point 2) there is web access to the base at
http://aleph.nkp.cz/eng/aut
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Fixed a typo in a code comment and a whitespace issue in the template.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
See Bugzilla comment 7. This change does not belong here and is
dubious on its own. Needs further attention on another report.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: In consultation with the author the same change is applied to the
corresponding lines in Z3950SearchAuth.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test plan:
- Apply the patch
- Add an SRU authority server in admininistration -> Z39.50/SRU servers
You can try with the French national library, configured as such:
Hostname: catalogue.bnf.fr
Port: 80
Database: api/SRU
Syntax: Unimarc
Record type: authority
Additional SRU options: version=1.2,sru=get
SRU Search fields mapping example:
Keyword (any): aut.anywhere
Name (any): aut.anywhere
Author (any): (aut.type any "pep org") and aut.accesspoint
Author (personal): aut.type=pep and aut.accesspoint
Author (corporate): aut.type=org and aut.accesspoint
Author (meeting/conference): aut.type=org and aut.accesspoint
Subject heading: (aut.type any "geo ram_nc ram_ge ram_pe ram_co") and aut.accesspoint
Subject sub-division: aut.type=ram_pe and aut.accesspoint
Title (any): (aut.type any "tic tut tum ram_tp ram_tu") and aut.accesspoint
Title (uniform):(aut.type any "tut tum ram_tu") and aut.accesspoint
- Try a search from Authorities -> New from Z39.50/SRU
- Check that the authority is correctly displayed in "Show Marc"
- Check that the authority is correclty added to koha in "Import"
- prove t/db_dependent/Breeding.t
Signed-off-by: François Pichenot <fpichenot@ville-roubaix.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
In Breeding.pm we let Z3950Search return the xslt handler error codes back
to the template. They are converted to text messages by using an new include
file (added for opac and intranet now). The generic xslt_err code is now
obsoleted.
In Record.pm the errstr call is removed. The croak is done with the new
error code in err. This seems sufficient.
Test plan:
[1] Run Breeding.t
[2] Run Record.t
[3] Add a nonexisting xslt file to one of your Z3950 targets. Search on that
target and check if you see a error 'XSLT file not found'.
The bonus is these error messages are now translatable as they are in
the templates
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
After feedback from the dev mailing list, it seems appropriate here to
propose making the Default framework authoritative for Koha to MARC
mappings. This implies checking only the Default framework in the
routines:
[1] GetMarcFromKohaField: The parameter frameworkcode is removed. A
follow-up report (19097) will update the calls not adjusted here.
This is safe since the parameter is silently ignored.
[2] GetMarcSubfieldStructureFromKohaField: Framework parameter is removed
and calls are adjusted. Includes acquisitions_stats.pl.
[3] TransformKohaToMarc: The parameter is removed; all calls are verified
or adjusted.
[4] TransformMarcToKoha: The parameter is no longer used and will be
removed in a follow-up report (19097). It always goes to Default now.
[5] TransformMarcToKohaOneField: The parameter is removed and all calls
are adjusted. Including: Breeding, XISBN and MetadataRecord modules.
[6] C4::Koha::IsKohaFieldLinked: This routine was called only once (in
C4::Items::_build_default_values_for_mod_marc. It can be replaced by
calling GetMarcFromKohaField. If there is no kohafield linked, undef
is returned. (Corresponding unit test is removed here.)
[7] C4::Items::ModItemFromMarc: The helper routine
_build_default_values_for_mod_marc does no longer have a framework
parameter. The cache key default_value_for_mod_marc- is no longer
combined with a frameworkcode. Three admin scripts are adjusted
accordingly; some tests will be corrected in the next patch.
Test plan:
See next patch. That patch adjusts all tests involved.
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
In order to allow multiple Koha to MARC mappings (for one kohafield), we
need to adjust a few key routines in C4/Biblio.pm. This results in a few
changes in dependent modules too.
Note: Multiple mappings also include 'alternating' mappings. Such as the
case of MARC21 260 and 264: only one of both fields will be used. Sub
TransformMarcToKoha will handle that just fine; the opposite transformation
is harder, since we do no longer know which field was the source. In that
case TransformKohaToMarc will fill both fields. We only use that operation
in Koha for items (in Acquisition and Cataloging).
Sub GetMarcSubfieldStructure
This sub used a selectall_hashref, which is fine as long as we have only
one mapping for each kohafield. But as DBI states it: If a row has the same
key as an earlier row then it replaces the earlier row. In other words,
we lose the first mapping if we have two.
This patch uses selectall_arrayref with Slice and rearranges the output so
that the returned hash returns an arrayref of hashrefs for each kohafield.
In order to improve consistency, we add an order clause to the SQL
statement used too.
Sub GetMarcFromKohaField
This sub just returned one tag and subfield, but in case of multiple
mappings we should return them all now.
Note: Many calls still expect just one result and will work just fine:
my ($tag, $sub) = GetMarcFromKohaField(...)
A possible second mapping would be silently ignored. Often the sub is
called for biblionumber or itemnumber. I would not recommend the use of
multiple mappings for such fields btw.
In case the sub is called in scalar context, it will return only the first
tag (instead of the number of tags and subfields).
Sub GetMarcSubfieldStructureFromKohaField
This sub previously returned the hash for one kohafield.
In scalar context it will behave like before: it returns the first hashref
in the arrayref that comes from GetMarcSubfieldStructure.
In list context, it returns an array of all hashrefs (incl. multiple
mappings).
The sub is not used in C4::Ris. Removed the use statement.
Sub TransformKohaToMarc
This sub got a second parameter: frameworkcode.
Historically, Koha more or less assumes kohafields to be defined across all
frameworks (see Koha to MARC mappings). Therefore it falls back to Default
when it is not passed.
When going thru all mappings in building a MARC record, it also supports
multiple mappings. Note that Koha uses this routine in Acquisition and in
Cataloging for items. Normally the MARC record is leading however and the
Koha fields are derivatives for optimization and reporting.
The added third parameter allows for passing a new option no_split => 1.
We use this option in C4::Items::Item2Marc; if two item fields are mapped to
one kohafield but would have different values (which would be very unusual),
these values are glued together. When transforming to MARC again, we do not
want to duplicate the item subfields, but we keep the glued value in both
subfields. This operation only affects items, since we are not doing this
reverse operation for biblio and biblioitem fields.
Sub _get_inverted_marc_field_map
This sub is a helper routine of TransformMarcToKoha, the opposite
transformation. When saving a MARC record, all kohafields are extracted
including multiple mappings.
Suppose that you had both 260c and 264c in your record (which you won't),
than both values get saved initially into copyrightdate like A | B. The
additional code for copyrightdate will extract the first year from this
string.
A small fix in TransformMarcToKoha makes that it only saves a value in a
kohafield if it is defined and not empty. (Same for concatenation.)
Sub TransformMarcToKohaOneField
This sub now just calls TransformMarcToKoha and extracts the requested
field. Note that since we are caching the structure, this does not result
in additional database access and is therefore performance-wise
insignificant. We simplify code and maintenance.
Instead of modifying the passed hashref, it simply returns a value. A call
in C4::Breeding is adjusted accordingly. The routine getKohaField in
Koha::MetadataRecord is redirected to TransformMarcToKohaOneField.
NOTE: The fourth patch restructures/optimizes TransformMarcToKoha[OneField].
Sub get_koha_field_from_marc
This sub can be removed. A call is replaced by TransformMarcToKohaOneField
in C4::XISBN.
Note: The commented lines for sub ModZebrafiles are removed (directly under
TransformMarcToKohaOneField).
Test plan:
For unit tests and interface tests, please see follow-ups.
Run qa tools in order to verify that the modules still compile well.
Read the code changes and verify that they make sense.
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
perl -p -i -e 's/^.*set the version for version checking.*\n//' **/*.pm
+ manual adjustements
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
Mainly a
perl -p -i -e 's/^.*3.07.00.049.*\n//' **/*.pm
Then some adjustements
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
perl -p -i -e 's/^(use vars .*)\$VERSION\s?(.*)/$1$2/' **/*.pm
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
When doing an Auth search through z3950, the resulting table has the first column (servers name) always empty.
TEST
1) once logged into the intranet, go to Authorities.
2) Click New from z39.50, fill appropriatly for a successful search.
3) Acknowledge first column is empty. Always.
4) Apply the (very simple) patch.
5) Do another search, validate column is not empty anymore.
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
http://bugs.koha-community.org/show_bug.cgi?id=9987
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This button lets you replace existing authorities using a Z39.50 search.
http://bugs.koha-community.org/show_bug.cgi?id=11961
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
All tests pass
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
I would prefer not to hide this "stuff".
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
When using a z3950 connexion with UNIMARC authorities, you get an error :
Unsupported UNIMARC character encoding [ ] for XML output for UNIMARCAUTH; 100$a -> 20141119
I've seen thant Bug 2060 when adds authorities import adds a special behavior for UNIMARC : marc flavor must be UNIMARCAUTH instead of just UNIMARC.
This patch adds the same behavior when using z3950 connexion and import.
Test plan :
- Use a UNIMARC install
- Define a z3950 connexion for UNIMARC authorities
- Go to Authorities module
- Click on "New from Z39.50"
- Perform a search
=> Without patch : you get the error
=> With patch : you get results
- Import one result
=> You get the authoritie creation form with all datas
You may check same plan with MARC21 install
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
NOTE: depending on the target, the syntax in the configuration
might not be UNIMARC, but MARC21/USMARC instead!
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This fixes regression introduced by Bug 6536 so that multiple
words in title search will work.
Test scenario:
1. try z39.50 search with more than one word
2. verify that no results apper
3. apply patch and re-run search
4. verify that there are results
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The last test on the first series, fails randomly on Perl 5.18:
not ok 12 - Third query makes no difference
# Failed test 'Third query makes no difference'
# at t/db_dependent/Breeding.t line 104.
# got: ''
# expected: '1'
# Looks like you failed 1 test of 12.
not ok 1 - _build_query
This change makes tests pass. Please consider if this needs to be fixed
(i.e. sort order matters) or the test needs to be rewritten.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
I agree with adding the sort. (The need for doing this in Perl 5.18 is another
topic.)
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Resolved:
[1] FAIL C4/Breeding.pm
FAIL critic ControlStructures::ProhibitMutatingListFunctions
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
No warnings anymore.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch only removes surrouding spaces at comma and equals-sign while
passing the options in sru_fields to the ZOOM object.
Test plan:
If you add spaces between options in sru_fields, searching should still work.
E.g. sru_fields= sru = get , sru_version = 1.1
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Use the stylesheets listed in field add_xslt of z3950servers to transform
search results of Z3950/SRU search.
Additionally, the template has been changed to make more error messages (or
warnings) visible when displaying results. Until now, error message were
shown in the results table and when connection errors occurred, no results
were displayed at all.
Test plan:
Create some stylesheets (or see the sample patch on bug 6536).
Add these stylesheets to some Z3950/SRU servers.
Do Z3950 search and verify the transformations.
Do a search with 2 targets; make one target fail (by manipulating its server
data). Do you see the connection error and the results for the other target?
Generate a XSLT error by modifying one stylesheet. Check search results. You
should see warnings.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch makes it possible to include SRU servers in Z3950 search.
It adjusts the Z3950Search routine in Breeding module.
It also replaces SQL code with DBIx statements in Breeding.pm/Z3950Search
and the associated scripts z3950search.pl in cataloguing and acqui.
Test plan:
Verify if a normal Z3950 search still works in cataloging/acqui.
Add a SRU target. (You could just use Koha's port 9998.)
Define sru_options like sru=get.
Use that target in a Z3950 search in cataloging and acqui. (Import.)
Test sru_fields translation by comparing search results between various
settings for some of the fields. For instance, leave title empty and
after that set it to the title field of your SRU target.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Replaces name by servername, type by servertype for running Z3950 search.
Limit search scripts to zed (z3950) servers until sru is supported.
Test plan:
Perform a Z3950 search in Cataloguing and Acquisition.
Verify that it still works as it did.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch removes the ImportBreeding() routine, which lost its
last caller as of the patch for bug 10462.
To test:
[1] Verify that prove -v t/Breeding.t passes.
[2] Perform a Z39.50 search in the staff interface.
[3] Perform a cataloguing reservoir search in the staff
interface; verifying that cached records from the search
done in step 2 are retrieved.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When importing records, the ISBN is normalized and stored
into database (see C4::ImportBatch::_add_biblio_fields). But when
searching with ISBN into reservoir, it is not normalized
(see C4::Breeding::BreedingSearch). So search does not match.
This patch adds the normalisation to reservoir search. Also, it
replaces call private method _isbn_cleanup by GetNormalizedISBN,
the correct public method. Also allows the reservoir search
on ISBN with hyphens.
This is intended to fix only reservoir searches.
Revised Test plan
-----------------
1) Back up DB
2) Save copy of attached example somewhere findable
2) Home -> Tools -> Stage MARC records for import
3) Click Browse and select the example MARC file
4) Click Upload file
5) Tweak as desired then click Stage for import
6) Click Manage staged records
7) Click Import this batch into the catalog
8) Home -> Cataloging
9) In the Cataloging search text box type 978-0-691-14289-0 and
click Submit
-- ISBN13 with hypens not found in reservoir
10) In the Cataloging search text box type 9780691142890 and
click Submit
-- ISBN13 without hypens not found in reservoir
11) In the Cataloging search text box type 0-691-14289-0 and
click Submit
-- ISBN10 with hypens not found in reservoir
12) In the Cataloging search text box type 0691142890 and
click Submit
-- ISBN10 without hypens found in reservoir
13) Apply patch
14) Repeat steps 9-12, this time it is always found! :)
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When a z39.50 server isn't able to be searched successfully, the yellow
error box came up empty. This patch fixes the problem.
Test Plan:
1) Go to Administration/z39.50 servers
2) Create a fake z39.50 server with a made up address
3) Go to cataloging, search only that server
4) Note the empty yellow alert box
5) Apply this patch
6) Re-run the search, not the alert box has a message in it now
Signed-off-by: Nora Blake <nblake@masslibsystem.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works according to test plan.
When one of the selected servers gives result no dialog
box is shown before and after applying the patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- improve POD
- remove extraneous comments
- correct license statement in new files
- remove backticks in database update SQL
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch introduces a new Z39.50 interface for searching Z39.50
compliant databases for MARC authority records.
These databases aren't as common as their bibliographic equivalents,
but they're out there and very useful. I have included info at the
bottom of this messsage for sample authority databases you can try.
To test this patch:
1) Set up Z39.50 client targets for authority databases. (I've included
information at the bottom of this message for LibrariesAustralia's
test server for authorities as well as instructions on how to use
your Koha's z39.50 authority server as well. The Library of Congress
also has authority databases available (unsure if these are test or
prod), and you might have access to others through OCLC or RLIN. OCLC
provides login credentials for their test databases.
2) Go to the Authorities module
3) Click on the new "Z39.50 search button"
4) Select your authority search targets from the list.
5) Do a search for an authority you would like using either the "Raw"
input box or the more specific input boxes for names, subjects, subject
sub divisions, or titles. (I like searching Name (personal): Eric on
the LibrariesAustralia test DB.)
6) You should see a table listing the server, heading, authority type,
and two other columns (MARC and a nameless column). "Authority type"
is the type of authority it will become when imported in to Koha. In
the Eric example, "PERSO_NAME".
7) Click on "MARC" next to the results of interest to review the MARC
authority record.
8) When you're satisfied with a record, click on "Import".
9) The pop-up window will close and your original Koha window will
change to the "Adding authority Personal Name" screen (in the Eric
example).
10) All the relevant fields should be filled out for the record. Review
them and make any changes as necessary. (N.B. The 001 will be cleared
when saved, so if you have a use for the imported control number, move
it to the 010, 016, or 035 as appropriate. If you have a default value
for the 003, this will also likely be overwritten. Move it if necessary.
The 005 will also be updated when saved, so do not worry about that.)
11) When you're satisfied, click save.
12) Presto! You've imported your first authority record via Z39.50!
--
Here is the info for the LibrariesAustralia test Z39.50 authority
database:
Z39.50 server: LibrariesAustralia Authorities
Hostname: z3950-test.librariesaustralia.nla.gov.au
Port: 210
Database: AuthTraining
Userid: ANLEZ
Password: z39.50
Syntax: MARC21/USMARC
Encoding: utf8
-
The U.S.A. Library of Congress also provides Z39.50 access to its Name
and Subject Authorities (http://www.loc.gov/z3950/lcserver.html).
Name Authority:
Z39.50 server: Library of Congress Name Authority File
Hostname: lx2.loc.gov
Port: 210
Database: NAF
Syntax: MARC21/USMARC
Encoding: utf8
Subject Authority:
Z39.50 server: Library of Congress Subject Authority File
Hostname: lx2.loc.gov
Port: 210
Database: SAF
Syntax: MARC21/USMARC
Encoding: utf8
(N.B. Both of these databases also include title authorities.)
-
For testing purposes, you can also set up a Z39.50 client target,
which points at your own Koha instance's Z39.50 authority server.
To find the hostname, go to /etc/koha-conf.xml and find the value for
the <listen id="authorityserver"> element. Depending on your
configuration, this could be something like the following:
unix:/zebra/koha/var/run/zebradb/authoritysocket
(N.B. You might be using a different scheme than unix sockets...)
To find the database, scroll down to the bottom of koha-conf.xml until
you reach the <config> element. Within this, look for the value of the
element <authorityserver>. It should probably be "authorities".
To set up this Z39.50 client target in Koha...
Z39.50 server: my koha authorities
Hostname: unix:/zebra/koha/var/run/zebradb/authoritysocket
Port:
Database: authorities
Userid:
Password:
Syntax: MARC21/USMARC (or whichever flavour you need)
Encoding: utf8
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - Add a z39.50 interface for authority searching
This patch adds the "recordtype" column to the "z3950servers" table.
The value in this column (biblio or authority) then controls whether
the z3950 server shows up in a bibliographic search (through the
Acq and Cataloguing modules) or in an authority search (through
the Authorities module).
I also edited the z3950 management console to show this value
and allow users to edit it. The default value is "biblio", since
the vast majority of z3950 targets will be bibliographic. However,
there is an option to add/edit a z3950 target as a source of
authority records.
Test Plan:
1) Apply both patches
2) Run updatedatabase.pl (after setting your KOHA_CONF and PERL5
environmental variables)
3) Use the test plan from the 1st patch
N.B. Make sure that your Z39.50 client target has a Record Type
of Authority, otherwise it won't display when you're doing a
Z3950 search for authorities.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - fix tabs/whitespace errors to pass QA
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch corrects the mixup for LC call number and control number.
Further, as suggested by Galen, it would be better to not introduce hardcoded
tags in the Z3950Search subs in Breeding.pm.
This patch resolves that by calling TransformMarcToKohaOneField.
Note that this only involves changes to _addrowdata and _isbn_show. These
subs are only used in building the displayed results table.
Additionally, for French UNIMARC installs publicationyear is used to fill
the Date column (copyrightdate is not used in those installs). The edition
statement is only used in unimarc_lecture_pub not in unimarc_complet.
Test plan:
Do some Z3950 searches and look for values in all result columns.
For MARC21 on LOC (and/or others):
Look for isbn 9780415964845 (check LCCN).
Look for author Rowling.
For UNIMARC on BNF2 (and/or others):
On BNF2 look for isbn 2070518426: result contains date and multiple isbn's.
Look for title: Guide des candidats aux emplois de commissaire de police.
Third result show edition statement (if you use 205$a with pub install).
Note that there are no results with LCCN here (just as before).
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested for MARC21 and UNIMARC (French lecture_pub install).
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
As Jonathan correctly noted, the new Z3950 response only showed one isbn
although more isbn numbers could be in the record and would be imported.
To resolve this display problem, I traverse them all now in the updated
routine _isbn_show. There is no change in the imported records.
Note that before this patch TransformMarcToKoha did put all isbn numbers in
one field, separated by pipes (for display only). This behavior is restored
now. The three regexes on the individual isbn numbers now seem to be
overkill, but I left them there for completeness.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested this on a fresh French install under UNIMARC with BNF server.
Tested it too for MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Refactors Z3950Search.
Disable batch record counts for z3950 records.
Test plan:
Do various Z3950 searches on multiple targets from Cataloging and Acquisition.
Behavior should not have changed.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Searching for stdid: Standard ID, srchany: RAW (any) somehow did not work
anymore.
Probably my fault :) Note that these two fields are in Cataloging Z3950 search
and not in Acquisition.
Fixing encoding problems: When adding -utf flag for CGI in acqui/z3950 and
cataloging/z3950, the decoding statements in C4/Breeding, Z3950Search should be
removed.
Test plan:
Search in Cataloging with:
Standard ID: 9782358670043 on LOC
RAW (any): musee [add an accent aigu on first e] on LOC -- Add diacritic!!!
Search in Acquisition
Somewhere, does not matter, but use a diacritic.
A note: My git version still has a hard time with utf8. Need to upgrade to version 1.7.10 to resolve this..
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. No errors
Without patch z39.50 search for example Std ID OR musee gives no results,
with patch there are.
No problems in acq search.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Good catch, passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Housekeeping: close the results sets and connections from Z3950 searches.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
No regression found, all tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Removing some unused variables.
Restoring timeout parameter that was only used in cataloging.
Restoring copyrightdate and editionstatement in row data for template.
Small adjustment at the end of the while loop with template vars.
Discovered while doing so, that the paging feature needs some further corrections; will propose a patch under a separate report.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>