The array serverhost is not filled. Should be replaced with values
from servers array.
Test plan:
Nothing exciting here. Read the patch.
Note that we will test in the next patch if the hostname is saved
correctly in the import batch.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
(cherry picked from commit eb75971990)
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
[1] If you have access to a Z3950 MARC8 auth server, search
for an authority record and import it.
[2] If you have access to a Z3950 UTF8 auth server, search
for an authority record and import it.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
(cherry picked from commit 1233480ffa)
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
This patch simply checks if we have a value for copyrighydate and
displays publicationyear if not. Even if copyrightdate is requested (MARC21)
but isn't populated, the publicatoinyear won't replace it because we
haven't transformed that field.
I think this read a bit easier, but RM can weigh in
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
After executing Z39.50 search, the result table is not populated
with publication dates. This is the result of code refactoring
made by Bug 30813. The removed function _add_rowdata treated
in special way the publication date putting it in $row in under
special, non MARC variant dependent key 'date'--since
the z3950_search.tt looks under breeding_loo.date.
Same effect (no data in the result table) with edition statement.
Reason: editionstatement coming from TransformMarcToKoha vs edition
expected by z3950_search.tt.
Test plan:
==========
1. Have a Koha instance with some Z39.50 servers defined.
2. In Cataloging, perform a Z39.50 search for any term.
3. In the result table you would not get the publication dates nor
edition statement (if present in the record).
4. Apply the patch.
5. Repeat the search.
6. You should see the publication dates (according to the current
mapping, i.e. for MARC 21 coming from 260 $c or 264 $c subfield)
and edition statements from the records found.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
C4::AuthoritiesMarc method GetAuthorizedHeading is not exported thus it is called in other modules :
> git grep GetAuthorizedHeading
C4/AuthoritiesMarc.pm:=head2 GetAuthorizedHeading
C4/AuthoritiesMarc.pm: $heading = &GetAuthorizedHeading({ record => $record, authid => $authid })
C4/AuthoritiesMarc.pm:sub GetAuthorizedHeading {
C4/Breeding.pm: $heading = C4::AuthoritiesMarc::GetAuthorizedHeading({ record => $marcrecord });
C4/ImportBatch.pm: $row->{'authorized_heading'} = C4::AuthoritiesMarc::GetAuthorizedHeading( { authid => $row->{'candidate_match_id'} } );
C4/ImportBatch.pm: my $authorized_heading = C4::AuthoritiesMarc::GetAuthorizedHeading({ record => $marc_record });
This patch adds it to be exported.
For example for use in Koha plugins.
Test plan :
Check import of authorities from a file is OK
Signed-off-by: Phil Ringnalda <phil@chetcolibrary.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
In C4::Breeding method ImportBreedingAuth() should be exported to be used for example by plugins.
No test plan needed i bet.
Signed-off-by: Magnus Enger <magnus@libriotech.no>
Looks like a harmless and useful enhancement.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Altough we used the namespace when calling, the intention of
bug 17600 was that we do import the routines that we are
calling here.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch changes the call to use the fully qualified name and
adds import to C4::Breeding, Koha::MetaSearcher, and removes import
from Koha::MetadataRecord
Additionally it replaces missing fields from the update to using TransformMarcToKoha
Lastly, it reduces the fields used when saving the bredding record to the reservoir
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
To test:
1 - prove -v t/db_dependent/Biblio/TransformMarcToKoha.t t/db_dependent/Breeding.t
2 - Test Z3950 search in advanced catalog editor
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch makes the cataloging reservoir search results a configurable
DataTable. The empty edition and date columns are removed, and an import
data column is added.
To test, apply the patch and go to Cataloging.
- Perform a cataloging search which will return results from the
reservoir.
- The table of reservoir search results should be a DataTable with
paging, navigation, filtering, column configuration, etc.
- Confirm that all DataTable controls work correctly.
- Go to Administration -> Table settings -> Cataloging -> addbooks.
- Try modifying the default configuration and confirm that the
settings take effect.
Signed-off-by: Barbara Johnson <barbara.johnson@bedfordtx.gov>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
The code in the script and the module attempt to determine whether a term is an isbn, or not. Rather
than try to do this, we can simply search it on the three fields: isbn, title, author
Additionally, we should search as any of the ISBN variations to broaden our matches
Note: Curently only an ISBN 10 is stored in import biblios, so for an ISBN13 that doesn't convert
the value will be blank - this is another bug
To test:
1 - Perform a cataloging search for a valid ISBN 13 with no ISBN10 counterpart:
9798200834976
2 - 500 error
3 - Apply patch
4 - Repeat, no results
5 - Import some records
6 - Search by title/author/isbn
7 - Confirm searching works as expected
WNC amended to fix spelling
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
AMENDED: Useless call of ISBNs (plural) when you only pass one parameter.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
On bug 17591 we discovered that there was something weird going on with
the way we export and use subroutines/modules.
This patch tries to standardize our EXPORT to use EXPORT_OK only.
That way we will need to explicitely define the subroutine we want to
use from a module.
This patch is a squashed version of:
Bug 17600: After export.pl
Bug 17600: After perlimport
Bug 17600: Manual changes
Bug 17600: Other manual changes after second perlimports run
Bug 17600: Fix tests
And a lot of other manual changes.
export.pl is a dirty script that can be found on bug 17600.
"perlimport" is:
git clone https://github.com/oalders/App-perlimports.git
cd App-perlimports/
cpanm --installdeps .
export PERL5LIB="$PERL5LIB:/kohadevbox/koha/App-perlimports/lib"
find . \( -name "*.pl" -o -name "*.pm" \) -exec perl App-perlimports/script/perlimports --inplace-edit --no-preserve-unused --filename {} \;
The ideas of this patch are to:
* use EXPORT_OK instead of EXPORT
* perltidy the EXPORT_OK list
* remove '&' before the subroutine names
* remove some uneeded use statements
* explicitely import the subroutines we need within the controllers or
modules
Note that the private subroutines (starting with _) should not be
exported (and not used from outside of the module except from tests).
EXPORT vs EXPORT_OK (from
https://www.thegeekstuff.com/2010/06/perl-exporter-examples/)
"""
Export allows to export the functions and variables of modules to user’s namespace using the standard import method. This way, we don’t need to create the objects for the modules to access it’s members.
@EXPORT and @EXPORT_OK are the two main variables used during export operation.
@EXPORT contains list of symbols (subroutines and variables) of the module to be exported into the caller namespace.
@EXPORT_OK does export of symbols on demand basis.
"""
If this patch caused a conflict with a patch you wrote prior to its
push:
* Make sure you are not reintroducing a "use" statement that has been
removed
* "$subroutine" is not exported by the C4::$MODULE module
means that you need to add the subroutine to the @EXPORT_OK list
* Bareword "$subroutine" not allowed while "strict subs"
means that you didn't imported the subroutine from the module:
- use $MODULE qw( $subroutine list );
You can also use the fully qualified namespace: C4::$MODULE::$subroutine
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
When searching for authorities if an authorities server reply has invalid records
none are displayed.
At least french BNF SRU server doesn't fully follow norm and can return an error
confusing Koha protocol handler which then returns an empty MARC record.
This patch silently removed bogus records.
To Test:
1- Add BNF SRU server
2- Go to authorities page
3- Add an authority
4- Search for keyword(any) droits de l'homme
5- No result (Internal Server Error)
6- Apply patch
7- restart starman
8- redo 4
9- Many records are displayed
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
When importing records from a SRU server, the diacritics have bad encoding.
I reproduce with BNF server so it may be a UNIMARC issue.
Tests show that difference between Z39.50 server and SRU is that leader contains 'a' at postion 9.
Looking at MARC::Record->encoding() shows that encoding depends on leader even for UNIMARC.
So this patch adds a call to MARC::Record->encoding('UTF-8') in case of a SRU server in C4::Breeding.
Same use exists in Koha::MetadataRecord::Authority::get_from_breeding().
In case of import via Z3950, MarcToUTF8Record() is called,
which calls SetMarcUnicodeFlag(),
which calls MARC::Record->encoding('UTF-8')
Test plan :
1) Use a UNIMARC database
2) Configure a connexion to a UNIMARC SRU, for example BNF,
see https://doc.biblibre.com/koha/autour_de_koha/serveurs_z3950_sru#serveur_de_la_bnf
3) Go to cataloguing module
4) Click on 'New from Z39.50/SRU'
5) Choose only the SRU target
6) Search for ISBN 2266072889
7) Confirm you see good encoding : diacritic on 'a' of title 'Strate-a-gemmes'
8) Click on 'Marc preview'
9) Confirm you see good encoding
10) Click import
11) Confirm you see good encoding
12) Check also Authorities import via SRU
13) Check also SRU imports on a MARC21 database
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: Removed change to new_from_xml call. We should respect syntax.
But the added MARC::Record encoding does the tric! Which is implicit
for Z3950 targets where MarcToUTF8Record does the same.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Should be XSLT::Base now.
Removes old XSLT_Handler stub too (from bug 23290).
Result of a git grep | sed statement.
Test plan:
Run qa tools (so modules compile).
Run t/db_dependent/Breeding.t
Run t/db_dependent/Koha/XSLT/Base.t (This test fails when only this patch
has been applied; see subsequent patch.)
Enable XSLT use on results and details display. Check search results and
detail view on OPAC and staff.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Adds the "Attributes" field to z3950 servers.
The feature here is not quite de same.
In the old patches, the attributes were applied to individual query parts if the part already contains "@attr" and the additionnal attribute is not already in the query part.
Here, the content of the new field is prepended to all PQF queries sent to the server.
This new way of doing is simpler and works for the sponsor.
Test plan:
I) Apply the patch
II) Run updatedatabase.pl
1) Add a new z3950 server with the following parameters:
Hostname : catalogue.banq.qc.ca
Port : 210
Database : IRIS
Syntax : Marc21
2) Perform a z3950 search on that server.
Keyword (Any) : egypt
2.1) Nothing Found.
3) Add attributes on the server administration page
@attr 4=1
4) Perform the same z3950 search
4.1) A lot of results
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
1) Apply the patch
2) Have a Z39.50 endpoint with attr 31 defined - Library of Congress
supports this
3) Try to find some biblio records through Z39.50 using the new field
"Publication year"
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Call to GetAuthorizedHeading is already done just before calling ImportBreedingAuth.
Call to GuessAuthTypeCode is not used.
Adding transaction to test (check your database, kidclamp ;)
Test plan:
Add new authority via Z3950 in the interface.
Run t/db_dependent/Breeding.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
git grep ImportBreedingAuth - there is only one call to this routine
from SearchZ3950Auth
We pass it a MARC record, '2' for overwrite_auth
We then check for this record in the DB and get the breeding id,
however, when overwrite_auth is 2 we always add the auth to the batch
and return the new breeding id.
We don't actually use any of the other parameters returned here either
To recreate:
1 - Browse to Authorities
2 - Select New form Z3950
3 - Perform a search that returns results
4 - SELECT COUNT(*) FROM import_auths
5 - Repeat the search
6 - SELECT COUNT(*) FROM import_auths
7 - There are 20 more records
8 - SELECT * FROM import_auths - note the repeated rows
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
Try to search, preview and import authority from Z39.50, everything
should work as expected
Signed-off-by: Hayley Mapley <hayleymapley@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch makes it possible to add an extra column to Z3950 search results.
The system preference AdditionalFieldsInZ3950ResultSearch decides which MARC field/subfields are displayed in the column.
Testing:
I Apply the patch
II Run updatedatabase.pl
ACQUISITIONS
0) Enter a field/subfield in the AdditionalFieldsInZ3950ResultSearch
1) Create a new basket or use an existing one
2) In -Add order to basket-, click "From an external source"
3) Select some search targets and enter a subject heading ex. house
4) Click Search bouton
5) Validate "Additional fields" column with the field/subfield value.
CATALOGUING
0) Shares same syspref as above
1) Go to cataloguing, click New from z3950
2) Fill to result in a successful search
3) Validate column Addition Fields
prove t/db_dependent/Breeding.t
Sponsored-by: CCSR (https://ccsr.qc.ca)
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test plan:
1) Apply the patch
2) prove t/db_dependent/Breeding.t
3) Try to search using Z39.50, both, authority and biblio should still
work
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
1) Apply the patch
2) Go to administration and set up a z39.50 authority server, which does
support searching by control number (use attribute 12), you can use czech
national library server:
host: aleph.nkp.cz
port: 9991
base: aut-utf
format: MARC21
encoding: UTF-8
3) Try to find an authority by control number using z39.50 - if you use the server
recomended in point 2) there is web access to the base at
http://aleph.nkp.cz/eng/aut
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Fixed a typo in a code comment and a whitespace issue in the template.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
See Bugzilla comment 7. This change does not belong here and is
dubious on its own. Needs further attention on another report.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: In consultation with the author the same change is applied to the
corresponding lines in Z3950SearchAuth.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test plan:
- Apply the patch
- Add an SRU authority server in admininistration -> Z39.50/SRU servers
You can try with the French national library, configured as such:
Hostname: catalogue.bnf.fr
Port: 80
Database: api/SRU
Syntax: Unimarc
Record type: authority
Additional SRU options: version=1.2,sru=get
SRU Search fields mapping example:
Keyword (any): aut.anywhere
Name (any): aut.anywhere
Author (any): (aut.type any "pep org") and aut.accesspoint
Author (personal): aut.type=pep and aut.accesspoint
Author (corporate): aut.type=org and aut.accesspoint
Author (meeting/conference): aut.type=org and aut.accesspoint
Subject heading: (aut.type any "geo ram_nc ram_ge ram_pe ram_co") and aut.accesspoint
Subject sub-division: aut.type=ram_pe and aut.accesspoint
Title (any): (aut.type any "tic tut tum ram_tp ram_tu") and aut.accesspoint
Title (uniform):(aut.type any "tut tum ram_tu") and aut.accesspoint
- Try a search from Authorities -> New from Z39.50/SRU
- Check that the authority is correctly displayed in "Show Marc"
- Check that the authority is correclty added to koha in "Import"
- prove t/db_dependent/Breeding.t
Signed-off-by: François Pichenot <fpichenot@ville-roubaix.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
In Breeding.pm we let Z3950Search return the xslt handler error codes back
to the template. They are converted to text messages by using an new include
file (added for opac and intranet now). The generic xslt_err code is now
obsoleted.
In Record.pm the errstr call is removed. The croak is done with the new
error code in err. This seems sufficient.
Test plan:
[1] Run Breeding.t
[2] Run Record.t
[3] Add a nonexisting xslt file to one of your Z3950 targets. Search on that
target and check if you see a error 'XSLT file not found'.
The bonus is these error messages are now translatable as they are in
the templates
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
After feedback from the dev mailing list, it seems appropriate here to
propose making the Default framework authoritative for Koha to MARC
mappings. This implies checking only the Default framework in the
routines:
[1] GetMarcFromKohaField: The parameter frameworkcode is removed. A
follow-up report (19097) will update the calls not adjusted here.
This is safe since the parameter is silently ignored.
[2] GetMarcSubfieldStructureFromKohaField: Framework parameter is removed
and calls are adjusted. Includes acquisitions_stats.pl.
[3] TransformKohaToMarc: The parameter is removed; all calls are verified
or adjusted.
[4] TransformMarcToKoha: The parameter is no longer used and will be
removed in a follow-up report (19097). It always goes to Default now.
[5] TransformMarcToKohaOneField: The parameter is removed and all calls
are adjusted. Including: Breeding, XISBN and MetadataRecord modules.
[6] C4::Koha::IsKohaFieldLinked: This routine was called only once (in
C4::Items::_build_default_values_for_mod_marc. It can be replaced by
calling GetMarcFromKohaField. If there is no kohafield linked, undef
is returned. (Corresponding unit test is removed here.)
[7] C4::Items::ModItemFromMarc: The helper routine
_build_default_values_for_mod_marc does no longer have a framework
parameter. The cache key default_value_for_mod_marc- is no longer
combined with a frameworkcode. Three admin scripts are adjusted
accordingly; some tests will be corrected in the next patch.
Test plan:
See next patch. That patch adjusts all tests involved.
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
In order to allow multiple Koha to MARC mappings (for one kohafield), we
need to adjust a few key routines in C4/Biblio.pm. This results in a few
changes in dependent modules too.
Note: Multiple mappings also include 'alternating' mappings. Such as the
case of MARC21 260 and 264: only one of both fields will be used. Sub
TransformMarcToKoha will handle that just fine; the opposite transformation
is harder, since we do no longer know which field was the source. In that
case TransformKohaToMarc will fill both fields. We only use that operation
in Koha for items (in Acquisition and Cataloging).
Sub GetMarcSubfieldStructure
This sub used a selectall_hashref, which is fine as long as we have only
one mapping for each kohafield. But as DBI states it: If a row has the same
key as an earlier row then it replaces the earlier row. In other words,
we lose the first mapping if we have two.
This patch uses selectall_arrayref with Slice and rearranges the output so
that the returned hash returns an arrayref of hashrefs for each kohafield.
In order to improve consistency, we add an order clause to the SQL
statement used too.
Sub GetMarcFromKohaField
This sub just returned one tag and subfield, but in case of multiple
mappings we should return them all now.
Note: Many calls still expect just one result and will work just fine:
my ($tag, $sub) = GetMarcFromKohaField(...)
A possible second mapping would be silently ignored. Often the sub is
called for biblionumber or itemnumber. I would not recommend the use of
multiple mappings for such fields btw.
In case the sub is called in scalar context, it will return only the first
tag (instead of the number of tags and subfields).
Sub GetMarcSubfieldStructureFromKohaField
This sub previously returned the hash for one kohafield.
In scalar context it will behave like before: it returns the first hashref
in the arrayref that comes from GetMarcSubfieldStructure.
In list context, it returns an array of all hashrefs (incl. multiple
mappings).
The sub is not used in C4::Ris. Removed the use statement.
Sub TransformKohaToMarc
This sub got a second parameter: frameworkcode.
Historically, Koha more or less assumes kohafields to be defined across all
frameworks (see Koha to MARC mappings). Therefore it falls back to Default
when it is not passed.
When going thru all mappings in building a MARC record, it also supports
multiple mappings. Note that Koha uses this routine in Acquisition and in
Cataloging for items. Normally the MARC record is leading however and the
Koha fields are derivatives for optimization and reporting.
The added third parameter allows for passing a new option no_split => 1.
We use this option in C4::Items::Item2Marc; if two item fields are mapped to
one kohafield but would have different values (which would be very unusual),
these values are glued together. When transforming to MARC again, we do not
want to duplicate the item subfields, but we keep the glued value in both
subfields. This operation only affects items, since we are not doing this
reverse operation for biblio and biblioitem fields.
Sub _get_inverted_marc_field_map
This sub is a helper routine of TransformMarcToKoha, the opposite
transformation. When saving a MARC record, all kohafields are extracted
including multiple mappings.
Suppose that you had both 260c and 264c in your record (which you won't),
than both values get saved initially into copyrightdate like A | B. The
additional code for copyrightdate will extract the first year from this
string.
A small fix in TransformMarcToKoha makes that it only saves a value in a
kohafield if it is defined and not empty. (Same for concatenation.)
Sub TransformMarcToKohaOneField
This sub now just calls TransformMarcToKoha and extracts the requested
field. Note that since we are caching the structure, this does not result
in additional database access and is therefore performance-wise
insignificant. We simplify code and maintenance.
Instead of modifying the passed hashref, it simply returns a value. A call
in C4::Breeding is adjusted accordingly. The routine getKohaField in
Koha::MetadataRecord is redirected to TransformMarcToKohaOneField.
NOTE: The fourth patch restructures/optimizes TransformMarcToKoha[OneField].
Sub get_koha_field_from_marc
This sub can be removed. A call is replaced by TransformMarcToKohaOneField
in C4::XISBN.
Note: The commented lines for sub ModZebrafiles are removed (directly under
TransformMarcToKohaOneField).
Test plan:
For unit tests and interface tests, please see follow-ups.
Run qa tools in order to verify that the modules still compile well.
Read the code changes and verify that they make sense.
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
perl -p -i -e 's/^.*set the version for version checking.*\n//' **/*.pm
+ manual adjustements
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
Mainly a
perl -p -i -e 's/^.*3.07.00.049.*\n//' **/*.pm
Then some adjustements
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
perl -p -i -e 's/^(use vars .*)\$VERSION\s?(.*)/$1$2/' **/*.pm
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Signed-off-by: Brendan A Gallagher <brendan@bywatersolutions.com>
When doing an Auth search through z3950, the resulting table has the first column (servers name) always empty.
TEST
1) once logged into the intranet, go to Authorities.
2) Click New from z39.50, fill appropriatly for a successful search.
3) Acknowledge first column is empty. Always.
4) Apply the (very simple) patch.
5) Do another search, validate column is not empty anymore.
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
http://bugs.koha-community.org/show_bug.cgi?id=9987
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This button lets you replace existing authorities using a Z39.50 search.
http://bugs.koha-community.org/show_bug.cgi?id=11961
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
All tests pass
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
I would prefer not to hide this "stuff".
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
When using a z3950 connexion with UNIMARC authorities, you get an error :
Unsupported UNIMARC character encoding [ ] for XML output for UNIMARCAUTH; 100$a -> 20141119
I've seen thant Bug 2060 when adds authorities import adds a special behavior for UNIMARC : marc flavor must be UNIMARCAUTH instead of just UNIMARC.
This patch adds the same behavior when using z3950 connexion and import.
Test plan :
- Use a UNIMARC install
- Define a z3950 connexion for UNIMARC authorities
- Go to Authorities module
- Click on "New from Z39.50"
- Perform a search
=> Without patch : you get the error
=> With patch : you get results
- Import one result
=> You get the authoritie creation form with all datas
You may check same plan with MARC21 install
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
NOTE: depending on the target, the syntax in the configuration
might not be UNIMARC, but MARC21/USMARC instead!
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This fixes regression introduced by Bug 6536 so that multiple
words in title search will work.
Test scenario:
1. try z39.50 search with more than one word
2. verify that no results apper
3. apply patch and re-run search
4. verify that there are results
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The last test on the first series, fails randomly on Perl 5.18:
not ok 12 - Third query makes no difference
# Failed test 'Third query makes no difference'
# at t/db_dependent/Breeding.t line 104.
# got: ''
# expected: '1'
# Looks like you failed 1 test of 12.
not ok 1 - _build_query
This change makes tests pass. Please consider if this needs to be fixed
(i.e. sort order matters) or the test needs to be rewritten.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
I agree with adding the sort. (The need for doing this in Perl 5.18 is another
topic.)
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Resolved:
[1] FAIL C4/Breeding.pm
FAIL critic ControlStructures::ProhibitMutatingListFunctions
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
No warnings anymore.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch only removes surrouding spaces at comma and equals-sign while
passing the options in sru_fields to the ZOOM object.
Test plan:
If you add spaces between options in sru_fields, searching should still work.
E.g. sru_fields= sru = get , sru_version = 1.1
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Use the stylesheets listed in field add_xslt of z3950servers to transform
search results of Z3950/SRU search.
Additionally, the template has been changed to make more error messages (or
warnings) visible when displaying results. Until now, error message were
shown in the results table and when connection errors occurred, no results
were displayed at all.
Test plan:
Create some stylesheets (or see the sample patch on bug 6536).
Add these stylesheets to some Z3950/SRU servers.
Do Z3950 search and verify the transformations.
Do a search with 2 targets; make one target fail (by manipulating its server
data). Do you see the connection error and the results for the other target?
Generate a XSLT error by modifying one stylesheet. Check search results. You
should see warnings.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch makes it possible to include SRU servers in Z3950 search.
It adjusts the Z3950Search routine in Breeding module.
It also replaces SQL code with DBIx statements in Breeding.pm/Z3950Search
and the associated scripts z3950search.pl in cataloguing and acqui.
Test plan:
Verify if a normal Z3950 search still works in cataloging/acqui.
Add a SRU target. (You could just use Koha's port 9998.)
Define sru_options like sru=get.
Use that target in a Z3950 search in cataloging and acqui. (Import.)
Test sru_fields translation by comparing search results between various
settings for some of the fields. For instance, leave title empty and
after that set it to the title field of your SRU target.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Replaces name by servername, type by servertype for running Z3950 search.
Limit search scripts to zed (z3950) servers until sru is supported.
Test plan:
Perform a Z3950 search in Cataloguing and Acquisition.
Verify that it still works as it did.
Signed-off-by: Giuseppe Angilella <giuseppe.angilella@ct.infn.it>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch removes the ImportBreeding() routine, which lost its
last caller as of the patch for bug 10462.
To test:
[1] Verify that prove -v t/Breeding.t passes.
[2] Perform a Z39.50 search in the staff interface.
[3] Perform a cataloguing reservoir search in the staff
interface; verifying that cached records from the search
done in step 2 are retrieved.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When importing records, the ISBN is normalized and stored
into database (see C4::ImportBatch::_add_biblio_fields). But when
searching with ISBN into reservoir, it is not normalized
(see C4::Breeding::BreedingSearch). So search does not match.
This patch adds the normalisation to reservoir search. Also, it
replaces call private method _isbn_cleanup by GetNormalizedISBN,
the correct public method. Also allows the reservoir search
on ISBN with hyphens.
This is intended to fix only reservoir searches.
Revised Test plan
-----------------
1) Back up DB
2) Save copy of attached example somewhere findable
2) Home -> Tools -> Stage MARC records for import
3) Click Browse and select the example MARC file
4) Click Upload file
5) Tweak as desired then click Stage for import
6) Click Manage staged records
7) Click Import this batch into the catalog
8) Home -> Cataloging
9) In the Cataloging search text box type 978-0-691-14289-0 and
click Submit
-- ISBN13 with hypens not found in reservoir
10) In the Cataloging search text box type 9780691142890 and
click Submit
-- ISBN13 without hypens not found in reservoir
11) In the Cataloging search text box type 0-691-14289-0 and
click Submit
-- ISBN10 with hypens not found in reservoir
12) In the Cataloging search text box type 0691142890 and
click Submit
-- ISBN10 without hypens found in reservoir
13) Apply patch
14) Repeat steps 9-12, this time it is always found! :)
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When a z39.50 server isn't able to be searched successfully, the yellow
error box came up empty. This patch fixes the problem.
Test Plan:
1) Go to Administration/z39.50 servers
2) Create a fake z39.50 server with a made up address
3) Go to cataloging, search only that server
4) Note the empty yellow alert box
5) Apply this patch
6) Re-run the search, not the alert box has a message in it now
Signed-off-by: Nora Blake <nblake@masslibsystem.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works according to test plan.
When one of the selected servers gives result no dialog
box is shown before and after applying the patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>