it fixes xgettext and (maybe) friends
[12:22:29] Error: Command failed: misc/translator/xgettext.pl --charset=UTF-8 -s -o /tmp/koha-Jaa9rf/Koha-marc-NORMARC.pot -f /tmp/koha-Jaa9rf/files
/tmp/koha-Jaa9rf/Koha-marc-NORMARC.pot at misc/translator/xgettext.pl line 387.
Use of uninitialized value in subroutine entry at misc/translator/xgettext.pl line 388.
Argument ">:encoding(utf-8)" isn't numeric in subroutine entry at misc/translator/xgettext.pl line 388.
Argument "/tmp/koha-Jaa9rf/Koha-marc-NORMARC.pot" isn't numeric in subroutine entry at misc/translator/xgettext.pl line 388.
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
On bug 17591 we discovered that there was something weird going on with
the way we export and use subroutines/modules.
This patch tries to standardize our EXPORT to use EXPORT_OK only.
That way we will need to explicitely define the subroutine we want to
use from a module.
This patch is a squashed version of:
Bug 17600: After export.pl
Bug 17600: After perlimport
Bug 17600: Manual changes
Bug 17600: Other manual changes after second perlimports run
Bug 17600: Fix tests
And a lot of other manual changes.
export.pl is a dirty script that can be found on bug 17600.
"perlimport" is:
git clone https://github.com/oalders/App-perlimports.git
cd App-perlimports/
cpanm --installdeps .
export PERL5LIB="$PERL5LIB:/kohadevbox/koha/App-perlimports/lib"
find . \( -name "*.pl" -o -name "*.pm" \) -exec perl App-perlimports/script/perlimports --inplace-edit --no-preserve-unused --filename {} \;
The ideas of this patch are to:
* use EXPORT_OK instead of EXPORT
* perltidy the EXPORT_OK list
* remove '&' before the subroutine names
* remove some uneeded use statements
* explicitely import the subroutines we need within the controllers or
modules
Note that the private subroutines (starting with _) should not be
exported (and not used from outside of the module except from tests).
EXPORT vs EXPORT_OK (from
https://www.thegeekstuff.com/2010/06/perl-exporter-examples/)
"""
Export allows to export the functions and variables of modules to user’s namespace using the standard import method. This way, we don’t need to create the objects for the modules to access it’s members.
@EXPORT and @EXPORT_OK are the two main variables used during export operation.
@EXPORT contains list of symbols (subroutines and variables) of the module to be exported into the caller namespace.
@EXPORT_OK does export of symbols on demand basis.
"""
If this patch caused a conflict with a patch you wrote prior to its
push:
* Make sure you are not reintroducing a "use" statement that has been
removed
* "$subroutine" is not exported by the C4::$MODULE module
means that you need to add the subroutine to the @EXPORT_OK list
* Bareword "$subroutine" not allowed while "strict subs"
means that you didn't imported the subroutine from the module:
- use $MODULE qw( $subroutine list );
You can also use the fully qualified namespace: C4::$MODULE::$subroutine
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
We should remove the debug statements or use Koha::Logger when we want
to keep it.
Test plan:
Confirm that occurrences of remaining occurrences of DEBUG need to be
kept (historical scripts for instance)
Confirm that the occurrences removed by this patch can be removed
Confirm that the occurrences replaced by Koha::Logger are correct
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Looks good to me, noting a few minor points on BZ.
JD amended patch: replace "warn #Finished" with "#warn Finished", and
put the statement on a single line
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
We are using Koha::Logger when it makes sense to keep the info,
otherwise we simply remove it
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Bug 28572: Replace missing occurrence in misc/admin/koha-preferences
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
There is a "debug" parameter we are passing from the controller scripts
to C4::Auth::get_template_and_user, but it's not actually used!
Test plan:
Confirm the assumption
Review the changes from this patch
Generated with:
perl -p -i -e 's#\s*debug\s*=\>\s*(0|1),?\s*##gms' **/*.pl
git checkout misc/devel/update_dbix_class_files.pl # Wrong catch
+ Manual fix in acqui/neworderempty.pl
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
For each subfield added, we check if other subfields exists in the same
field. If that's the case we use the same tab as the first subfield
found.
Test plan:
1. Find a biblio subfield in
misc/migration_tools/ifla/data/biblio/default.yml that doesn't exist
in your default biblio MARC framework (or delete one). The field
should exist and have other subfields with a tab set.
2. Change the tab of all subfields within that field it's different from
what's in the .yml file
3. Run misc/migration_tools/ifla/update.pl
4. Verify that the subfield has been added and have the same tab as
others subfields
5. Do the same for authorities (files are in
misc/migration_tools/ifla/data/auth/)
Signed-off-by: Koha team <koha@univ-lyon.fr>
Signed-off-by: sonia <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Test plan:
1. Find a biblio subfield in
misc/migration_tools/ifla/data/biblio/default.yml that you have in
your default biblio MARC framework (or create one).
2. Change the tab of this subfield so that it's different from what's in
the .yml file
3. Run misc/migration_tools/ifla/update.pl --force
4. Verify that the tab of this subfield has not been changed.
5. Do the same for authorities (files are in
misc/migration_tools/ifla/data/auth/)
Signed-off-by: sonia <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
There is an obvious error in the script since bug 23463 and it's known
to be broken for 10 years now (since bug 5579). As nobody complains we
can safely remove this script.
Signed-off-by: Joonas Kylmälä <joonas.kylmala@helsinki.fi>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
JD Amended patch: Remove perlcritic error
"$d" is declared but not used at line 839, column 5. Unused variables clutter code and make it harder to read. (Severity: 3)
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch adds a start time, end time and elapsed times on the 'records
exported' lines to the verbose output of the rebuild_zebra.pl script.
Test plan
1/ Run rebuild_zebra.pl -a -b -v
2/ Note the new timestamps included on the verbose output
3/ Signoff
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Same as Load, but for Dump.
Test plan:
Edit ES mappings, replace withdrawn's label with "withdrawn ✔️❤️ ★"
Export the mappings
perl misc/search_tools/export_elasticsearch_mappings.pl > admin/searchengine/elasticsearch/mappings.yaml
Reset mappings from the UI
=> Notice that the label is correct
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@helsinki.fi>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
From tht YAML pod:
"""
This module has been released to CPAN as YAML::Old, and soon YAML.pm will be changed to just be a frontend interface module for all the various Perl YAML implementation modules, including YAML::Old.
If you want robust and fast YAML processing using the normal Dump/Load API, please consider switching to YAML::XS. It is by far the best Perl module for YAML at this time. It requires that you have a C compiler, since it is written in C.
"""
See also
https://gitlab.com/koha-community/qa-test-tools/-/merge_requests/35
Test plan:
Try some place where YAML::XS is not used and confirm that it works
correctly
QA note: This patch removes some uses of YAML that were not useful
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@helsinki.fi>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
One last instance of ModBiblioMarc in bulkmarcimport still passed the
frameworkcode.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Tested with :
For a framework (not the default) :
Creation of a biblio record
Edition of this biblio record
Creation of an item of this record
Creation of an item of this record
./misc/batchRepairMissingBiblionumbers.pl OK
prove t/db_dependent/Biblio/ModBiblioMarc.t OK
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
To prevent
DBD::mysql::db begin_work failed: Already in a transaction at /usr/share/perl5/DBIx/Class/Storage/DBI.pm line 1588, <GEN45> line 1.
Test plan:
1 - Export more than 100 records
2 - Use bulkmarcimport (with -commit=10) to import them
perl misc/migration_tools/bulkmarcimport.pl -b -file testbmi.xml -v -m=MARCXML --commit=10
3 - Modify a record to make the import fail (for instance having a too long
lccn)
4 - Use bulkmarcimport (with -commit=10) to import them
perl misc/migration_tools/bulkmarcimport.pl -b -file testbmi.xml -v -m=MARCXML --commit=10
5 - Notice that the import stops but that the imported record are imported (apart from the last batch of 10)
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This script checks that an authority is not being used before deleting it.
We do not need to check biblios for the authority id as we have already verified it is unused.
TO test:
1 - Reset db to koha test database
2 - perl misc/migration_tools/remove_unused_authorities.pl -t
3 - Note a number of unsued authorities
4 - perl misc/migration_tools/remove_unused_authorities.pl -c
5 - Note authorities are removed
6 - Reset db
7 - Apply patch
8 - perl misc/migration_tools/remove_unused_authorities.pl -t
9 - Note results are the same
10 - perl misc/migration_tools/remove_unused_authorities.pl -c
11 - Note results are the same
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch removes the use of AddAuthority with a non-empty
authid argument, since that triggers an update rather than an insert.
In practice, the update also fails, but the error isn't raised,
as the database connection doesn't have RaiseError set.
Test plan:
1) Do not apply patch
2) Try to bulkmarcimport an authority file with a 001
3) Observe that the script reports success but no authority is added
4) Apply the path
5) Try to bulkmarcimport an authority file with a 001
6) Observe that the script reports success and the authority is added
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
bulkmarcimport.pl displays a small typo when a MARC modification
template is being used: "modofication".
This patch fixes that.
Test plan: apply the patch, and confirm the script no longer
displays the typo.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Test plan:
0) do not apply the patch
1) set up OAI and create at least one set
2) run build_oai_sets.pl
3) remember/write down number of records added
4) delete a biblio, the is included in set
5) run buid_oai_sets.pl again
6) the set is 1 record smaller
7) apply the patch
8) run build_oai_sets.pl
9) the number of record should be the same as in 3)
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch removes some new error cases introduced during rebase
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch adds a .perlcriticrc (copied from qa-test-tools) and fixes
almost all perlcrictic violations according to this .perlcriticrc
The remaining violations are silenced out by appending a '## no critic'
to the offending lines. They can still be seen by using the --force
option of perlcritic
This patch also modify t/00-testcritic.t to check all Perl files using
the new .perlcriticrc.
I'm not sure if this test script is still useful as it is now equivalent
to `perlcritic --quiet .` and it looks like it is much slower
(approximatively 5 times slower on my machine)
Test plan:
1. Run `perlcritic --quiet .` from the root directory. It should output
nothing
2. Run `perlcritic --quiet --force .`. It should output 7 errors (6
StringyEval, 1 BarewordFileHandles)
3. Run `TEST_QA=1 prove t/00-testcritic.t`
4. Read the patch. Check that all changes make sense and do not
introduce undesired behaviour
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
This patch adds the table name to the SQL update statements.
To test:
1) Apply the patch
2) Run the script. Check that there are no errors, and that the script
behaves as expected.
Signed-off-by: Andreas Roussos <a.roussos@dataly.gr>
Created a test record with values in field 440. Applied the patch, ran the
script with the -c -f flags and observed that the values were moved to field
490. Also, the relevant Koha to MARC mappings were changed accordingly.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Like in most scripts in misc, add confirm argument
to ensure script is not run without knowing what it does.
Test plan:
1) Run misc/migration_tools/remove_unused_authorities.pl -h
2) You see help line for confirm
3) Run misc/migration_tools/remove_unused_authorities.pl
4) You see help and script does nothing
5) Run misc/migration_tools/remove_unused_authorities.pl -c
6) Script runs like wanted
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
The script misc/migration_tools/remove_unused_authorities.pl directly checks if Zebra search is OK.
This patch changes so that this test is only if Zebra is the search engine.
It also adds a test on the search off any authority number indexed (index 'an').
With Zebra its : an,alwaysmatches=''
With ES its : an:*
This test ensure that biblios records are indexed and that not all autorities will be deleted.
Test plan:
1) On a catalog create a new authority
2) Use Zebra in systempreference SearchEngine
3) Stop Zebra server
4) Run misc/migration_tools/remove_unused_authorities.pl -c
5) The script does nothing and says :
Zebra server seems not to be available. This script needs Zebra runs.
6) Restart Zebra server
7) Delete biblio index base
8) Run misc/migration_tools/remove_unused_authorities.pl -c
9) The script does nothing and says :
Searching authority number in biblio records seems not to be available : an,alwaysmatches=''
10) Use ElasticSearch in systempreference SearchEngine
11) Delete biblio index base
12) Run misc/migration_tools/remove_unused_authorities.pl -c
13) The script does nothing and says :
Searching authority number in biblio records seems not to be available : an:*
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
remove_unused_authorities.pl script can be improved.
This patch changes changes verbosity so than test mode can be used
to know the autorities that are used and those that can be deleted.
It also writes a line in output if limited authority type(s).
This patch also removes the unused vars $thresholdmin and $thresholdmax.
It also changes the query to use SQL with parameters for authority types.
Test plan :
1) On a catalog create a new authority
2) Be sure catalog is well indexed
3) Run misc/migration_tools/remove_unused_authorities.pl -t
4) You will see the line :
*** Testing only, authorities will not be deleted. ***
5) You will see lines of :
authid=x type=y : used X time(s)
6) You will see the line for the authority created in 1) :
authid=x type=y : can be deleted
7) You will see at the end :
x authorities parsed
y can be deleted because unused
z unchanged because used
8) Run misc/migration_tools/remove_unused_authorities.pl
9) You don't see the line :
*** Testing only, authorities will not be deleted. ***
10) You will see lines of :
authid=x type=y : used X time(s)
11) You will see the line for the authority created in 1) :
authid=x type=y : deleted
12) You will see at the end :
x authorities parsed
y deleted because unused
z unchanged because used
13) Run misc/migration_tools/remove_unused_authorities.pl --auth NP --auth CO
14) You see the line :
Restricted to authority type(s) : NP,CO.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
When importing with bulkmarcimport, authorities may or may not be updated based
on which authority is newer (005 are compared). This patch allows to keep track
in the result yaml file if an authority has been updated or not:
Signed-off-by: Jon Knight <J.P.Knight@lboro.ac.uk>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Starting to replace the ModItem calls with Koha::Item->store
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
This patch standardises the encoding name used in direct calls
to new_from_xml() to 'UTF-8' instead of 'utf8' or 'utf-8'.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
At the last development meeting we have voted to remove the
QueryParser-related code
https://wiki.koha-community.org/wiki/Development_IRC_meeting_19_February_2020
Hea tells us that it has not been adopted, and the code/bug tracker that
it is not really usable as it. As nobody is willing to work on it, we
decided to remove it instead.
Test plan:
% prove t/db_dependent/Search.t
must return green
See commits from bug 9239 and confirm that the code is removed in this
patch.
Also play with the search on the UI and confirm that you do not see
obvious regressions
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Bug 9978 should have fixed them all, but some were missing.
We want all the license statements part of Koha to be identical, and
using the GPLv3 statement.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
We certainly faced 3 similar bugs due to this syntax: bug 23006, bug
22941 and bug 17526.
To prevent other issues related to this syntax this patch suggests to
replace them all in one go.
Test plan:
Confirm that the 2 syntaxes are similar
Eyeball the patch and confirm that there is no typo!
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
SYSPREF_OVERRIDE unfortunately is not supported ;)
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
In an attempt to save time, bulkmarcimport temporarily sets CataloguingLog
and AuthoritiesLog to 0. It does this by disabling syspref caching and saving
the changes to the database (then replacing the original values at completion).
Unfortunately, this disables other key sysprefs from being cached, and results in
a 50% increase in processing time for the script.
This patch instead utilizes the ENV variable override feature of sysprefs, which
preempts the cache in C4::Context->preference().
To test:
1. Perform a bulkmarcimport with a reasonable number of biblios (~1000 will do)
2. Note the time taken to complete
3. Apply patch
4. Revert the biblio load performed
5. Perform another bulkmarcimport with the same biblios and commandline options
6. Note the time taken to complete
7. Compare times. The time from step 6 should be about 33% less than the time from step 2
8. Check Cataloguing and Authorities Logs to verify imported records were not logged
9. Profit!
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
If elastic is used as search engine, the bulkmarcimport.pl will not
handle correctly UTF-8 encoded MARCXML
Koha::SearchEngine::Search->new uses a require statement to load the correct Search module.
This is done l.257 of bulkmarcimport.pl:
257 my $searcher = Koha::SearchEngine::Search->new
Koha::SearchEngine::Elasticsearch::Search will `use MARC::File::XML`, and so resets the arguments set before:
216 $MARC::File::XML::_load_args{BinaryEncoding} = 'utf-8';
220 $MARC::File::XML::_load_args{RecordFormat} = $recordformat;
An easy (but dirty) fix could be to move the declaration of my $searcher before in the script.
The tricky (but correct) fix would be to remove the long standing "ugly hack follows" comment.
This patch is the easy, and dirty, fix
Test plan:
Use the command line tool to import MARXCML records that contains unicode characters into Koha
Something like `misc/migration_tools/bulkmarcimport.pl -biblios -file record.marcxml -m=MARCXML`
Without this patch you will notice that unicode characters will not be displayed correctly
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
018 - [Reserved for other International Standard Numbers]
033 - Other System Persistent Record Identifier
183 - Coded Data Field: Type of Carrier
203 - Content Form and Media Type
231 - Digital File Characteristics
283 - Carrier Type
338 - Funding Information Note
Signed-off-by: Sonia BOUIS <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Test plan:
1. Take a look at files in misc/migration_tools/ifla/data. It
contains all data that will be inserted into Koha. Its content is
based on the previous patches.
2. Run the script misc/migration_tools/ifla/update.pl and verify
that it effectively added the new fields, subfields, authorised
values and authority types.
3. Run the script again and see that it doesn't update existing fields
4. Run with --force and verify that it update existing fields (you can
modify unimarc_ifla.yml to see changes)
5. Run with --force --po-file misc/migration_tools/ifla/language/fr.po
and verify that the labels are now in french
There is a POT file in misc/migration_tools/ifla/language/template.pot,
use it to create PO files for other languages.
Signed-off-by: Sonia BOUIS <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>