Why a refactoring was need for this script?
The export tool (tools/export.pl) can be called from the command line
and some parts of code were unnecessarity complicated (just look at the
code, you will understand).
Worse still, the script does not provide the same options for both
interface. For instance you cannot export records given a range of
biblionumbers, authids, callnumbers, etc. from the commandline.
What does this patch?
1/ Important: The script tools/export.pl does not work anymore if called from
the command line (should be in the release notes).
2/ The code used to generated a file (csv, iso2709 or xml) has been moved to a new
module (Koha::Exporter::Record) and tests have been provided.
3/ No change is done on the web interface
4/ Some new options have been added to the commandline script
(misc/export.pl):
- starting_authid
- ending_authid
- authtype
- starting_biblionumber
- ending_biblionumber
- itemtype
- starting_callnumber
- ending_callnumber
- start_accession
- end_accession
5/ There is a change in the behavior if an error occurs:
Can't call method "as_usmarc" on an undefined value at Koha/Exporter/Record.pm line 114.
record (number 5530) is invalid and therefore not exported because its reopening generates warnings above at Koha/Exporter/Record.pm line 117.
Before this patch, they were not displayed (using the command line).
What does not do this patch?
It does not provide the 'clean', 'timestamp' and 'deleted_barcodes' options to
the web interface (same as before).
What about the perfs?
With a DB with ~800 biblios (MARC21)
Before: perl tools/export.pl 14.79s user 0.83s system 71% cpu 21.905 total
After: perl misc/export.pl 17.19s user 0.84s system 75% cpu 24.018 total
With a DB with ~6400 biblios (UNIMARC)
Before: perl tools/export.pl 26.55s user 0.76s system 76% cpu 35.498 total
After: perl misc/export.pl 26.78s user 0.84s system 80% cpu 34.494 total
How to test this patch?
Test plan:
A. Web interface:
1/ On the current master, export some records, biblios and authorities (with
the 3 differents exports) playing with the different filters (item type,
libraries, callnumber, accession date, don't export items, remove
non-local items, don't export fields, etc.).
2/ Apply this patch, export again the same records, and compare the
generated files. They must be identical!
3/ Confirm that the export features on the checkout list
(circ/circulation.pl) works as before this patch.
B. The command line
1/ On the current master, export some records, biblios and authorities (with
the 2 differents exports) playing with the different options (date,
deleted_barcodes, clean).
2/ Apply this patch, export again the same records, and compare the
generated files. They must be identical!
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Since bug 10803 adds a C4::Search::History module, the
PurgeSearchHistory routine should be moved.
Test plan:
- run misc/cronjobs/cleanup_database.pl with the searchhistory param and
verify behavior is the same as before applying this patch.
- run prove t/Search/History.t
Signed-off-by: Joonas Kylmälä <j.kylmala@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If the table given in parameter is not in the white list, the script
should die rathen than correct to a default value.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Currently the --where parameter only allow to specify a condition on
fields in the biblioitems table.
For some needs it would be great to specify a condition on the field in
the items table.
The use case is the following: you want to reindex biblios with items
modified since a specific timestamp.
Test plan:
1/ Pick an item randomly in your catalogue
2/ Edit it and save
3/ Note that the items.timestamp has been set to today but not the
biblioitems.timestamp
4/ launch rebuild_zebra without the new parameter
perl misc/migration_tools/rebuild_zebra.pl -b -v --where
"timestamp >= XXX"
where XXX is the today date (e.g. "2014-06-05 00:00:00").
Note that the biblio has not been indexed.
5/ launch rebuild_zebra using the new parameter:
perl misc/migration_tools/rebuild_zebra.pl -b -v -t items --where
"timestamp >= XXX"
Note the biblio has been indexed.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When going to the installer (without the DB structure), you have to log
twice before starting the installation.
Test plan:
0/ Create a new database and fill the database entry in the koha conf
with its name
1/ Go on the mainpage, you should be redirected to the installer
2/ Try to log in
Without this patch, you will get the login form again.
With this patch, you can start the installation
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: Added koha-common.cron.daily.
Note that the cronjob only does something when the pref is set now.
See corresponding change on bug 6810.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
As per suggestion of Robin on report 14840, it would be better to always
run the cronjob and only do something when the pref is set.
This patch adds a test in the cronjob and clears the former default of 14
days.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Removed the pref and ran the dbrev again: Fine.
Run the cronjob with -c -v -n: Prints exit warning.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Moved some POD lines and removed some duplicated POD lines.
Corrected some indentation.
Did some minor spelling changes, rewording.
Removed the C4::Dates module. No longer needed.
Removed a label in front of the for statement.
Removed the Do you wish to continue-code. Should not be in a cron job.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
- remove DateTime->now()
- use Koha::DateUtils->dt_from_string;
- use Pod2usage for the usage
- use Modern::Perl
- use branches table
- Change letter code from MEMEXP to MEMBERSHIP_EXPIRY
- review comments implemented
- fix qa script comments
Bug 6810 - Fix QA failures
- MembershipExpiryDaysNotice system preferences arragned alphabetical order.
Bug 6810 - Add sample notices
- review comments implemented
- default value of is_html field in letter table is 0
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Bug 6810 - Fix QA failures
- Use KohaDates to convert dateexpiry
- remove MYSQL specifics methods for date handling in
GetUpcomingMembershipExpires
- make the script membership_expiry.pl write in Koha system logs
- add tests
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Bug 6810 - Fix QA failures:
- use Koha::DateUtils instead of Koha::Template::Plugin::KohaDates,
- Add test with syspref MembershipExpiryDaysNotice equals 0 and undef,
- fix (new) test failure (when MembershipExpiryDaysNotice is undef).
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
A new crontab based perl script to send membership expiry reminders. A
system preference controls the number of days in advance of membership
expiry that the notices will be sent on.
To Test:
1) Create a new Patron and set membership expiry date 14 days from the
date of registration.
2) Check your systemprefence ( MemExpDayNotice to 14 days default value)
3) Manual testing Run ( perl membership_expiry.pl -h)
It would give you various option:
This script prepares for membership expiry reminders to be sent to
patrons. It queues them in the message queue, which is processed by
the process_message_queue.pl cronjob.
See the comments in the script for directions on changing the script.
This script has the following parameters :
-c Confirm and remove this help & warning
-n send No mail. Instead, all mail messages are printed on screen.
Useful for testing purposes.
-v verbose
Do you wish to continue? (y/n)
4) Choose option for ex: perl membership_expiry.pl -c
5) Go to your koha database and check message_queue table you see some
results.
6) Run (perl process_message_queue.pl) it will send email to those
patron whose membership after 14 days from today.
7) Cron testing: (10 1 * * * $KOHA_CRON_PATH/membership_expiry.pl -c)
8) Set your 15 * * * * $KOHA_CRON_PATH/process_message_queue.pl
9) After running membership_expiry.pl, (process_message_queue.pl will
send emails to those patron whose membership after 14 days from
today).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Koha needs a script to automate the importing of Lexile score data for
titles that have available scores but are not currently in the title's
record.
This script will take a CSV file of Lexile scores, and locate any
matching records in the Koha database ( by ISBN ). If the record already
has a score, it will be updated. If not, the Lexile score field will be
created.
Test Plan:
1) Apply this patch
2) Catalog a record for each of the following ISBNs:
0789170191
9780673779410
3) Download the file LexileTitlesTruncated.txt attached
to this bug report
4) Run the script from the command line:
./misc/migraction_tools/import_lexile.pl -v --file /path/to/LexileTitlesTruncated.txt
5) View those records in Koha
6) Note those records now have valid Lexile scores
7) Edit the Lexile score ( 521$a ) and change the value to something else
8) Repeat step 4
9) Note the original Lexile score has been restored
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
NOTE: Before patch "./misc/cronjobs/batch_anonymise.pl --help" had no
message, and neither did the anonymizing tool in the staff client.
After the patch, both had informative messages.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
This patch introduces entries for monthly running the share_usage_with_koha_community.pl
script to the packages and also the crontab.example file for manual
installs use.
Edit: I fixed the Copyright line
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Add a script sitemap.pl to process all biblio records from a Koha
instance and generate Sitemap files complying with this protocol as
described on http://sitemaps.org. The goal of this script is to be able
to provide to search engines direct access to biblio records. It avoid
leaving search engine browsing Koha OPAC and so generating a lot of
traffic, and workload, for a bad result.
Thanks Magnus for testing, and helping to improve the script design.
[2015.04.16] Switch from Moose to Moo.
[2015.08.20] Add complete (more) UT.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
All options to the script work as expected and the output looks
good. Nice enhancement!
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
I signed-of my own patch after fixing various QA errors.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Amended patch: replace tabs with spaces.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
To test:
1/ Run the overdue_notices.pl script (don't do this on production
obviously :))
2/ Notice the warns
3/ Apply patch
4/ Run again
5/ Notice no warns, but notices are still generated ok
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
The call to cronlogaction is done in cleanup_database. So there is no use
in keeping the module here.
Test plan:
Run or compile (perl -c) script delete_expired_opac_registrations.pl.
Run or compile (perl -c) script delete_unverified_opac_registrations.pl.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
This patch moves the core code of two selfreg cron jobs into the Members
module. The new routines are called from cleanup_database with two new
parameters. The old cron jobs are now wrappers to cleanup_database.
As a bonus, we can add a unit test now.
In time, we can obsolete the selfreg cron jobs. For now, the code is in one
place and behavior does not change.
A next step (as described on the Bugzilla report) would be: remove the Delay
pref for self regs.
Test plan:
Run the unit test t/db_dependent/Members.t.
Test the two new parameters of cleanup_database.pl.
Verify if delete_expired_opac_registrations.pl still works.
Same for delete_unverified_opac_registrations.pl.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
. Fixed minor merge confict on UT & cleanup_database.pl
. UT ok
. The two deprecated scripts still work as before, with a warning
message.
. cleanup_database.pl do the deletion job, calling new C4::Members
function rather that doing it directly.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Test plan:
0/ Create a new notice suggestions > TO_PROCESS
You can use the one defined in the other patch.
1/ Create a suggestion and link it to a fund
2/ Add a owner to this fund and make sure this patron has an email
address (the email address used should be the one defined in the
AutoEmailPrimaryAddress syspref).
3/ Execute the cronjob script with the -v and without the -c argument
4/ The output should tell you that an email will be sent
5/ Execute the cronjob script with the -v and with the -c argument
6/ Verify the notice is generated in the message_queue table and it is
correctly formatted.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Most of them were found and fixed using codespell.
Signed-off-by: Stefan Weil <sw@weilnetz.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This is major problem for plack installations with utf-8 encoding.
In this case, we are overriding CGI->new to setup utf-8 flag and
get correctly decoded $cgi->params, and reset syspref cache using
C4::Context->clear_syspref_cache
Test scenario:
1. under plack try to search with utf-8 charactes
2. try to find patron with utf-8 characters
Signed-off-by: Gaetan Boisson <gaetan.boisson@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch renames translation files for Bengali
language, from ben-* to bn-IN-*.
Also adds India as region
To test:
1) Apply the patch
2) Run updatedatabase
3) Install Bengali language
cd misc/translator
perl translate install bn-IN
enable
Check correct description
4) Create and install a fake Bengali variant
cd misc/translator
perl translate create bn-XX
perl translate install bn-XX
enable both variants
Check correct rendering of region
Results comply with expected test plan outcome. Signed off for bn-IN
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The output filename is notices_all_<date>.[html|csv|ods] if no
letter_code parameter is given.
If 1 is given: notices_<letter_code>_<date>.[html|csv|ods]
If 1+ are given:
noties_<letter_code1>_..._<letter_codeN>_<date>.[html|csv|ods]
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Fix --html without --letter_code
Fix --ods which was producing a 2 lines ods file
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
No more encoding issue with html file, no problem with csv|ods
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
If you choose to generate print notices for a specific letter code, the
generated files should be distinct.
The use case is: you want to process print notice for letter codes:
overdue1, overdue2 and overdue3.
The cronjobs will be:
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue1 --csv --ods --html --delimiter=";"
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue2 --csv --ods --html --delimiter=";"
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue3 --csv --ods --html --delimiter=";"
without this patch, the 2 first files will be erased.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds:
- the ability to generate an ods file
From now you are able to generate a ods file for print notices.
You would like to generate a csv file and not a html file.
Test plan:
- same as previous patch but test the following parameters:
perl misc/cronjobs/gather_print_notices.pl /tmp/test --ods
--letter_code=OVERDUE -d=:
you should get an error because csv2ods is not installed.
Follow the installation instructions and try again the command.
A ods file should be generated in your /tmp/test directory.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds:
- the ability to generate a csv file instead of a html file.
- a letter_code parameter.
From now you are able to generate a csv file for print notices.
Imagine a template notice defined as:
cardnumber:patron:email:item
<<borrowers.cardnumber>>:<<borrowers.firstname>> <borrowers.surname>>:<<borrowers.email>>:<<items.barcode>>
You would like to generate a csv file and not a html file.
Test plan:
- define your ODUE notice for the print template as:
cardnumber:patron:email:item
<<borrowers.cardnumber>>:<<borrowers.firstname>> <<borrowers.surname>>:<<borrowers.email>>:<item><<items.barcode>></item>
- define overdues rules for a patron category
- check 2 items out using a due date in order to generate the overdue
notices
- check these 2 items in
- launch the overdue_notices script
- the message_queue table should now contain 2 new entries
- launch the gather_print_notices cronjob with the following parameters:
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
--letter_code=OVERDUE --letter_code=CHECKIN
you should get an error
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
you should get an error
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
--letter_code=OVERDUE -d=:
will produce 1 csv file in your /tmp/test directory
- verify the csv file is correct and contain only 1 csv header column.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
QA note: Keep in mind that you can use all placeholders for the
csv that you can use for the normal templates. If you normally
get the item information from <item></item> you need to use that.
If you can use <<item.barcode>> directly, you can also do so
in the csv.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch refactores and adds some good practices:
- use Modern::Perl
- use Pod::Usage
- add POD
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
koha-qa should be good :
OK misc/devel/coverage.pl
OK critic
OK forbidden patterns
OK pod
OK valid
OK C4/Installer/PerlDependencies.pm
OK critic
OK forbidden patterns
OK pod
OK valid
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Test plan not explicitly stated,
script run and generates a lot of data :)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Fixed a small conflict on PerlDependencies.pm
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
It's a script making a cover on all modules to see
which ones are not tested yet. It uses Devel::Cover
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Corrected specifield.
Updated usage statement on use of delimiter pref.
Clarified the CONDITION explanation somewhat..
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The original concern of bug 9892 was that this borrowers export script
cannot handle tabulation to separate columns.
With this patch, the delimiter preference is used as separator for the
output, to be consistent with others scripts.
This should be highlighted on the release, it can produce change in
behaviors.
Test plan:
Confirm that the 'delimiter' pref is used for the output, but you are able
to overwrite it with the 'separator' parameter
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works as expected, respect preference but is supeseeded by cmd line
No koha-qa errors
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Links in rss templates were hardcoded to library.org.nz. Should pass and
use the system's OPACBaseURL.
Tested and verified.
Signed-off-by: Eivin Giske Skaaren <eskaaren@yahoo.no>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>