If the table given in parameter is not in the white list, the script
should die rathen than correct to a default value.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Currently the --where parameter only allow to specify a condition on
fields in the biblioitems table.
For some needs it would be great to specify a condition on the field in
the items table.
The use case is the following: you want to reindex biblios with items
modified since a specific timestamp.
Test plan:
1/ Pick an item randomly in your catalogue
2/ Edit it and save
3/ Note that the items.timestamp has been set to today but not the
biblioitems.timestamp
4/ launch rebuild_zebra without the new parameter
perl misc/migration_tools/rebuild_zebra.pl -b -v --where
"timestamp >= XXX"
where XXX is the today date (e.g. "2014-06-05 00:00:00").
Note that the biblio has not been indexed.
5/ launch rebuild_zebra using the new parameter:
perl misc/migration_tools/rebuild_zebra.pl -b -v -t items --where
"timestamp >= XXX"
Note the biblio has been indexed.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When going to the installer (without the DB structure), you have to log
twice before starting the installation.
Test plan:
0/ Create a new database and fill the database entry in the koha conf
with its name
1/ Go on the mainpage, you should be redirected to the installer
2/ Try to log in
Without this patch, you will get the login form again.
With this patch, you can start the installation
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Amended: Added koha-common.cron.daily.
Note that the cronjob only does something when the pref is set now.
See corresponding change on bug 6810.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
As per suggestion of Robin on report 14840, it would be better to always
run the cronjob and only do something when the pref is set.
This patch adds a test in the cronjob and clears the former default of 14
days.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Removed the pref and ran the dbrev again: Fine.
Run the cronjob with -c -v -n: Prints exit warning.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Moved some POD lines and removed some duplicated POD lines.
Corrected some indentation.
Did some minor spelling changes, rewording.
Removed the C4::Dates module. No longer needed.
Removed a label in front of the for statement.
Removed the Do you wish to continue-code. Should not be in a cron job.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
- remove DateTime->now()
- use Koha::DateUtils->dt_from_string;
- use Pod2usage for the usage
- use Modern::Perl
- use branches table
- Change letter code from MEMEXP to MEMBERSHIP_EXPIRY
- review comments implemented
- fix qa script comments
Bug 6810 - Fix QA failures
- MembershipExpiryDaysNotice system preferences arragned alphabetical order.
Bug 6810 - Add sample notices
- review comments implemented
- default value of is_html field in letter table is 0
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Bug 6810 - Fix QA failures
- Use KohaDates to convert dateexpiry
- remove MYSQL specifics methods for date handling in
GetUpcomingMembershipExpires
- make the script membership_expiry.pl write in Koha system logs
- add tests
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Bug 6810 - Fix QA failures:
- use Koha::DateUtils instead of Koha::Template::Plugin::KohaDates,
- Add test with syspref MembershipExpiryDaysNotice equals 0 and undef,
- fix (new) test failure (when MembershipExpiryDaysNotice is undef).
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
A new crontab based perl script to send membership expiry reminders. A
system preference controls the number of days in advance of membership
expiry that the notices will be sent on.
To Test:
1) Create a new Patron and set membership expiry date 14 days from the
date of registration.
2) Check your systemprefence ( MemExpDayNotice to 14 days default value)
3) Manual testing Run ( perl membership_expiry.pl -h)
It would give you various option:
This script prepares for membership expiry reminders to be sent to
patrons. It queues them in the message queue, which is processed by
the process_message_queue.pl cronjob.
See the comments in the script for directions on changing the script.
This script has the following parameters :
-c Confirm and remove this help & warning
-n send No mail. Instead, all mail messages are printed on screen.
Useful for testing purposes.
-v verbose
Do you wish to continue? (y/n)
4) Choose option for ex: perl membership_expiry.pl -c
5) Go to your koha database and check message_queue table you see some
results.
6) Run (perl process_message_queue.pl) it will send email to those
patron whose membership after 14 days from today.
7) Cron testing: (10 1 * * * $KOHA_CRON_PATH/membership_expiry.pl -c)
8) Set your 15 * * * * $KOHA_CRON_PATH/process_message_queue.pl
9) After running membership_expiry.pl, (process_message_queue.pl will
send emails to those patron whose membership after 14 days from
today).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@quecheelibrary.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Koha needs a script to automate the importing of Lexile score data for
titles that have available scores but are not currently in the title's
record.
This script will take a CSV file of Lexile scores, and locate any
matching records in the Koha database ( by ISBN ). If the record already
has a score, it will be updated. If not, the Lexile score field will be
created.
Test Plan:
1) Apply this patch
2) Catalog a record for each of the following ISBNs:
0789170191
9780673779410
3) Download the file LexileTitlesTruncated.txt attached
to this bug report
4) Run the script from the command line:
./misc/migraction_tools/import_lexile.pl -v --file /path/to/LexileTitlesTruncated.txt
5) View those records in Koha
6) Note those records now have valid Lexile scores
7) Edit the Lexile score ( 521$a ) and change the value to something else
8) Repeat step 4
9) Note the original Lexile score has been restored
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
NOTE: Before patch "./misc/cronjobs/batch_anonymise.pl --help" had no
message, and neither did the anonymizing tool in the staff client.
After the patch, both had informative messages.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
This patch introduces entries for monthly running the share_usage_with_koha_community.pl
script to the packages and also the crontab.example file for manual
installs use.
Edit: I fixed the Copyright line
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Add a script sitemap.pl to process all biblio records from a Koha
instance and generate Sitemap files complying with this protocol as
described on http://sitemaps.org. The goal of this script is to be able
to provide to search engines direct access to biblio records. It avoid
leaving search engine browsing Koha OPAC and so generating a lot of
traffic, and workload, for a bad result.
Thanks Magnus for testing, and helping to improve the script design.
[2015.04.16] Switch from Moose to Moo.
[2015.08.20] Add complete (more) UT.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
All options to the script work as expected and the output looks
good. Nice enhancement!
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
I signed-of my own patch after fixing various QA errors.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Amended patch: replace tabs with spaces.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
To test:
1/ Run the overdue_notices.pl script (don't do this on production
obviously :))
2/ Notice the warns
3/ Apply patch
4/ Run again
5/ Notice no warns, but notices are still generated ok
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
The call to cronlogaction is done in cleanup_database. So there is no use
in keeping the module here.
Test plan:
Run or compile (perl -c) script delete_expired_opac_registrations.pl.
Run or compile (perl -c) script delete_unverified_opac_registrations.pl.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
This patch moves the core code of two selfreg cron jobs into the Members
module. The new routines are called from cleanup_database with two new
parameters. The old cron jobs are now wrappers to cleanup_database.
As a bonus, we can add a unit test now.
In time, we can obsolete the selfreg cron jobs. For now, the code is in one
place and behavior does not change.
A next step (as described on the Bugzilla report) would be: remove the Delay
pref for self regs.
Test plan:
Run the unit test t/db_dependent/Members.t.
Test the two new parameters of cleanup_database.pl.
Verify if delete_expired_opac_registrations.pl still works.
Same for delete_unverified_opac_registrations.pl.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
. Fixed minor merge confict on UT & cleanup_database.pl
. UT ok
. The two deprecated scripts still work as before, with a warning
message.
. cleanup_database.pl do the deletion job, calling new C4::Members
function rather that doing it directly.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@unc.edu.ar>
Test plan:
0/ Create a new notice suggestions > TO_PROCESS
You can use the one defined in the other patch.
1/ Create a suggestion and link it to a fund
2/ Add a owner to this fund and make sure this patron has an email
address (the email address used should be the one defined in the
AutoEmailPrimaryAddress syspref).
3/ Execute the cronjob script with the -v and without the -c argument
4/ The output should tell you that an email will be sent
5/ Execute the cronjob script with the -v and with the -c argument
6/ Verify the notice is generated in the message_queue table and it is
correctly formatted.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Most of them were found and fixed using codespell.
Signed-off-by: Stefan Weil <sw@weilnetz.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This is major problem for plack installations with utf-8 encoding.
In this case, we are overriding CGI->new to setup utf-8 flag and
get correctly decoded $cgi->params, and reset syspref cache using
C4::Context->clear_syspref_cache
Test scenario:
1. under plack try to search with utf-8 charactes
2. try to find patron with utf-8 characters
Signed-off-by: Gaetan Boisson <gaetan.boisson@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch renames translation files for Bengali
language, from ben-* to bn-IN-*.
Also adds India as region
To test:
1) Apply the patch
2) Run updatedatabase
3) Install Bengali language
cd misc/translator
perl translate install bn-IN
enable
Check correct description
4) Create and install a fake Bengali variant
cd misc/translator
perl translate create bn-XX
perl translate install bn-XX
enable both variants
Check correct rendering of region
Results comply with expected test plan outcome. Signed off for bn-IN
Signed-off-by: Indranil Das Gupta (L2C2 Technologies) <indradg@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The output filename is notices_all_<date>.[html|csv|ods] if no
letter_code parameter is given.
If 1 is given: notices_<letter_code>_<date>.[html|csv|ods]
If 1+ are given:
noties_<letter_code1>_..._<letter_codeN>_<date>.[html|csv|ods]
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Fix --html without --letter_code
Fix --ods which was producing a 2 lines ods file
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
No more encoding issue with html file, no problem with csv|ods
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
If you choose to generate print notices for a specific letter code, the
generated files should be distinct.
The use case is: you want to process print notice for letter codes:
overdue1, overdue2 and overdue3.
The cronjobs will be:
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue1 --csv --ods --html --delimiter=";"
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue2 --csv --ods --html --delimiter=";"
perl misc/cronjobs/gather_print_notices.pl
/tmp --letter_code=overdue3 --csv --ods --html --delimiter=";"
without this patch, the 2 first files will be erased.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds:
- the ability to generate an ods file
From now you are able to generate a ods file for print notices.
You would like to generate a csv file and not a html file.
Test plan:
- same as previous patch but test the following parameters:
perl misc/cronjobs/gather_print_notices.pl /tmp/test --ods
--letter_code=OVERDUE -d=:
you should get an error because csv2ods is not installed.
Follow the installation instructions and try again the command.
A ods file should be generated in your /tmp/test directory.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds:
- the ability to generate a csv file instead of a html file.
- a letter_code parameter.
From now you are able to generate a csv file for print notices.
Imagine a template notice defined as:
cardnumber:patron:email:item
<<borrowers.cardnumber>>:<<borrowers.firstname>> <borrowers.surname>>:<<borrowers.email>>:<<items.barcode>>
You would like to generate a csv file and not a html file.
Test plan:
- define your ODUE notice for the print template as:
cardnumber:patron:email:item
<<borrowers.cardnumber>>:<<borrowers.firstname>> <<borrowers.surname>>:<<borrowers.email>>:<item><<items.barcode>></item>
- define overdues rules for a patron category
- check 2 items out using a due date in order to generate the overdue
notices
- check these 2 items in
- launch the overdue_notices script
- the message_queue table should now contain 2 new entries
- launch the gather_print_notices cronjob with the following parameters:
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
--letter_code=OVERDUE --letter_code=CHECKIN
you should get an error
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
you should get an error
perl misc/cronjobs/gather_print_notices.pl /tmp/test --csv
--letter_code=OVERDUE -d=:
will produce 1 csv file in your /tmp/test directory
- verify the csv file is correct and contain only 1 csv header column.
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
QA note: Keep in mind that you can use all placeholders for the
csv that you can use for the normal templates. If you normally
get the item information from <item></item> you need to use that.
If you can use <<item.barcode>> directly, you can also do so
in the csv.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch refactores and adds some good practices:
- use Modern::Perl
- use Pod::Usage
- add POD
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
koha-qa should be good :
OK misc/devel/coverage.pl
OK critic
OK forbidden patterns
OK pod
OK valid
OK C4/Installer/PerlDependencies.pm
OK critic
OK forbidden patterns
OK pod
OK valid
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Test plan not explicitly stated,
script run and generates a lot of data :)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Fixed a small conflict on PerlDependencies.pm
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
It's a script making a cover on all modules to see
which ones are not tested yet. It uses Devel::Cover
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Corrected specifield.
Updated usage statement on use of delimiter pref.
Clarified the CONDITION explanation somewhat..
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The original concern of bug 9892 was that this borrowers export script
cannot handle tabulation to separate columns.
With this patch, the delimiter preference is used as separator for the
output, to be consistent with others scripts.
This should be highlighted on the release, it can produce change in
behaviors.
Test plan:
Confirm that the 'delimiter' pref is used for the output, but you are able
to overwrite it with the 'separator' parameter
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works as expected, respect preference but is supeseeded by cmd line
No koha-qa errors
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Links in rss templates were hardcoded to library.org.nz. Should pass and
use the system's OPACBaseURL.
Tested and verified.
Signed-off-by: Eivin Giske Skaaren <eskaaren@yahoo.no>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch gives a partial solution for this problem.
It ignores strings like "[% something %]", but not
"[% IF ( value ) %][% value %][% END %]"
We get 100+ strings less to translate
To test:
1) Update translation files for your preferred language
2) Apply the patch
3) Update again
4) Compare, you must find 100+ strings removed from
translation files
eg.
-msgid "[% SEARCH_RESULT.biblionumber |url %]"
-msgid "[% accepteddate | $KohaDates %]"
-msgid "[% amountoutstanding | format('%.2f') %]"
-msgid "[% authtypetext |html %]"
-msgid "[% barcode_llx |html %]"
-msgid "[% barcode_lly |html %]"
-msgid "[% biblio.quantity.length ? biblio.quantity : 1 %]"
-msgid "[% billingdate | $KohaDates %]"
-msgid "[% borname |html %]"
...
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Verified all strings removed from the po files were
pure TT.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Many libraries would like to be able to import various types of files as
MARC records ( citations, csv files, etc ). We can add a new function to
the plugins system to allow that kind of behavior at a very custom
level.
Test Plan:
1) Ensure you have plugins enabled and configured correctly
2) Installed the attached version 2.00 of the Kitchen Sink plugin
3) Download the attached text file
4) Browse to "Stage MARC records for import"
5) Select the downloaded text file for staging
6) After uploading, you should see a new area "Transform file to MARC:",
select "Example Kitchen-Sink Plugin" from the pulldown menu
7) Click 'Stage for import"
8) Click 'Manage staged records"
9) You should now see two new MARC records!
Signed-off-by: Aleisha <aleishaamohia@hotmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described - interesting new feature.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch adds information about cron jobs performed and make it viewable
under Home > Tools > Logs ("Browse system logs")
To test:
Apply patch
- Got to system preferences and set 'CronjobLog' to: [Log] information from
cron jobs.
- Run some cron jobs
- Go to Home > Tools > Logs
- Verify that you have a selection 'Cron jobs' in drop-down 'Module'. Select it
with Action "All" and Submit.
- Output should show Date/time and info about Cron jobs
Rebased to work on top of Bug 6911 (conflict in viewlog.tt) /MV
Rebased after applying patch for Bug 6911 /MV
Conflicts resolved:
misc/cronjobs/overdue_notices.pl
misc/cronjobs/cleanup_database.pl
Signed-off-by: Frederic Demians <f.demians@tamil.fr>
- Merge both patches, and fix updatedatabase.pl
- Works as described. Provide intersting feedback from cronjob scripts.
--
Modified version taking in account syspref CronJobLog. Handling simplified by introducing a convenience sub cronlogaction in C4/Log.pm /MV
Amended to take in account comments #11, #12, #13 /MV
http://bugs.koha-community.org/show_bug.cgi?id=13899
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch implements separate PO files for
different MARC dialects.
It depends on correct filenames, i.e. it will build PO
files using files with/without "unimarc|normarc|marc21"
on their names.
Changes:
A) LangInstaller.pm
Added definitions to create or update xx-YY-{MARCFLAVOR}.po,
minor change to create and install procedure, and modification
of install procedure to handle multiple target dirs.
Updated documentation.
B) Standarization of filenames
STAFF po file is now xx-YY-staff-prog.po
MARC dialects po files are xx-YY-marc-{MARCFLAVOUR}.po
To test:
1) Update po files for your preferred language, ej. nn-NO
cd misc/translator
perl translate update nn-NO
2) Do some copying/renaming
cp po/nn-NO-i-staff-t-prog-v-3006000.po po/nn-NO-marc-UNIMARC.po
cp po/nn-NO-i-staff-t-prog-v-3006000.po po/nn-NO-marc-NORMARC.po
cp po/nn-NO-i-staff-t-prog-v-3006000.po po/nn-NO-marc-MARC21.po
mv po/nn-NO-i-staff-t-prog-v-3006000.po po/nn-NO-staff-prog.po
(most MARC dialect strings are on staff, so we use that as basis)
3) Apply the patch
4) Update again to fix translation files, verbose
perl translate update nn-NO -v
5) Install language, verbose, verify translations
perl translate install nn-NO -v
6) Create translation files
rm po/nn-NO*
perl translate create nn-NO
we must have this list:
po/nn-NO-marc-MARC21.po
po/nn-NO-marc-NORMARC.po
po/nn-NO-marc-UNIMARC.po
po/nn-NO-opac-bootstrap.po
po/nn-NO-pref.po
po/nn-NO-staff-help.po
po/nn-NO-staff-prog.po
Additional tests:
7) Number of msgids
7.a) Before patch and after upgrade, extract and count msgids
for i in $(ls po/nn-NO-*po); \
do msginit -i $i -o nn-old.po --no-translator --no-wrap --locale=nn_NO; \
egrep ^msgid nn-old.po >> old; \
done
sort old | uniq | tee s-old | wc -l > n-old
s-old: have all msgids
n-old: number of msgids
7.b) After patch and after creation of new files
Repeat procedure, diferent files (s-new, n-new)
7.c) Compare (diff s-old snew), they are the same
(save for a strange UNIMARC char in my case, but
it's present on corresponding PO file)
8) Installed dirs/files
8.a) List of EN dirs/files
cd koha-tmpl
find | egrep "/en/" > en
8.b) List of nn-NO dirs/files. After patch and language install
cd koha-tmpl
find | egrep "/nn-NO/" | sed 's|/nn-NO/|/en/|' > nn
8.c) Compare (diff en nn), they are the same
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Followed the steps outlined by Bernardo, and everything works as
expected. I think the most important points are that "perl translate
create nn-NO" produces the right files, and translating anything in
them, then doing "translate install" makes the translations show
up in the interface. The numbers msgids in the nn-NO correspond
well wit the number of msgids in other sets of .po files.
I bet y'all will be happy when you don't have to see the stupid
Norwegian strings when you translate! ;-)
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>