Adds the ability to use branches.* fields in digest notices and
have them be parsed correctly. Also adds a warning to the notices
editor for digests.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I like the idea to show a warning, but I would perhaps move
it under the message body label to be more obvious.
Patch works nicely, branch data of my user's home library
is displayed in the notice.
Replaced existing MaxFine syspref logic with overduefinescap.
Repurposed MaxFine to be the overall overdue limit for all items
overdue. Implemented new MaxFine logic in UpdateFine().
Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>
Tested according to Srdjan's test plan and everything worked like he said it would. I set fined equal to $2 and max fine equal to $1. When I ran the fines script for overdue items fines assessed were only $1.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This script prints to standard output what is returned by
GetMemberDetails in CSV format.
Exported fields can be specified with option -f. If no -f option is
specified, all fields are exported.
Signed-off-by: Gaetan Boisson <gaetan.boisson@biblibre.com>
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Amended with some code to better handle bad data.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
$OUTPUT being used but not being declared.
When trying to run this script I gat a nasty:
15:42 ~/koha.dev/koha-community (new/bug_8063 $%)$ ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 81.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 95.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 102.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 106.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 120.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 127.
Execution of ./misc/cronjobs/gather_print_notices.pl aborted due to compilation errors.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Before the patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at
[...]./misc/cronjobs/gather_print_notices.pl line 81.
./misc/cronjobs/gather_print_notices.pl had compilation errors.
With this patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
./misc/cronjobs/gather_print_notices.pl syntax OK
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Adds the option -s/--split to enable notices to be separated
into different files by borrower home library.
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Fixes the following things:
1. Sanitizes log output to prevent an attacker from using a specially
crafted POST to add extra lines to the log
2. Simplify a regular expression since "..file" cannot be used to
escape the current directory
3. Makes sure directories are consistent
4. Correct logic issues in misc/cronjobs/backup.sh
Thanks to Frere Sebastien Marie for catching these issues.
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch builds on work by Lars Wirzenius for the Koha packages.
To date, the only way for a Koha librarian to obtain a complete backup
of their system has been to log into the system via SSH (or FTP) to
download the mysqldump file. This patch makes it possible for
superlibrarians in properly configured systems to download night backups
via the staff client's Export tool.
Recognizing that this is functionality with potentially very grave
security implications, system administrators must manually enable these
features in the koha-conf.xml configuration file.
The following configuration settings have been added to the koha-conf.xml
file:
* backupdir => directory where backups should be stored.
* backup_db_via_tools => whether to allow superlibrarians to download
database backups via the Export tool. The default is disabled, and
there is no way -- by design -- to enable this option without manually
editing koha-conf.xml.
* backup_conf_via_tools => whether to allow superlibrarians to download
configuration backups via the Export tool (this may be applicable to
packages only). The default is disabled, and there is no way -- by
design -- to enable this option without manually editing koha-conf.xml.
This commit modifies the following scripts to make use of the new
backupdir configuration option:
* koha-dump and koha-run-backups in the Debian packages
* The sample backup script misc/cronjobs/backup.sh
Note that for security reasons, superlibrarians will not be allowed
to download files that are not owned by the web server's effective user.
This imposes a de facto dependency on ITK (for Apache) or running the
web server as the Koha user (as is done with Plack).
To test:
1. Apply patch.
2. Go to export page as a superlibrarian. Notice that no additional
export options appear because they have not been enabled.
3. Add <backupdir>$KOHADEV/var/spool</backup> to the <config> section
of your koha-conf.xml (note that you will need to adjust that so that
it is pointing at a logical directory).
4. Create the aforementioned directory.
5. Go to export page as a superlibrarian. Notice that no additional
export options appear because they have not been enabled.
6. Add <backup_db_via_tools>1</backup_db_via_tools> to the <config>
section of your koha-conf.xml
7. Go to the export page as a superlibrarian. Notice the new tab.
8. Go to the export page as a non-superlibrarian. Notice there is no
new tab.
9. Run: mysqldump -u koha -p koha | gzip > $BACKUPDIR/backup.sql.gz
(substituting appropriate user, password, and database name)
10. Go to the export page as a superlibrarian, and look at the "Export
database" tab. If you are running the web server as your Koha user,
and ran the above command as your Koha user, you should now see the
file listed as an option for download.
11. If you *did* see the file listed, change the ownership to something
else: sudo chown root:root $BACKUPDIR/backup.sql.gz
11a. Confirm that you no longer see the file listed when you look at the
"Export database" tab.
12. Change the ownership on the file to your web server (or Koha) user:
sudo chown www-data:www-data backup.sql.gz
13. Go to the export page as a superlibrarian, and look at the "Export
database" tab. You should now see backup.sql.gz listed.
14. Choose to download backup.sql.gz
15. Confirm that the downloaded file is what you were expecting.
If you are interested, you can repeat the above steps but replace
<backup_db_via_tools> with <backup_conf_via_tools>, and instead of
creating an sql file, create a tar file.
To test packaging: run koha-dump, confirm that it still creates a
usable backup.
------
This signoff contains two changes:
10-1. If no backup/conf files were present, then the message telling you
so doesn't appear and the download button does. Made them behave
correctly.
10-2. The test for a file existing required it to be owned by the
webserver UID. This change makes it so it only has to be readable.
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Fix syntax errors preventing the scripts misc/translator/text-extract2.pl
and misc/cronjobs/thirdparty/TalkingTech_itiva_inbound.pl from compiling.
Remove misc/migration_tools/build6xx.pl entirely since it refers to
columns that no longer exist in the Koha database, and has seemingly
had broken encoding since Koha switched from CVS to git (or before!).
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
SIPServer.pm requires that C4/SIP is added to its lib
path This has been done by passing this directory
to it via -I. By using FindBin it can set the path
for itself correctly. This will also work if the C4/SIP
directory tree is moved to a non-standard location
Removed the now redundant -I. from sip_run.sh
Added a variable to sip_run.sh for the koha tree to
highlight a problem with the script if you have multiple
directories in the PERL5LIB environment variable
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Small script that checks if each bibliorecord in the DB is properly indexed
use -h to learn more
(MT #6389)
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Complete rewrite of rebuild_zebra_sliced.zsh (renamed to .sh). Main
improvements are:
- both biblio and authority records are handled
- records are exported only once
It also add an option --skip-index to rebuild_zebra.pl that permit to
use rebuild_zebra.pl as an 'export only' script.
Description:
Index Koha records by chunks. It is useful when some record causes
errors and stop the indexation process. With this script, if indexation
of one chunk fails, chunk is splitted in 2 (or 3) chunks, and
indexation continue on these chunks.
rebuild_zebra.pl is called only once to export records.
Splitting and indexing is handled by this script (using yaz-marcdump and
zebraidx).
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Because updating the total issues count associated with a bibliographic
record on issue could cause a significant load on the server, this
commit adds the syspref UpdateTotalIssuesOnCirc (which defaults to OFF
to match existing behavior). The syspref has the following description:
Do/Do not update a bibliographic record's total issues count whenever
an item is issued (WARNING! This increases server load significantly;
if performance is a concern, use the update_totalissues.pl cron job
to update the total issues count).
Bug 6557: automatically increment totalissues
Adds the ability to automatically increment biblioitems.totalissues
whenever an item is issued.
To test:
1) Choose a record with at least one item that can circulate
2) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). Most likely there won't be any 942$0 at all
3) Enable UpdateTotalIssuesOnCirc
4) Check out the item you selected
5) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should now be one greater than before
6) Discharge the item
7) Disable UpdateTotalIssuesOnCirc
8) Check out the item you selected again
9) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should not have changed
Bug 6557: add script to update totalissues from stats
NAME
update_totalissues.pl
SYNOPSIS
update_totalissues.pl --use-stats
update_totalissues.pl --use-items
update_totalissues.pl --commit=1000
update_totalissues.pl --since='2012-01-01'
update_totalissues.pl --interval=30d
DESCRIPTION
This batch job populates bibliographic records' total issues count
based on historical issue statistics.
--help Prints this help
-v|--verbose
Provide verbose log information (list every bib modified).
--use-stats
Use the data in the statistics table for populating total
issues.
--use-items
Use items.issues data for populating total issues. Note that
issues data from the items table does not respect the --since
or --interval options, by definition. Also note that if both
--use-stats and --use-items are specified, the count of biblios
processed will be misleading.
-s|--since=DATE
Only process issues recorded in the statistics table since
DATE.
-i|--interval=S
Only process issues recorded in the statistics table in the
last N units of time. The interval should consist of a number
with a one-letter unit suffix. The valid suffixes are h
(hours), d (days), w (weeks), m (months), and y (years). The
default unit is days.
--incremental
Add the number of issues found in the statistics table to the
existing total issues count. Intended so that this script can
be used as a cron job to update popularity information during
low-usage periods. If neither --since or --interval are
specified, incremental mode will default to processing the
last twenty-four hours.
--commit=N
Commit the results to the database after every N records are
processed.
--test Only test the popularity population script.
WARNING
If the time on your database server does not match the time on your Koha
server you will need to take that into account, and probably use the
--since argument instead of the --interval argument for incremental
updating.
=== TESTING PLAN ===
NOTE: in order to test this script, you will need to have some sort of
circulation data already existing in your Koha installation.
1) Disable UpdateTotalIssuesOnCirc
2) Run: misc/cronjobs/update_totalissues.pl --use-items -t -v
3) If you have total checkout data in your item records (i.e. anything
in 952$l), you should see messages like "Processing bib 43 (1 issues)"
4) Choose one of the lines that shows more than 0 issues, and view the
record with that biblionumber in the staff client, choosing the "Items"
tab (moredetail.pl). Add up the "Total checkouts" listed for each item,
and confirm it matches what the script reported
5) Run: misc/cronjobs/update_totalissues.pl --use-stats -t -v
6) If you have any circulation statistics in your database (i.e. any
'issue' entries in your statistics table), you should see messages
like "Processing bib 43 (1 issues)";
7) Choose one of the lines and view the record with that biblionumber in
the staff client, choosing the "Items" tab (moredetail.pl). If you
count the number of checkouts listed in each item's checkout history,
the total should match what the script reported.
8) Check out an item
9) Run: misc/cronjobs/update_totalissues.pl --use-stats
--incremental --interval=1h -t -v
10) You should see one line reporting a single circ for the bib record
associated with the item you just checked out (there may be more if
you checked out any books in the hour prior to running these tests
11) If the results in steps 4, 7, and 10 match the predictions, the
script worked
This patch to Koha was sponsored by the Arcadia Public Library and the
Arcadia Public Library Foundation in honor of Jackie Faust-Moreno, late
director of the Arcadia Public Library.
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Tested this with my test data - numbers are correct and updated appropriately.
More importantly - if I do a popularity search, the most popular items *come up first*. Amazing.
The variable $i was being re-used and overwriting the necessary value that was being passed to a subroutine. Renaming $i to $j fixed it. I also added an extra safety check within parse_letter that would also have prevented this bug.
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch add a new parameter to overdue_notices.pl, that is a date.
If you add --date=YYYY-MM-DD when running overdue_notices, it will generate overdues as if you were on date provided
that's usefull if you want to relaunch an overdue calculation that has failed, of after changing your circ rules
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
In rebuild_zebra.pl, if we are in "unimarc" ("marcflavour" syspref), the sub "fix_unimarc_100" is called and checks if 100$a lenght is equal to 35.
If it is not the case, the sub inserts the localtime and more, so we loose the datas in reindexing.
The standart lenght is 36.
I have just changed 35 to 36.
Signed-off-by: Sophie Meynieux <sophie.meynieux@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
There is a flaw in C4::Members::Messaging::GetMessagingPreferences where
the system assumes that every transport will use the same letter. This
is not necessarily true. Even with the default preferences of just
'email' and 'sms', we should be able to have different letters
for each, as one has a maximum character length ( sms ) and one
does not. GetMessagingPreferences currently uses the letter code
of the last result of its query as the letter code for every transport type.
The returned data is a hashref with a key 'transport_types' that is
an array of transport_types this borrower has selected for the given
alert.
This commit modifies GetMessagingPreferences such that the the
'transport_types' array is now a hash where the name of the transport
type is now a key to the value of the letter code set for that transport
type.
It also modifies code calling GetMessagingPreferences where necessary,
and as a side benefit will correctly get the letter codes for email
and sms correctly, if they are defined differently.
http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
In use in production by two libraries: Middletown and Washoe
who give their sign off but don't have git to do so.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Implements support for Talking Tech I-tiva phone notification for OVERDUE, PREDUE and HOLD notifications.
Overdues respect triggers as configured for the patron's branch.
Predue and Holds notifications respect patron's messaging preference choices.
A new column for phone notification is added if the TalkingTechItivaPhoneNotification system preference is turned on
Record of phone messages being sent to patrons is added to the patron's Notices
tab; notice of success or failure can be retrieved from I-tiva.
See the TalkingTech.README for installation and set-up instructions.
Aside from the control system preference, and the necessary changes to Messaging Preferences
forms to make use of phone notifications, the bulk of the code resides in external
cronjobs.
TalkingTech_itiva_outbound.pl generates the Spec C file to send to I-tiva. Actual transmission
of the file must be handled by the system administrator.
TalkingTech_itiva_inbound.pl processes the received Results file from I-tiva. Getting the
file from I-tiva to Koha is the job of the system administrator, as well.
Both scripts have a --help option with full documentation.
The only necessary change to core Koha behavior is in C4::Letters::EnqueueLetter. The return
value was changed from 0 or 1 (successful addition of letter to message_queue or not), to the actual
insert ID of the letter. This was required by the outbound script to present a unique Transaction ID
for the notice added to the patron's record (so a 'sent' or 'failed' status could be updated). Since
the dbh and sth are not shared, and the last_insert_id() command is table-specific, this should be thread-safe.
No changes are necessary to any parts of Koha, as all usage of EnqueueLetter currently ignores the return value.
To Test:
1. Turn on TalkingTechItivaPhoneNotification system preference
2. Verify that 'phone' is now a valid notification option for patrons on both staff and OPAC side
3. Attempt to set a 'phone' preference for PREDUE or HOLD messaging; attempt should succeed
4. Set up the patron for notices to triggers:
a. include checked out items due in a range of days, including the value set up in their messaging preferences.
b. place several holds, some in position, others waiting for pickup, others in transit.
c. set the patron up to have overdues, overdue by a range of days that includes the delay values for
the patrons branch and categorycode
5. Run TalkingTech_itiva_outbound.pl --type=RESERVE --type=PREOVERDUE --type=OVERDUE --outfile=/tmp/talkingtechtest.csv
The resulting talkingtechtest.csv file should include all the items due on X days (where X is the patrons' preference),
and none of the ones due in other increments. Similarly, overdues messages should be added for each item due by a delay
value as configured; overdues of other numbers of days should be ignore. Holds that are waiting pick up or in transit should
have messages, those still pending should not.
Messages should be added to the patron's notices tab for each issue sent. Verify these messages exist, and all Notices
tokens are replaced with appropriate information.
Repeat, this time with 4c making use of the default branch overdue triggers, instead of branch-specific triggers.
To test the inbound script, create a CSV with rows in the format "<<Message_id>>","<<SUCCESS or FAIL>>"
Message ID should correspond to the final column of the talkingtechtest.csv file (the transaction id) for the message.
Primary Authorship: Ian Walls
Additional modifications: Kyle M Hall
http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Tested and in use in production by two public libraries : Middletown
and Washoe. Both have given their sign off, but don't have git to
actually sign off.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
(Severity: 5)
Note: Rebased on master 06/09/2012 by jcamins
Signed-off-by: Joy Nelson <joy@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
One consequence is that the -x and -a options are no longer
mutually exclusive.
Also, because of the way that the GRS-1 SGML filter works, if you're
indexing multiple documents, you can't just wrap them in a document
element, but the DOM filter *requires* it. Consequently, two
new config settings in koha-conf.xml are added to indicate the
Zebra filter in use so that the -x option of rebuild_zebra.pl
knows whether to wrap the exported records or not:
- bib_index_mode (defaults to 'grs1' if not specified)
- auth_index_mode (defaults to 'dom')
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
misc/maintenance/make_zebra_dom_cfg_from_record_abs:
generate a DOM filter Zebra index config from a GRS-1 config
Given a Zebra record.abs file containing a set of index definitions for
Zebra's GRS-1 filter, write an equivalent DOM filter configuration.
To generate the XSLT that is to be used by Zebra, run something like
the following on the output of this utility:
xsltproc ZEBRA_CFG_DIR/xsl/koha-indexdefs-to-zebra.xsl \
biblio-koha-indexdefs.xml \
> ZEBRA_CFG_DIR/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
The above example assumes that the output of the program was named
biblio-koha-indexdefs.xsl.
This commit also introduces Koha::Indexer::Utils, a new package for
misceallenous routines that support Koha's indexing definitions.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Adds the necessary bits to enable DOM indexing for bib
records as an option during installation from source.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Simple command-line client which can authorize itself to Koha,
get MARC XML record based on biblio number and update record
This script can also be used as module using require "koha-svc.pl"
from other scripts which can implement MARC XML creation or parsing.
This is follow up version which now uses Content-type: text/xml
header when using POST method to be in sync with documentation at
http://wiki.koha-community.org/wiki/Koha_/svc/_HTTP_API
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Patch by Judit with a small change to the help wording.
Sponsored by CALYX information essentials.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
svc/import_bib:
* takes POST request with parameters in url and MARC XML as DATA
* pushes MARC XML to an impoort bach queue of type 'webservice'
* returns status and imported record XML
* is a drop-in replacement for svc/new_bib
misc/cronjobs/import_webservice_batch.pl:
* a cron job for processing impoort bach queues of type 'webservice'
* batches can also be processed through the UI
misc/bin/connexion_import_daemon.pl:
* a daemon that listens for OCLC Connexion requests and is compliant
with OCLC Gateway spec
* takes request with MARC XML
* takes import batch params from a config file and forwards the lot to
svc/import_bib
* returns status
ImportBatches:
* Added new import batch type of 'webservice'
* Changed interface to AddImportBatch() - now it takes a hashref
* Replaced batch_type = 'batch' with
batch_type IN ( 'batch', 'webservice' ) in some SELECTs
Signed-off-by: MJ Ray <mjr@phonecoop.coop>
Adds the ability to suspend reserves. The new system preference
AutoResumeSuspendedHolds enables the ability to set a date for
a suspended hold to automatically be resumed.
When a hold is suspended, it will continue to increase in priority
as the holds above it are fulfilled. If the first holds in line
to be filled are suspended, the first non-suspened hold in line
will be used when an item can fulfill a hold that has been placed.
http://bugs.koha-community.org/show_bug.cgi?id=7641
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Tested with the preference on and off:
1. placed several holds in the staff client
2. suspended some with a date
3. suspended some without a date
4. triggered hold message by checking in for hold with suspensions
5. the suspended hold was skipped as it should be
6. tested suspending holds in the OPAC w and w/out dates
7. ran the cron to clear suspensions with dates
All the above tests worked as expected. Signing off.
Add a tool to calculate static fine. For example, 7 days left = 1€ fixed fine
Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This adds the -dedupbarcode option that allows bulkmarkimport to erase
a barcode but keep the item of any items it finds with duplicate
barcodes.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
3 features:
- adds social network information in search results
- adds babeltheque data in opac-detail
- adds social network links in opac-detail too (google+, twitter, mail
and co.)
This patch deal with the -v flag that you can put on translate script.
If you run without -v, the process should be silent
if you run with -v, the process should be verbose
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
I've refactored your patch to handle verbosity directly via a translator
attribute, rather than with parameter which has to be send to each object call
method.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
New sql tables:
- oai_sets: contains the list of sets, described by a spec and a name
- oai_sets_descriptions: contains a list of descriptions for each set
- oai_sets_mappings: conditions on marc fields to match for biblio to be
in a set
- oai_sets_biblios: list of biblionumbers for each set
New admin page: allow to configure sets:
- Creation, deletion, modification of spec, name and descriptions
- Define mappings which will be used for building oai sets
Implements OAI Sets in opac/oai.pl:
- ListSets, ListIdentifiers, ListRecords, GetRecord
New script misc/migration_tools/build_oai_sets.pl:
- Retrieve marcxml from all biblios and test if they belong to defined
sets. The oai_sets_biblios table is then updated accordingly
New system preference OAI-PMH:AutoUpdateSets. If on, update sets
automatically when a biblio is created or updated.
Use OPACBaseURL in oai_dc xslt
When the longoverdu.pl script is run, and it marks an item as lost ( using
LostItem() ), if fails to remove the item from the borrower record. So, the
item is marked as lost, but is also still listed as checked out to the
borrower.
This commit adds the command line parameter --mark-returned. If used,
longoverdue.pl will remove lost items from the borrowers record.
Functionality will remain the same if it is not used.
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
http://bugs.koha-community.org/show_bug.cgi?id=7426
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Branches can have their own version of notices - added branchcode to
letter table.
Support html notices - added is_html to letter table.
Support for borrower attributes in templates.
GetPreparedletter() is the interface for compiling letters (notices).
Sysprefs for notice and slips stylesheets
Added TRANSFERSLIP to the letters
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Squashed patch incorporating all previous patches (there is no functional
change compared to the previous version of this patch, this patch merely
squashes the original patch and follow-up, and rebases on latest master).
=== TL;DR VERSION ===
*** Installation ***
1. Run installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt1
and installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt2
2. Make sure you copy the following files from kohaclone to koha-dev:
etc/zeradb/authorities/etc/bib1.att,
etc/zebradb/marc_defs/marc21/authorities/authority-koha-indexdefs.xml,
etc/zebradb/marc_defs/marc21/authorities/authority-zebra-indexdefs.xsl,
etc/zebradb/marc_defs/marc21/authorities/koha-indexdefs-to-zebra.xsl, and
etc/zebradb/marc_defs/unimarc/authorities/record.abs
3. Run misc/migration_tools/rebuild_zebra.pl -a -r
*** New sysprefs ***
* AutoCreateAuthorities
* CatalogModuleRelink
* LinkerModule
* LinkerOptions
* LinkerRelink
* LinkerKeepStale
*** Important notes ***
You must have rebuild_zebra processing the zebraqueue for bibs when testing this
patch.
=== DESCRIPTION ===
*** Cataloging module ***
* Added an additional box to the authority finder plugin for "Heading match,"
which consults not just the main entry but also See-from and See-also-from
headings.
* With this patch, the automatic authority linking will actually work properly
in the cataloging module. As Owen pointed out while testing the patch,
though, longtime users of Koha will not be expecting that. In keeping with
the principles of least surprise and maximum configurability, a new syspref,
CatalogModuleRelink makes it possible to disable authority relinking in the
cataloging module only (i.e. leaving it enabled for future runs of
link_bibs_to_authorities.pl). Note that though the default behavior matches
the current behavior of Koha, it does not match the intended behavior.
Libraries that want the intended behavior rather than the current behavior
will need to adjust the CatalogModuleRelink syspref.
*** misc/link_bibs_to_authorities.pl ***
Added the following options to the misc/link_bibs_to_authorities.pl script:
--auth-limit Only process those headings that match the authorities
matching the user-specified WHERE clause.
--bib-limit Only process those bib records that match the
user-specified WHERE clause.
--commit Commit the results to the database after every N records
are processed.
--link-report Display a report of all the headings that were processed.
Converted misc/link_bibs_to_authorities.pl to use POD.
Added a detailed report of headings that linked, did not link, and linked
in a "fuzzy" fashion (the exact semantics of fuzzy are up to the individual
linker modules) during the run.
*** C4::Linker ***
Implemented new C4::Linker functionality to make it possible to easily add
custom authority linker algorithms. Currently available linker options are:
* Default: retains the current behavior of only creating links when there is
an exact match to one and only one authority record; if the 'broader_headings'
option is enabled, it will try to link to headings to authority records for
broader headings by removing subfields from the end of the heading (NOTE:
test the results before enabling broader_headings in a production system
because its usefulness is very much dependent on individual sites' authority
files)
* First Match: based on Default, creates a link to the *first* authority
record that matches a given heading, even if there is more than one
authority record that matches
* Last Match: based on Default, creates a link to the *last* authority
record that matches a given heading, even if there is more than one record
that matches
The API for linker modules is very simple. All modules should implement the
following two functions:
<get_link ($field)> - return the authid for the authority that should be
linked to the provided MARC::Field object, and a boolean to indicate whether
the match is "fuzzy" (the semantics of "fuzzy" are up to the individual plugin).
In order to handle authority limits, get_link should always end with:
return $self->SUPER::_handle_auth_limit($authid), $fuzzy;
<flip_heading ($field)> - return a MARC::Field object with the heading flipped
to the preferred form. At present this routine is not used, and can be a stub.
Made the linking functionality use the SearchAuthorities in C4::AuthoritiesMarc
rather than SimpleSearch in C4::Search. Once C4::Search has been refactored,
SearchAuthorities should be rewritten to simply call into C4::Search. However,
at this time C4::Search cannot handle authority searching. Also fixed numerous
performance issues in SearchAuthorities and the Linker script:
* Correctly destroy ZOOM recordsets in SearchAuthorities when finished. If left
undestroyed, efficiency appears to approach O(log n^n)
* Add an optional $skipmetadata flag to SearchAuthorities that can be used to
avoid additional calls into Zebra when all that is wanted are authority
records and not statistics about their use
*** New sysprefs ***
* AutoCreateAuthorities - When this and BiblioAddsAuthorities are both turned
on, automatically create authority records for headings that don't have
any authority link when cataloging. When BiblioAddsAuthorities is on and
AutoCreateAuthorities is turned off, do not automatically generate authority
records, but allow the user to enter headings that don't match an existing
authority. When BiblioAddsAuthorities is off, this has no effect.
* CatalogModuleRelink - when turned on, the automatic linker will relink
headings when a record is saved in the cataloging module when LinkerRelink
is turned on, even if the headings were manually linked to a different
authority by the cataloger. When turned off (the default), the automatic
linker will not relink any headings that have already been linked when a
record is saved.
* LinkerModule - Chooses which linker module to use for matching headings
(current options are as described above in the section on linker options:
"Default," "FirstMatch," and "LastMatch")
* LinkerOptions - A pipe-separated list of options to set for the authority
linker (at the moment, the only option available is "broader_headings," which
is described below)
* LinkerRelink - When turned on, the linker will confirm the links for headings
that have previously been linked to an authority record when it runs. When
turned off, any heading with an existing link will be ignored.
* LinkerKeepStale - When turned on, the linker will never *delete* a link to an
authority record, though, depending on the value of LinkerRelink, it may
change the link.
*** Other changes ***
* Cleaned up authorities code by removing unused functions and adding
unimplemented functions and added some unit tests.
* This patch also modifies the authority indexing to remove trailing punctuation
from Match indexes.
* Replace the old BiblioAddAuthorities subroutines with calls into the new
C4::Linker routines.
* Add a simple implementation for C4::Heading::UNIMARC. (With thanks to F.
Demians, 2011.01.09) Correct C4::Heading::UNIMARC class loading. Create
biblio tag to authority types data structure at initialization rather than
querying DB.
* Ran perltidy on all changed code.
*** Linker Options ***
Enter "broader_headings" in LinkerOptions. With this option, the linker will
try to match the following heading as follows:
=600 10$aCamins-Esakov, Jared$xCoin collections$vCatalogs$vEarly works to
1800.
First: Camins-Esakov, Jared--Coin collections--Catalogs--Early works to 1800
Next: Camins-Esakov, Jared--Coin collections--Catalogs
Next: Camins-Esakov, Jared--Coin collections
Next: Camins-Esakov, Jared (matches! if a previous attempt had matched, it
would not have tried this)
This is probably relevant only to MARC21 and LCSH, but could potentially be of
great use to libraries that make heavy use of floating subdivisions.
=== TESTING PLAN ===
Note: all of these tests require that you have some authority records,
preferably for headings that actually appear in your bibliographic data. At
least one authority record must contain a "see from" reference (remember which
one contains this, as you'll need it for some of the tests). The number shown
in the "Used in" column in the authority module is populated using Zebra
searches of the bibliographic database, so you *must* have
rebuild_zebra.pl -b -z [-x] running in cron, or manually run it after running
the linker.
*** Testing the Heading match in the cataloging plugin ***
1. Create a new record, and open the cataloging plugin for an
authority-controlled field.
2. Search for an authority by entering the "see from" term in the Heading Match
box
3. Confirm that the appropriate heading shows up
4. Search for an authority by entering the preferred heading into the Main
entry or Main entry ($a only) box (i.e., repeat the procedure you usually
use for cataloging, whatever that may be)
5. Confirm that the appropriate heading shows up
*** Testing the cataloging interface ***
6. Turn off BiblioAddsAuthorities
7. Confirm that you cannot enter text directly in an authority-controlled field
8. Confirm that if you search for a heading using the authority control plugin
the heading is inserted (note, however, that this patch does not AND IS NOT
INTENDED TO fix the bugs in the authority plugin with duplicate subfields;
those are wholly out of scope- this check is for regressions)
9. Turn on BiblioAddsAuthorities and AutoCreateAuthorities
10. Confirm that you can enter text directly into an authority-controlled field,
and if you enter a heading that doesn't currently have an authority record,
an authority record stub is automatically created, and the heading you
entered linked
11. Confirm that if you enter a heading with only a subfield $a that fully
*matches* an existing heading (i.e. the existing heading has only
subfield $a populated), the authid for that heading is inserted into
subfield $9
12. Confirm that if you enter a heading with multiple subfields that *matches*
an existing heading, the authid for that heading is inserted into
subfield $9
13. Turn on BiblioAddsAuthorities and turn off AutoCreateAuthorities
14. Confirm that you can enter text directly into an authority-controlled field,
and if you enter a heading that doesn't currently have an authority record,
an authority record stub is *not* created
15. Confirm that if you enter a heading with only a subfield $a that *matches*
an existing heading, the authid for that heading is inserted into
subfield $9
16. Confirm that if you enter a heading with multiple subfields that *matches*
an existing heading, the authid for that heading is inserted into
subfield $9
17. Create a record and link an authority record to an authorized field using
the authority plugin.
18. Save the record. Ensure that the heading is linked to the appropriate
authority.
19. Open the record. Change the heading manually to something else, leaving
the link. Save the record.
20. Ensure that the heading remains linked to that same authority.
21. Change CatalogModuleRelink to "on."
22. Open the record. Use the authority plugin to link that heading to the
same authority record you did earlier.
23. Save the record. Ensure that the heading is linked to the appropriate
authority.
24. Open the record. Change the heading manually to something else, leaving
the link. Save the record.
25. Ensure that the heading is no longer linked to the old authority record.
*** Testing link_bibs_to_authorities.pl ***
26. Set LinkerModule to "Default," turn on LinkerRelink and
BiblioAddsAuthorities, and turn AutoCreateAuthorities and
LinkerKeepStale off
27. Edit one bib record so that an authority controlled field that has already
been linked (i.e. has data in $9) has a heading that does not match any
authority record in your database
28. Run misc/link_bibs_to_authorities.pl --link-report --verbose --test (you may
want to pipe the output into less or a file, as the result is quite a lot of
information)
29. Look over the report to see if the headings that you have authority records
for report being matched, that the heading you modified in step 2 is
reported as "unlinked," and confirm that no changes were actually made to
the database (to check this, look at the bib record you edited earlier, and
check that the authid in the field you edited hasn't changed)
30. Run misc/link_bibs_to_authorities.pl --link-report --verbose (you may want
to pipe the output into less or a file, as the result is quite a lot of
information)
31. Check that the heading you modified has been unlinked
32. Change the modified heading back to whatever it was, but don't use the
authority control plugin to populate $9
33. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
of the record you've been editing)
34. Confirm that the heading has been linked to the correct authority record
35. Turn LinkerKeepStale on
36. Change that heading to something else
37. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
of the record you've been editing)
38. Confirm that the $9 has not changed
39. Turn LinkerKeepStale off
40. Create two authorities with the same heading
41. Run misc/migration_tools/rebuild_zebra.pl -a -z
42. Enter that heading into the bibliographic record you are working with
43. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
of the record you've been editing)
44. Confirm that the heading has not been linked
45. Change LinkerModule to "FirstMatch"
46. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
of the record you've been editing)
47. Confirm that the heading has been linked to the first authority record it
matches
48. Change LinkerModule to "LastMatch"
49. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
of the record you've been editing)
50. Confirm that the heading has been linked to the second authority record it
matches
51. Run misc/link_bibs_to_authorities.pl --link-report --verbose
--auth-limit="authid=${AUTH}" (replacing ${AUTH} with an authid)
52. Confirm that only that heading is displayed in the report, and only those
bibs with that heading have been changed
If all those things worked, good news! You're ready to sign off on the patch
for bug 7284.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master and squashed follow-up, 16 February 2012
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master, 21 February 2012
Signed-off-by: schuster <dschust1@gmail.com>
To test:
git checkout 3.6.x
cd misc/translator
./tmpl_process3.pl update -i ../../koha-tmpl/opac-tmpl/prog/en/ \
-s ./po/fr-FR-i-opac-t-prog-v-3006000.po -r
po/fr-FR-i-opac-t-prog-v-3006000.po contains broken diacritics for NORMARC
strings for example
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
fixes regression caused by bug 6752
This patch reimplement a feature that is on biblibre/master for Koha-community/master
It adds 4 parameters:
* offset = the offset of record. Say 1000 to start rebuilding at the 1000th record of your database
* length = how many records to export. Say 400 to export only 400 records
* where = add a where clause to rebuild only a given itemtype, or anything you want to filter on
Another improvement resulting from offset & length limit is the rebuild_zebra_sliced.zsh
that will be submitted in another patch.
rebuild_zebra_sliced will slice your all database in small chunks, and, if something went wrong for a given slice, will slice the slice, and repeat, until you reach a slice size of 1, showing which record is wrong in your database.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Removed mention of -l option for limiting number of items exported, as requested
by QA manager. This can be re-added in a later patch.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
The scripts run with the caveat that you must specify the path to SIPconfig.xml. The followup previously attached should take care of that issue.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
- Calculates updates date based on the upper age limit defined in the patron categories.
- Allows libraries to work on all branches or only one.
- Allows libraries to specify which Adult patron category to update child categories to.
- Allows libraries to specify a single Child patron category to update to an adult category.
- Has a test mode to display what transforms would be done on the database without executing the changes.
Includes improved help, copyright statement, and uses warnings. Also incorporates Paul's suggestions regarding --help and --man, changes -fromcat and -tocat to -f and -t, and removes a redundant update to categorycode (per M. deRooy).
To test:
Create two patron categories, a child and an adult category. Make sure they
have an upper age limit.
Create or modify some patrons in multiple branches that fall into the category
of "my birthdate is less than or equal to today's date minus the upper age
limit"
1. Run the script with no flags - nothing should happen, it will suggest you try the --help flag.
2. Run the script with the --help flag - you should see the help
3. Run the script with the -f=<child category> -t=<adult category> -v -n - should show you results from all branches but take no action and tell you what its computations are.
4. Run the script with the -f=<child category> -t=<adult category> -b=<branchcode> -v -n - should show you results from your specified branch, but take no action and tell you what it's computations are.
5. Run the script with the -f=<child category> -t=<adult category> -v -b=<branchcode> - should show you the computations and tell you how many patrons were modified in your single branch. It will not show you the information on which patrons were updated.
6. Run the script with the -f=<child category> -t=<adult category> -v - should show you the computations and tell you how many patrons were modified across all branches.
7. Run the script without the -v flag, if you care what the non-verbose output looks like.
Fixed in this revision: Known limitation - if you give it an unknown tocat, it will fail with a rather ugly error.
Minor changes to the commit message to reflect new longopts (which I missed the last time)
There is more this script could do, please feel free to take it and run.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Bug 7157 : Follow up, fixing FSF address
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
use encoding(UTF-8) rather than utf-8 for stricter
encoding
Marking output as ':utf8' only flags the data as utf8
using :encoding(UTF-8) also checks it as valid utf-8
see binmode in perlfunc for more details
In accordance with the robustness principle input
filehandles have not been changed as code may make
the undocumented assumption that invalid utf-8 is present
in the imput
Fixes errors reported by t/00-testcritic.t
Where feasable some filehandles have been made lexical rather than
reusing global filehandle vars
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Currently, -v option resets Zebra log output to default system values.
This produce amount of log specified in system defaults which is usually
too low for debugging.
This change explicitly forces all Zebra log output which create much more
chatter so it triggers with verbosity level 2
Test scenario:
1. pick koha site to reindex
2. use -v -v options to rebuild_zebra.pl to see additional output
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Verified help corrections and loglevel 2 output vs. loglevel 1 output. No issues found.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch lets cleanup_database also purge older records from the (five) import tables and the action_logs table.
Two new command line parameters are introduced: --import and --logs.
If no number of days is specified for --zebraqueue, --import or --logs, it defaults to 30 days, 60 days resp. 180 days.
I did not add a default for --sessdays, because this parameter cannot be seen separately from parameter --sessions.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Adds new parameters and code, does not change existing behaviour
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch makes it possible to add fields from the issues table to overdue notices.
Template used for testing:
<item>"<<biblio.title>>" by <<biblio.author>>, <<items.itemcallnumber>>, Barcode: <<items.barcode>> , Checkout date: <<issues.issuedate>>, Due date: <<issues.date_due>> Fine: <fine>GBP</fine> Checkout date from items: <<items.onloan>></item>
Possible improvements:
- Dates are not formatted according to dateformat system preference
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch fixes the SQL request giving the list of borrowers
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
selection of items to be listed in an overdue notice included
both limits (upper and lower). So items with an overdue equal
to a limit appeared on both notices. This patch fix this,
including lower limit and excluding upper limit for the selection.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Adds --maxdays command line flag to longoverdue.pl to allow the user to specify
their own $endrange value. Because sometimes 366 isn't enough!
Also adds help documentation for both --quiet and --maxdays params
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Simple change, works well
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
notifyMailsOp.pl is deprecated : is was written by and for Ouest Provence (thus the OP) and is not used anymore.
it's probably not working anymore.
Removing the script
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
checked that others values for the separator are OK = it is (space, semicolon,... see syspref "delimiter")
Pass charge_fee = yes wherever is LostItem() called, which effectively
means tha there's no change.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Sometimes zebra needs a tmp dir in order to work. This ensures that it
is created both by koha-create-dirs in the packages, and by
rebuild_zebra when it runs.
--
tested ok, signing off
Signed-off-by: Mason James <mtj@kohaaloha.com>
In advanced_notices.pl you can return the number of due items using <<count>>
flag.
If you use this flag in overdue_notices, it does not work, no number is
displayed.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This benchmark_staff.pl script is based on the previous benchmark_circulation.pl script
As it does not test only circulation, the renaming was necessary.
The script has many enhancement compared to the benchmark_circulation.pl one
The benchmark_staff will run a benchmark on the following pages:
* mainpage.pl
* catalogue/detail.pl
* catalogue/search.pl
* members/member.pl (search on a name)
* members/member.pl (search on a 1st letter)
* circulation/circulation.pl and return.pl (check-out and check-in)
* all those steps at the same time
run the script without any parameter to get syntax
- adds command line options for encoding, defaulting to utf8
- options match options availablen in the stage marc records form
of the staff interface
- activates warnings
- adds copyright statement
To test:
Import records with diacritics using the stage_biblios_file.pl
Records can be imported into the catalog using the staff interface
or the commit_biblios_file.pl script.
Signed-off-by: Ulrich Kleiber <ulrich.kleiber@bsz-bw.de>
Successfully tested with default encoding utf8.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Some code coming from BibLibre has been lost in the process of inclusion in
3.4. The result is that fine in days does not work at all (you can setup rules,
but it does nothing)
Step to reproduce:
- Koha > Admin > circ rules > set 1 day fine every day of overdue for default
rule
- Issue a book return date last week
- check-in the book => no debarment is set
The following patch will fix all of those problems by :
* updating borrowers.debarred to a date field (instead of tinyint). It contains
the limit of the debarment
* changing API of DebarMember and UpdateBorrowerDebarred to pass a date
* display debarrdate where applicable. Note that a debarrdate of 31/12/9999 is
considered as unlimited and not displayed
* added a debarrcomment, usefull to explain why a patron is debarred (this is
independant from debarrdate changes and can be used when placing an unlimited
debarment too)
[2011-05-12] F. Demians. It works as described. And I can confirm this
functionality is impatiently awaited by French libraries since one year. Thanks
BibLibre for the good work and for contributing this code.
Bug 6328 Followup--update DB structure
Thanks Katrin.
Bug 6328: make comment a textbox / fix debar by notice trigger
Debarring by notice triggers was broken, because the new function
expects a date as second parameter.
The comment field in patron account details was a very long text field.
Patch changes it to be a textbox instead.
Bug 6328: Lift debarment leaves patron account
'Lift debarment' redirects to an empty circulation page.
BZ6328 follow-up 3
Fixes comment 23 from Fernando L. Canizo : when the patron was debarred and debar removed
he still could not check-out.
The changes in the IsMemberBlocked (that were on biblibre/master) were lost somewhere
The sub was still checking for old_issues instead of calling CheckBorrowerDebarred
to get a debardate if applicable
Note : this bug was appearing only is you had issuing rules defined for itemtype/categorycode/branch.
Seemed to work if you had only default rules. That's probably why it hadn't been spotted before
BZ6328 follow-up 4
Comments fron Zeno Tajoli: The patch is OK and I sign-off it. Two little changes done on
installer/data/mysql/kohastructure.sql and installer/data/mysql/updatedatabase.pl
Signed-off-by: koha <koha@kohabase.localdomain>
This patch allow to handle properly items containing extended characters and
send valid XML records to zebraidx
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This is a fairly hacky solution, a counter patch would be more than
welcome
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Updated, translated and installed German po files after applying this patch.
No problems found.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Improving the cleanup_database.pl script in two aspects:
1) In some cases CGI::Session seems to place quotes around the atime and ctime
data in the a_session field. Two regexps now take this into account.
2) If the --sessdays parameter is used, the --sessions parameter is now
implicitly enabled too.
With thanks to Ian Bays and Tom Hanstra.
Signed-off-by: Ian Bays <ian.bays@ptfs-europe.com>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Adds purging completed entries from need_merge_authorities table.
If you set dontmerge to ON, you need to periodically remove records.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This little script, given a --days numeric parameter, will anonymise checkouts
before that many days ago. Useful for sites that want to automatically do this on
a periodic (cronnable) basis.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Pref MergeAuthoritiesOnUpdate does not exist; should be dontmerge
(AuthoritiesMarc.pm).
Instead of folder modified_authorities, now introducing a table for this
purpose: need_merge_authorities. This eliminates several permissions and
security issues. This change applies to AuthoritiesMarc.pm and
merge_authority.pl.
POD lines added for ModAuthority. Deprecated parameter $merge removed.
Test this patch by applying the db revision first from the second patch.
August 4, 2011: Rebased.
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Thanks Marcel. It works as advertised. Both modes are functionnal
(back):
- Immediate with dontmerge=0: After modifying an authority record, its
linked biblios are immediately modified. This isn't the case in 3.4.5.
- Delayed with dontmerge=1: After modifying an authority record, its
linked biblios are not modified. But an entry is added to
need_merge_authority new table and 'merge_authorities.pl -b' script
updates biblios.
Comment: need_merge_authority, like zebraqueue, should be cleared
from time to time.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Call LostItem() whenever item is lost.
LostItem() new arg - mark returned.
Disabled Lost Status on catalogue item edit.
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
For follow up we need to explain how to hide the 952$1 (lost) from
the framework by putting it in the 'ignore' tab.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch fixes an issue whereby biblios with many items (often > 500) would index,
but not the biblionumber itself, resulting in search results with a) inaccurate item counts
and b) no biblionumber to use in the link to the details page. This is due to Net::Z3950::ZOOM not providing
a mechanism for specifying different connection attributes; the maximumRecordSize ZOOM connection attribute,
if not specified, defaults to 1MB, which is less than the size of a MARC record with many, many 952 fields. Since
it is unlikely we can fix Net::Z3950::ZOOM in a timely fashion, this patch aims to build a workaround on the Koha end.
This patch changes EmbedItemsInMarcBiblio to use append_fields instead of insert_ordered_fields,
so the 999$c will come before the item records. It's VERY unlikely we will encounter more than 1MB of biblio-level MARC
content, as this would break the ISO-2709 standard by a large factor.
To this end, it also moves the fix_biblio_ids portion of get_corrected_marc_record out of rebuild_zebra.pl,
and makes it a part of GetMarcBiblio (right before EmbedItemsInMarcBiblio, so the 952s still come last). fix_biblio_ids
is kept as a subroutine for the deletion portion of rebuild_zebra.pl, which still uses it.
It also uses the subroutine parameter in GetMarcBiblio to do the EmbedItemsInMarcBiblio action, rather than having
rebuild_zebra.pl perform it on the itemless record returned from GetMarcBiblio. Simpler and cleaner that way.
To verify bug issue:
1. Find a biblio with over 700 items (or enough that the resulting MARCXML is greater than 1MB)
2. search for this biblio (in a search that would return multiple results, not just this title). You should get the title in
the results list
3. attempt to click the link to this biblio's details page; the biblionumber should be blank, leading to a 404
To test solution:
1. Apply patch
2. modify the biblio slightly (click the 005 for example) and save
OR manually add the biblio to zebraqueue for reindexing
3. after rebuild_zebra.pl -z -b -x runs, use the same search as above. The title should still appear.
4. click the link, and find yourself on the biblio detail page as desired
Signed-off-by: D Ruth Bavousett <ruth@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Display links to parent biblios, show linked items in holdings, allow holds on
linked items. This uses MARC to maintain relationships.
Sponsored by the Mississippi Department of Archives and History and RapidRadio
Solution. Originally developed by Savitra Sirohi and Amit Gupta at OSSLabs, with
UNIMARC support added by Zeno Tajoli. Commits squashed and merge conflicts
resolved by Chris Cormack from Catalyst. Respect for NORMARC and some small
framework portability fixes made by Jared Camins-Esakov of C & P Bibliography
Services.
IMPORTANT NOTE: A bug in the 773 coding for MARC21 was corrected from the
original OSS Labs code. The 773s generated by the pre-release code did not have
the first indicator set to '0', which means that they were not supposed to
display. Going forward, the first indicator will be set correctly, but existing
records created with this code will no longer appear (they appeared before only
due to another bug). To correct this, you could globally (or, to make sure you
only modify records created with the Analytics tool, for records with 773$0)
change the first indicator of the 773 from blank to '0'.
== Background ==
An analytic record for an item is a more detailed, monographic biblio for an
item attached to a serial record . This is often used for special issues of a
journal that are released as books on their own (assigned an ISBN, as well as an
ISSN/volume/issue). It is important for researchers to be able to search for
these items both as issues of the serial, and as monographs. It is equally
important for the library to not have duplicate item records for the item in
question to have to keep synchronized.
== Establishing relationships ==
Analytical records are connected to items belonging to parent or host
bibliographic records. This can be accomplished by:
* From an analytical bibliographic record linking to an host item by providing
the item barcode as input
* From a host item by using option "analyze", this creates a new empty
bibliographic record with field 773 (MARC21) populated
* Running a new CLI script that establishes a relationship between the
analytical record and the host item identified by the barcode in the
analytical record's 773$o (MARC21)
== Connecting Records ==
The relationships are maintained in the MARC records, we have not used database
tables at all.
== MARC Representation ==
In MARC21/NORMARC we have used:
* 773$9 to store the Koha item number of the host item
* 773$0 to store the Koha biblio number of the host bibliographic record
The above fields are used to display the relationships in various screens in the
OPAC and the staff interface. Additionally, when populating field 773 with host
item's details, we have used following MARC 21 mapping:
* 'a' <= 100/110/111 $a (author main)
* 'b' <= 250$a (edition)
* 'd' <= 260$a, 260$b, 260$c (place, publisher, year)
* 'o' <= barcode
* 't' <= 245$a (title)
* 'w' <= (003)001 --> if no 001 is available, we can populate biblionumber
* 'x' <= 022$a (issn)
* 'z' <= 020$a (isbn)
In UNIMARC, this code uses:
* 461$9 to store the Koha item number of the host item
* 461$0 to store the Koha biblio number of the host bibliographic record
When populating field 461 in UNIMARC, the following mapping is used:
* 't' <= 200$a (title)
== Treatment of Holds ==
A key requirement was to allow holds to be placed on host items from the
analytical record. We have accomplished this by allowing holds on specific
copies only. Biblio level holds are not allowed. This ensures that holds are
placed on specific items that are relevant to the analytical record.
== Deleting host items with linked analytical records ==
As we have not used database tables to maintain relationships, we had to use
search to find out if any linked analytical records are present. If 1 or more
analytical are present, we do not allow deletion of items. This is similar to
what we see when we try to delete authority records.
== Importing analytical records ==
Analytical records can be imported using bulkmarcimport or the GUI tools. The
new CLI script can be executed after the import to establish relationships with
host items. The script will establish relationships using the host item's
barcode, the barcode must be present in 773$o of the analytical record.
== What if there are two or more copies of the host item? ==
The current design will require that there be two host (773) fields, one for
each copy.
== What if there is no barcode available for the host item? ==
It is still possible to establish a relationship, by populating 773$9 with the
host's item number. However the CLI script uses barcode in 773$o to establish
relationships so it won't work where barcodes are unavailable. Also from an
analytical record, it is possible to establish a relationship to a host item by
providing the barcode as input, this option will not be available as well.
Commits that added the following features were squashed by Chris Cormack (this
is not a list of every commit):
* Display links to host records from biblio detail screens
* Support for UNIMARC, respecting the system preference 'marcflavor'
* Support holds from the OPAC
* Ability to link to items belong to host records from a analytical record
* Display items belonging to host records in the moredetail page
* Ability to edit items belonging to host records, also ability to delink from
them
* Move get host items code into a C4 routine, also calling the new routine in
related perl scripts
* Move host field population to a C4 routine, all changes in pl files to call
new routine
* Allow only specific copy holds for analytical records plus changes to use new
C4 routines
* Support for holds on items linked via host records
* Storing bibnumber and itemnumber in subfields 0 and 9, plus other mapping
changes
* New command line script that establishes relationships between analytical
records and host items and bibs. The script looks for host field (MARC21 773)
in records, and based on barcode in subfield 'o' populates host bibnumber in
subfield '0' and host itemnumber in subfield '9'. The script can be run after
an import of analytical records, it can also be run in the crontab to maintain
the relationships
* Ability to create analytical records from items, to view linked analytics, and
prevent deletion of items that have linked analytics
* New template for catalogue/detail.pl (NOTE: not a new template file, just a
new way of displaying analytics), template displays linked analytics and
allows creation of analytical records
* New zebra index for item number in host fields. This index will be used to
display links to analytical records from host records
* Display title of host record instead of the phrase host record
* Using detail.tmpl for analytics tab instead of a new template file
* Improved qualification info prepration in Prephostmarcfield
* Check for linked analytics before deleting item
* Display link to host record and more meaningful anchor text for edit item link
* Analytical record: Unimarc index in record.abs and help in
create_analytical_rel.pl
* Adding a sys pref that controls display of options to create analytical
relationships
* Add host entry in XSLT stylesheet in staff item detail
* Added host record support to OPAC detail XSLT
* Adding 773$0 and 773$9 to all frameworks
* Adding 773 subfields 0 and 9 to default marc framework via updatedatabase.pl
* Display create analytics and used in links in catalog detail
* Fixed problem where analytical records not showing in OPAC search results
because GetMarcBiblio now needs a flag to add item records
* Fixed problem where analytics count was set to 1 for all records, not just
those with analytics
* Fixed catalogue detail page not to show analytics counts if count is 0
Conflicts:
installer/data/mysql/updatedatabase.pl
koha-tmpl/intranet-tmpl/prog/en/modules/cataloguing/addbiblio.tt
kohaversion.pl
Co-author: Savitra Sirohi <savitra.sirohi@osslabs.biz>
Co-author: Zeno Tajoli <tajoli@cilea.it>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Note: this script really needs a rewrite, but this patch does fix up the
things it's supposed to fix up.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This will re-add any leading spaces, so formatting is not messed
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
- all examples from the bug report are fixed now
- verified system preferences are still translated
- verified xslt displays are still translated
- verified javascript alerts are still translated
- verified switching languages works
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
This patch fixes an error with bugfix 5236; any item-table information in the PREDUE letter
was being parsed with biblionumber as the key, instead of itemnumber. Unless itemnumber == biblionumber,
this will ALWAYS return the wrong information.
I've moved the item table parsing line to within the if ($itemnumber) conditional check, and replaced
the key to use the itemnumber instead of the biblionumber.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch solves the situation that news is in another language than
the Koha interface AND makes that themelanguage routine is always called
the same way in order to prevent mixed display.
It fixes also a bug related to language preselection based on web
browser prefered language.
September 9: Adjusted with input of Frederic Demians.
Septembre 10: Avoid circular dependency, as pointed by Chris Cormack.
Templates related functions are moved from C4::Output to C4::Templates
Signed-off-by: Alex Arnaud <alex.arnaud@biblibre.com>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This test validate Template Toolkit (TT) Koha files.
For the time being an unique validation is done: Test if TT files
contain TT directive within HTML tag. For example:
<li[% IF
This kind of constuction MUST be avoided because it break Koha
translation process.
This patch transform also translation specific modules into C4 modules
in order to be able to use them in test case:
C4::TTPaser
C4::TmplToken
C4::TmplTokenType
This patch is a Perl adaptation of a Haskell script from Frère Sébastien
Marie.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Notes on testing:
- translate install de-DE - worked ok
- translate update de-DE > translate install de-DE - worked ok
- running the test xt/tt_valid.t - worked ok and pointed out lots of problems.
Found no problems.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Adding support for including fields from the Issues table in advanced due notices.
This is primarily to allow the inclusion of the due date for each item in the
advanced due notice, but will allow the inclusion of any field from the ISSUES
table.
This also adds code to exclude timestamp fields as these are irrelevant to the
end user in this context.
Note: Documentation should be updated to reflect the availability of the additional
fields in all circulation notices.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
To test:
Verify that there is a listed job for purge_suggestions.pl in the crontab.example
Bug 6478 - adding --days support and help verbiage to the cronjob.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This both adds a bit of a failsafe to get_raw_biblio, and prevents
records that have been deleted from being updated by the same instance
of rebuild_zebra.
Minor amendment to remove duplication of 6433
Signed-off-by: MJ Ray <mjr@phonecoop.coop>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Display a warning for strings that don't have the same count of %s placeholders
that the English original strings. Don't warn for not translated string and
'fuzzy' string because those strings are not installed, and this is the
translator responsability to examine them.
Based on Frère Sébastien Marie work.
Signed-off-by: Frère Sébastien Marie <semarie-koha@latrappe.fr>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This squash commit takes the two patches from Srdjan and adds a minor fix to work with
template toolkit variable renames.
Squashed commit of the following:
commit 2cab669d1fd072600942e1e6fbf3378944255a68
Author: Ian Walls <ian.walls@bywatersolutions.com>
Date: Thu Jun 2 14:08:40 2011 -0400
Bug 5929: Fix advanced_notices to use new template-toolkit compatible message names
Uses 'item_due' and 'advance_notice' for advance notices names; letters do not send otherwise
commit caded04702d5eebd0f63a3b93cdddce28257f092
Author: Srdjan Jankovic <srdjan@catalyst.net.nz>
Date: Tue Mar 29 12:38:49 2011 +1300
wr77490 (bug 5929): removed debugging leftover
commit 1944de0de40f937b1d8748500f24a119390db3f0
Author: Srdjan Jankovic <srdjan@catalyst.net.nz>
Date: Tue Mar 22 19:05:23 2011 +1300
wr77490 (bug 5929): use branch email in preference for due notices
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch adds a progress indicator to remove_items_from_biblioitems.pl, with
the user option of silencing it with --silent.
Signed-off-by: Liz Rea <lrea@nekls.org>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
For Translator Manager (myself...). A translation installation may fail because
translator hasn't enter properly string parameters (%s). With this patch, the
message ID is displayed not only the parameter.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Fixes bug where a bib record imported by bulkmarcimport.pl
could become unindexable by ensuring that ModBiblioMarc()
is always called by bulkmarcimport.pl to finalize saving the
bib record (as it was initially created by AddBiblio with the
defer_marc_save option).
Also introduces a utility routine, C4::Biblio::_strip_item_fields.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Now that item data is inserted into the MARC bib record
at the point of indexing, sync_items_in_marc_bib.pl is
no longer needed.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Claire Hernandez <claire.hernandez@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Adds a new routine, C4::Biblio::EmbedItemsInMarcBiblio, to
embed the items in the bib record when necessary:
* cataloging/additem.pl
* rebuild_zebra.pl
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Claire Hernandez <claire.hernandez@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
* add copyright statement
* make --run switch work; this is now required to cause
the update to run
* improve help text
* make remove_items_from_biblioitems.pl executable
Signed-off-by: Galen Charlton <gmcharlt@esilibrary.com>
Signed-off-by: Claire Hernandez <claire.hernandez@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This is a squash of four patches by Henri-Damien Laurent
starting work on removing the copy of item record information
in the 9XX field of bibliographic records. The reason
for doing this is primarily to improve performance, in particular,
the expense of having to add/modify the bib record whenever an
item changes. Now, whenever an item changes, the bib record is
put in the queue to be reindexed; when the bib is indexed, the 9XX
fields are inserted into the version of the bib that Zebra indexes.
Since rebuild_zebra.pl runs in a separate process, the processing of the
bib record will not delay (e.g.) circulation.
As part of upgrading to 3.4, the following batch script should be run:
misc/maintenance/remove_items_from_biblioitems.pl --run
This should be followed by a complete reindexing of the bib records, e.g.,
misc/migration_tools/rebuild_zebra.pl -b -r
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Claire Hernandez <claire.hernandez@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Import script shouldn't remove an information present in entering biblio
records. With this patch, by default, ISBN are not cleared anymore.
[2011.04.12] Rebased on HEAD
DOCUMENTATION: There is a new paramater --isbn|--noisbn
Signed-off-by: Colin Campbell <colin.campbell@ptfs-europe.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch adds support for using a gmail account as an SMTP server.
It includes a basic HOWTO.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Remove some unnecessary checks when check of error is
sufficient. Make the order in some cases more logical
Should remove some possibilities of runtime warning noise.
Although some calls belong to the 'Nothing could
ever go wrong' school have added some warnings
Signed-off-by: Christophe Croullebois <christophe.croullebois@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch changes the holds queue build process in order to
require that items not be damaged in order to appear in the
holds queue report.
Revision adds a check for the AllowHoldsOnDamagedItems preference to
determine whether a damaged item should be included in the holds
queue report.
Signed-off-by: Christophe Croullebois <christophe.croullebois@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch creates a --quiet flag for longoverdue that will squelch
the summary at the end of the run. It also silences an unnecessary
warn in C4/Acounts.pm
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Add scripts that call ModBiblio and ModItem on all or some of the records in a
given catalog. For use when an upgrade changes the behavior of ModBiblio or
ModItem, and the change needs to be retroactively applied to records already in
the system. Usage is as follows:
misc/maintenance/touch_all_[biblios|items].pl [-v] [--where=STRING]
When invoked with a --where argument, the scripts will only modify those biblios
or items that match.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Variable names for systempreferences are now case sensitive.
Changing check_sysprefs in relation with this change
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This makes sending reports via e-mail with runreport.pl work properly. It also
adds a --format option to allow the user to select between text, html, csv, and
tsv. At the moment text is not implemented, and falls back to tsv, but that is
still more readable than the HTML that used to be produced.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch is a new script that delete suggestion that have be processed by librarians.
It take on argument, it's a number of days to keep suggestions. Suggestions olders than TODAY - $days will be deleted.
This script should be used to purge suggestions and clean the table in intranet.
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
It is now possible to specify a command line argument --since so that the
borrowers-force-messaging-defaults script only changes patrons created starting
on a certain day. If the optional argument is not specified, the script applies
to all borrowers.
Signed-off-by: Jared Camins-Esakov <jcamins@bywatersolutions.com>
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
If the EnhancedMessagingPreferences syspref is enabled after borrowers
have been created in the DB, those borrowers won't have messaging
transport preferences default values as defined for their borrower
category, even no transport preferences at all. So you would have to
modify each borrower one by one if you would like to send them 'Hold
Filled' notice for example.
I propose this script to create transport preferences for all existing
borrowers and set them to default values defined for the category they
belong to.
[DOC] Should be documented somewhere.
Signed-off-by: Jared Camins-Esakov <jcamins@bywatersolutions.com>
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
New script misc/admin/koha-preferences - Allows getting and setting of
sysprefs one-at-a-time or in bulk. More info:
misc/admin/koha-preferences help
or:
misc/admin/koha-preferences manual
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Reimplements support for -r, as well for -reset
Signed-off-by: D Ruth Bavousett <ruth@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
<items.content> in overdue notices prints issuedate instead of duedate by default.
This patch changes default to issues.date_due.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
In opac/staff templates .po files, we have comment lines contextualizing
extracted text in templates. Path to template are absolute. For example, we can
have:
#: /home/katrin/kohaclone/koha-tmpl/intranet-tmpl/prog/en/...
modules/cataloguing/addbiblio.tmpl:585
The first part of the pathname is useless. With this patch, we just keep
relative path to tempalte from Koha template main directory. The above example
becomes:
#: intranet-tmpl/prog/en/modules/cataloguing/addbiblio.tmpl:585
To be applied on [3.2]
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
[1] Update all .po files in once with this command:
translate update
[2] For sysprefs, quoted text wasn't properly retrieved from .po file and so
quoted strings wasn't translatable.
[3] The install process (translate -p install de-DE) was rewriting syspref
.po file which isn't required anymore.
MUST be applied to [3.2] to get all syspref proper translation.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
If the zebra server directories don't exist, zebra will spit the dummy.
This makes rebuild_zebra.pl smart enough to create them if they're not
there. If that fails, it'll scream loudly so you know zebra isn't
reindexing.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Adds a conditional around the opening and closing of STYLESHEET, testing on whether the $stylesheet variable is set
or not.
Signed-off-by: Nicole Engard <nengard@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This was due to passing off the biblionumber to GetFines rather than the itemnumber.
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
More podchecker cleanups to eliminate warnings / errors
Signed-off-by: Andrew Elwell <Andrew.Elwell@gmail.com>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
working through the master branch to eliminate all
podchecker warnings/errors
Actual improvement to the quality of the POD will
come later (hopefully with assistance of others)
Signed-off-by: Andrew Elwell <Andrew.Elwell@gmail.com>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Thx to Brooke for helping with the wiki.
This is the last patch.
We will have to change some more links, after translate.koha.org was moved.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Includes a bit of cleanup of the enhancement patch
for bug 5074 - adding comments about old and new
behavior isn't necessary for such small changes.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Currently, the misc/cronjobs script cleanup_database truncates the session table (deleting all records, including active sessions).
With an additional parameter sessdays, this behavior could be changed or (perhaps better) extended. If the parameter sessdays is passed along with a number of days, the script only deletes older session records. This is accomplished by examining the values of lasttime, atime or ctime in the record.
So, calling the script like:
./cleanup_database.pl -v -sessions -sessdays 7
will only delete sessions records older than 7 days. The "old style" call
./cleanup_database.pl -v -sessions
still works too and truncates the table as before.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
tmpl_process is patched to handle properly specific XML directives.
UNIMARC XSL files are modified to gain knowledge of HTML entity
which isn't the case by default. It may be necessary to do the same
thing for MARC21 XSL.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
By default the packages now set up the cron jobs to handle things like
overdues and email etc. By default, email is off, 'koha-email-enable'
and 'koha-email-disable' can manage this.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This prevents it leaving files lying around in /tmp
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
my $formatted_fine = currency_format("$1", "$fine", FMT_SYMBOL);
is already utf-8.
Resend with additional change, removing 2 lines (no strict, use strict) from the code.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
-csv was not working as advertised.
This feature is now working as it is forecast
Signed-off-by: Henri-Damien LAURENT <henridamien.laurent@biblibre.com>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
The wrong variable was used to select the number of days-until-due; this fixes it to use
the borrower's preference setting.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
* 'bug2505_patches' of git://git.catalyst.net.nz/koha: (24 commits)
Bug 2505 - use strict and warnings in sax_parser_test
Bug 2505 - enable warnings for link_bibs_to_authorities
Bug 2505 - add strict and warnings to perlmodule_ls
Bug 2505 - add strict and warnings to check_sysprefs
Bug 2505 - Add commented use warnings where missing in *.t
Bug 2505 - Add commented use warnings where missing in *.pm
Bug 2505 - Add commented use warnings where missing in the cataloguing/ directory
Bug 2505 - Add commented use warnings where missing in the misc/ directory
Bug 2505 - Add commented use warnings where missing in the tools/ directory
Bug 2505 - Add commented use warnings where missing in the installer/ directory
Bug 2505 - Add commented use warnings where missing in the rotating_collections/ directory
Bug 2505 - Add commented use warnings where missing in the C4/ directory
Bug 2505 - Add commented use warnings where missing in the serials/ directory
Bug 2505 - Add commented use warnings where missing in the catalogue/ directory
Bug 2505 - Add commented use warnings where missing in the sms/ directory
Bug 2505 - Add commented use warnings where missing in the opac/ directory
Bug 2505 - Add commented use warnings where missing in the virtualshelves/ directory
Bug 2505 - Add commented use warnings where missing in the suggestion/ directory
Bug 2505 - Add commented use warnings where missing in the admin/ directory
Bug 2505 - Add commented use warnings where missing in the circ/ directory
...
Conflicts:
C4/Auth_with_cas.pm
acqui/supplier.pl
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This is done by saving the notices in the message_queue table with
type 'print'. The notices are generated from a notice named
HOLD_PRINT. At the end of the day, they are dumped to an HTML file and
marked as sent by a new cronjob.
This setup is intended to be temporary; modules/batch/ shouldn't be around
forever.
Mandatory SQL:
INSERT INTO message_transport_types (message_transport_type) values ('print');
Modified overdue_notices.pl to support output of html for printing.
The -html option will e-mail notices to those with e-mail, and output
html to print for borrowers without e-mail.
When system preference PrintNoticesMaxLines is set to a positive
integer, it will limit the number of items on the notice to that
number, and append a message to the end telling the borrower to
check his or her account for the full listing of items. This only
affects print notices, not emailed ones.
Mandatory SQL:
INSERT INTO `systempreferences`
( `variable` , `value` , `options` , `explanation` , `type` ) i
VALUES ( 'PrintNoticesMaxLines', '0', '', i
'If greater than 0, sets the maximum number of lines an overdue notice will print. If the number of items is greater than this number, the notice will end with a warning asking the borrower to check their online account for a full list of overdue items.',
'Integer' );
Conflicts:
installer/data/mysql/en/mandatory/sysprefs.sql
installer/data/mysql/fr-FR/1-Obligatoire/unimarc_standard_systemprefs.sql
misc/cronjobs/overdue_notices.pl
* export C4::Reserves::CancelExpiredReserves
* rename misc/cronjobs/cancel_expired_reserves.pl
to misc/cronjobs/holds/cancel_expired_holds.pl
* added cancel_expired_holds.pl to example crontab
* fix staff crash if AllowHoldDateInFuture is on
* expirationdate is now nullable instead of relying
on 0000-00-00
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This is a much improved re-implementation of the reserves updates from dev_week.
Less new code has been added, and more existing functions are used instead of adding new ones.
The 'Lock Hold' function has been removed due to it not working as intended.
[RM note for documentation: this adds the following features:
* ability to specify an expiration date for a hold request
when placing it via the staff interface or OPAC
* daily batch job to cancel expired holds
* nice interface to change the priority of hold
requests for a bib in the staff interface]
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Add the use_memcached question to rewrite-config.pl and all three questions
to koha-install-log. So that we don't have to keep answering these questions
when we upgrade/install with the --prev-install-log option.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Note: overdue_notices.pl really needs to be completely re-written.
The script does not process all fields advertised in tools/letter.pl
This patch adds code to process all fields advertised as well as any
from the items table.
It also adds two additional tags for use in the letter templates:
<item></item> which should enclose all fields from the biblio, biblioitems,
and items tables.
<fine></fine> which should be enclosed by the item tag and should
enclose a currency identifier per ISO 4217. If this tag is present with
a proper identifier, the fine for that item will be displayed in the
proper currency format. Note: ISO 4217 changes from time to time therefore
all currencies may not be supported. If you find one that is not
supported, please file a bug with the Locale::Currency::Format author
Tan D Nguyen <tnguyen at cpan doe org>.
An example of the implimentation of these two tags in a notice template
might be like:
The following item(s) is/are currently overdue:
<item>"<<biblio.title>>" by <<biblio.author>>, <<items.itemcallnumber>>, Barcode: <<items.barcode>> Fine: <fine>GBP</fine></item>
Which, assuming two items were overdue, would result in a notice like:
The following item(s) is/are currently overdue:
"A Short History of Western Civilization" by Harrison, John B, 909.09821 H2451, Barcode: 08030003 Fine: £3.50
"History of Western Civilization" by Hayes, Carlton Joseph Huntley, 909.09821 H3261 v.1, Barcode: 08030004 Fine: £3.50
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Based on David Schuster improvement patch.
For David:
- To send the output into an HTML file, there is no need to add a
paramater to this script, just redirect to a file:
check-url --html --host-prot=http://koha-pro.mylib.org \\
> /usr/local/koha/koha-tmpl/badurls.html
- If you want as a result a table with alternate rows, use CSS and
JavaScript. For example, with jQuery (found with google):
<style type="text/css">
table {width:400px; border:1px solid blue;}
.oddrow {background-color:#E5E5E5;}
</style>
<script type="text/javascript"
src="http://code.jquery.com/jquery-latest.min.js"></script>
<script type="text/javascript">
$(function(){
$("table.tiger-stripe tr:even").addClass("oddrow");
});
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
- Modify install-code.pl to install prefs with templates
- Update .po preferences file in order to get last 'en' preferences
For 3.4, I will do a script which will handle together the 3 .po file:
opac, intranet and preferences (and .tt files if necessary). Don't do it
now, since it will change files naming convention.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Add to previous patch (and replace it):
- update function
- translation of tab subsection labels
- GPL2
- fix last minute bug catched by Galen
Cut-and-past of pref-trans script perldoc:
NAME
pref-trans - Handle preferences translation
SYNOPSYS
pref-trans init fr-FR
pref-trans update fr-FR
pref-trans install fr-FR
USAGE
pref-trans init lang
Create a .po file in po directory, named lang-pref.po. This
file contains text to translate extracted from .pref files.
pref-trans update lang
Update a .po file in po directory, named lang-pref.po. This
file contains new text to translate extracted from .pref files.
Previous translated text are kept. There is a minor bug, which can’t
be fixed due to preferences data struture: preferences tab
subsection labels are lost when updating .po file.
pref-trans install lang
Use lang-pref.po file to translate the english version of
preferences files and copy those files in the appropriate
directory.
DESCRIPTION
Koha preferences are stored in a data structure found in
koha-tmpl/intranet-tmpl/en/module/admin/preferences/ files.
Depending of user language, other files are used. This script extract text
from ’en’ preference files, and put them in one .po file. This .po
file can be updated. When completed, a .po file can be applied to create
localized versions of preferences templates.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Updated references to Portuguese translation in Release Notes
for the 3.0.4 Release:
- For OPAC, removed pt-PT from the list of Partial translations
(because it already correctly appears in the list of complete ones)
- For Staff client, added pt-PT to the list of Partial translations
(cherry picked from commit f4eb63c728)
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
list involved in release notes for koha-devel were wrong
Thanks ricardo
(cherry picked from commit 0521828140)
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>