Modify /misc/translator/translate script in order to manage properly alternate
OPAC templates.
To test it with new 'ccsr' template:
- Create the .po file:
./translate create fr-FR
Result: existing .po files are not modified. A new fr-FR-opac-ccsr.po file is
available.
- Install all templates :
./translate install fr-FR
Result: A new koha-tmpl/opac-tmpl/ccsr/fr-FR directory contains translated
templates.
- Update .po files:
./translate update fr-FR
Result: fr-FR .po files are update, include fr-FR-opac-ccsr.po
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Simple addition of the koha user to the sample cron file. Might help non-tech
users to get things like incremental indexing to work.
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Current script check-url.pl checks URL found in 856$u by sending HTTP
requests, one by one. The next request can't be sent before the previous
one get a result, which can be very slow for dead URL. I propose a new
script which send multiple requests simultaneously which improve
drastically URL checking execution time.
This script is based on AnyEvent and AnyEvent::HTTP CPAN modules.
Add new dependencies AnyEvent & AnyEvent::HTTP.
See doc: perldoc check-url-quick.pl
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
This script batch deletes biblios which contain a biblionumber
present in file passed in parameter.
If one biblio has items, it is not deleted.
http://bugs.koha-community.org/show_bug.cgi?id=8674
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Created file with biblionumbers for bibs with and without items.
Only the bibs without items were deleted.
An eval { eval "require $module;" }; was replaced with
eval { eval { require $module; }; }; which is a no-op, meaning that
the linker was not getting loaded, and the catalog module was throwing
up a big nasty error every time someone tried to save a record with a
heading. This patch replaces the require with can_load from
Module::Load::Conditional, which is PBP-friendly, and offers equivalent
functionality.
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
The patches for bug 7001 removed the parseletter subroutine from
C4::Letters without updating the talking tech script to use the
new alternative. This patch rectifies that situation.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Expose authority import functionality to the command line import
scripts, and rename them from commit_biblios_file.pl and
stage_biblios_file.pl to commit_file.pl and stage_file.pl.
To test (note that these instructions assume you have a MARC21
installation and are using the provided sample file):
1. Find a file of authorities (a sample file with MARC21 authorities
is attached to bug 7475) and download it to your server
2. Stage the file using the following command (replace <filename> with
the name of the file you saved in step 1):
> misc/stage_file.pl --file <filename> --authorities
3. Note the batch number the script assigns to your batch
4. Commit the records using the following command (replace <batchnumber>
with the batch number you made note of in step 3):
> misc/commit_file.pl --batch-number <batchnumber>
5. Index the authorities Zebraqueue (or wait)
6. Confirm that the new authorities appear.
7. Create a matching rule with the following settings:
Code: AUTHTEST
Description: Personal name main entry
Match threshold: 999
Record type: Authority record
Search index: Heading-main
Score: 1000
Tag: 100
Subfields: a
Offset: 0
Length: 0
(note the ID of this matching rule)
8. Stage the authority file again, this time using the following
command:
> misc/stage_file.pl --file <filename> --authorities \
--match <matchingrule>
7. Revert the import with the following command:
> misc/commit_file.pl --batch-number <batchnumber> --revert
8. Index the authorities Zebraqueue (or wait)
9. Confirm that the records have been removed
10. Import an authority record with the Stage MARC/Manage staged MARC
tools in exactly the way you would for a bibliographic record,
but choose "Authority" instead of "Bibliographic" for the record
type.
Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>
Testing plan delivers as it should.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master 11 September 2012
* Add the code necessary to handle authorities with matching rules and
import batches.
* Update all the scripts that use the matcher and import batch code
to use the new API.
* Add authority records to the matching rules interface in the staff
client.
http://bugs.koha-community.org/show_bug.cgi?id=2060
Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master 11 September 2012
This should make saved reports more manageable.
Group/Subgroup hierarchy is stored in authorised_values,
categories REPORT_GROUP and REPORT_SUBGROUP, connected by
REPORT_SUBGROUP.lib_opac -> REPORT_GROUP.authorised_value
Database changes:
* authorised_values: expanded category to 16 chars
* created default set of REPORT_GROUP authorised values to match
hardcoded report areas
* reports_dictionary: replaced area int with report_area text, converted
values
* saved_sql: added report_area, report_group and report_subgroup;
report_area is not currently used, saved for the record
C4/Reports/Guided.pm:
* Replaced Area numeric values with the mnemonic codes
* get_report_areas(): returns hardcoded areas list
* created get_report_areas(): returns full hierarchy (groups with belonging
subgroups)
* save_report(): changed iterface, accepts fields hashref as input
* update_sql(): changed iterface, accepts id and fields hashref as input
* get_saved_reports():]
- join to authorised_values to pick group and subgroup name
- accept group and subgroup filter params
* get_saved_report():
- changed iterface, return record hashref
- join to authorised_values to pick group and subgroup name
* build_authorised_value_list(): new sub, moved code from
reports/guided_reports.pl
* Updated interfaces in:
cronjobs/runreport.pl, svc/report, opac/svc/report: get_saved_report()
reports/dictionary.pl: get_report_areas()
reports/guided_reports.pl
reports/guided_reports_start.tt:
* Reports list:
- added group/subgroup filter
- display area/group/subgroup for the reports
* Create report wizard:
- carry area to the end
- select group and subgroup when saving the report; group defaults to area,
useful when report groups match areas
* Update report and Create from SQL: added group/subgroup
* Amended reports/guided_reports.pl accordingly
Conflicts:
C4/Reports/Guided.pm
admin/authorised_values.pl
installer/data/mysql/kohastructure.sql
installer/data/mysql/updatedatabase.pl
koha-tmpl/intranet-tmpl/prog/en/modules/reports/dictionary.tmpl
koha-tmpl/intranet-tmpl/prog/en/modules/reports/guided_reports_start.tmpl
misc/cronjobs/runreport.pl
reports/dictionary.pl
reports/guided_reports.pl
Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
indexing not indexation
some minor grammatical changes
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Bug 8378 - <fine> syntax broken NFC and charset utf8
NFC normalize enqueued letters and add content-type charset=utf-8
This prevents utf8 codes from causing mysql to truncate the 'content'
from the point of certain codes, when stored in the message_queue table.
This was happenning with the currency symbol generated by
Locale::Currency:Format currency_format routine. NFC normalization
was only done on the attachment content with its content-type
containing "text", as in text/plain.
For emails AND attachments, the charset="utf-8" was added to the
content-type so mail clients would correctly iterate the utf8 codes,
thus preventing mobijake.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Ran through test plan before and after applying patch. Verified
that fine syntax does not work pre-patch and does work post-patch
for both direct emails and emails to the KohaAdminEmailAddress.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
On some Linux distributions like RedHat, Fedora, CentOS you can use SELinux for enhanced security. Among others, this involves file labeling (security context). In other distributions SELinux can be installed additionally.
The attached script lets you update and restore such labels on the perl scripts in a Koha installation.
July 18, 2012: Added opac/svc.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This patch adds the Koha::Indexer::RecordNormalizer and
Koha::Indexer::MARC::RecordNormalizer::EmbedSeeFromHeadings packages
to enable the inclusion of alternate forms of headings in bibliographic
searches. When the new syspref IncludeSeeFromInSearches is turned on
(default is off) rebuild_zebra.pl will insert see from headings from
authority records into bibliographic records when indexing, so that a
search on an obsolete term will turn up relevant records.
To test:
1) Enable IncludeSeeFromInSearches
2) Add a heading that has an alternate form to a record (for example,
"Cooking" has the alternate form "Cookery," if you have authority
records from LC)
3) Index the zebraqueue (or reindex if you haven't indexed your system
yet)
4) Confirm that if you search for "Cookery" you get the record you
just modified
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on master 5 August 2012
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on master 11 September 2012
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Also checked:
- Verified database update works correctly
- Checked system preference and its description
- Checked staff/opac detail pages with feature on/off
- Checked staff/opac search facets
- Downloaded and tested records in various formats
- Tried different searches for 'see from' entries of authorities
- Ran all unit tests
No problems found.
Create transport_cost table, added UseTransportCostMatrix syspref.
transport_cost table contains branch to branch transfer
costs. These are used for filling inter-branch hold transfers.
Moved GetHoldsQueueItems() from .pl to HoldsQueue.pm
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Prior patches to this bug had lots of comments like "I don't have a way to test this, so..."
In the OCLC Connexion web, when you choose the option to export to MARC, it'll *send* it, and
say, "Record Exported," but the web client does nothing whatever to confirm that the record
actually landed in Koha. That's a flaw in their software, but can be easily checked by
looking in Koha to see if an import batch got created. The desktop client is a little
smarter about this, but needed much more testing, also.
With this patch, both the client and web will actually work. With a config file and set up as
previously described, The record will be staged and/or imported, and the desktop client returns
a useful message about what happened, *and* the staff client URL to the record.
Oodles of gobs of bunches of thanks to Virginia Military Institute, for loaning me their OCLC
authorization credentials so this could be tested, as well as for great suggestions of cosmetic
improvements to the mechanism and output.
Suspended holds are showing up in both the holds queue and holds to pull reports.
This patch adds to the sql queries such that any hold that is suspended
is not selected.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
GET requests in benchmark_staff.pl test 6 do not work if a space character is part of the barcode. That seems highly unlikely to happen in barcodes, but is possible if no real barcodes are used but a substitute, like a copy of the call number. Space character needs to be changed to %20 for the request to work.
Also fixes a typo.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
The script is unusable.
The variable $date is not replaced with its content.
Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Adds the ability to use branches.* fields in digest notices and
have them be parsed correctly. Also adds a warning to the notices
editor for digests.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I like the idea to show a warning, but I would perhaps move
it under the message body label to be more obvious.
Patch works nicely, branch data of my user's home library
is displayed in the notice.
Replaced existing MaxFine syspref logic with overduefinescap.
Repurposed MaxFine to be the overall overdue limit for all items
overdue. Implemented new MaxFine logic in UpdateFine().
Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>
Tested according to Srdjan's test plan and everything worked like he said it would. I set fined equal to $2 and max fine equal to $1. When I ran the fines script for overdue items fines assessed were only $1.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This script prints to standard output what is returned by
GetMemberDetails in CSV format.
Exported fields can be specified with option -f. If no -f option is
specified, all fields are exported.
Signed-off-by: Gaetan Boisson <gaetan.boisson@biblibre.com>
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Amended with some code to better handle bad data.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
$OUTPUT being used but not being declared.
When trying to run this script I gat a nasty:
15:42 ~/koha.dev/koha-community (new/bug_8063 $%)$ ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 81.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 95.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 102.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 106.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 120.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 127.
Execution of ./misc/cronjobs/gather_print_notices.pl aborted due to compilation errors.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Before the patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at
[...]./misc/cronjobs/gather_print_notices.pl line 81.
./misc/cronjobs/gather_print_notices.pl had compilation errors.
With this patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
./misc/cronjobs/gather_print_notices.pl syntax OK
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Adds the option -s/--split to enable notices to be separated
into different files by borrower home library.
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Fixes the following things:
1. Sanitizes log output to prevent an attacker from using a specially
crafted POST to add extra lines to the log
2. Simplify a regular expression since "..file" cannot be used to
escape the current directory
3. Makes sure directories are consistent
4. Correct logic issues in misc/cronjobs/backup.sh
Thanks to Frere Sebastien Marie for catching these issues.
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch builds on work by Lars Wirzenius for the Koha packages.
To date, the only way for a Koha librarian to obtain a complete backup
of their system has been to log into the system via SSH (or FTP) to
download the mysqldump file. This patch makes it possible for
superlibrarians in properly configured systems to download night backups
via the staff client's Export tool.
Recognizing that this is functionality with potentially very grave
security implications, system administrators must manually enable these
features in the koha-conf.xml configuration file.
The following configuration settings have been added to the koha-conf.xml
file:
* backupdir => directory where backups should be stored.
* backup_db_via_tools => whether to allow superlibrarians to download
database backups via the Export tool. The default is disabled, and
there is no way -- by design -- to enable this option without manually
editing koha-conf.xml.
* backup_conf_via_tools => whether to allow superlibrarians to download
configuration backups via the Export tool (this may be applicable to
packages only). The default is disabled, and there is no way -- by
design -- to enable this option without manually editing koha-conf.xml.
This commit modifies the following scripts to make use of the new
backupdir configuration option:
* koha-dump and koha-run-backups in the Debian packages
* The sample backup script misc/cronjobs/backup.sh
Note that for security reasons, superlibrarians will not be allowed
to download files that are not owned by the web server's effective user.
This imposes a de facto dependency on ITK (for Apache) or running the
web server as the Koha user (as is done with Plack).
To test:
1. Apply patch.
2. Go to export page as a superlibrarian. Notice that no additional
export options appear because they have not been enabled.
3. Add <backupdir>$KOHADEV/var/spool</backup> to the <config> section
of your koha-conf.xml (note that you will need to adjust that so that
it is pointing at a logical directory).
4. Create the aforementioned directory.
5. Go to export page as a superlibrarian. Notice that no additional
export options appear because they have not been enabled.
6. Add <backup_db_via_tools>1</backup_db_via_tools> to the <config>
section of your koha-conf.xml
7. Go to the export page as a superlibrarian. Notice the new tab.
8. Go to the export page as a non-superlibrarian. Notice there is no
new tab.
9. Run: mysqldump -u koha -p koha | gzip > $BACKUPDIR/backup.sql.gz
(substituting appropriate user, password, and database name)
10. Go to the export page as a superlibrarian, and look at the "Export
database" tab. If you are running the web server as your Koha user,
and ran the above command as your Koha user, you should now see the
file listed as an option for download.
11. If you *did* see the file listed, change the ownership to something
else: sudo chown root:root $BACKUPDIR/backup.sql.gz
11a. Confirm that you no longer see the file listed when you look at the
"Export database" tab.
12. Change the ownership on the file to your web server (or Koha) user:
sudo chown www-data:www-data backup.sql.gz
13. Go to the export page as a superlibrarian, and look at the "Export
database" tab. You should now see backup.sql.gz listed.
14. Choose to download backup.sql.gz
15. Confirm that the downloaded file is what you were expecting.
If you are interested, you can repeat the above steps but replace
<backup_db_via_tools> with <backup_conf_via_tools>, and instead of
creating an sql file, create a tar file.
To test packaging: run koha-dump, confirm that it still creates a
usable backup.
------
This signoff contains two changes:
10-1. If no backup/conf files were present, then the message telling you
so doesn't appear and the download button does. Made them behave
correctly.
10-2. The test for a file existing required it to be owned by the
webserver UID. This change makes it so it only has to be readable.
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Fix syntax errors preventing the scripts misc/translator/text-extract2.pl
and misc/cronjobs/thirdparty/TalkingTech_itiva_inbound.pl from compiling.
Remove misc/migration_tools/build6xx.pl entirely since it refers to
columns that no longer exist in the Koha database, and has seemingly
had broken encoding since Koha switched from CVS to git (or before!).
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
SIPServer.pm requires that C4/SIP is added to its lib
path This has been done by passing this directory
to it via -I. By using FindBin it can set the path
for itself correctly. This will also work if the C4/SIP
directory tree is moved to a non-standard location
Removed the now redundant -I. from sip_run.sh
Added a variable to sip_run.sh for the koha tree to
highlight a problem with the script if you have multiple
directories in the PERL5LIB environment variable
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Small script that checks if each bibliorecord in the DB is properly indexed
use -h to learn more
(MT #6389)
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Complete rewrite of rebuild_zebra_sliced.zsh (renamed to .sh). Main
improvements are:
- both biblio and authority records are handled
- records are exported only once
It also add an option --skip-index to rebuild_zebra.pl that permit to
use rebuild_zebra.pl as an 'export only' script.
Description:
Index Koha records by chunks. It is useful when some record causes
errors and stop the indexation process. With this script, if indexation
of one chunk fails, chunk is splitted in 2 (or 3) chunks, and
indexation continue on these chunks.
rebuild_zebra.pl is called only once to export records.
Splitting and indexing is handled by this script (using yaz-marcdump and
zebraidx).
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Because updating the total issues count associated with a bibliographic
record on issue could cause a significant load on the server, this
commit adds the syspref UpdateTotalIssuesOnCirc (which defaults to OFF
to match existing behavior). The syspref has the following description:
Do/Do not update a bibliographic record's total issues count whenever
an item is issued (WARNING! This increases server load significantly;
if performance is a concern, use the update_totalissues.pl cron job
to update the total issues count).
Bug 6557: automatically increment totalissues
Adds the ability to automatically increment biblioitems.totalissues
whenever an item is issued.
To test:
1) Choose a record with at least one item that can circulate
2) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). Most likely there won't be any 942$0 at all
3) Enable UpdateTotalIssuesOnCirc
4) Check out the item you selected
5) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should now be one greater than before
6) Discharge the item
7) Disable UpdateTotalIssuesOnCirc
8) Check out the item you selected again
9) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should not have changed
Bug 6557: add script to update totalissues from stats
NAME
update_totalissues.pl
SYNOPSIS
update_totalissues.pl --use-stats
update_totalissues.pl --use-items
update_totalissues.pl --commit=1000
update_totalissues.pl --since='2012-01-01'
update_totalissues.pl --interval=30d
DESCRIPTION
This batch job populates bibliographic records' total issues count
based on historical issue statistics.
--help Prints this help
-v|--verbose
Provide verbose log information (list every bib modified).
--use-stats
Use the data in the statistics table for populating total
issues.
--use-items
Use items.issues data for populating total issues. Note that
issues data from the items table does not respect the --since
or --interval options, by definition. Also note that if both
--use-stats and --use-items are specified, the count of biblios
processed will be misleading.
-s|--since=DATE
Only process issues recorded in the statistics table since
DATE.
-i|--interval=S
Only process issues recorded in the statistics table in the
last N units of time. The interval should consist of a number
with a one-letter unit suffix. The valid suffixes are h
(hours), d (days), w (weeks), m (months), and y (years). The
default unit is days.
--incremental
Add the number of issues found in the statistics table to the
existing total issues count. Intended so that this script can
be used as a cron job to update popularity information during
low-usage periods. If neither --since or --interval are
specified, incremental mode will default to processing the
last twenty-four hours.
--commit=N
Commit the results to the database after every N records are
processed.
--test Only test the popularity population script.
WARNING
If the time on your database server does not match the time on your Koha
server you will need to take that into account, and probably use the
--since argument instead of the --interval argument for incremental
updating.
=== TESTING PLAN ===
NOTE: in order to test this script, you will need to have some sort of
circulation data already existing in your Koha installation.
1) Disable UpdateTotalIssuesOnCirc
2) Run: misc/cronjobs/update_totalissues.pl --use-items -t -v
3) If you have total checkout data in your item records (i.e. anything
in 952$l), you should see messages like "Processing bib 43 (1 issues)"
4) Choose one of the lines that shows more than 0 issues, and view the
record with that biblionumber in the staff client, choosing the "Items"
tab (moredetail.pl). Add up the "Total checkouts" listed for each item,
and confirm it matches what the script reported
5) Run: misc/cronjobs/update_totalissues.pl --use-stats -t -v
6) If you have any circulation statistics in your database (i.e. any
'issue' entries in your statistics table), you should see messages
like "Processing bib 43 (1 issues)";
7) Choose one of the lines and view the record with that biblionumber in
the staff client, choosing the "Items" tab (moredetail.pl). If you
count the number of checkouts listed in each item's checkout history,
the total should match what the script reported.
8) Check out an item
9) Run: misc/cronjobs/update_totalissues.pl --use-stats
--incremental --interval=1h -t -v
10) You should see one line reporting a single circ for the bib record
associated with the item you just checked out (there may be more if
you checked out any books in the hour prior to running these tests
11) If the results in steps 4, 7, and 10 match the predictions, the
script worked
This patch to Koha was sponsored by the Arcadia Public Library and the
Arcadia Public Library Foundation in honor of Jackie Faust-Moreno, late
director of the Arcadia Public Library.
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Tested this with my test data - numbers are correct and updated appropriately.
More importantly - if I do a popularity search, the most popular items *come up first*. Amazing.
The variable $i was being re-used and overwriting the necessary value that was being passed to a subroutine. Renaming $i to $j fixed it. I also added an extra safety check within parse_letter that would also have prevented this bug.
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This patch add a new parameter to overdue_notices.pl, that is a date.
If you add --date=YYYY-MM-DD when running overdue_notices, it will generate overdues as if you were on date provided
that's usefull if you want to relaunch an overdue calculation that has failed, of after changing your circ rules
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
In rebuild_zebra.pl, if we are in "unimarc" ("marcflavour" syspref), the sub "fix_unimarc_100" is called and checks if 100$a lenght is equal to 35.
If it is not the case, the sub inserts the localtime and more, so we loose the datas in reindexing.
The standart lenght is 36.
I have just changed 35 to 36.
Signed-off-by: Sophie Meynieux <sophie.meynieux@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
There is a flaw in C4::Members::Messaging::GetMessagingPreferences where
the system assumes that every transport will use the same letter. This
is not necessarily true. Even with the default preferences of just
'email' and 'sms', we should be able to have different letters
for each, as one has a maximum character length ( sms ) and one
does not. GetMessagingPreferences currently uses the letter code
of the last result of its query as the letter code for every transport type.
The returned data is a hashref with a key 'transport_types' that is
an array of transport_types this borrower has selected for the given
alert.
This commit modifies GetMessagingPreferences such that the the
'transport_types' array is now a hash where the name of the transport
type is now a key to the value of the letter code set for that transport
type.
It also modifies code calling GetMessagingPreferences where necessary,
and as a side benefit will correctly get the letter codes for email
and sms correctly, if they are defined differently.
http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
In use in production by two libraries: Middletown and Washoe
who give their sign off but don't have git to do so.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Implements support for Talking Tech I-tiva phone notification for OVERDUE, PREDUE and HOLD notifications.
Overdues respect triggers as configured for the patron's branch.
Predue and Holds notifications respect patron's messaging preference choices.
A new column for phone notification is added if the TalkingTechItivaPhoneNotification system preference is turned on
Record of phone messages being sent to patrons is added to the patron's Notices
tab; notice of success or failure can be retrieved from I-tiva.
See the TalkingTech.README for installation and set-up instructions.
Aside from the control system preference, and the necessary changes to Messaging Preferences
forms to make use of phone notifications, the bulk of the code resides in external
cronjobs.
TalkingTech_itiva_outbound.pl generates the Spec C file to send to I-tiva. Actual transmission
of the file must be handled by the system administrator.
TalkingTech_itiva_inbound.pl processes the received Results file from I-tiva. Getting the
file from I-tiva to Koha is the job of the system administrator, as well.
Both scripts have a --help option with full documentation.
The only necessary change to core Koha behavior is in C4::Letters::EnqueueLetter. The return
value was changed from 0 or 1 (successful addition of letter to message_queue or not), to the actual
insert ID of the letter. This was required by the outbound script to present a unique Transaction ID
for the notice added to the patron's record (so a 'sent' or 'failed' status could be updated). Since
the dbh and sth are not shared, and the last_insert_id() command is table-specific, this should be thread-safe.
No changes are necessary to any parts of Koha, as all usage of EnqueueLetter currently ignores the return value.
To Test:
1. Turn on TalkingTechItivaPhoneNotification system preference
2. Verify that 'phone' is now a valid notification option for patrons on both staff and OPAC side
3. Attempt to set a 'phone' preference for PREDUE or HOLD messaging; attempt should succeed
4. Set up the patron for notices to triggers:
a. include checked out items due in a range of days, including the value set up in their messaging preferences.
b. place several holds, some in position, others waiting for pickup, others in transit.
c. set the patron up to have overdues, overdue by a range of days that includes the delay values for
the patrons branch and categorycode
5. Run TalkingTech_itiva_outbound.pl --type=RESERVE --type=PREOVERDUE --type=OVERDUE --outfile=/tmp/talkingtechtest.csv
The resulting talkingtechtest.csv file should include all the items due on X days (where X is the patrons' preference),
and none of the ones due in other increments. Similarly, overdues messages should be added for each item due by a delay
value as configured; overdues of other numbers of days should be ignore. Holds that are waiting pick up or in transit should
have messages, those still pending should not.
Messages should be added to the patron's notices tab for each issue sent. Verify these messages exist, and all Notices
tokens are replaced with appropriate information.
Repeat, this time with 4c making use of the default branch overdue triggers, instead of branch-specific triggers.
To test the inbound script, create a CSV with rows in the format "<<Message_id>>","<<SUCCESS or FAIL>>"
Message ID should correspond to the final column of the talkingtechtest.csv file (the transaction id) for the message.
Primary Authorship: Ian Walls
Additional modifications: Kyle M Hall
http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Tested and in use in production by two public libraries : Middletown
and Washoe. Both have given their sign off, but don't have git to
actually sign off.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
(Severity: 5)
Note: Rebased on master 06/09/2012 by jcamins
Signed-off-by: Joy Nelson <joy@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
One consequence is that the -x and -a options are no longer
mutually exclusive.
Also, because of the way that the GRS-1 SGML filter works, if you're
indexing multiple documents, you can't just wrap them in a document
element, but the DOM filter *requires* it. Consequently, two
new config settings in koha-conf.xml are added to indicate the
Zebra filter in use so that the -x option of rebuild_zebra.pl
knows whether to wrap the exported records or not:
- bib_index_mode (defaults to 'grs1' if not specified)
- auth_index_mode (defaults to 'dom')
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>