Commit graph

1496 commits

Author SHA1 Message Date
44c3657cfc Bug 9001: Remove Zebraqueue_daemon
[This patch was split out from tcohen's excellent patches for bug 8519
 --jcamins]

It removes the obsolete zebraqueue_daemon.pl and koha-zebraqueue-ctl.sh
obsolete scripts too.

Several files are modified to address te removal/addition of these files.

I didn't run the install procedure as I was working on my laptop with a dev
setup, just set the symlinks. Now fixed things as proposed by wajasu
on comment #4. Any other suggestion please let me know.

Tested to work on an up-to-date Ubuntu 12.04.

Asked by wajasu, remove remaining obsolete zebraqueue stuff.

Sponsored-by: Universidad Nacional de Córdoba

Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Passed-QA-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-11-06 16:41:11 -05:00
7f628b12d4 Bug 8253 - fix fine doubling, when upgrading from 3.6 to 3.8 - Add de-duplication script
This script will remove these duplicate fines. To use, repeatably run this
script until there are no more duplicates in the database.
(fix_accountlines_rmdupfines_bug8253.pl)

Duplicate fines would happen if you upgraded to a 3.8 version that does not have the
bug8253 patch, and the misc/cronjobs/fines.pl is run. In 3.8 an upgrade to a more
granular date/time was not addressed for pre-existing fine entries and this script
removes the resulting duplicates. It also intelligently preserves the amount outstanding
for payments already applied. If your version already had the bug8253 patch at the time
of the upgrade, duplicate fines should not have been generatd.

Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-28 19:12:30 +01:00
6a4cbbba60 Bug 8633 Manage OPAC alternate templates
Modify /misc/translator/translate script in order to manage properly alternate
OPAC templates.

To test it with new 'ccsr' template:

- Create the .po file:

  ./translate create fr-FR

  Result: existing .po files are not modified. A new fr-FR-opac-ccsr.po file is
  available.

- Install all templates :

  ./translate install fr-FR

  Result: A new koha-tmpl/opac-tmpl/ccsr/fr-FR directory contains translated
  templates.

- Update .po files:

  ./translate update fr-FR

  Result: fr-FR .po files are update, include fr-FR-opac-ccsr.po

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-11 11:15:07 +02:00
d5cf28406b Bug 8741 - crontab.example missing username, fails in some systems
Simple addition of the koha user to the sample cron file. Might help non-tech
users to get things like incremental indexing to work.

Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-09 17:55:12 +02:00
Paul Poulain
f88f11b4f8 Bug 7963 follow-up: die nicely if AnyEvent libraries not installed
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-05 11:44:56 +02:00
1e1422e8ba Bug 7963 Parallel HTTP requests when checking URLs
Current script check-url.pl checks URL found in 856$u by sending HTTP
requests, one by one. The next request can't be sent before the previous
one get a result, which can be very slow for dead URL. I propose a new
script which send multiple requests simultaneously which improve
drastically URL checking execution time.

This script is based on AnyEvent and AnyEvent::HTTP CPAN modules.
Add new dependencies AnyEvent & AnyEvent::HTTP.

See doc: perldoc check-url-quick.pl

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
2012-10-05 11:44:52 +02:00
Paul Poulain
ca80293756 Bug 8674 follow-up Fix perlcritic error 2012-10-02 17:48:03 +02:00
Jonathan Druart
7251a9339d Bug 8674: Followup: Add POD for misc/batchdeletebiblios.pl
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-02 17:23:17 +02:00
Jonathan Druart
c349336725 Bug 8674: Adds script batchdeletebiblios
This script batch deletes biblios which contain a biblionumber
present in file passed in parameter.
If one biblio has items, it is not deleted.

http://bugs.koha-community.org/show_bug.cgi?id=8674

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>

Created file with biblionumbers for bibs with and without items.
Only the bibs without items were deleted.
2012-10-02 17:23:15 +02:00
Jared Camins-Esakov
813c744008 Bug 8818: make sure we load modules before using them
An eval { eval "require $module;" }; was replaced with
eval { eval { require $module; }; }; which is a no-op, meaning that
the linker was not getting loaded, and the catalog module was throwing
up a big nasty error every time someone tried to save a record with a
heading. This patch replaces the require with can_load from
Module::Load::Conditional, which is PBP-friendly, and offers equivalent
functionality.

Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-01 19:01:50 +02:00
Koha user
025528a157 Bug 8606 - Talking Tech broken by Bug 7001
The patches for bug 7001 removed the parseletter subroutine from
C4::Letters without updating the talking tech script to use the
new alternative. This patch rectifies that situation.

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
2012-09-20 14:12:41 +02:00
Jared Camins-Esakov
7ad5e203da Bug 2060: Update command line MARC import scripts
Expose authority import functionality to the command line import
scripts, and rename them from commit_biblios_file.pl and
stage_biblios_file.pl to commit_file.pl and stage_file.pl.

To test (note that these instructions assume you have a MARC21
installation and are using the provided sample file):
1. Find a file of authorities (a sample file with MARC21 authorities
   is attached to bug 7475) and download it to your server
2. Stage the file using the following command (replace <filename> with
   the name of the file you saved in step 1):
   > misc/stage_file.pl --file <filename> --authorities
3. Note the batch number the script assigns to your batch
4. Commit the records using the following command (replace <batchnumber>
   with the batch number you made note of in step 3):
   > misc/commit_file.pl --batch-number <batchnumber>
5. Index the authorities Zebraqueue (or wait)
6. Confirm that the new authorities appear.
7. Create a matching rule with the following settings:
   Code: AUTHTEST
   Description: Personal name main entry
   Match threshold: 999
   Record type: Authority record
   Search index: Heading-main
   Score: 1000
   Tag: 100
   Subfields: a
   Offset: 0
   Length: 0
   (note the ID of this matching rule)
8. Stage the authority file again, this time using the following
   command:
   > misc/stage_file.pl --file <filename> --authorities \
     --match <matchingrule>
7. Revert the import with the following command:
   > misc/commit_file.pl --batch-number <batchnumber> --revert
8. Index the authorities Zebraqueue (or wait)
9. Confirm that the records have been removed
10. Import an authority record with the Stage MARC/Manage staged MARC
    tools in exactly the way you would for a bibliographic record,
    but choose "Authority" instead of "Bibliographic" for the record
    type.

Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>

Testing plan delivers as it should.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master 11 September 2012
2012-09-19 17:16:18 +02:00
Jared Camins-Esakov
6e71b80ca3 Bug 7475: Teach matching rules to handle authorities
* Add the code necessary to handle authorities with matching rules and
  import batches.
* Update all the scripts that use the matcher and import batch code
  to use the new API.
* Add authority records to the matching rules interface in the staff
  client.

http://bugs.koha-community.org/show_bug.cgi?id=2060
Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master 11 September 2012
2012-09-19 17:15:56 +02:00
Srdjan
42acfbf75b Bug 7993: Save reports with Group/Subgroup hierarchy
This should make saved reports more manageable.
Group/Subgroup hierarchy is stored in authorised_values,
categories REPORT_GROUP and REPORT_SUBGROUP, connected by
REPORT_SUBGROUP.lib_opac -> REPORT_GROUP.authorised_value

Database changes:
* authorised_values: expanded category to 16 chars
* created default set of REPORT_GROUP authorised values to match
  hardcoded report areas
* reports_dictionary: replaced area int with report_area text, converted
  values
* saved_sql: added report_area, report_group and report_subgroup;
  report_area is not currently used, saved for the record

C4/Reports/Guided.pm:
* Replaced Area numeric values with the mnemonic codes
* get_report_areas(): returns hardcoded areas list
* created get_report_areas(): returns full hierarchy (groups with belonging
  subgroups)
* save_report(): changed iterface, accepts fields hashref as input
* update_sql(): changed iterface, accepts id and fields hashref as input
* get_saved_reports():]
- join to authorised_values to pick group and subgroup name
- accept group and subgroup filter params
* get_saved_report():
- changed iterface, return record hashref
- join to authorised_values to pick group and subgroup name
* build_authorised_value_list(): new sub, moved code from
  reports/guided_reports.pl
* Updated interfaces in:
cronjobs/runreport.pl, svc/report, opac/svc/report: get_saved_report()
reports/dictionary.pl: get_report_areas()
reports/guided_reports.pl

reports/guided_reports_start.tt:
* Reports list:
- added group/subgroup filter
- display area/group/subgroup for the reports
* Create report wizard:
- carry area to the end
- select group and subgroup when saving the report; group defaults to area,
  useful when report groups match areas
* Update report and Create from SQL: added group/subgroup
* Amended reports/guided_reports.pl accordingly

Conflicts:

    C4/Reports/Guided.pm
    admin/authorised_values.pl
    installer/data/mysql/kohastructure.sql
    installer/data/mysql/updatedatabase.pl
    koha-tmpl/intranet-tmpl/prog/en/modules/reports/dictionary.tmpl
    koha-tmpl/intranet-tmpl/prog/en/modules/reports/guided_reports_start.tmpl
    misc/cronjobs/runreport.pl
    reports/dictionary.pl
    reports/guided_reports.pl

Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-18 17:49:08 +02:00
Colin Campbell
722701d596 Bug 8727 Minor stylistic change to help text
indexing not indexation
some minor grammatical changes

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-17 18:47:40 +02:00
wajasu
40f9914e60 Bug 8378 - <fine> syntax not working on overdues anymore
Bug 8378 - <fine> syntax broken NFC and charset utf8

NFC normalize enqueued letters and add content-type charset=utf-8

This prevents utf8 codes from causing mysql to truncate the 'content'
from the point of certain codes, when stored in the message_queue table.
This was happenning with the currency symbol generated by
Locale::Currency:Format currency_format routine. NFC normalization
was only done on the attachment content with its content-type
containing "text", as in text/plain.

For emails AND attachments, the charset="utf-8" was added to the
content-type so mail clients would correctly iterate the utf8 codes,
thus preventing mobijake.

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>

Ran through test plan before and after applying patch. Verified
that fine syntax does not work pre-patch and does work post-patch
for both direct emails and emails to the KohaAdminEmailAddress.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-14 17:52:25 +02:00
7571a0e48b 7675 New script for changing selinux file labels on perl scripts
On some Linux distributions like RedHat, Fedora, CentOS you can use SELinux for enhanced security. Among others, this involves file labeling (security context). In other distributions SELinux can be installed additionally.

The attached script lets you update and restore such labels on the perl scripts in a Koha installation.

July 18, 2012: Added opac/svc.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2012-09-14 17:17:04 +02:00
Fridolyn SOMERS
8469d53f02 Bug 8420: tool statisticfines.pl and hourly loan
Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-14 16:19:25 +02:00
Jared Camins-Esakov
bc05b5d163 Bug 7417: Include see from references in bibliographic searches
This patch adds the Koha::Indexer::RecordNormalizer and
Koha::Indexer::MARC::RecordNormalizer::EmbedSeeFromHeadings packages
to enable the inclusion of alternate forms of headings in bibliographic
searches. When the new syspref IncludeSeeFromInSearches is turned on
(default is off) rebuild_zebra.pl will insert see from headings from
authority records into bibliographic records when indexing, so that a
search on an obsolete term will turn up relevant records.

To test:
1) Enable IncludeSeeFromInSearches
2) Add a heading that has an alternate form to a record (for example,
   "Cooking" has the alternate form "Cookery," if you have authority
   records from LC)
3) Index the zebraqueue (or reindex if you haven't indexed your system
   yet)
4) Confirm that if you search for "Cookery" you get the record you
   just modified

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on master 5 August 2012
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on master 11 September 2012

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>

Also checked:
- Verified database update works correctly
- Checked system preference and its description
- Checked staff/opac detail pages with feature on/off
- Checked staff/opac search facets
- Downloaded and tested records in various formats
- Tried different searches for 'see from' entries of authorities
- Ran all unit tests

No problems found.
2012-09-13 14:19:28 +02:00
Srdjan
86c2c4626d bug_5911: Transport Cost Matrix
Create transport_cost table,  added UseTransportCostMatrix syspref.
transport_cost table contains branch to branch transfer
costs. These are used for filling inter-branch hold transfers.

Moved GetHoldsQueueItems() from .pl to HoldsQueue.pm

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-12 14:49:25 +02:00
Paul Poulain
0d4acbba5c Merge remote-tracking branch 'origin/new/bug_7613' 2012-09-05 14:54:33 +02:00
D Ruth Bavousett
fafd3e87f5 Bug 7613: OCLC Connexion web service and desktop client, followup patch
Prior patches to this bug had lots of comments like "I don't have a way to test this, so..."

In the OCLC Connexion web, when you choose the option to export to MARC, it'll *send* it, and
say, "Record Exported," but the web client does nothing whatever to confirm that the record
actually landed in Koha.  That's a flaw in their software, but can be easily checked by
looking in Koha to see if an import batch got created.  The desktop client is a little
smarter about this, but needed much more testing, also.

With this patch, both the client and web will actually work.  With a config file and set up as
previously described, The record will be staged and/or imported, and the desktop client returns
a useful message about what happened, *and* the staff client URL to the record.

Oodles of gobs of bunches of thanks to Virginia Military Institute, for loaning me their OCLC
authorization credentials so this could be tested, as well as for great suggestions of cosmetic
improvements to the mechanism and output.
2012-09-05 14:53:13 +02:00
4a139b51c5 Bug 8419 - Suspended holds appear on the daily holds queue
Suspended holds are showing up in both the holds queue and holds to pull reports.

This patch adds to the sql queries such that any hold that is suspended
is not selected.

Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-31 17:56:46 +02:00
Mirko Tietgen
608402e27b Bug 8413 Space in barcodes breaks GET request in benchmark_staff.pl
GET requests in benchmark_staff.pl test 6 do not work if a space character is part of the barcode. That seems highly unlikely to happen in barcodes, but is possible if no real barcodes are used but a substitute, like a copy of the call number. Space character needs to be changed to %20 for the request to work.

Also fixes a typo.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-31 17:55:41 +02:00
Fridolyn SOMERS
88e66b674e Bug 8576: Software error on authority edition when using merge
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-28 17:55:33 +02:00
Jonathan Druart
ee0a778eaa Bug 8607: FIX overdues_notices script: $date is not replaced
The script is unusable.
The variable $date is not replaced with its content.

Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-28 17:48:50 +02:00
7561d9f433 Bug 3383 - Followup - Switch from GetMemberDetails to GetMember
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-03 15:58:26 +02:00
fd23761d4d Bug 3383 - Item due reminder digests - cannot display title information
Adds the ability to use branches.* fields in digest notices and
have them be parsed correctly. Also adds a warning to the notices
editor for digests.

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I like the idea to show a warning, but I would perhaps move
it under the message body label to be more obvious.

Patch works nicely, branch data of my user's home library
is displayed in the notice.
2012-08-03 15:58:24 +02:00
Paul Poulain
de07f00630 Bug 7420 tiny bugfix
The printing at the end, in verbose mode was not displaying variables properly because the print was '' and not ""
2012-07-25 18:35:36 +02:00
Elliott Davis
381794ff4e [PATCH] bug_7420: Added overduefinescap to issuingrules
Replaced existing MaxFine syspref logic with overduefinescap.
Repurposed MaxFine to be the overall overdue limit for all items
overdue. Implemented new MaxFine logic in UpdateFine().

Signed-off-by: Elliott Davis <elliott@bywatersolutions.com>
Tested according to Srdjan's test plan and everything worked like he said it would.  I set fined equal to $2 and max fine equal to $1.      When I ran the fines script for overdue items fines assessed were only $1.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-25 18:33:20 +02:00
Julian Maurice
30ee49ddb4 Bug 8376: New script to export borrowers misc/export_borrowers.pl
This script prints to standard output what is returned by
GetMemberDetails in CSV format.
Exported fields can be specified with option -f. If no -f option is
specified, all fields are exported.

Signed-off-by: Gaetan Boisson <gaetan.boisson@biblibre.com>
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>

Amended with some code to better handle bad data.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-24 17:36:14 +02:00
20b5228af6 Bug 8063 - Followup - Bug fix
$OUTPUT being used but not being declared.

When trying to run this script I gat a nasty:
15:42 ~/koha.dev/koha-community (new/bug_8063 $%)$ ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 81.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 95.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 102.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 106.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 120.
Global symbol "$OUTPUT" requires explicit package name at ./misc/cronjobs/gather_print_notices.pl line 127.
Execution of ./misc/cronjobs/gather_print_notices.pl aborted due to compilation errors.

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Before the patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
Global symbol "$OUTPUT" requires explicit package name at
[...]./misc/cronjobs/gather_print_notices.pl line 81.
./misc/cronjobs/gather_print_notices.pl had compilation errors.

With this patch:
$perl -wc ./misc/cronjobs/gather_print_notices.pl
./misc/cronjobs/gather_print_notices.pl syntax OK

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-16 14:54:02 +02:00
e1df3a573b Bug 8063 - Hold print notices do not sort by branch
Adds the option -s/--split to enable notices to be separated
into different files by borrower home library.

Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
2012-07-16 14:35:05 +02:00
Paul Poulain
f08698a406 Bug 8353 follow-up adding a tiny sh in misc/maintenance
This script will help the sysadmin know there's a test he can use
during maintenance

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-07-13 14:42:55 +02:00
Jared Camins-Esakov
8affddc52d Bug 8268 follow-up: incorporate QA comments
Fixes the following things:
1. Sanitizes log output to prevent an attacker from using a specially
   crafted POST to add extra lines to the log
2. Simplify a regular expression since "..file" cannot be used to
   escape the current directory
3. Makes sure directories are consistent
4. Correct logic issues in misc/cronjobs/backup.sh

Thanks to Frere Sebastien Marie for catching these issues.

Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-12 17:40:22 +02:00
Jared Camins-Esakov
bbcb1d784b Bug 8268: Add database dump to export tool
This patch builds on work by Lars Wirzenius for the Koha packages.

To date, the only way for a Koha librarian to obtain a complete backup
of their system has been to log into the system via SSH (or FTP) to
download the mysqldump file. This patch makes it possible for
superlibrarians in properly configured systems to download night backups
via the staff client's Export tool.

Recognizing that this is functionality with potentially very grave
security implications, system administrators must manually enable these
features in the koha-conf.xml configuration file.

The following configuration settings have been added to the koha-conf.xml
file:
* backupdir => directory where backups should be stored.
* backup_db_via_tools => whether to allow superlibrarians to download
  database backups via the Export tool. The default is disabled, and
  there is no way -- by design -- to enable this option without manually
  editing koha-conf.xml.
* backup_conf_via_tools => whether to allow superlibrarians to download
  configuration backups via the Export tool (this may be applicable to
  packages only). The default is disabled, and there is no way -- by
  design -- to enable this option without manually editing koha-conf.xml.

This commit modifies the following scripts to make use of the new
backupdir configuration option:
* koha-dump and koha-run-backups in the Debian packages
* The sample backup script misc/cronjobs/backup.sh

Note that for security reasons, superlibrarians will not be allowed
to download files that are not owned by the web server's effective user.
This imposes a de facto dependency on ITK (for Apache) or running the
web server as the Koha user (as is done with Plack).

To test:
1. Apply patch.
2. Go to export page as a superlibrarian. Notice that no additional
   export options appear because they have not been enabled.
3. Add <backupdir>$KOHADEV/var/spool</backup> to the <config> section
   of your koha-conf.xml (note that you will need to adjust that so that
   it is pointing at a logical directory).
4. Create the aforementioned directory.
5. Go to export page as a superlibrarian. Notice that no additional
   export options appear because they have not been enabled.
6. Add <backup_db_via_tools>1</backup_db_via_tools> to the <config>
   section of your koha-conf.xml
7. Go to the export page as a superlibrarian. Notice the new tab.
8. Go to the export page as a non-superlibrarian. Notice there is no
   new tab.
9. Run: mysqldump -u koha -p koha | gzip > $BACKUPDIR/backup.sql.gz
   (substituting appropriate user, password, and database name)
10. Go to the export page as a superlibrarian, and look at the "Export
    database" tab. If you are running the web server as your Koha user,
    and ran the above command as your Koha user, you should now see the
    file listed as an option for download.
11. If you *did* see the file listed, change the ownership to something
    else: sudo chown root:root $BACKUPDIR/backup.sql.gz
11a. Confirm that you no longer see the file listed when you look at the
     "Export database" tab.
12. Change the ownership on the file to your web server (or Koha) user:
    sudo chown www-data:www-data backup.sql.gz
13. Go to the export page as a superlibrarian, and look at the "Export
    database" tab. You should now see backup.sql.gz listed.
14. Choose to download backup.sql.gz
15. Confirm that the downloaded file is what you were expecting.

If you are interested, you can repeat the above steps but replace
<backup_db_via_tools> with <backup_conf_via_tools>, and instead of
creating an sql file, create a tar file.

To test packaging: run koha-dump, confirm that it still creates a
usable backup.

------

This signoff contains two changes:

10-1. If no backup/conf files were present, then the message telling you
so doesn't appear and the download button does. Made them behave
correctly.
10-2. The test for a file existing required it to be owned by the
webserver UID. This change makes it so it only has to be readable.

Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
2012-07-12 17:40:21 +02:00
Jared Camins-Esakov
3616eee996 Bug 8384: Some Perl scripts do not compile
Fix syntax errors preventing the scripts misc/translator/text-extract2.pl
and misc/cronjobs/thirdparty/TalkingTech_itiva_inbound.pl from compiling.

Remove misc/migration_tools/build6xx.pl entirely since it refers to
columns that no longer exist in the Koha database, and has seemingly
had broken encoding since Koha switched from CVS to git (or before!).

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-10 10:50:58 +02:00
Colin Campbell
8e485c6112 Bug 8271 teach SIPServer.pm to set its own lib path
SIPServer.pm requires that C4/SIP is added to its lib
path This has been done by passing this directory
to it via -I. By using FindBin it can set the path
for itself correctly. This will also work if the C4/SIP
directory tree is moved to a non-standard location
Removed the now redundant -I. from sip_run.sh

Added a variable to sip_run.sh for the koha tree to
highlight a problem with the script if you have multiple
directories in the PERL5LIB environment variable

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-06 18:28:11 +02:00
Christophe Croullebois
665136f8a0 Bug 6566 Checking if DB's records are properly indexed
Small script that checks if each bibliorecord in the DB is properly indexed
use -h to learn more
(MT #6389)

Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-06 17:11:39 +02:00
Jonathan Druart
623f3a2c84 Bug 8233 : SearchEngine: Add a Koha::SearchEngine module
First draft introducing solr into Koha :-)

List of files :
  $ tree t/searchengine/
  t/searchengine
  |-- 000_conn
  |   `-- conn.t
  |-- 001_search
  |   `-- search_base.t
  |-- 002_index
  |   `-- index_base.t
  |-- 003_query
  |   `-- buildquery.t
  |-- 004_config
  |   `-- load_config.t
  `-- indexes.yaml
  just do `prove -r t/searchengine/**/*.t`

  t/lib
  |-- Mocks
  |   `-- Context.pm
  `-- Mocks.pm
  provide a mock to SearchEngine syspref (set_zebra and set_solr).

  $ tree Koha/SearchEngine
  Koha/SearchEngine
  |-- Config.pm
  |-- ConfigRole.pm
  |-- FacetsBuilder.pm
  |-- FacetsBuilderRole.pm
  |-- Index.pm
  |-- IndexRole.pm
  |-- QueryBuilder.pm
  |-- QueryBuilderRole.pm
  |-- Search.pm
  |-- SearchRole.pm
  |-- Solr
  |   |-- Config.pm
  |   |-- FacetsBuilder.pm
  |   |-- Index.pm
  |   |-- QueryBuilder.pm
  |   `-- Search.pm
  |-- Solr.pm
  |-- Zebra
  |   |-- QueryBuilder.pm
  |   `-- Search.pm
  `-- Zebra.pm

How to install and configure Solr ?
  See the wiki page: http://wiki.koha-community.org/wiki/SearchEngine_Layer_RFC

http://bugs.koha-community.org/show_bug.cgi?id=8233
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
2012-07-06 16:51:58 +02:00
Julian Maurice
57424a9fdc Bug 7286: rebuild_zebra_sliced for biblios and authorities
Complete rewrite of rebuild_zebra_sliced.zsh (renamed to .sh). Main
improvements are:
  - both biblio and authority records are handled
  - records are exported only once

It also add an option --skip-index to rebuild_zebra.pl that permit to
use rebuild_zebra.pl as an 'export only' script.

Description:
Index Koha records by chunks. It is useful when some record causes
errors and stop the indexation process. With this script, if indexation
of one chunk fails, chunk is splitted in 2 (or 3) chunks, and
indexation continue on these chunks.
rebuild_zebra.pl is called only once to export records.
Splitting and indexing is handled by this script (using yaz-marcdump and
zebraidx).

Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-07-06 15:06:40 +02:00
Jared Camins-Esakov
c0714a99f2 Bug 6557: Record bib popularity in totalissues
Because updating the total issues count associated with a bibliographic
record on issue could cause a significant load on the server, this
commit adds the syspref UpdateTotalIssuesOnCirc (which defaults to OFF
to match existing behavior). The syspref has the following description:

  Do/Do not update a bibliographic record's total issues count whenever
  an item is issued (WARNING! This increases server load significantly;
  if performance is a concern, use the update_totalissues.pl cron job
  to update the total issues count).

Bug 6557: automatically increment totalissues

Adds the ability to automatically increment biblioitems.totalissues
whenever an item is issued.

To test:
1) Choose a record with at least one item that can circulate
2) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). Most likely there won't be any 942$0 at all
3) Enable UpdateTotalIssuesOnCirc
4) Check out the item you selected
5) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). That value should now be one greater than before
6) Discharge the item
7) Disable UpdateTotalIssuesOnCirc
8) Check out the item you selected again
9) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). That value should not have changed

Bug 6557: add script to update totalissues from stats

NAME
       update_totalissues.pl

SYNOPSIS
         update_totalissues.pl --use-stats
         update_totalissues.pl --use-items
         update_totalissues.pl --commit=1000
         update_totalissues.pl --since='2012-01-01'
         update_totalissues.pl --interval=30d

DESCRIPTION
       This batch job populates bibliographic records' total issues count
       based on historical issue statistics.

       --help  Prints this help

       -v|--verbose
               Provide verbose log information (list every bib modified).

       --use-stats
               Use the data in the statistics table for populating total
               issues.

       --use-items
               Use items.issues data for populating total issues. Note that
               issues data from the items table does not respect the --since
               or --interval options, by definition. Also note that if both
               --use-stats and --use-items are specified, the count of biblios
               processed will be misleading.

       -s|--since=DATE
               Only process issues recorded in the statistics table since
               DATE.

       -i|--interval=S
               Only process issues recorded in the statistics table in the
               last N units of time. The interval should consist of a number
               with a one-letter unit suffix. The valid suffixes are h
               (hours), d (days), w (weeks), m (months), and y (years). The
               default unit is days.

       --incremental
               Add the number of issues found in the statistics table to the
               existing total issues count. Intended so that this script can
               be used as a cron job to update popularity information during
               low-usage periods. If neither --since or --interval are
               specified, incremental mode will default to processing the
               last twenty-four hours.

       --commit=N
               Commit the results to the database after every N records are
               processed.

       --test  Only test the popularity population script.

WARNING

If the time on your database server does not match the time on your Koha
server you will need to take that into account, and probably use the
--since argument instead of the --interval argument for incremental
updating.

=== TESTING PLAN ===

NOTE: in order to test this script, you will need to have some sort of
circulation data already existing in your Koha installation.

1) Disable UpdateTotalIssuesOnCirc
2) Run: misc/cronjobs/update_totalissues.pl --use-items -t -v
3) If you have total checkout data in your item records (i.e. anything
   in 952$l), you should see messages like "Processing bib 43 (1 issues)"
4) Choose one of the lines that shows more than 0 issues, and view the
   record with that biblionumber in the staff client, choosing the "Items"
   tab (moredetail.pl). Add up the "Total checkouts" listed for each item,
   and confirm it matches what the script reported
5) Run: misc/cronjobs/update_totalissues.pl --use-stats -t -v
6) If you have any circulation statistics in your database (i.e. any
   'issue' entries in your statistics table), you should see messages
   like "Processing bib 43 (1 issues)";
7) Choose one of the lines and view the record with that biblionumber in
   the staff client, choosing the "Items" tab (moredetail.pl). If you
   count the number of checkouts listed in each item's checkout history,
   the total should match what the script reported.
8) Check out an item
9) Run: misc/cronjobs/update_totalissues.pl --use-stats
   --incremental --interval=1h -t -v
10) You should see one line reporting a single circ for the bib record
    associated with the item you just checked out (there may be more if
    you checked out any books in the hour prior to running these tests
11) If the results in steps 4, 7, and 10 match the predictions, the
    script worked

This patch to Koha was sponsored by the Arcadia Public Library and the
Arcadia Public Library Foundation in honor of Jackie Faust-Moreno, late
director of the Arcadia Public Library.

Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Tested this with my test data - numbers are correct and updated appropriately.

More importantly - if I do a popularity search, the most popular items *come up first*. Amazing.
2012-06-29 14:29:22 +02:00
ac5e09a0f0 Bug 8267 - Overdue notices not working
The variable $i was being re-used and overwriting the necessary value that was being passed to a subroutine. Renaming $i to $j fixed it. I also added an extra safety check within parse_letter that would also have prevented this bug.

Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-20 23:14:56 +02:00
Paul Poulain
d094e24af9 Bug 7447 allow to specify a date in overdue_notice.pl
This patch add a new parameter to overdue_notices.pl, that is a date.
If you add --date=YYYY-MM-DD when running overdue_notices, it will generate overdues as if you were on date provided

that's usefull if you want to relaunch an overdue calculation that has failed, of after changing your circ rules

Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-20 21:50:21 +02:00
christophe croullebois
082bb5049d Bug 8136 Changes the expected lenght of 100$a in rebuild_zebra.pl
In rebuild_zebra.pl, if we are in "unimarc" ("marcflavour" syspref), the sub "fix_unimarc_100" is called and checks if 100$a lenght is equal to 35.
If it is not the case, the sub inserts the localtime and more, so we loose the datas in reindexing.
The standart lenght is 36.
I have just changed 35 to 36.

Signed-off-by: Sophie Meynieux <sophie.meynieux@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-20 09:39:27 +02:00
Paul Poulain
3752be09d3 Revert "Subroutine prototypes used at line XXX, column 1. See page 194 of PBP."
This reverts commit 583abead1b.
The translator does not work anymore after this patch has been applied, so reverting it
2012-06-11 15:38:35 +02:00
1d53bd778b Talking Tech Support - Phase I - Followup 3 - Follow PBP
* Fixes violations of Perl Best Practices, where possible
* perltidy both scripts

http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-10 17:47:03 +02:00
47e4f3ed84 Talking Tech Support - Phase I - Followup - Fix Messaging Preferences
There is a flaw in C4::Members::Messaging::GetMessagingPreferences where
the system assumes that every transport will use the same letter. This
is not necessarily true. Even with the default preferences of just
'email' and 'sms', we should be able to have different letters
for each, as one has a maximum character length ( sms ) and one
does not. GetMessagingPreferences currently uses the letter code
of the last result of its query as the letter code for every transport type.

The returned data is a hashref with a key 'transport_types' that is
an array of transport_types this borrower has selected for the given
alert.

This commit modifies GetMessagingPreferences such that the the
'transport_types' array is now a hash where the name of the transport
type is now a key to the value of the letter code set for that transport
type.

It also modifies code calling GetMessagingPreferences where necessary,
and as a side benefit will correctly get the letter codes for email
and sms correctly, if they are defined differently.

http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>

In use in production by two libraries: Middletown and Washoe
who give their sign off but don't have git to do so.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-10 17:46:58 +02:00
Ian Walls
d29efac4f3 Talking Tech Support - Phase I
Implements support for Talking Tech I-tiva phone notification for OVERDUE, PREDUE and HOLD notifications.
Overdues respect triggers as configured for the patron's branch.
Predue and Holds notifications respect patron's messaging preference choices.
A new column for phone notification is added if the TalkingTechItivaPhoneNotification system preference is turned on

Record of phone messages being sent to patrons is added to the patron's Notices
tab; notice of success or failure can be retrieved from I-tiva.

See the TalkingTech.README for installation and set-up instructions.

Aside from the control system preference, and the necessary changes to Messaging Preferences
forms to make use of phone notifications, the bulk of the code resides in external
cronjobs.

TalkingTech_itiva_outbound.pl generates the Spec C file to send to I-tiva.  Actual transmission
of the file must be handled by the system administrator.

TalkingTech_itiva_inbound.pl processes the received Results file from I-tiva.  Getting the
file from I-tiva to Koha is the job of the system administrator, as well.

Both scripts have a --help option with full documentation.

The only necessary change to core Koha behavior is in C4::Letters::EnqueueLetter.  The return
value was changed from 0 or 1 (successful addition of letter to message_queue or not), to the actual
insert ID of the letter.  This was required by the outbound script to present a unique Transaction ID
for the notice added to the patron's record (so a 'sent' or 'failed' status could be updated).  Since
the dbh and sth are not shared, and the last_insert_id() command is table-specific, this should be thread-safe.
No changes are necessary to any parts of Koha, as all usage of EnqueueLetter currently ignores the return value.

To Test:

1. Turn on TalkingTechItivaPhoneNotification system preference
2. Verify that 'phone' is now a valid notification option for patrons on both staff and OPAC side
3. Attempt to set a 'phone' preference for PREDUE or HOLD messaging; attempt should succeed
4. Set up the patron for notices to triggers:
   a. include checked out items due in a range of days, including the value set up in their messaging preferences.
   b. place several holds, some in position, others waiting for pickup, others in transit.
   c. set the patron up to have overdues, overdue by a range of days that includes the delay values for
the patrons branch and categorycode
5. Run TalkingTech_itiva_outbound.pl --type=RESERVE --type=PREOVERDUE --type=OVERDUE --outfile=/tmp/talkingtechtest.csv

The resulting talkingtechtest.csv file should include all the items due on X days (where X is the patrons' preference),
and none of the ones due in other increments.  Similarly, overdues messages should be added for each item due by a delay
value as configured; overdues of other numbers of days should be ignore.  Holds that are waiting pick up or in transit should
have messages, those still pending should not.

Messages should be added to the patron's notices tab for each issue sent.  Verify these messages exist, and all Notices
tokens are replaced with appropriate information.

Repeat, this time with 4c making use of the default branch overdue triggers, instead of branch-specific triggers.

To test the inbound script, create a CSV with rows in the format "<<Message_id>>","<<SUCCESS or FAIL>>"
Message ID should correspond to the final column of the talkingtechtest.csv file (the transaction id) for the message.

Primary Authorship: Ian Walls
Additional modifications: Kyle M Hall

http://bugs.koha-community.org/show_bug.cgi?id=4246
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>

Tested and in use in production by two public libraries : Middletown
and Washoe. Both have given their sign off, but don't have git to
actually sign off.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-10 17:46:52 +02:00
583abead1b Subroutine prototypes used at line XXX, column 1. See page 194 of PBP.
(Severity: 5)

Note: Rebased on master 06/09/2012 by jcamins
Signed-off-by: Joy Nelson <joy@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-10 15:11:18 +02:00
8caef64680 Bug 6267: [SIGNED-OFF]Fix a typo
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-09 14:44:17 +02:00
Galen Charlton
daca5edc52 Bug 7818: -x option of rebuild_zebra.pl now works with DOM filter
One consequence is that the -x and -a options are no longer
mutually exclusive.

Also, because of the way that the GRS-1 SGML filter works, if you're
indexing multiple documents, you can't just wrap them in a document
element, but the DOM filter *requires* it.  Consequently, two
new config settings in koha-conf.xml are added to indicate the
Zebra filter in use so that the -x option of rebuild_zebra.pl
knows whether to wrap the exported records or not:

- bib_index_mode (defaults to 'grs1' if not specified)
- auth_index_mode (defaults to 'dom')

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-09 11:44:09 +02:00
Galen Charlton
4559fa3a27 Bug 7818: utility to generate DOM indexing configs
misc/maintenance/make_zebra_dom_cfg_from_record_abs:
  generate a DOM filter Zebra index config from a GRS-1 config

Given a Zebra record.abs file containing a set of index definitions for
Zebra's GRS-1 filter, write an equivalent DOM filter configuration.

To generate the XSLT that is to be used by Zebra, run something like
the following on the output of this utility:

xsltproc ZEBRA_CFG_DIR/xsl/koha-indexdefs-to-zebra.xsl \
  biblio-koha-indexdefs.xml \
  > ZEBRA_CFG_DIR/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl

The above example assumes that the output of the program was named
biblio-koha-indexdefs.xsl.

This commit also introduces Koha::Indexer::Utils, a new package for
misceallenous routines that support Koha's indexing definitions.

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-09 11:44:00 +02:00
Galen Charlton
f50d433781 Bug 7818: update installer for biblio DOM indexing
Adds the necessary bits to enable DOM indexing for bib
records as an option during installation from source.

Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-09 11:43:56 +02:00
Paul Poulain
60186fa42f Merge remote-tracking branch 'origin/new/bug_6858' 2012-05-28 16:35:53 +02:00
Matthias Meusburger
4dc4563396 Bug 6858: Adds staticfines.pl for static fines processing
Add a tool to calculate static fine. For example, 7 days left = 1€ fixed fine

Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
2012-05-28 16:29:48 +02:00
Chris Cormack
dd864696de Bug 7213 : Follow up fixing license information
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-05-15 15:44:33 +02:00
Dobrica Pavlinusic
63bc7ebc39 Bug 7213 - simple /svc/ HTTP example
Simple command-line client which can authorize itself to Koha,
get MARC XML record based on biblio number and update record

This script can also be used as module using require "koha-svc.pl"
from other scripts which can implement MARC XML creation or parsing.

This is follow up version which now uses Content-type: text/xml
header when using POST method to be in sync with documentation at
http://wiki.koha-community.org/wiki/Koha_/svc/_HTTP_API

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-05-14 18:22:17 +02:00
Paul Poulain
5c32a9f811 Release notes for 3.8.0 2012-04-23 12:42:07 +02:00
Paul Poulain
3a4ea95fe6 fixing missing " in french syspref file 2012-04-23 12:40:16 +02:00
f47ad12f9d Koha 3.8.0 Translation Update 2012-04-21 20:02:22 +02:00
MJ Ray
1aef5ab44e Bug 6267 custom http user-agent in check-url.pl (fix for books.google.com 401 error)
Patch by Judit with a small change to the help wording.
Sponsored by CALYX information essentials.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-04-19 12:08:17 +02:00
Chris Cormack
e3669815a0 Bug 7613 follow up to fix perlcritic errors
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-04-06 17:26:36 +02:00
Srdjan
12ff7355bb bug_7613: OCLC Connexion gateway
svc/import_bib:
* takes POST request with parameters in url and MARC XML as DATA
* pushes MARC XML to an impoort bach queue of type 'webservice'
* returns status and imported record XML
* is a drop-in replacement for svc/new_bib

misc/cronjobs/import_webservice_batch.pl:
* a cron job for processing impoort bach queues of type 'webservice'
* batches can also be processed through the UI

misc/bin/connexion_import_daemon.pl:
* a daemon that listens for OCLC Connexion requests and is compliant
  with OCLC Gateway spec
* takes request with MARC XML
* takes import batch params from a config file and forwards the lot to
  svc/import_bib
* returns status

ImportBatches:
* Added new import batch type of 'webservice'
* Changed interface to AddImportBatch() - now it takes a hashref
* Replaced batch_type = 'batch' with
  batch_type IN ( 'batch', 'webservice' ) in some SELECTs

Signed-off-by: MJ Ray <mjr@phonecoop.coop>
2012-04-06 17:26:20 +02:00
Paul Poulain
b7a6071cf5 bug 7641 follow-up: activate use strict (see coding guidelines) 2012-03-29 15:10:16 +02:00
f446b3d03d Bug 7641: Suspend Reserves
Adds the ability to suspend reserves. The new system preference
AutoResumeSuspendedHolds enables the ability to set a date for
a suspended hold to automatically be resumed.

When a hold is suspended, it will continue to increase in priority
as the holds above it are fulfilled. If the first holds in line
to be filled are suspended, the first non-suspened hold in line
will be used when an item can fulfill a hold that has been placed.

http://bugs.koha-community.org/show_bug.cgi?id=7641
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>

Tested with the preference on and off:
1. placed several holds in the staff client
2. suspended some with a date
3. suspended some without a date
4. triggered hold message by checking in for hold with suspensions
5. the suspended hold was skipped as it should be
6. tested suspending holds in the OPAC w and w/out dates
7. ran the cron to clear suspensions with dates

All the above tests worked as expected. Signing off.
2012-03-29 14:37:49 +02:00
Paul Poulain
cfa444d583 bug 6858 follow-up indenting with spaces 2012-03-28 18:10:40 +02:00
Matthias Meusburger
15c8a453fc Bug 6858: Adds staticfines.pl for static fines processing
Add a tool to calculate static fine. For example, 7 days left = 1€ fixed fine

Signed-off-by: Delaye Stephane <stephane.delaye@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-28 17:59:50 +02:00
Paul Poulain
0486d0c6b7 Merge remote-tracking branch 'origin/new/bug_6199' 2012-03-28 17:54:55 +02:00
Robin Sheat
b96c8b7ffa Bug 6199 - allow bulkmarkimport.pl to remove duplicate barcodes
This adds the -dedupbarcode option that allows bulkmarkimport to erase
a barcode but keep the item of any items it finds with duplicate
barcodes.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-28 17:30:54 +02:00
Jonathan Druart
f35a1cce68 Bug 7470: Babeltheque integration
3 features:
- adds social network information in search results
- adds babeltheque data in opac-detail
- adds social network links in opac-detail too (google+, twitter, mail
  and co.)
2012-03-26 14:24:04 +02:00
Paul Poulain
d029ec835d Bug 7780: make silent/verbose flag for translation installing
This patch deal with the -v flag that you can put on translate script.
If you run without -v, the process should be silent
if you run with -v, the process should be verbose

Signed-off-by: Frédéric Demians <f.demians@tamil.fr>

I've refactored your patch to handle verbosity directly via a translator
attribute, rather than with parameter which has to be send to each object call
method.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-26 11:01:40 +02:00
Chris Cormack
0c40ff9f98 Merge remote-tracking branch 'kc/master' into merged_5549
Fixed conflicts

Conflicts:
	catalogue/moredetail.pl
	installer/data/mysql/updatedatabase.pl
	koha-tmpl/intranet-tmpl/prog/en/modules/admin/smart-rules.tt
2012-03-22 09:36:55 +13:00
Matthias Meusburger
d91bb113f0 Bug 6025: Adds a script that re-create missing statistics from issues and old_issues tables
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-20 17:02:33 +01:00
Julian Maurice
3b0d4e04e0 Bug 6440: Implement OAI-PMH Sets
New sql tables:
  - oai_sets: contains the list of sets, described by a spec and a name
  - oai_sets_descriptions: contains a list of descriptions for each set
  - oai_sets_mappings: conditions on marc fields to match for biblio to be
    in a set
  - oai_sets_biblios: list of biblionumbers for each set

New admin page: allow to configure sets:
  - Creation, deletion, modification of spec, name and descriptions
  - Define mappings which will be used for building oai sets

Implements OAI Sets in opac/oai.pl:
  - ListSets, ListIdentifiers, ListRecords, GetRecord

New script misc/migration_tools/build_oai_sets.pl:
  - Retrieve marcxml from all biblios and test if they belong to defined
    sets. The oai_sets_biblios table is then updated accordingly

New system preference OAI-PMH:AutoUpdateSets. If on, update sets
automatically when a biblio is created or updated.

Use OPACBaseURL in oai_dc xslt
2012-03-20 11:38:26 +01:00
24dc37a490 Bug 7526 - longoverdue.pl leaves items marked as lost as still checked out to patron
When the longoverdu.pl script is run, and it marks an item as lost ( using
LostItem() ), if fails to remove the item from the borrower record. So, the
item is marked as lost, but is also still listed as checked out to the
borrower.

This commit adds the command line parameter --mark-returned. If used,
longoverdue.pl will remove lost items from the borrowers record.
Functionality will remain the same if it is not used.

Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>

http://bugs.koha-community.org/show_bug.cgi?id=7426
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-20 11:22:52 +01:00
Colin Campbell
624edf3dba Bug 5549 : Refactor fines.pl
Clean code in fines to remove unnecessary complexity
remove constructs now thought suspect or
not good pracrice
2012-03-20 13:27:12 +13:00
Colin Campbell
39d1b7e61b Bug 5549 : Overdues : Handle some date comparison and display issues 2012-03-20 13:21:19 +13:00
Colin Campbell
d55405047b Bug 5549 : Fix calculation of duedates in fines.pl and advance_notices.pl
Cleaned up some no longer used parameters in
Overdues::CalcFine
2012-03-20 13:20:01 +13:00
Paul Poulain
ba6c8485ca Merge remote-tracking branch 'origin/new/bug_7368' 2012-03-16 11:50:42 +01:00
8a1ce25939 7368 Typo in cart_to_shelf
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-16 11:50:29 +01:00
Srdjan Jankovic
a9ded4fa00 bug_7001: Issue and Reserve slips are notices.
Branches can have their own version of notices - added branchcode to
letter table.
Support html notices - added is_html to letter table.
Support for borrower attributes in templates.
GetPreparedletter() is the interface for compiling letters (notices).
Sysprefs for notice and slips stylesheets
Added TRANSFERSLIP to the letters

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-09 10:11:20 +01:00
Jared Camins-Esakov
5207699f98 signed off Bug 7284: Authority matching improvements
Squashed patch incorporating all previous patches (there is no functional
change compared to the previous version of this patch, this patch merely
squashes the original patch and follow-up, and rebases on latest master).

=== TL;DR VERSION ===
*** Installation ***
1. Run installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt1
and installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt2
2. Make sure you copy the following files from kohaclone to koha-dev:
etc/zeradb/authorities/etc/bib1.att,
etc/zebradb/marc_defs/marc21/authorities/authority-koha-indexdefs.xml,
etc/zebradb/marc_defs/marc21/authorities/authority-zebra-indexdefs.xsl,
etc/zebradb/marc_defs/marc21/authorities/koha-indexdefs-to-zebra.xsl, and
etc/zebradb/marc_defs/unimarc/authorities/record.abs
3. Run misc/migration_tools/rebuild_zebra.pl -a -r

*** New sysprefs ***
* AutoCreateAuthorities
* CatalogModuleRelink
* LinkerModule
* LinkerOptions
* LinkerRelink
* LinkerKeepStale

*** Important notes ***
You must have rebuild_zebra processing the zebraqueue for bibs when testing this
patch.

=== DESCRIPTION ===

*** Cataloging module ***
* Added an additional box to the authority finder plugin for "Heading match,"
  which consults not just the main entry but also See-from and See-also-from
  headings.

* With this patch, the automatic authority linking will actually work properly
  in the cataloging module. As Owen pointed out while testing the patch,
  though, longtime users of Koha will not be expecting that. In keeping with
  the principles of least surprise and maximum configurability, a new syspref,
  CatalogModuleRelink makes it possible to disable authority relinking in the
  cataloging module only (i.e. leaving it enabled for future runs of
  link_bibs_to_authorities.pl).  Note that though the default behavior matches
  the current behavior of Koha, it does not match the intended behavior.
  Libraries that want the intended behavior rather than the current behavior
  will need to adjust the CatalogModuleRelink syspref.

*** misc/link_bibs_to_authorities.pl ***
Added the following options to the misc/link_bibs_to_authorities.pl script:
--auth-limit        Only process those headings that match the authorities
                    matching the user-specified WHERE clause.
--bib-limit         Only process those bib records that match the
                    user-specified WHERE clause.
--commit            Commit the results to the database after every N records
                    are processed.
--link-report       Display a report of all the headings that were processed.

Converted misc/link_bibs_to_authorities.pl to use POD.

Added a detailed report of headings that linked, did not link, and linked
in a "fuzzy" fashion (the exact semantics of fuzzy are up to the individual
linker modules) during the run.

*** C4::Linker ***
Implemented new C4::Linker functionality to make it possible to easily add
custom authority linker algorithms. Currently available linker options are:
* Default: retains the current behavior of only creating links when there is
  an exact match to one and only one authority record; if the 'broader_headings'
  option is enabled, it will try to link to headings to authority records for
  broader headings by removing subfields from the end of the heading (NOTE:
  test the results before enabling broader_headings in a production system
  because its usefulness is very much dependent on individual sites' authority
  files)
* First Match: based on Default, creates a link to the *first* authority
  record that matches a given heading, even if there is more than one
  authority record that matches
* Last Match: based on Default, creates a link to the *last* authority
  record that matches a given heading, even if there is more than one record
  that matches

The API for linker modules is very simple. All modules should implement the
following two functions:
<get_link ($field)> - return the authid for the authority that should be
linked to the provided MARC::Field object, and a boolean to indicate whether
the match is "fuzzy" (the semantics of "fuzzy" are up to the individual plugin).
In order to handle authority limits, get_link should always end with:
    return $self->SUPER::_handle_auth_limit($authid), $fuzzy;

<flip_heading ($field)> - return a MARC::Field object with the heading flipped
to the preferred form. At present this routine is not used, and can be a stub.

Made the linking functionality use the SearchAuthorities in C4::AuthoritiesMarc
rather than SimpleSearch in C4::Search. Once C4::Search has been refactored,
SearchAuthorities should be rewritten to simply call into C4::Search. However,
at this time C4::Search cannot handle authority searching. Also fixed numerous
performance issues in SearchAuthorities and the Linker script:
* Correctly destroy ZOOM recordsets in SearchAuthorities when finished. If left
  undestroyed, efficiency appears to approach O(log n^n)
* Add an optional $skipmetadata flag to SearchAuthorities that can be used to
  avoid additional calls into Zebra when all that is wanted are authority
  records and not statistics about their use

*** New sysprefs ***
* AutoCreateAuthorities - When this and BiblioAddsAuthorities are both turned
  on, automatically create authority records for headings that don't have
  any authority link when cataloging. When BiblioAddsAuthorities is on and
  AutoCreateAuthorities is turned off, do not automatically generate authority
  records, but allow the user to enter headings that don't match an existing
  authority. When BiblioAddsAuthorities is off, this has no effect.
* CatalogModuleRelink - when turned on, the automatic linker will relink
  headings when a record is saved in the cataloging module when LinkerRelink
  is turned on, even if the headings were manually linked to a different
  authority by the cataloger. When turned off (the default), the automatic
  linker will not relink any headings that have already been linked when a
  record is saved.
* LinkerModule - Chooses which linker module to use for matching headings
  (current options are as described above in the section on linker options:
  "Default," "FirstMatch," and "LastMatch")
* LinkerOptions - A pipe-separated list of options to set for the authority
  linker (at the moment, the only option available is "broader_headings," which
  is described below)
* LinkerRelink - When turned on, the linker will confirm the links for headings
  that have previously been linked to an authority record when it runs. When
  turned off, any heading with an existing link will be ignored.
* LinkerKeepStale - When turned on, the linker will never *delete* a link to an
  authority record, though, depending on the value of LinkerRelink, it may
  change the link.

*** Other changes ***
* Cleaned up authorities code by removing unused functions and adding
  unimplemented functions and added some unit tests.

* This patch also modifies the authority indexing to remove trailing punctuation
  from Match indexes.

* Replace the old BiblioAddAuthorities subroutines with calls into the new
  C4::Linker routines.

* Add a simple implementation for C4::Heading::UNIMARC. (With thanks to F.
  Demians, 2011.01.09) Correct C4::Heading::UNIMARC class loading. Create
  biblio tag to authority types data structure at initialization rather than
  querying DB.

* Ran perltidy on all changed code.

*** Linker Options ***
Enter "broader_headings" in LinkerOptions. With this option, the linker will
try to match the following heading as follows:
=600  10$aCamins-Esakov, Jared$xCoin collections$vCatalogs$vEarly works to
1800.

First: Camins-Esakov, Jared--Coin collections--Catalogs--Early works to 1800
Next: Camins-Esakov, Jared--Coin collections--Catalogs
Next: Camins-Esakov, Jared--Coin collections
Next: Camins-Esakov, Jared (matches! if a previous attempt had matched, it
would not have tried this)

This is probably relevant only to MARC21 and LCSH, but could potentially be of
great use to libraries that make heavy use of floating subdivisions.

=== TESTING PLAN ===

Note: all of these tests require that you have some authority records,
preferably for headings that actually appear in your bibliographic data. At
least one authority record must contain a "see from" reference (remember which
one contains this, as you'll need it for some of the tests). The number shown
in the "Used in" column in the authority module is populated using Zebra
searches of the bibliographic database, so you *must* have
rebuild_zebra.pl -b -z [-x] running in cron, or manually run it after running
the linker.

*** Testing the Heading match in the cataloging plugin ***
1.  Create a new record, and open the cataloging plugin for an
    authority-controlled field.
2.  Search for an authority by entering the "see from" term in the Heading Match
    box
3.  Confirm that the appropriate heading shows up
4.  Search for an authority by entering the preferred heading into the Main
    entry or Main entry ($a only) box (i.e., repeat the procedure you usually
    use for cataloging, whatever that may be)
5.  Confirm that the appropriate heading shows up

*** Testing the cataloging interface ***
6.  Turn off BiblioAddsAuthorities
7.  Confirm that you cannot enter text directly in an authority-controlled field
8.  Confirm that if you search for a heading using the authority control plugin
    the heading is inserted (note, however, that this patch does not AND IS NOT
    INTENDED TO fix the bugs in the authority plugin with duplicate subfields;
    those are wholly out of scope- this check is for regressions)
9.  Turn on BiblioAddsAuthorities and AutoCreateAuthorities
10. Confirm that you can enter text directly into an authority-controlled field,
    and if you enter a heading that doesn't currently have an authority record,
    an authority record stub is automatically created, and the heading you
    entered linked
11. Confirm that if you enter a heading with only a subfield $a that fully
    *matches* an existing heading (i.e. the existing heading has only
    subfield $a populated), the authid for that heading is inserted into
    subfield $9
12. Confirm that if you enter a heading with multiple subfields that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
13. Turn on BiblioAddsAuthorities and turn off AutoCreateAuthorities
14. Confirm that you can enter text directly into an authority-controlled field,
    and if you enter a heading that doesn't currently have an authority record,
    an authority record stub is *not* created
15. Confirm that if you enter a heading with only a subfield $a that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
16. Confirm that if you enter a heading with multiple subfields that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
17. Create a record and link an authority record to an authorized field using
    the authority plugin.
18. Save the record. Ensure that the heading is linked to the appropriate
    authority.
19. Open the record. Change the heading manually to something else, leaving
    the link. Save the record.
20. Ensure that the heading remains linked to that same authority.
21. Change CatalogModuleRelink to "on."
22. Open the record. Use the authority plugin to link that heading to the
    same authority record you did earlier.
23. Save the record. Ensure that the heading is linked to the appropriate
    authority.
24. Open the record. Change the heading manually to something else, leaving
    the link. Save the record.
25. Ensure that the heading is no longer linked to the old authority record.

*** Testing link_bibs_to_authorities.pl ***
26. Set LinkerModule to "Default," turn on LinkerRelink and
    BiblioAddsAuthorities, and turn AutoCreateAuthorities and
    LinkerKeepStale off
27. Edit one bib record so that an authority controlled field that has already
    been linked (i.e. has data in $9) has a heading that does not match any
    authority record in your database
28. Run misc/link_bibs_to_authorities.pl --link-report --verbose --test (you may
    want to pipe the output into less or a file, as the result is quite a lot of
    information)
29. Look over the report to see if the headings that you have authority records
    for report being matched, that the heading you modified in step 2 is
    reported as "unlinked," and confirm that no changes were actually made to
    the database (to check this, look at the bib record you edited earlier, and
    check that the authid in the field you edited hasn't changed)
30. Run misc/link_bibs_to_authorities.pl --link-report --verbose (you may want
    to pipe the output into less or a file, as the result is quite a lot of
    information)
31. Check that the heading you modified has been unlinked
32. Change the modified heading back to whatever it was, but don't use the
    authority control plugin to populate $9
33. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
34. Confirm that the heading has been linked to the correct authority record
35. Turn LinkerKeepStale on
36. Change that heading to something else
37. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
38. Confirm that the $9 has not changed
39. Turn LinkerKeepStale off
40. Create two authorities with the same heading
41. Run misc/migration_tools/rebuild_zebra.pl -a -z
42. Enter that heading into the bibliographic record you are working with
43. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
44. Confirm that the heading has not been linked
45. Change LinkerModule to "FirstMatch"
46. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
47. Confirm that the heading has been linked to the first authority record it
    matches
48. Change LinkerModule to "LastMatch"
49. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
50. Confirm that the heading has been linked to the second authority record it
    matches
51. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --auth-limit="authid=${AUTH}" (replacing ${AUTH} with an authid)
52. Confirm that only that heading is displayed in the report, and only those
    bibs with that heading have been changed

If all those things worked, good news! You're ready to sign off on the patch
for bug 7284.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master and squashed follow-up, 16 February 2012
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master, 21 February 2012

Signed-off-by: schuster <dschust1@gmail.com>
2012-03-07 17:34:11 +01:00
35b344b609 Bug 6895 Diacritics in Pootle/po files are broken in source text
To test:

  git checkout 3.6.x
  cd misc/translator
  ./tmpl_process3.pl update -i ../../koha-tmpl/opac-tmpl/prog/en/ \
    -s ./po/fr-FR-i-opac-t-prog-v-3006000.po -r

  po/fr-FR-i-opac-t-prog-v-3006000.po contains broken diacritics for NORMARC
  strings for example

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
fixes regression caused by bug 6752
2012-02-27 11:21:50 +01:00
Paul Poulain
1fd8c8a4de Bug 7246 add offset/length and where options to rebuild_zebra
This patch reimplement a feature that is on biblibre/master for Koha-community/master

It adds 4 parameters:
* offset = the offset of record. Say 1000 to start rebuilding at the 1000th record of your database
* length = how many records to export. Say 400 to export only 400 records
* where = add a where clause to rebuild only a given itemtype, or anything you want to filter on

Another improvement resulting from offset & length limit is the rebuild_zebra_sliced.zsh
that will be submitted in another patch.
rebuild_zebra_sliced will slice your all database in small chunks, and, if something went wrong for a given slice, will slice the slice, and repeat, until you reach a slice size of 1, showing which record is wrong in your database.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Removed mention of -l option for limiting number of items exported, as requested
by QA manager. This can be re-added in a later patch.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-02-17 10:59:23 +01:00
Chris Cormack
43b3fb9701 Bug 7238 : Shifting SIPconfig.xml to the etc dir and the scripts to misc/bin
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
The scripts run with the caveat that you must specify the path to SIPconfig.xml. The followup previously attached should take care of that issue.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-02-03 15:15:22 +01:00
Paul Poulain
e8b83c665a Bug 7157 follow-up: j2a.pl executable again
This patch just update the permission of the script, that must be executable
2012-01-27 12:18:25 +01:00
Liz Rea
fe1a642a1a Bug 7157 - Improve the j2a.pl cronjob
- Calculates updates date based on the upper age limit defined in the patron categories.
- Allows libraries to work on all branches or only one.
- Allows libraries to specify which Adult patron category to update child categories to.
- Allows libraries to specify a single Child patron category to update to an adult category.
- Has a test mode to display what transforms would be done on the database without executing the changes.

Includes improved help, copyright statement, and uses warnings. Also incorporates Paul's suggestions regarding --help and --man, changes -fromcat and -tocat to -f and -t, and removes a redundant update to categorycode (per M. deRooy).

To test:

Create two patron categories, a child and an adult category. Make sure they
have an upper age limit.

Create or modify some patrons in multiple branches that fall into the category
of "my birthdate is less than or equal to today's date minus the upper age
limit"

1. Run the script with no flags - nothing should  happen, it will suggest you try the --help flag.
2. Run the script with the --help flag - you should see the help
3. Run the script with the -f=<child category> -t=<adult category> -v -n - should show you results from all branches but take no action and tell you what its computations are.
4. Run the script with the -f=<child category> -t=<adult category> -b=<branchcode> -v -n - should show you results from your specified branch, but take no action and tell you what it's computations are.
5. Run the script with the -f=<child category> -t=<adult category> -v -b=<branchcode> - should show you the computations and tell you how many patrons were modified in your single branch. It will not show you the information on which patrons were updated.
6. Run the script with the -f=<child category> -t=<adult category> -v - should show you the computations and tell you how many patrons were modified across all branches.
7. Run the script without the -v flag, if you care what the non-verbose output looks like.

Fixed in this revision: Known limitation - if you give it an unknown tocat, it will fail with a rather ugly error.

Minor changes to the commit message to reflect new longopts (which I missed the last time)
There is more this script could do, please feel free to take it and run.

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>

Bug 7157 : Follow up, fixing FSF address

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-27 12:17:29 +01:00
Colin Campbell
263dded818 Bug 6752: Be stricter with utf-8 encoding of output
use encoding(UTF-8) rather than utf-8 for stricter
encoding
Marking output as ':utf8' only flags the data as utf8
using :encoding(UTF-8) also checks it as valid utf-8
see binmode in perlfunc for more details
In accordance with the robustness principle input
filehandles have not been changed as code may make
the undocumented assumption that invalid utf-8 is present
in the imput
Fixes errors reported by t/00-testcritic.t
Where feasable some filehandles have been made lexical rather than
reusing global filehandle vars

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-27 12:11:06 +01:00
Dobrica Pavlinusic
90d68d6f5c Bug 7247 - rebuild_zebra.pl -v should show all Zebra log output
Currently, -v option resets Zebra log output to default system values.

This produce amount of log specified in system defaults which is usually
too low for debugging.

This change explicitly forces all Zebra log output which create much more
chatter so it triggers with verbosity level 2

Test scenario:
1. pick koha site to reindex
2. use -v -v options to rebuild_zebra.pl to see additional output

Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Verified help corrections and  loglevel 2 output vs. loglevel 1 output. No issues found.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-17 17:31:25 +01:00
73c5cb3277 Bug 7240: Cleaning up import tables and action_logs
This patch lets cleanup_database also purge older records from the (five) import tables and the action_logs table.
Two new command line parameters are introduced: --import and --logs.
If no number of days is specified for --zebraqueue, --import or --logs, it defaults to 30 days, 60 days resp. 180 days.
I did not add a default for --sessdays, because this parameter cannot be seen separately from parameter --sessions.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>

Adds new parameters and code, does not change existing behaviour

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-13 12:20:51 +01:00
Marc Balmer
c9c6bbdea8 Bug 7356 - Fix various typos and mis-spellings
Fix typos: the the -> the, wether -> whether, developper -> developer.

http://bugs.koha-community.org/show_bug.cgi?id=7356
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-13 11:51:26 +01:00
Sophie Meynieux
940652a6c7 BUG 5607 : Adds parsing of issues fields in overdue letters
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch makes it possible to add fields from the issues table to overdue notices.

Template used for testing:
<item>"<<biblio.title>>" by <<biblio.author>>, <<items.itemcallnumber>>, Barcode: <<items.barcode>> , Checkout date: <<issues.issuedate>>, Due date: <<issues.date_due>> Fine: <fine>GBP</fine> Checkout date from items: <<items.onloan>></item>

Possible improvements:
- Dates are not formatted according to dateformat system preference

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-27 18:35:21 +01:00
Chris Cormack
525c7bc4af Bug 7370 : Missing use call
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-27 18:22:42 +01:00
Sophie Meynieux
1be4678a57 Bug 6292 : Followup 2. several letters where generated if a borrower had overdues with different due_date triggering the same level
This patch fixes the SQL request giving the list of borrowers

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-16 17:45:39 +01:00
Chris Cormack
410975a7f7 Bug 6292 : Overdue notices not being generated when borrower had an overdue older than the max value of the notice triggers
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
2011-12-16 17:45:37 +01:00
Sophie Meynieux
d86d62adba Bug 6292 followup
selection of items to be listed in an overdue notice included
both limits (upper and lower). So items with an overdue equal
to a limit appeared on both notices. This patch fix this,
including lower limit and excluding upper limit for the selection.

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
2011-12-16 17:45:36 +01:00
Marc Balmer
d9f7922ad7 Bail out early if an invalid language is used.
Signed-off-by: Marc Balmer <marc@msys.ch>

http://bugs.koha-community.org/show_bug.cgi?id=7346
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-15 18:33:36 +01:00
Ian Walls
1f5f4981c0 Bug 7326: longoverdue.pl hardcoded to 366 days maximum
Adds --maxdays command line flag to longoverdue.pl to allow the user to specify
their own $endrange value.  Because sometimes 366 isn't enough!

Also adds help documentation for both --quiet and --maxdays params

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>

Simple change, works well

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-15 17:13:24 +01:00
6dc234a001 7185 Fixing typo in release notes
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Trivial typo fix in text file.
2011-12-15 13:33:12 +01:00