This patch modifies GetSoonestRenewDate() so that it returns
undef if the patron, item, or loan cannot be found. This
better reflects the usage of this routine GetSoonestRenewDate(),
as none of its callers tried to check the second return
value containing an error code.
This patch also updates the POD to match.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch modifies CanBookBeRenewed, so that based on
issuingrules.norenewalbefore a new error "too_soon" can be returned.
Also adds a new subroutine GetSoonestRenewDate.
To test:
1) Create an issuing rule with "No renewal before" set to value X
and "Unit" set to days.
2) Test the following steps for both:
Home > Patron > Patron details
Home > Circulation > Checkouts
3) On the checkout page, test for today's issues as well as previous
issues. (Check something out on one day and something else on the
next day, then do the testing.)
4) Confirm that items can't be renewed if current date is more than
X days before due date.
5) Confirm that the date and time of the soonest possible renewal are
displayed in the format specified by global sysprefs "dateformat"
and "TimeFormat".
6) Confirm that items can be renewed if "No renewal before" is
undefined or current date is X or less days before due date.
7) Confirm that if the number of allowed renewals is exceeded
"Not renewable" is displayed, no matter what "No renewal before"
is set to.
8) Test the same things with "Unit" set to hours.
Sponsored-by: Hochschule für Gesundheit (hsg), Germany
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
The WHERE clause should not erase $query.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This adds the ability to specify whether staff, OPAC,
or slip news entries apply to all libraries or just a
particular library.
With the branch parameter added to key functions in
C4/NewsChannels.pm, function calls in C4/Members.pm,
mainpage.pl, opac/opac-main.pl, tools/koha-news.pl, and
t/db_dependent/NewsChannels.t were needed.
Some license texts were updated.
Templates were modified to display, allow for entry and editing
of the branches selected.
TEST PLAN
---------
1) Having logged into the staff client, is the news displaying
correctly? Have you entered a news item which should not
display for this branch of logged in user?
2) Find a patron (with some items checked out?)
3) Print a slip
- News which is labelled 'All Branches' or for the same branch
as the one printing the slip should display on the slip.
- THIS DOES NOT AFFECT QUICK SLIPS
4) Home -> Tools -> News
- Can you edit a news item?
- Does the change save correctly?
- Can you filter based on location and branch correctly?
- Can you add a new entry correctly?
- Can you delete an entry correctly?
5) Open an OPAC client.
- Does only the news for all branches display?
6) Log into the OPAC client.
- Does the news for all branches and the specific branch display?
7) prove -v t/db_dependent/NewsChannels.t
- Does it run and all succeed?
- Does the code seem to catch the required cases?
8) Comparing the patched and unpatched versions of files affected,
are the license changes missing anything?
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Changed the add and update functions to use a hash reference
for the parameter, so that adding or subtracting parameters
should be easier. Added some POD for the add_opac_news and
upd_opac_news functions, so that developers would know how to
call it.
The hashref changes resulted in being able to return 0 for
failure and 1 for success. This meant adding a couple tests
to the test file.
And while testing, there was some sort of logic problem with
the matter of '' being all, but selecting all only showed
things set for all, and excluded particular languages, or other
interfaces.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
"When all the data has been fetched from a SELECT statement,
the driver will automatically call finish for you. So you should
not call it explicitly except when you know that you've not
fetched all the data from a statement handle and the handle
won't be destroyed soon."
(http://search.cpan.org/~timb/DBI-1.627/DBI.pm#finish)
All the $sth variables were scoped within the functions,
and would be destroyed immediately. Additionally, there was
one after a SELECT, for only a single idnew, and so it was
not necessary.
TEST PLAN
---------
1) prove -v t/db_dependent/NewsChannels.t
2) apply patch
3) prove -v t/db_dependent/NewsChannels.t
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
ok 1 - use C4::NewsChannels;
ok 2 - Successfully added the first dummy news item!
ok 3 - Successfully added the second dummy news item!
ok 4 - Successfully updated second dummy news item!
ok 5 - Successfully tested get_opac_new id1!
ok 6 - Successfully tested get_opac_new id2!
ok 7 - Successfully tested get_opac_news!
ok 8 - Successfully tested GetNewsToDisplay!
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Grabbed the current license from
http://wiki.koha-community.org/wiki/Coding_Guidelines#Licence
and changed the use strict; use warnings; into a
use Modern::Perl instead.
TEST PLAN
---------
1) Log into staff client.
- Does news look okay?
2) Apply patch
3) Refresh staff client.
- Does news look the same?
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Safe no op action
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes an issue originally reported by bug 11702.
RM note: the patch is clear enough and doesn't break existing tests,
but on the other hand, I have been completely unable to reproduce
the original issue.
To test:
[1] Verify that prove -v t/db_dependent/Holds.t passes
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When calling C4::Context::Zconn twice with different parameters,
the same ZOOM::Connection object is returned twice (parameters
of 2nd call are not used) This patch fixes that.
This is in part because the connection cache is keyed on server
name only. This patch corrects this by keying on all parameters.
TEST PLAN
---------
1) apply patch
2) run koha qa test tools
3) prove -v t/Context.t
The unit tests properly triggers the modified routine for
testing. Additionally, in hunting for ways it could break,
no nested synchronous or asynchronous Zconn's were found.
And even if they were, the keying on all parameters should allow
it to function properly without messing up the other connection.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When the ability to stage authority records was added to Koha, sorting
record batches by citation ( i.e. title ) caused the addition of
"authorized_heading" to be added to the sort. When sorting by title
descending, this causes the order by clause to be "title,
authorized_heading DESC" which means sort by title ASC, then
authorized_heading DESC. This is incorrect and causes regular biblio
batches to always be sorted ascending.
Test plan:
1) Stage a batch of biblio records from a file
2) View the staged batch
3) Attempt to sort by title descending
4) Note it is still sorted by title ascending
5) Apply this patch
6) Note the sorting now works correctly
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised. The code pertaining to sorting in routine
GetImportRecordsRange will probably not win beauty prizes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Koha Team Lyon 3 <koha@univ-lyon3.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Added Sign off line.
Passes all tests and QA script, including t/db_dependent/Serials.t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10649 introduced a new include file for adding DataTables-related
JavaScript assets. This patch adds use of this include file to all
circ-related pages which use DataTables.
Apply the patch and test the following pages to confirm that table
sorting works correctly:
- Reports -> Guided reports -> Use saved
(reports/guided_reports.pl?phase=Use saved):
"Creation date" sorting has been reconfigured to use the title-string
method for sorting on an unformatted date. C4:Reports::Guided.pm has
been modified to pass an unformatted date to the template. Sorting
should work correctly for all settings of the dateformat system pref.
- Reports -> Catalog by item type
(reports/manager.pl?report_name=itemtypes)
- Reports -> Serials statistics wizard (reports/serials_stats.pl):
The subscription begin and subscription end columns have been modified
to use the title-string filter for sorting. An unformatted date is now
passed from reports/serials_stats.pl to the template, where the
KohaDates filter is used for formatting. Sorting is based on the
unformatted date. Sorting should work correctly for all settings of
the dateformat system pref.
- Sorting of titles should now exclude article from sorting.
- Minor template improvements:
- Vendor name now links to vendor details.
- Subscription title now links to subscription details.
- Library name is now shown instead of branchcode.
Signed-off-by: Aleisha <aleishaamohia@hotmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Checked all pages, no regressions or Javascript errors detected.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan :
- Define a notice containing <<borrowers.streettype>>
- Trigger an event that generate this notice
Without patch <<borrowers.streettype>> is replaced by ROADTYPE
authorised value code. With the patch it is resplaced by its
description
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
This works as described, passes tests and QA script.
Note: it seems it's not possible currently to use B_streettype from
the interface, but it might be worth adding it as a follow up for later
use.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If GetOrder is called with a nonexistent ordernumber or without any
ordernumber, it should return undef.
Test plan:
prove t/db_dependent/Acquisition.t
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Updated number of tests to 68, tests and QA script all happy now.
Looked at a few pages in aquisition using GetOrder as well.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch finishes the work started in one of the previous
follow-ups and allows CardnumberLength to be set to a value
like ',5'. In conjunction with not including cardnumber in
BorrowerMandatoryField, this allows a cardnumber to not be
required but, if present, to not exceed the specified length.
This patch also updates t/db_dependent/Members.t so that
it runs in a transaction, tests the new return value
of checkcardnumber, and manages the CardnumberLength syspref.
To test:
[1] Verify that prove -v t/db_dependent/Members.t and
prove -v t/Members/cardnumber.t pass.
[2] Set CardnumberLength to ",5" and take cardnubmer out of
the BorrowerMandatoryField list.
[3] Verify that you can save a patron record without a cardnumber,
but if you supply one, that it can be at most 5 characters long.
[4] Add cardnumber back to BorrowerMandatoryField. This time, the
minimum length is 1 even though CardnumberLength is ",5".
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch refactors the previous code and moves the logic from the pl
to a new routine.
Same test plan as previous patch.
/!\ new unit test filename.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Bug 10861: Reintroduced the cardnumber length check (client side)
Previous patches has removed the pattern attribute of the input, it was
not needed. This patch reintroduces it. It will only work for new
browser version.
Moreover, it manages with the ',XX' format (see UT).
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Squashed the last two follow-ups. The pattern test did not work fully for me
in Firefox 26 (very recent). But I see the message when I clear the field.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Some libraries would like to add a check on the cardnumber length.
This patch adds the ability to restrict the cardnumber to a specific
length (strictly equal to XX, or length > XX or min < length < max).
This restriction is checked on inserting/updating a patron or on importing
patrons.
This patch adds:
- 1 new syspref CardnumberLength. 2 formats: a number or a range
(xx,yy).
- 1 new unit test file t/Members/checkcardnumber.t for the
C4::Members::checkcardnumber routine.
Test plan:
1/ Fill the pref CardnumberLength with '5,8'
2/ Create a new patron with an invalid cardnumber (123456789)
3/ Check that you cannot save
4/ With Firebug, replace the pattern attribute value (for the cardnumber
input) with ".{5,10}"
5/ You are allowed to save but an error occurred.
6/ Try the same steps for update.
7/ Go to the import borrowers tool.
8/ Play with the import borrowers tool. We must test add/update patrons
and the "record matching" field (cardnumber or a uniq patron attribute)
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested adding, updating; importing and ran unit test.
Preliminary QA comments on Bugzilla
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The SearchOrders routine should return the booksellerid and this
patch adds it.
This fixes several problems:
[1] The link to the vendor on the order receive page breadcrumbs
was broken.
[2] The tax calculation in finishreceive.pl didn't run.
[3] The item booksellerid field never got updated during
receipt.
Booksellerid was returned before bug 10723.
Quick test plan:
Go on orderreceive.pl and verify that the vendor link is correct.
Followed test plan. Vendor link is now correct.
Signed-off-by: Marc Véron <veron@veron.ch>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch also adds POD and UT for the change in SearchOrders()
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The order status ordered is set when the basket is closed.
The parcel page should only display status "ordered" and "partial".
Test plan:
- create a basket.
- create an order.
- verify the order is not listed on the parcel page (i.e. you cannot
receive it).
- close the basket.
- verify the order is listed on the parcel page.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes some dead code concerning the handling of patrons
that are members of other, institutional patrons. This code did not
work; removing it clears the field if somebody wants to do a better
implementation of such relationships between patrons.
This patch:
[1] Removes the memberofinstitution system preference.
[2] Removes the following routines:
C4::Members::get_institutions()
C4::Members::add_member_orgs() (and removing this routine
removes a reference to a borrowers_to_borrowers table that
does not exist).
There should be no changes whatsoever to system functionality with this
patch (with the trivial exception of the absence of the
memberofinstitution system preference).
Test plan:
[1] Look at the code and use grep, git grep, etc. verify this patch
does not remove something in use.
[2] Verify that there are no regressions upon adding or editing
a patron record.
[3] Verify that the memberofinstitution system preference has been
removed
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
DBD::Mysql provides a mysql_auto_reconnect flag. Using it avoids
the time required to do a $dbh->ping().
Benchmarks:
use Modern::Perl;
use C4::Context;
for ( 1 .. 1000 ) {
$dbh = C4::Context->dbh;
}
* without this patch on a local DB:
perl t.pl 0,49s user 0,02s system 98% cpu 0,525 total
* without this patch on a remote DB:
perl t.pl 0,52s user 0,05s system 1% cpu 37,358 total
* with this patch on a local DB:
perl t.pl 0,46s user 0,04s system 99% cpu 0,509 total
* with this patch on a remote DB:
perl t.pl 0,49s user 0,02s system 56% cpu 0,892 total
Testing the auto reconnect:
use Modern::Perl;
use C4::Context;
my $ping = $dbh->ping;
say $ping;
$dbh->disconnect;
$ping = $dbh->ping;
say $ping;
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Real improvement. No koha-qa errors
prove t/db_dependent/Circulation_issuingrules.t produces no error
prove t/db_dependent/Context.t produces no error
Test
1) dumped Koha DB, load it on a non-local server
2) run sample script whit and without patch, local and remote
use Modern::Perl;
use C4::Context;
for ( 1 .. 100000 ) {
my $dbh = C4::Context->dbh;
}
Main difference I note is with remote server
a) without patch
real 0m16.357s
user 0m2.592s
sys 0m2.132s
b) with patch
real 0m0.259s
user 0m0.240s
sys 0m0.012s
I think this could be good for DBs placed on
remote servers
Bug 10611: add a "new" parameter to C4::Context->dbh
When dbh->disconnect is called and the mysql_auto_reconnect flag is set,
the dbh is not recreated: the old one is used.
Adding a new flag, we can now force the C4::Context->dbh method to
return a new dbh.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 10611: Followup: remove useless calls to dbh->disconnect
These 3 calls to disconnect are done at the end of the script, they are
useless.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The report also known as "Overdues with fines"
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
All tests pass, this adds data to the Patron column on the
overdues with fines report to show the patron's cardnumber
and phone number.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
This works as described and passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
While reviewing the main patch for this bug to verify that the
holds queue routines and C4::Reserves had the same conception of
when a damaged item could fill a hold request, I noticed that
GetItemsAvailableToFillHoldRequestsForBib() duplicated the code
for adding an SQL clause to filter out damaged items. This patch
removes the duplication and improves the POD for that routine.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
AllowHoldsOnDamagedItems will stop item-specific holds from being placed
on damaged items, but does not stop Koha from using damaged items to
fill holds. This seems like incorrect behavior.
Test Plan:
1) Set 'AllowHoldsOnDamagedItems' to "Don't Allow"
2) Pick an item, set it to damaged
3) Place a bib-level hold on this item's record
4) Scan the item though the returns system
5) Koha will ask to use this item to fill the hold, click "ignore"
6) Apply this patch
7) Repeat step 4
8) Koha will not ask to use this item to fill the hold
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The priority of new hold requests was not calculated when using ILS-DI.
A new routine is added, C4::Reserves::CalculatePriority(), to calculate
the priority prior to placing a request.
A separate bug report, 11640, covers the changes in reserves to
use this new routine more generally.
This patch does therefore only affect ILS-DI.
Note: ILS-DI already allows you to generate multiple holds on a biblio or
item for the same patron. This patch does not change that behavior.
Test plan:
[1] Place multiple holds using ILS-DI HoldTitle service:
/cgi-bin/koha/ilsdi.pl?service=HoldTitle&patron_id=BORROWERNUMBER&bib_id=BIBLIONUMBER&request_location=test
Check the priority.
[2] Do the same using HoldItem service:
/cgi-bin/koha/ilsdi.pl?service=HoldItem&patron_id=BORROWERNUMBER&bib_id=BIBLIONUMBER&item_id=ITEMNUMBER
Check the priority again.
[3] Use a biblio with multiple items. Place item level holds on both.
Check in one of these items in another branch. Confirm transfer.
Check in the other item in the original branch. Confirm hold.
Now you have a waiting and a transit hold.
Test HoldTitle and HoldItem service again a few times.
[4] Enable AllowHoldDateInFuture and add a future hold.
Now test HoldTitle and HoldItem again and check if these holds are
inserted before the future hold (lower priority).
January 29, 2014: Rebased this patch and amended it to make a distinction
between fixing the ILS-DI bug and using the new routine.
Updated commit message and test plan (marcelr).
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::Acquisition need more UT, and more robust ones. This patch
adds some.
This patch adds UT to
- GetOrder
- GetOrders
- GetCancelledOrders
- GetLateOrders
It refactors UT for SearchOrders
New UT use 2 new routines, used for check the list of fields returned
by a routine:
_check_fields_of_order
_check_fields_of_orders
These 2 routines could later be used by other UT
_check_fields_of_order has its own UT (tests n°14,15,16).
to test :
prove -v t/db_dependent/Acquisition.t
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Unit tests pass, passes koha-qa.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa and t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This corrects line 1250 of C4/Context.pm to be:
return ($userenv->{flags}//0) % 2;
And thus avoids an uninitialized value used in the modulus.
TEST PLAN
---------
1) Apply the first patch (to update t/Context.t)
2) prove -v t/Context.t
-- This should fail tests 7 and 8
3) Apply this patch (to fix C4/Context.pm)
4) prove -v t/Context.t
-- All tests should succeed
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch addresses a number of issues with the main patch:
- regression on bug 2060 (i.e., displaying authority import batches
correctly)
- regression on bug 10170 (translation of import record states)A
- use of datatables.inc
- lack of clarity as to the licensing of tools/batch_records_ajax.pl
- insufficent sanitizing of input used to generate an SQL statement
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Some libraries would like to sort by columns for the records of an
import batch. This seems like a good use of Ajax DataTables.
Test plan:
1) Apply this patch
2) Import a record batch into Koha
a) Use some form of matching
b) Have some records that will match and some that won't
c) Have at least 30 records so you can test the pager
3) Verify the new table is functionally equivalent to the old static one
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Tests fine and looks good with the exception of the corrections I put in
a follow-up.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes Koha <-> Zebra use MARCXML for the serialization when
using DOM, and USMARC for GRS-1.
* The following functions are modified to set the Zebra record syntax
according to the current sysprefs and configuration:
- C4::Context->Zconn
- C4::Context-_new_Zconn
* A new function 'new_record_from_zebra' is introduced, which checks the
context we are in, and creates the MARC::Record object using the right
constructor.
The following packages get touched to make use of the new function:
- C4::Search
- C4::AuthoritiesMarc
and the same happens to the UI scripts that make use of them (both in
the OPAC and STAFF interfaces).
* Calls to the unsafe ZOOM::Record->render()[1] method are removed.
Due to this last change the code for building facets was rewritten. And
for performance on the facets creation I pushed higher version
dependencies for MARC::File::XML and MARC::Record (we rely on
MARC::Field->as_string).
* Calls to MARC::Record->new_from_xml and MARC::Record->new_from_usmarc
are wrapped with eval for catching problems [2].
* As of bug 3087, UNIMARC uses the 'unimarc' record syntax. this case is
correctly handled.
* As of bug 7818 misc/migration_tools/rebuild_zebra.pl behaves like:
- bib_index_mode (defaults to 'grs1' if not specified)
- auth_index_mode (defaults to 'dom')
here we do exactly the same.
To test:
- prove t/db_dependent/Search.t should pass.
- Searching should remain functional.
- Indexing and searching for a big record should work (that's what the
unit tests do).
- Test an index scan search (on the staff interface):
Search > More options > Check "Scan indexes".
- Enable 'itemBarcodeFallbackSearch' and try to circulate any word, it
shouldn't break.
- Searching for a biblio in a new subscription shouldn't break.
- Running bulkmarcimport.pl shouldn't break.
- And so on... for the rest of the .pl files.
[1] http://search.cpan.org/~mirk/Net-Z3950-ZOOM/lib/ZOOM.pod#render()
[2] a record that cannot be parsed by MARC::Record is simply skipped (bug 10684)
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10649 introduced a new include file for adding DataTables-related
JavaScript assets. This patch adds use of this include file to the Koha
news page.
To test you should have existing news items with varying creation and
expiration dates. Apply the patch and confirm that table sorting works
correctly for all settings of the dateformat system preference.
C4::NewsChannels.pm has been modified so that it now passes an
unformatted date to the template, where the KohaDates plugin is used to
apply the correct formatting. Sorting is based on the unformatted date.
Also corrected: Capitalization errors.
Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described, no problems found.
Also passes tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This is just some code cleanup, no behavior change expected.
Also replacing errstr with err in testing the results. (See DBI.)
Test plan:
Modify an item and save it.
Followed test plan. No problems found.
Signed-off-by: Marc Véron <veron@veron.ch>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This routine is no longer used.
Test plan:
Do a grep on the name.
(Bonus points:) Verify if you can perform some actions on lists.
No more occurences of _biblionumber_sth found
Signed-off-by: Marc Véron <veron@veron.ch>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Currently road types are stored in a specific table in DB. Moreover, an
admin page is present in order to manage them.
This patch proposes to remove this table and this page in favour of a
new authorised value category 'ROADTYPE'.
This patch:
- adds a new AV category 'ROADTYPE' (created from the roadtype table
content).
- remove the roadtype table.
- remove the .pl and .tt file admin/roadtype
- remove the 2 routines C4::Members::GetRoadTypes and
C4::Members::GetRoadTypeDetails
Test plan:
1/ Execute the updatedatabase entry and verify existing roadtypes are
now stored in the AV 'ROADTYPE'.
2/ Verify you can add/update a streettype for patrons.
3/ Verify on following pages the streettype is displayed in patron
information (top left):
circ/circulation.pl
members/memberentry.pl
members/moremember.pl
members/routing-lists.pl
Signed-off-by: Sophie Meynieux <sophie.meynieux@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If a search gives results with 6 facets, one of those facets won't be
displayed. This is due to a bug in the code that only considers great
than 6 facets in one area, and less than 6 in another.
Test Plan:
1) Perform a search that should give results for 6 different libraries
2) Note you only see 5 libraries in the facets with no option to expand
3) Apply this patch
4) Repeat step 1
5) Note you now have the option to expand the facets list
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
This patch should provide a regression test but I really don't know how
to write it.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Replace constructs using given and when by if/else
feature now generates compilation warnings in 5.18
and is liable to change behaviour.
This patch:
* replaces the construct with if/else
* reformats the if branching using perltidy
to remove the now redundant indent
To test:
[1] Verify that prove -v t/db_dependent/MarcModificationTemplates.t
passes.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When you run the Reserves test, you have the warnings:
Use of uninitialized value $branchcode in hash element at /usr/share/koha/testclone/C4/Letters.pm line 138.
Use of uninitialized value $branchcode in hash element at /usr/share/koha/testclone/C4/Letters.pm line 148.
This patch removes that warning.
Test plan:
Run the Reserves.t again.
Revised Test Plan
-----------------
Run the following on the command line prompt before and after
applying the patch:
perl -e "use C4::Letters; *C4::Context::userenv= sub { return {} }; my \$blah=C4::Letters::getletter('circulation','DUE', 'BRA');"
Before the patch there will be errors (as above), after there will not.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
IndependentBranches must be on.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the following changes to UNIMARC biblio indexing :
A. Changes to UNIMARC conf files
1. add comments to biblio-koha-indexdefs.xml
2. make biblio-koha-indexdefs.xml more compact by grouping some
declarations
Ex : 200$f and 200$g => one declaration for 200$fg
3. suppress unneeded declarations (indexing of some 4XX fields and 6XX
fields not in unimarc format)
4. unindex some (sub)fields unneeded by most users (318, 207,230,210a,
215, 4XXd)
5. change the way 308 field is indexed (no visible changes)
6. replace Title-host with Host-item -- see bug 11119
7. index 208 in Material-Type -- see bug 11119
8. index 100 pos 8-9 and 9-12 in pubdate:y and pubdate:n
9. index 100 pos 8-9 in pubdate:s instead of 210$d
10. Index all subfields of note 334 and 327 in note index
11. Index 304 and 327 in title index as well as note index
327 can contain a list of titles included in a work
304 can contain the title of the original work in case of a
translation
12. Index 314 in author index as well as note index
314 can contain authors not mentionned in 200$f/g (the 4th, 5th etc.
author)
13. Index 328 note in Dissertation-information as well as note
14. Index 328$t in Title
B. Changes to ccl.properties :
1. add a new index Dissertation-information (1056)
2. fix EAN, pubdate and acqdate (they were not linked with bib1 attributes)
C. Changes to Search.pm
1. add Dissertation-information and suppress Title-host and UPC
D. Changes to QP config file queryparser.yaml
1. add Dissertation-information
2 fix EAN, pubdate and acqdate
Test plan :
If you cannot test in GRS1, test only in DOM, as GRS will be deprecated.
1. Apply the patch in a UNIMARC Koha running with DOM and ICU
2. copy src/etc/searchengine/queryparser.yaml into the main config
directory of QP
3. copy src/etc/zebradb/ccl.properties into the main config directory
of Zebra
4. copy src/etc/zebradb/marc_defs/unimarc/biblio/* into the main config
directory of Zebra
5. reindex biblios (rebuild_zebra.pl -r -b -x -v)
6. test note index : make some searches on 334$b or 327$b
7. test author index : make some searches on 314 field
8. test title index : make some searches on 304 and 327 field, make a
search on 328$t subfield
9. test dissertation-information index : make some searches on 328 field
10. In a record, put in the dates of 100 fields the values "1000" (1st
date) and "1001" (2d date) ; try to search a book written in year
1000, you should find the record ; idem for year 1001
11. make some searches and sort by date. It should work better as before,
especially if you have values like "c2009" or "impr. 2010" in 210
field
12. Regression test : make some searches on several indexes, like EAN,
etc. It should work as before
Test 10-12 with and without Queryparser activated.
Be careful: with Queryparser activated, the index names (title,
dissertation-information...) must be entered in lowercase only.
Of course, to test search and sort by dates, you need to have full
records, with dates in 100 field as well as 210 field.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adding Number-local-acquisition in C4::Search known indexes allows to
search without using "ccl=" prefix.
Also corrects in ccl.properties : inv must be an alias of
Number-local-acquisition.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This fixes a regression introduced by the patches for bug 10723.
To Test:
1) Create budget and fund under budget administration.
2) Create Vendor in acquisitons module.
3) Create basket under vendor.
4) Create order and choose budget while creating order.
5) Click on Receive shipment button.
6) Click on receive link on the right hand side you
will be able to see a staff user name in the "created by"
field.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If a patron has a record-level hold that is unavailable, any patron
information request will send back an empty CD field when this field
should have an item barcode in it [RM note: this actually isn't
universally true -- the SIP2 standard is silent as to what is supposed
to go in the CD field. Some SIP2 devices do indeed want an item
barcode, but others are known to just want a display of the title
and author of the request in question. Providing an option is the
topic of a new enhancement request, however.]
This is due to a minor error in ILS::Patron::_get_outstanding_holds
where GetItemnumbersForBiblio is assumed to return an array but in
reality returns an arrayref.
Test Plan:
1) Create a record level hold for a patron and record
2) Using SIP2, make a patron information request
3) Note the empty CD fields
4) Apply this patch, restart SIP server
5) Repeat step 2
6) Note the CD field now has a barcode
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
I did not test this patch but the following code shows me it is correct:
use C4::Items;
use Data::Dumper;
my $biblionumber = 5035;
my $itemnumber = (GetItemnumbersForBiblio($biblionumber))[0];
say Dumper $itemnumber;
$itemnumber = (GetItemnumbersForBiblio($biblionumber))->[0];
say Dumper $itemnumber;
displays:
$VAR1 = [
'23168',
'23169',
'23170',
'23171',
'23172'
];
$VAR1 = '23168';
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::Charset::SetMarcUnicodeFlag() fetches system preference
values, so since it invokes routines in C4::Context, it should
load the module.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::Acquisitions contained a number of unnecessary calls to
$sth->finish. Removed these and the associated variables introduced to
cache query results between fetch and the return
Where finish was the end of the routine I have added an
explicit return to document that no data is returned.
A number of places made query calls and fetched a single
row. Such a case could require an explicit finish.
These assume that they are looking up with a unique key.
To remove assumptions and isolate the code from future changes
I've switched these to fetching all and returning the
first row. I have commented these cases.
For fuller explanation see perldoc DBI
What I tested:
Edit existing basket, chnged name
Modify order line, change vendor price
Create new basket and add order
Delet this order
Delte this basket
New Basket, new order, user added, user removed
Add contract to vendor, change details, delete contract
Search order biblio
Create basket group, add basket to group, remove basket from group
Delete basket group
Receive order
Everything behaved as I expected
Signed-off-by: Marc Veron <veron@veron.ch>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The last use of the POE family of Perl modules went away with
the removal of zebraqueue_daemon.pl per bug 9001. Consequently,
this patch removes POE as a dependency.
To test:
[1] Verify that "git grep POE" and "git grep libpoe" report
nothing.
[2] Verify that koha_perl_deps.pl -a does not report POE
as a dependency.
[3] (extra credit) verify that Debian packages can be built
that do not list libpoe-perl as a dependency.
This patch also updates some distro-specific installation
instructions and scripts, but makes no representations about
whether those instructions currently work.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The SQL option for MARC framework imports was subject to a bug whereby
somebody could use it to gain access to arbitrary information in the
database by uploading an SQL file containing unexpected statements.
As it is difficult to securely sanitize SQL, this patch removes the
option to use SQL as an import or export format.
To test:
[1] Verify that SQL no longer appears as an import or export option
for the MARC frameworks.
[2] Verify that exports and imports in CSV, Excel XML, and ODS formats
still work.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Works as advertised. The UI doesn't offer exporting/importing in the SQL format.
Crafting the URL to export SQL fallbacks to a spreadsheet format (ODS).
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described, passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In Koha 3.8, if a standard catalog search was performed and the user
clicked the Z39.50 search button, the search string would automatically
be placed in the isbn field for the Z39.50 search form.
Changes to the code have since broken this functionality.
Test Plan:
1) From mainpage.pl, use "Search the catalog" to search for the string
"9781570672835"
2) Click the Z39.50 Search button
3) Note the string is placed in the title field
4) Apply this patch
5) Repeat steps 1-2
6) Note the string is placed in the isbn field
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested old and new ISBN with and without hyphens.
Also tested some other keyword searches.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Note that the behavior will be a bit odd if you do a 'replace via
Z39.50' from a bib record whose title happens to be an ISBN, but
this scenario seems unlikely enough to ignore.
This patch fixes following warnings:
FAIL C4/Serials.pm
FAIL valid
Useless use of a constant (43) in void context
Useless use of a constant (41) in void context
Useless use of a constant (44) in void context
Useless use of a constant (42) in void context
Useless use of a constant (4) in void context
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
4 new statuses to represent variations on "missing" is added by this
patch: "never received", "sold out", "damaged", and "lost.
These status have the same behavior than the simple Missing status.
Test plan:
- Find a serial to claim.
- Modify the status of this serial with one of these new statuses.
- Try to find it with the "serials to claim" search.
- Verify that the status is displayed on the serial module pages and on
the OPAC.
Signed-off-by: Nicolas Bravais <nicolas.bravais@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The return from GetReservesFromBiblionumber contains an unnecessary
extra variable. In scalar context an array returns its element count.
Maintaining a separate count can lead to unforeseen bugs
and imposes ugly constructions on the subroutine's users.
Remove the useless count variable from the return
This patch also changes the parameters: now the routine takes a hashref.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Placed biblio holds, future holds and item holds. Works as expected.
Tested Holds.t and Reserves.t. Pass.
Tested /cgi-bin/koha/ilsdi.pl?service=GetRecords&id=999 with two holds on
one item. Fine.
C4/SIP/ILS/Item.pm: Looked for "whatever" and "arrayref" and could not find
them anymore. Looks good.
Handled a few unneeded calls in QA follow-up.
Left only one point to-do for serials/routing-preview.pl. See Bugzilla.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes an issue where chosing 'None' as the sort order
for an authority search would result in zero hits if QueryParser is
eanbled.
This patch also adds some additional test cases.
To test:
[1] Enable QueryParser.
[2] Perform an authority search in the staff interface that
uses 'Heading A-Z' as the sort order and returns hits.
[3] Run the same search, but with the sort order set to 'None'.
No hits are returned.
[4] Apply the patch.
[5] Do step 3 again. This time, hits should be returned.
[6] Verify that prove -v t/db_dependent/Search.t passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If parent_ordernumber is not set in NewOrder parameter, it is
automatically set to ordernumber.
This patch only avoid code duplication.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
This solution is better!
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script. Also all tests in
t/db_dependent/Acquisitions/.
Confirmed bug and that the patch fixes it.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To reproduce the issue:
- transfer an order from a basket to another. Note the previous
ordernumber (X) and the new one (Y).
- receive the order
- cancel the receipt
- verify the order has been deleted:
select count(*) from aqorders where ordernumber=Y;
select * from aqorders_transfers where ordernumber_from = X;
The value for ordernumber_to is null.
To test this patch:
- apply this patch
- transfer an order from a basket to another
- receive the order
- cancel the receipt
- verify the order still exist in the basket where the transfer has been
done.
Signed-off-by: Sonia BOUIS <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the words 'biblio' and 'item' to the 'info'
of the cataloguing logs which were missing them (such as biblio
delete, biblio mod, item mod, upload cover image).
This patch also adds 'authority' for authority mod.
_TEST PLAN_
Before applying:
1) Create/view mods for items, biblios, and authorities.
2) Create/view biblio deletion
3) Create/view upload cover image log
4) Note that none of these contain the words 'biblio','item',or
'authority' in their "Info" columns.
Apply patch.
5) Repeat steps 1-3
6) Note that the new logs contain 'biblio','item', and 'authority'
in their "Info" column, while the past ones don't.
7) Note also that 'biblio' and 'item' will have 'Biblio' and 'Item'
appear in their "Object" column for the new logs
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
1/ CURRENT_DATE() is a MySQLism and should be replaced with CAST(now() AS
DATE).
2/ The date formatting should be done in the template (using the TT
plugin).
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Pasting comment from the Bugzilla report:
Looking bit longer at this code, it is kind of strange to find it
there in the first place. Adding maxpickupdelay in Letters.pm should
not be there, but it is..
Also this date is not used normally in the default HOLD Available for
Pickup notice (that we are generating in this case). And if it would be
undef, the expiration date should imo be empty instead of today+0.
(before adding maxreservespickupdelay, you should test the allowexpire
pref first) So it is an (invisible) bug on its own.
Test plan:
See former patch. Kyle just discovered this bug, apparently by
deleting the maxpickupdelay pref..
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Before bug 9788 the alldates parameter of GetReservesFromItemnumber was
actually not used in the codebase.
The first patch of bug 9788 did change that and passed true by default.
But a closer look revealed that we do not really need it.
The parameter is removed by this patch; the SQL statement is slightly
adjusted: if reservedate<=now or a waitingdate is filled for the
requested itemnumber, GetReservesFromItemnumber will return the reserve.
This includes so-called future waits: a future hold that has been confirmed
ahead of time with pref ConfirmFutureHolds > 0 days.
Note that future item-level holds are not really interesting to return; this
just corresponds to original behavior. Future next-available holds are not
in view at all; they do not contain an item number.
Test plan:
Actually, the test plan of the first patch is valid. But for completeness I
repeat it here:
[1] Enable future holds and set ConfirmFutureHolds to 2 days.
[2] Place a future next-available hold for 2 days ahead.
[3] Check item status on catalogue detail. Available? That is fine.
[4] Confirm the future hold by checking it in. ('future wait')
[5] Look at item status again on catalogue detail. Must be Waiting now.
[6] Switch to OPAC and login as another opac user. Goto Place a hold.
[7] Check item status with item level hold info. Is it waiting?
[8] Try to place hold in staff, check item level status again. Waiting?
[9] Make a transfer for the item. Switch branch. Check hold status on
Transfers to receive.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes GetReservesFromItemnumber also returns the waiting
date and removes some repeated code.
Improves item status display on catalogue detail, when placing a hold at
opac-reserve and in staff, and on transfers to receive form.
This patch builds on work from reports 9367 and 9761.
Test plan:
Place a future next-av. hold (enable future holds prefs), say 2 days ahead.
Check item status on catalogue detail. Nothing to see.
Enable ConfirmFutureHolds by inserting a number of days, say 2.
Confirm earlier hold by checking it in. Look at item status again on detail.
Switch to other opac user. Try to place a hold again. Check item status with
item level hold info. Try to place hold in staff, check item level status.
Make a transfer for that item. Switch branch. Look at transfers to receive.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes C4::Barcodes::PrinterConfig, which is
used by no other code in the database.
Signed-off-by: Emma Heath <emmaheath.student@wegc.school.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
No instances of PrinterConfig found in the codebase
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Couldn't find any reference to those files in Koha.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch restores the ability to request a DBI database handle
or a DBIx::Class schema object connected to a PostgreSQL database.
To address the concerns raised in bug 7188, only "mysql" and "Pg"
are recognized as valid DB schemes. If anything else is passed
to C4::Context::db_scheme2dbi or set as the db_scheme in the Koha
configuration file, the DBD driver to load is assumed to be "mysql".
Note that this patch drops any pretense of Oracle support.
To test:
[1] Apply patch, and verify that the database-dependent tests
pass when run against a MySQL Koha database.
[2] To test against PostgreSQL, create a Pg database and
edit koha-conf.xml to set db_scheme to Pg (and adjust
the other DB connection parameters appropriately). The
following tests should pass, at minimum:
t/Context.t
t/db_dependent/Koha_Database.t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described, some additional notes:
- Installed Postgres following
http://wiki.ubuntuusers.de/PostgreSQL
- Created a database user koha
- Created a database koha
- Changed the koha-conf.xml file
<db_scheme>Pg</db_scheme>
<database>koha</database>
<hostname>localhost</hostname>
<port>5432</port>
<user>koha</user>
<pass>xxxx</pass>
- Installed libdbd-pg-perl
- Ran the web installer until step 3 everything looked ok
Step 3 complains:
Password for user koha: psql: fe_sendauth: no password supplied
- Both t/Context.t and t/db_dependent/Koha_Database.t pass
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes sure that the search history from an
anonymous session is cleared from the session after a user
logs in (and the session history is saved to that user's
record in the database). This fixes a problem where the
search history from the session got repeatedly added to the
database each time the user did something while logged
into the OPAC.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This is recommended in CGI::Session documentation.
Signed-off-by: Charlene Criton <charlene.criton@univ-lyon2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- Remove unit tests for ParseSearchHistoryCookie, which doesn't exist
anymore
- Add unit tests for ParseSearchHistorySession and
SetSearchHistorySession
- Remove/Modify comments about search history cookie
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Tests fixed and moved, and comments tidied up
Signed-off-by: Charlene Criton <charlene.criton@univ-lyon2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Storing search history into cookie can cause problems, due to the size
limitation of 4KB.
The solution here is to store search history into the CGI::Session
object, so there is no size limitation (but anonymous search history
still remember up to 15 requests max.)
Test plan:
- Go to OPAC in anonymous mode.
- Check that the "Search history" link is *not* shown in the top right
corner of the page
- Make some searches on /cgi-bin/koha/opac-search.pl
- The "Search history" link should appear. Click.
- Your search history should be displayed.
- Try to log in with invalid username/password
- Go back to search history, it's still there
- Now log in with valid username/password
- Your anonymous search history should be saved into your own search
history.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Restoring original sign offs and comments below
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors
Well, search history saving is similar before and after patch.
i.e. anonmymous search is saved when user logs in, but cookie
KohaOpacRecentSearches is empty.
Shows current an previous session searches
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
All tests and QA script pass, works as described.
Signed-off-by: Charlene Criton <charlene.criton@univ-lyon2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The way GetBranches was written, it will issue one query to get all
branches, and then one query per branch for the branch relations.
This patch pre-fetches the relations table (as we need it all anyway)
and so makes the whole process happen in two queries, rather than take
1+N, where N is the number of branches.
This might not seem like much, but when you do a search, GetBranches is
called once for each result, so 25. And you might have 10 branches. This
causes 275 database requests when you could get away with 50.
From profiling, when you run a search, this is the thing that calls
DBI::st::execute the most. Refer:
http://debian.koha-community.org/~robin/opac-search/usr-share-koha-lib-C4-Branch-pm-146-line.html#125
Test Plan:
* Have a database with branches and relationships between the branches.
(these are 'Library groups' in the UI.
* Make sure the relationships show up correctly after applying the
patch.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Silence warns in C4::Bookseller::GetBooksellersWithLateOrders()
to test
1/ run prove t/db_dependent/Bookseller.t
Notice lots of Use of uninitialized value $delay in numeric lt (<) at /var/lib/jenkins/jobs/Koha_master/workspace/C4/Bookseller.pm line 134 type lines
2/ apply patch
3/ run prove t/db_dependent/Bookseller.t
Notice warns are gone
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tiny change, positive consequences.
Passes QA script and all tests.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch corrects a typo that broken ModReserveFill(). This
patch also adds a unit test that (via two levels of indirection)
exercises ModReserveFill().
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch replaces a given/when statement by an if so we
do not get warnings on compilation when using Perl 5.18 or later.
Note the perldoc for the subroutine was not correct;
code was not testing that paramater equalled the values
but that it contained them. Have amended accordingly
Have documented behaviour in case parameter contains
neither value.
Subroutine does not appear to me used elsewhere in
codebase
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script, especially t/DateUtils.t.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch improves the POD for C4::Reserves::_FixPriority()
to (hopefully) describe its function thoroughly. It also
adjusts the call of _FixPriority() by CancelReserve() to
omit passing reserve_id, since by that point no row in
the reserves table for that request still exists.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In various places, deleting a hold request did not trigger recalculating
the priority of the other holds on the bib:
To reproduce the bug:
- select or create 2 users U1 and U2
- select or create an holdable item
- place on hold for both U1 and U2. U1 has priority 1 and U2 has
priority 2.
- delete the hold for U1
- go on circ/circulation.pl?borrowernumber=XXXX for U2 (or in the DB
directly) and verify the priority has not been set to 1
The issue is repeatable (at least) on these 2 pages:
* circ/circulation.pl?borrowernumber=XXXX (tab 'Holds', select "yes"
in the dropdown list and submit the form)
* reserve/request.pl?biblionumber=XXXX (click on the red cross)
Signed-off-by: Christopher Brannon <cbrannon@cdalibrary.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Reran my tests:
Preparations:
- Create holds for different patrons on a record:
* 1st - title level hold
* 2nd - item level hold
* 3rd - title level hold
* 4th - title level hold
- AllowOnShelfHolds = On/Allow (items were not checked out)
Tests:
Deleted holds from various pages, confirming bugs first,
then testing with applied patches. Reloading database
after each test.
1) Cancel holds from OPAC patron account
/cgi-bin/koha/opac-user.pl#opac-user-holds
- Cancel 4th - ok, before and after applying the patch
- Cancel 2nd - ok, after applying the patch
2) Cancel hold from holds tab on staff detail page
/cgi-bin/koha/reserve/request.pl?biblionumber=7
a) Setting priority to 'del', submitting with 'Update holds'
- Cancel first (1st) - ok, before and after
- Cancel hold in the middle (was 3rd) - ok, before and after
- Cancel last (was 4th) -ok, before and after
b) Using red X
- Repeating tests from a) - before the patch is applied holds
get totally 'out of order' - after applying the patch, it works
correctly
Additional tests done on this page:
- Change priority using up, down, to top, to bottom icons
- Change priority with 'toggle to lowest'
3) Cancel hold from the patron's account
a) Check out tab - Delete? Yes, 'Cancel marked holds'
/cgi-bin/koha/circ/circulation.pl?borrowernumber=X
- Cancel first (1st) - ok, after applying the patch
- Cancel hold in the middle (was 3rd) - ok, after applying the patch
- Cancel last (was 4th) - ok, after applying the patch
b) Details tab - Delete? yes, 'Cancel marked holds'
/cgi-bin/koha/members/moremember.pl?borrowernumber=X
- Cancel first (1st) - ok, after applying the patch
- Cancel hold in the middle (was 3rd) - ok, after applying the patch
- Cancel last (was 4th) - ok, after applying the patch
Without the patch, holds priorities get out of order.
Additional tests done:
- Check in one item to trigger first hold
- Check in one item to trigger second hold
- Check out first item
Priorities are kept while the item is waiting, when it's
checked out, priorities of remaining holds get reset correctly.
Conclusion:
Big improvement, no regressions found.
Passes all tests in t, xt and QA script.
Also: t/db_dependent/Holds.t
t/db_dependent/HoldsQueue.t
t/db_dependent/Reserves.t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
RIS and Bibtex exports from staff side and OPAC edited to
allow for additional publisher RDA tag (264). Script will
look first for 264 then fall back to 260 when pulling publisher
data from MARC21 records.
Test Plan:
1. Create RDA and non RDA record
2. In OPAC, export both as RIS and Bibtex - verify publisher information
is exported
3. On staff side, export records as RIS and Bibtex, verify publisher
information is exported.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed some tabs pointed out by the QA script.
Works nicely in my tests, no regressions found.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The summary is built using the authtypecode selected from the interface.
So when a search is launch on all auth types, the summary is not
correctly built by the BuildSummary routine.
It should get the authtypecode from the authority (call to
GetAuthTypeCode).
To test:
1/ go to authorities/authorities-home.pl
2/ search <something> by authtype personal name
3/ results are displayed with summary
4/ now select the default entry and search again the
results display but without the summary
5/ apply the patch
6/ search default again, now summary shows
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested with a UNIMARC database, works as described.
All tests and QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes an error caught by t/db_dependent/Acquisition.t, and
adjusts C4::Context::IsSuperLibrarian() to return true if no
userenv is set. This is done on the basis that if no userenv is set,
calls to C4::Context routines are being made from a command-line script,
and if you have access to the command line of a running Koha instance,
you implicitly already have better than superlibrarian access.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For DUE message (and PREDUE, etc.) there are no check before sending the
message to the message_queue table.
This check avoids to try to send again and again the same message. Now
it is marked as "failed".
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Without the patch a sms notice will remain as 'pending' forever.
With the patch applied, the status is set to 'failed'.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
That's it. A guide box cannot be created if invalid data is passed.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script, includes new unit tests.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Note that I modify the return value. Before this patch, it returned an
empty string or 1. Now it returns 0 or 1.
Test plan:
- same as the original patch
- verify that unit tests pass:
prove t/Context.t
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script, including new tests.
Checked the code and tested superlibrarian behaviour in some places:
moremember.pl:
With IndyBranches only superlibrarian can delete borrowers from
other branches. Accessing the borrower with a direct link.
OK
C4/Members.pm
With IndyBranches only superlibrarian can search for borrowres
from other branches.
OK
tools/holidays.pl
With IndyBranches only superlibrarian can edit holidays for other
branches.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The method of checking the logged in user for superlibrarian privileges
is obtuse ( $userenv && $userenv->{flags} % 2 != 1 ) to say the least.
The codebase is littered with these lines, with no explanation given. It
would be much better if we had one subroutine that returned a boolean
value to tell us if the logged in user is a superlibrarian or not.
Test Plan:
1) Apply this patch
2) Verify superlibrarian behavior remains unchanged
Signed-off-by: Joel Sasse <jsasse@plumcreeklibrary.net>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on second patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes a problem where a patron could receive duplicate
hold waiting notifications. For example, this could happen if a
circ operator checked in an item more than once and confirmed the
same hold each time.
To test:
[1] Set up a test patron that received hold waiting notifications.
[2] Put an item on hold for the patron, then check the item in
and confirm the hold. Verify that a hold notification is
sent (or inspect the message_queue table).
[3] Check the item in again and confirm the hold again. A duplicate
hold notification will be generated.
[4] Apply the patch.
[5] Repeat steps 2 and 3. This time, only one notification should be
generated.
[6] Verify that prove -v t/db_dependent/Reserves.t passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works as described.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When a z39.50 server isn't able to be searched successfully, the yellow
error box came up empty. This patch fixes the problem.
Test Plan:
1) Go to Administration/z39.50 servers
2) Create a fake z39.50 server with a made up address
3) Go to cataloging, search only that server
4) Note the empty yellow alert box
5) Apply this patch
6) Re-run the search, not the alert box has a message in it now
Signed-off-by: Nora Blake <nblake@masslibsystem.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works according to test plan.
When one of the selected servers gives result no dialog
box is shown before and after applying the patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In C4::Items::DelItemCheck, there are two SQL queries: one to check
if item is on loan, the other if item is reserved.
Those two queries use "SELECT * FROM table", fetch the data with
"$var = $sth->fetchrow", and use "$var" as a boolean condition.
This is not correct, SQL query should be "SELECT COUNT(*) FROM table".
As a consequence, it was possible to delete an item without warning to
the operator even if it was waiting on the hold shelf or in transit to
fill a hold.
This patch corrects the SQL queries and sets my ($var) to show that
fetchrow returns an array.
Test plan :
- Set an item A onloan
- Set an item B reserved and the reserve waiting
- Go to items cataloguing : cgi-bin/koha/cataloguing/additem.pl?biblionumber=XXX
- Try to delete item A
=> You get an alert and item is not deleted
- Try to delete item B
=> You get an alert and item is not deleted
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Works, and has the added bonus of being a tiny bit faster.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes t, xt and QA script tests.
Also tried deleting via batch delete - correct warnings are displayed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
It could be useful to index the original language of a document (i.e.
"fre" for the English translation of a French novel).
This patch renames the Bib-1 use attribute 1095 from
Code-language-original to language-original and uses it to index:
- MARC21 041$h subfield
- UNIMARC 101$c subfield
It adds "language-original" in the list of index in Search.pm.
Test plan :
A. in a MARC21 GRS1 environment
1. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/marc21/biblios/record.abs) from
your source etc/ directory to your main koha etc/ directory
2. Reindex zebra
3. Make some searches, like "language-original:fre"
B. in a MARC21 DOM environment
4. Copy Zebra config files (zebradb/biblios/etc/bib1.att, zebradb/ccl.properties,
marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl) from your source etc/
directory to your main koha etc/ directory
5. Reindex zebra
6. Make some searches, like "language-original:fre"
C. in a UNIMARC GRS1 environment
7. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/unimarc/biblios/record.abs) from
your source etc/ directory to your main koha etc/ directory
8. Reindex zebra
9. Make some searches, like "language-original:fre"
A. in a UNIMARC DOM environment
10. Copy Zebra config files (zebradb/biblios/etc/bib1.att,
zebradb/ccl.properties, marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl)
from your source etc/ directory to your main koha etc/ directory
11. Reindex zebra
12. Make some searches, like "language-original:fre"
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When item is transfered from items table to deleted items, all fields
must be copies but "timestamp".
This value must be updated to know when the item was deleted.
Test plan:
- Look a an item timestamp :
mysql> select timestamp from items where itemnumber = 2690;
+---------------------+
| timestamp |
+---------------------+
| 2011-09-09 15:30:21 |
+---------------------+
1 row in set (0.00 sec)
- Delete this item in cataloguing module
- Check it is not in items table anymore :
mysql> select timestamp from items where itemnumber = 2690;
Empty set (0.00 sec)
- Look in deleteditems table :
mysql> select timestamp from deleteditems where itemnumber = 2690;
+---------------------+
| timestamp |
+---------------------+
| 2013-12-05 15:33:20 |
+---------------------+
1 row in set (0.00 sec)
=> timestamp as been set to actual date/time
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Patch set passes koha-qa.pl, works as advertised!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This is supplementary to the main patch for
bug 6331. Having removed the attribute marc from
items DelItem, we should not try to populate it.
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
'ctId' as a column name conflicts with one of the system
columns that PostgreSQL uses for each table, and consequently
needs to be renamed to enable deploying the schema to a Pg
database. This patch makes this change.
To test:
[1] Apply the patch and run the SQL specified in the database
updated.
[2] Verify that the collections_tracking table no longer has
a ctId column, but now has collections_tracking_id.
[3] Verify that prove -v t/db_dependent/RotatingCollections.t
passes.
[4] Verify that installer/data/mysql/kohastructure.sql runs
cleanly in an empty database.
This patch does not affect user-visible behavior given the fact
that the rotating collections feature is currently disabled.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
It is not necessary to process the case where the number of quotes
is just one, as int(rand(1)) will always produce 0, which is a valid
offset.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If there is a gap in the id sequence for the quotes table, it
is possible that no new quote will be selected. This will happen
particularly when a lot of the older quotes with low ids have been
deleted.
This patch improves the selection of a new quote.
To test:
- Load sample quotes
- Delete the first half of the quotes.
Note: With 34 quotes, delete the quotes with ids from 1-17
- Activate the QuoteOfTheDay system preference
- Check if a quote is displayed in OPAC
- Reload the page a few times, no quote should be displayed
Note: make sure you don't have a quote with the current
date in your quotes table before running those tests
- Run 'perl t/db_dependent/Koha.t'
Note: requires sample quotes!
- Apply patch
- Reload the OPAC start page
- Verify a quote was now picked
- Run 'perl t/db/dependent/Koha.t' again - all tests should still pass
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Patch modified to use "LIMIT 1 OFFSET ?" rather than "LIMIT ?, 1"; the
latter construction does not work in PostgreSQL.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add date fields to track when an item was marked as lost or withdrawn.
Display those fields on catalogue/moredetail.pl
Test Plan:
1) Apply patch
2) Run updatedatabase.pl
3) Pick a record with items, browse to the 'items' tab ( moredetail.pl )
4) Mark an item as lost, verify the field "Lost on:" displays below
the "Lost status" field with todays date.
5) Mark the item as not lost, verify the field no longer displays
6) Repeat steps 4 and 5 with the Withdrawn field.
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Under certain circumstance, a search term without quotation marks
returns the expected results while the same search with a
double quote embedded in it would fail.
Koha should ignore the quotation marks and return results anyway.
This appears when QueryWeightFields syspref is activated (and
QueryAutoTruncate is off), as field weighting builds a complex CCL
query using double quotes around search words. This patch simply
replaces double quotes in search words by a space.
Test plan :
- Set QueryAutoTruncate off (you may also need to set QueryFuzzy to off)
- Set QueryWeightFields off
- Perform a serch on two words where you have results, like : centre "ville
=> you get results
- Set QueryWeightFields on
- Perform same serch
=> you get the same results
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds 3 filters for the serials search:
- location
- callnumber
- expiration date
To test:
- Search serials by location and/or callnumber and/or expiration date
and check that results are consistent.
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch repairs a regression introduced by the main
patch where it became impossible to search for cancelled
orders from the advanced order search form.
This patch also tweaks the wording on the order status
drop-down on the order search form to clarify that the
default status filter is orders that have any status
except cancelled.
To test:
[1] Before applying this patch, perform an advanced
order search (acqui/histsearch.pl) for orders
with status cancelled. Observe that no hits are returned.
[2] Apply the patch and run the search again. This time,
the cancelled orders should be returned.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds a new tab "Acquitition details" on the catalogue detail
page. It provides a list of order made for this biblio.
New system preference:
AcquisitionDetails: Hide/Show the new tab. The default for
new and upgraded installations is to display the new tab.
Test plan:
1/ Apply the patch.
2/ Select the "placing an order" value for the AcqCreateItem pref.
3/ Create a new order with X items.
4/ Go on the catalogue detail page for the selected biblio.
5/ Click on the "Acquisition details" tab and check that your order is
displayed. Itemnumbers are present in the last column. Check that links
are not broken.
6/ Close your basket.
7/ Status become "Ordered"
8/ Receive X-1 items.
9/ Come back on the catalogue detail page. There are 2 orders: 1
complete and 1 partial. The complete one has a receive date.
10/ Receive the last item.
11/ Now you have 2 orders with a complete status.
12/ Cancel the last receipt.
13/ You have 1 ordered and 1 complete (2 items).
14/ Cancel the first receipt.
15/ You have 1 ordered (3 items).
16/ Delete your order
17/ You have 1 deleted order.
18/ Switch the AcqCreateItem pref to "receiving an order"
19/ Do again steps 3 to 17.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This dependency is used in load testing (misc/load_testing/*)
Test plan:
Check if you see the dependency listed on About/Perl modules.
Verify if the version information is correct.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The subroutine _filter_fields is not used by the module
and the sub _columns is only used by it
This patch removes the dead code.
To test:
[1] Verify that the following tests pass
t/Budgets.t
t/Budgets/CanUserModifyBudget.t
t/Budgets/CanUserUseBudget.t
t/db_dependent/Acquisition.t
t/db_dependent/Acquisition/GetOrdersByBiblionumber.t
t/db_dependent/Acquisition/Invoices.t
t/db_dependent/Acquisition/OrderFromSubscription.t
t/db_dependent/Acquisition/TransferOrder.t
t/db_dependent/Acquisition/close_reopen_basket.t
t/db_dependent/Bookseller.t
t/db_dependent/Budgets.t
t/db_dependent/Serials.t
t/db_dependent/Serials_2.t
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Looks good to me.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Intermittently problems in the calling environment
cause a C4::Biblio routine to be called with an undefined
MARC::Record object. This results in the process
dying and returning to the end user a low level
message such as 'cannot call method x on an undefined
object'.
For exported subroutines taking a MARC::Record object,
check that object is defined otherwise return a logical
return value and log a stack trace to the error log.
A couple of cases were checking but dying, this may have
unwelcome results in a persistent environment so croak has
been downgraded to carp
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Adds lots of checks for $record in various places, should
not affect behaviour.
Passed all tests and QA script, including new unit tests.
Tested adding and saving a new record.
Also tested detail and result pages without XSLT.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Mandatory item fields are not indicated the same way in all places. This
patch corrects two places where required fields were shown in bold
rather than using the standard "required" class: When adding an order
from a staged file and when adding an item for a new issue of a serial.
This patch also normalizes the text input size on item entry forms: In
some places it was 50, others 67. I have changed the latter to 50.
Unrelated changes: Added $KohaDates formatting of date and time and
corrected capitalization on a heading on the add order from staged file
page.
It would be nice to be able to use the same method for displaying the
item form as we use on neworderentry.tt -- pulling in the form from a
separate include. However that system is designed for handling multiple
items and would need to be adapted for these cases.
To test, you must have a staged file from which to add an order. Open an
existing basket or create a new one and choose to add an order "From a
staged file." Choose a staged file from which to order. The item entry
form under the "Import all" heading should show required fields in red.
To test in serials: Begin the process for receiving an item from an
existing subscription. On the serials-edit page, find the "Click to add
item" links and click to open the item edit forms. There should be one
under the numbered issue and the supplemental issue forms. In both cases
the item edit screen should show the mandatory item fields in red.
Confirm that the cataloging add item form looks correct and works
correctly.
Revision: Left out the "required" note which should appear after each
required field.
Signed-off-by: David Cook <dcook@prosentient.com.au>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The clear js function parses input text, but input filled to a plugin
does not contain the type attribute.
Test plan:
- fill the barcode field to the barcode plugin
- go on the new order page
- verify the barcode plugin works as before
- verify the clear link clears the barcode field and all others fields.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as expected, passes all tests and QA script.
Template change only.
Barcode and date acquired are now also cleared with the
'clear' link.
But: it only works when you enter a barcode manually currently,
because the AutoBarcode functionality is broken on master (bug 11273).
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If you select an index in the search dropdown and then enter in a QP
query starting with the field, Koha will prepend the index you do not
want to use at the beginning of the search, resulting in a search that
probably does not match what you were hoping for.
To test:
1) Select an index in the search dropdown in the OPAC. Author is fine.
2) Enter a search term using manually entered indexes. For example:
ti:cat in the hat
3) Note that the search fails.
4) Apply patch.
5) Repeat steps 1 and 2.
6) Note that the search succeeds.
7) Sign off.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes C4::ILSDI::Utility on the basis
that one of its routines (BorrowerExists) was not used
and that the other routine can be (and is) moved to
C4/ILSDI/Services.pm.
Test:
This should be a noop. Regression testing required:
/cgi-bin/koha/ilsdi.pl functioanality, in particuler:
GetAvailability - ?service=Describe&verb=GetAvailability
AuthenticatePatron - ?service=Describe&verb=AuthenticatePatron
ILS-DI syspref must be turned on
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The descriptions for fines are stored in English in the DB
(accountlines.description). So they are not translatable.
This patch removes the descriptions automatically added and generates
the string in the template.
Test plan:
1/ Execute the updatedatabase entry.
2/ Verify in the following pages the description is consistent:
- members/pay.pl?borrowernumber=XXXX
- members/boraccount.pl?borrowernumber=XXXX
- opac-account.pl
3/ Launch the translate script and update the po files in order to
translate the new strings.
4/ Verify the strings are translated in the interface.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised. Corrected few typos in the commit message.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
Go on acqui/parcel.pl?invoiceid=XX page and verify the basket group name
is displayed into the 2nd column of the pending orders and already
received tables.
Signed-off-by: Ed Veal <ed.veal@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Changed basketgroup to basket group to match spelling on other
pages.
Works as described, passes tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This gets rid of some more warnings.
It also corrects a noisy ne condition.
$userid = $retuserid if ( $retuserid ne '');
became
$userid = $retuserid if ( $retuserid );
It also integrates Srdjan Jankovic's patch with Petter Goksoyrsen's
patch, while correcting the problems found.
This includes:
my $q_userid = $query->param('userid') // '';
along with:
my $s_userid = '';
and:
my $s_userid = $session->param('id') // '';
Indentation does not reflect actual scoping.
A missing system preference would have triggered a ubiquitous
undef compare check failure message. This makes the flooding
message more useful, so as to help correct it.
The change to accomplish this was:
my $pki_field = C4::Context->preference('AllowPKIAuth');
if (!defined($pki_field)) {
print STDERR "Error: Missing AllowPKIAuth System Preference!\n";
$pki_field = 'None';
}
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
During login at the Staff interface you get warnings in the logs
regarding an uninitialized value for the $pki_field variable.
To test:
- tail -f /path/to/your-intranet-logs
- Point your browser to your staff login page
- Login
- Three warnings are showed
- Apply the patch
- Log out
- Log in
- No new warnings, and you can still log in.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Petter Goksoyr Asen <boutrosboutrosboutros@gmail.com>
Followed test plan; it works as advertised.
Also works when I deleted AllowPKIAuth system pref.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
MARC::Record 2.0.6+ enables the warnings pragma, and as a
consequence, started logging cases where a routine in
C4::Search was calling MARC::Field->subfield() with an undef
subfield label. This patch removes the log noise.
To test:
- Run prove -v t/db_dependent/Search.t
- There will be warnings about
"Use of uninitialized value $code_wanted in string" in MARC::Field.
- Apply the patch.
- Those warnings are gone.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Tests now pass
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Fix the supplier, shipment date, and library filters
on the invoice search. An invoice's library is
(in parallel with order search) defined as the library
of the staff member that approved the basket. Before this
patch, the code was referring to an aqorders.branchcode
column that doesn't exist.
This patch also improves the author, title, ISBN/EAN/ISSN,
publisher, and publication year filters to no longer require
exact matches; substring matches now suffice.
Finally, this patch considers biblio.copyrightdate in addition
to biblioitems.publicationyear for publication date searches, as
the MARC21 frameworks use the former column but not the latter.
This patch also fixes the current test cases for invoices
so that they pass and adds regression tests.
Test plan:
[1] Create two invoices for different vendors.
[2] Do an invoice search and filter on shipment
date. Verify that the expected invoice(s)
are returned.
[3] Do an invoice search and filter on branch
(of the staff member that approved the basket).
Verify that the expected invoice(s) are returned.
[4] Do an invoice search and filter on supplier.
Verify that the expected invoice(s) are returned.
[5] Do invoice searches on author, title, ISBN/EAN/ISSN,
publisher, and publication year and verify that the
results are as expected.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Patch passes all tests, test plan and QA script.
(Adding from Katrin notes early) I agree with
Possible improvements:
- Document the behaviour of the library search as there are
lots of branches all over acquisitions with different meaning.
- Add the shipment date to the results list table
- Change label ISBN/EAN/ISSN to not include EAN for MARC21
installations
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The default values for date fields is undef, so if a date field contains
an empty string, we should insert NULL in the DB, not 0000-00-00.
The format_date_in_iso routine should be only called if a date is
defined, is not equal to an empty string and does not match the iso
regex.
This patch fixes a bug where editing or creating a patron record
without setting the birth date results in 0000-00-00 rather than null
being set as the dateofbirth value.
Partial test plan:
1. Create a new patron. Leave dateofbirth empty.
2. Save the record.
3. Open the record for editing.
4. Save the record without making changes.
5. Koha gives no error.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Now when no date is given NULL is saved to the database.
Tested:
- Adding a patron without date of birth
- Editing the patron, entering a date of birth
- Editing the patron, deleting date of birth
All worked as expected.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When a record fails to decode during a search, Koha dies with an error.
Koha should ignore bad records and continue on ( and log the error ).
An example of a record that Zebra will happily ingest but which MARC::Record
doesn't like is one that contains a punctuation character in a tag label.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
- Check out at least 3 items, due dates should be today, sometime in
the past and one day after tomorrow.
- Edit the message options - activate DUE and PREDUE notices with
days in advance = 2
- Run the advance_notices.pl script with -v -c
Result: Only a PREDUE notice is generated
- Run the advance_notices.pl script with -v -c -m 2
Result: Only the PREDUE message is generated correctly.
- Run t/db_dependent/Circulation.t
- without first patch: all tests pass.
- with first patch: some tests will fail.
- Apply patch.
- Rerun script, now PREDUE and DUE notices should be
generated.
- Run t/db_dependent/Circulation.t again, all tests should pass.
Add more items with different due dates, rerun and check results.
Run t/Circulation.t to confirm all tests pass.
- Apply the patch
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Passes functional tests and automated tests.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A "busc" param is cleared if the template name is not opac-.*detail.tt.
So if a user adds a biblio to a list, he cannot continue to browse
results.
Test plan:
- launch a search at the OPAC (opac-search.pl).
- click on a result and browse results (using previous/next links).
- a title attract your attention and you add it to a list
("Save to yours lists" link on the right).
- save the list.
- browse again results.
Signed-off-by: Joy Nelson <joy@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested according to test plan, also checked some other pages and actions
accessible from the detail page.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This followup fixes some QA issues:
- replace the MySQLism SQL_CALC_FOUND_ROWS
- use Koha::DateUtils instead of C4::Dates
- replace "branch" and "location" with "library"
- fixe wrong capitalisation on "Clear all" and "Select all"
and fixes some behaviors:
- the inventory tools can be used without barcode file (fixed for the
csv export too).
- mark as not scanned a non scanned item.
- update the datelastseen 1 time per biblio (and fixes the displayed
count)
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Koha Team Amu <koha.aixmarseille@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
* when a file was uploaded and the comparison with catalogue range
requested, the comparison was wrong: the logic was wrong
* items that were not supposed to be scanned (ie: supposed to be on another shelf)
didn't had the author and title, it was hard to retrieve them on the shelved
* some useful fields were missing, like homebranch, location, status
* the CSV export contained all the item information. It should contain the same
informations as the screen
Behaviour now:
* scan a list of barcode & select a range of location
* if a barcode has been scanned and should not be (misplaced item),
the information is displayed
* if you choose "compare barcodes list to result option", the
resulting list contains all items that have been scanned and those
that were supposed to be. Any item not in both list appears with a
specific message on the last column
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Signed-off-by: Koha Team Amu <koha.aixmarseille@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Bug 8015: Fix complains from qa tools
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 8015: Get rid of the eval in ModifyRecordWithTemplate
This patch removes the use of eval in the
C4::MarcModificationTemplates::ModifyRecordWithTemplate routine.
Now this routine call the wanted modification routine with the list of
parameters.
This call is done only if the condition is respected.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 8015: Get rid of eval for evaluating =~ m//
Koha::SimpleMarc::field_equals uses eval in order to check if a string
matches a pattern.
Now this eval is removed and the "regex" variable does not contain the
regex separated character (/ or |).
Regression: Before this patch, the user was able to user a modifier. Now
it is not possible.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 8015: Get rid of the eval for substitution
Before this patch, the regex substitution was contain into only one
variable (e.g. my $regex = "/foo/bar/i").
Now each member of the regex is stored into a field in the
marc_modification_template_actions sql table.
In order to avoid a complex code, only modifiers i and g are take into
account.
Note: If you already add the mmta table, you have to drop it.
This patch also adds a foreign key from mmta to mmt tables.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 8015: FIX ui issue
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Bug 8015: The template name is a required field
Test plan:
Try to add a template with an empty string as name.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 8015: Fix template capitalization amd other template issues
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Fix error where field object is returned instead of field value for fields without subfields
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Fix bad ordering on function parameters
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Escape escape characters for strings
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Fix bad parameter list for direct external call to update_field
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Fix problem with moving existing subfield value to nonexistent field/subfield
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: FIX QA issues
This patch fixes some stuffs failing qa tests: POD, indentation (tabs),
perlcritic
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Bug 8015: Catch error in the SetUTF8Flag routine
The eval avoids the interface to run endlessly if an error occurred.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The MARC Modification Templates system gives Koha users
the power to make alterations to MARC records automatically
while staging MARC records for import.
This tool is useful for altering MARC records from
various venders work with your MARC framework.
The system essentially allows one to create a basic script
using actions to Copy, Move, Add, Update and Delete fields.
Each action can also have an optional condition to check
the value or existance of another field.
The Copy & Move actions also support Regular Expressions,
which can be used to automatically modify field values during the
copy/move. An example would be to strip out the '$' character
in field 020$c.
Furthermore, the value for an update can include variables
that change each time the template is used. Currently,
the system supports two variables, __BRANCHCODE__ which
is replaced with the branchcode of the library currently
using the template, and __CURRENTDATE__ which is replaced
with the current date in ISO format ( YYYY-MM-DD ).
At its simplist, it can perform functions such as:
Copy field 092$a to 952$c
At its most complex it can run actions like:
Copy field 020$c to 020$c using RegEx s/\$// if 020$c equals RegEx m/^\$/
Signed-off-by: Leila <koha.aixmarseille@gmail.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The system preference 'maxoutstanding' is defined as the maximum amount
of fees owed by the patron before Koha should block placing holds (
terrible naming on this one ).
However, although the Koha OPAC respects this preference, placing holds
via a SIP2 device will not.
Test Plan:
1) Set maxoutstanding to $10
2) Pick a patron owning more than $10 in fees
3) Attempt to place a hold for this patron from a SIP2 device
This attempt should succeed
4) Apply this patch
5) Restart your SIP2 server
6) Attempt to place a hold for this patron from a SIP2 device again
This attempt should now fail
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The routines sql_file_list and marcflavour_list in Installer.pm are not used.
There are some references to them in probably obsolete '/lib' test units.
I changed these test units for the record too.
Also: removed the not-existing marcflavour parameter of sample_data_sql_list
in its call in install.pl.
Test plan:
Run a new install.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- Add branch info to baskets
- Add a list of borrowers that are allowed to manage a basket (one list
for each basket).
- Add a new subpermission: acquisition => order_manage_all
If user is superlibrarian, or if that user has permission acquisition = 1
(GranularPermissions = OFF), or subpermission acquisition =>
order_manage_all (GranularPermissions = ON), that user is authorised to manage
all baskets.
Depending on syspref AcqViewBaskets:
'all': user can manage all baskets
'branch': user can manage baskets of their branch (the basket branch is
taken into account, not the branch of the basket's creator).
If basket branch is not defined, all users can manage this
basket.
'user': user can manage baskets she created, and baskets in their
user list
There are unit tests in t/Acquisition/CanUserManageBasket.t, which
require Test::MockModule
You can edit basket's branch and users list in basket modification page
(acqui/basket.pl)
Signed-off-by: Sonia Bouis <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 7688 broke the exporting serial claims as CSV (see bug 10854).
For C4::Serials::GetLateOrMissingIssues(), $supplierid is not
meant to be mandatory. This patch fixes that.
Test plan:
try to export a serial claim.
Without this patch, the csv is always empty.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch allows to export late orders as CSV.
Test plan:
- Go on the late orders page (acqui/lateorders.pl)
- Select one or more order and click on the button "Export as CSV".
- The generated file should contains some information on the orders
(order date, estimated delivery date, vendor name, information field,
cost, basket name (and basketid), claims count and the claimed date)
The last line of the file is the total of orders.
- You are not allow to select order from different vendor.
- The check/uncheck all links appears only if a vendor is selected.
- Check that the check/uncheck works for all pages of the table.
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: sonia <koha@univ-lyon3.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Testing comments on last patch in this series.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In the serial module, we want to hide serials from others libraries.
However, to permit central serials manage, this patch introduces a
new permission, 'superserials'. If a staff member has this permission,
that person can override the restriction.
Test plan:
- Switch on the IndependantBranches syspref
- Add the permission 'superserials' for a patron and test you can
navigate and see all serials
- Remove this permission and test you cannot manage/view subscriptions
from others libraries
Signed-off-by: Frederic Durand <frederic.durand@unilim.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The first patch does a left join on aqorders_items which returns too
much order lines.
This patch follows the Galen's suggestion: it removes the join and calls
the GetItemnumbersFromOrder routine for retrieving itemnumbers.
Bonus: the "parcelitems" variable is badly named and obfuscates the code.
I changed it for "orders".
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adds a column to indicate holds on received items, as well as adding
a new column for fund and showing the subtotals per fund above
the total subtotal.
To test:
[1] Create an order basket containing at least one title and
ensure that an item is created for that title. Close the
basket.
[2] Place a hold on the title.
[3] Receive the order. After receiving it, but before finishing
the invoice, the table of already received orders should now
have columns for the order budget and number of holds on the
title as well as lines with the subtotal per fund.
Signed-off-by: Pierre Angot <tredok.pierre@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds a more extensible and flexible debarments system to Koha. The fields
borrowers.debarred and borrowers.debarredcomment are retained for compatibility and
speed.
This system supports having debarments for multiple reasons. There are currently
three types of debarments:
OVERDUES - Generated by overdue_notices.pl if the notice should debar a patron
SUSPENSION - A punative debarment generated on checkin via an issuing rule
MANUAL - A debarment created manually by a librarian
OVERDUE debarments are cleared automatically when all overdue items have been returned,
if the new system preference AutoRemoveOverduesRestrictions is enabled. It is disabled
by default to retain current default functionality.
Whenever a borrowers debarments are modified, the system updates the borrowers debarment
fields with the highest expiration from all the borrowers debarments, and concatenates
the comments from the debarments together.
Test plan:
1) Apply patch
2) Run updatedatabase.pl
3) Verify the borrower_debarments table has been created and
populated with the pre-existing debarments
4) Run t/db_dependent/Borrower_Debarments.t
5) Manually debar a patron, with an expiration date
6) Verify the patron cannot be issued to
7) Add another manual debarment with a different expiration date
8) Verify the 'restricted' message lists the date farthest into the future
9) Add another manual debarment with no expiration date
10) Verify the borrower is now debarred indefinitely
11) Delete the indefinite debarment
12) Verify the debarment message lists an expiration date dagain
13) Enable the new system preference AutoRemoveOverduesRestrictions
14) Set an overdue notice to debar after 1 day of being overdue
15) Check out an item to a patron and backdate the due date to yesterday
16) Run overdue_notices.pl
17) Verify the OVERDUES debarment was created
18) Return the item
19) Verify the OVERDUES debarment was removed
20) Disable AutoRemoveOverduesRestrictions
21) Repeat steps 15 though 18, verify the OVERDUES debarment was *not* removed
22) Add issuing rules so that an overdue item causes a temporary debarment
23) Check out an item to a patron and backdate the due date by a year
24) Return the item
25) Verify the SUSPENSION debarment was added to the patron
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Currently, the location facet only shows if you have singlebranch
mode enabled. In other words, you can either see the library branch
or the shelving location.
This patch simply changes the location facet so that it will always
show the shelving location (if one is available), regardless of the
singlebranch system preference.
Test Plan:
BEFORE APPLYING:
0) Disable singlebranch mode if it is on
1) Do an OPAC or Staff Client search for a record that has items with
shelving locations.
2) Note that you can see the library branch facet under Libraries
but no shelving locations.
3) Enable singlebranch mode
4) Repeat your search
5) Note that you can no longer see the library branch facet under
Libraries. However, you can see the shelving location under Location
N.B. If you don't have more than one branch or the search results
are all from one branch, you might not get a library branch facet.
If this is the case, create additional branches and/or change the
branch for items in your search results so that you have multiple
branches to prompt the appearance of a library branch facet.
AFTER APPLYING
1) Do an OPAC or Staff Client search for a record that has items with
shelving locations.
2) Note that you see a facet under Location on the left sidebar,
regardless of there being a singlebranch mode or the number of branches
there are being represented in the search results.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, works as advertised.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the ability to overlay by either itemnumber,
or barcode. Itemnumbers take precendence over barcodes, which
allows us to batch update item barcodes with an overlay.
Test Plan:
1) Create a new record with 2 items, make sure to give it a unique ISBN
2) Download the record as MARCXML
3) Edit the MARC XML
a) Delete one of the two items
b) Change the barcode in the barcode field to something unused
4) Transform the xml file into marc with xml2marc
5) Browse to 'Stage MARC records for import'
6) Upload the binary marc file
7) Choose the following options:
Record matching rule: ISBN
Action if matching record found: Ignore incoming record
Action if no match is found: Ignore incoming record
Check for embedded item record data: Yes
How to process items: Replace items if matching bib was found
8) Click 'Stage for import' button
9) Verify a matching record was found, then click 'Manage staged records' link
10) Verify the rules are still set correctly
11) Click 'Import this batch into the catalog'
12) The import should tell you:
1 record was ignored
1 item was replaced
13) View the record details and verify the item's barcode was replaced
with your updated barcode value
14) Download the record as MARCXML
15) Edit the MARC XML
a) Delete one of the two items
b) Delete the itemnumber field for the remaining item
c) Alter the item's callnumber to a new value
16) Repeat steps 4 through 12
17) View the record details and verify the item's callnumber was replace
with your updated callnumber value
Signed-off-by: Henry Bankhead <hbankhead@losgatosca.gov>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When staging biblios with items attached you previously had only two
options, add or don't add.
This patch adds a third option to replace an item record if a match is
found on itemnumber or barcode, else it adds the item.
Test Plan:
1) Stage a file of biblios with items attached.
2) Import the batch into the catalog.
3) Run the indexer so the matcher will match
4) Modify the item data for at least one bib in the file
5) Re-stage the file with the item matching option set to "Replace
items if matching bib was found"
6) Let the indexer run again
7) You should see updated item information after the overlay
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Henry Bankhead <hbankhead@losgatosca.gov>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Using the TransportCostMatrix can cause the same issue. Removing the
last ditch use of the first item causes the the subroutine to continue
with the traditional matching, which will respect the hold policies.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For some reason MapItemsToHoldRequests will, as a last ditch effort,
grab what seems to be an arbitrary available item to fill a hold request,
even if it will violate the circulation rules for holds.
In other words, even if an item matches a "Holds policy by item type"
that says "From home library", a request to transfer that item to
another library will be added to the holds queue!
Test Plan:
1) Create a record with a an item at BranchA of item type BOOK
2) Set the holds policy such that itemtype BOOK for BranchA is set
to "From home library"
3) Place a bib-level hold request for a patron with a pickup at BranchB
4) Run build_holds_queue.pl
5) You should now see a request for that item to be transfered to
BranchB, even though the rules should not allow this.
6) Apply this patch
7) Run build_holds_queue.pl again
8) View the holds queue again, that request should no longer exist
Signed-off-by: Heather Braum <hbraum@nekls.org>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch also corrects an error in the description of
NewSubscription().
Named parameters for this function cannot come soon enough.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A lot of tests were just a call to a funtion without any arguments,
whereas the function expect at least one argument.
These tests were kept, but all return values are now undef when a
mandatory argument is missing, so return values are consistent.
The part where subscription periodicity is changed could not work
because of ',' appended to each key in ModSubscription call. So it's
rewritten, taking into account the new API for subscription frequencies.
This script should leave your database intact because it revert any
modification made.
Also fix some warnings in C4::Serials and in C4::Items.
And fix a typo in koha-tmpl/.../subscription-numberpatterns.tt
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. No errors
Tested, again, without trouble.
With a suscription, tests successful.
No koha-qa errors
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Serials numbering pattern and frequencies are no longer hard-coded. Now
it's possible to create, edit and delete numbering patterns (and
frequencies). This patch adds two new sql tables
(subscription_numberpatterns and subscription_frequencies)
Numbering patterns behave almost as before, there are still the same
values to configure (addX, everyX, settoX, whenmorethanX). lastvalueX
and innerloopX remain in subscription tables.
There is a new value in numbering patterns: numberingX. For each
"column" (X, Y or Z) you can tell how to format the number. Actually
numberingX can be set to:
- 'dayname' (name of the day) (0-6 or 1-7 depending on which day is the
first of the week)
- 'monthname' (name of the month) (0-11)
- 'season' (name of the season) (0-3) (0 is Spring)
These names are localized by using POSIX::setlocale and POSIX::strftime
and setting a 'locale' value to the subscription. Locale have to be
installed on the system.
Note that season names are not localized using POSIX::strftime (it can't
do this), so names are hardcoded into the code (available languages: en,
fr). This could be fixed in the future by using a Perl localization
framework.
Frequencies can be configured using 3 parameters:
- 'unit': one of 'day', 'week', 'month', 'year'
- 'issuesperunit': integer >= 1, the number of received issues per
'unit'
- 'unitsperissue': integer >= 1, the number of 'unit' between two
issues
One of 'issuesperunit' and 'unitsperissue' must be equal to 1.
Examples:
unit = 'day', issuesperunit=3, unitsperissue=1 => 3 issues per day
unit = 'week', issuesperunit=1, unitsperissue=3 => 1 issue each 3
weeks
Prediction pattern is now computed server-side and is more consistent
with what Koha will do. The publication date is displayed alongside the
serial number.
Irregularities can now be checked one by one, in the prediction pattern
table, or if frequency is 'day-based' (unit is 'day'), there is the
possibility to check all issues for a week day at once.
When an irregularity is found, there is the possibility to keep the
serial number unchanged, or to skip it. It is configured at subscription
creation or modification.
For instance, with a daily subscription you can have:
skip serial number | keep serial number
----------------------+----------------------
2012-01-01 ¦ No 1 | 2012-01-01 ¦ No 1
2012-01-03 ¦ No 3 | 2012-01-03 ¦ No 2
To lighten the subscription modification page, manual history has been
moved in its own page subscription-history.pl which is accessible on
subscription-detail.pl, tab 'Planning'.
Important note: updatedatabase.pl script takes into account existing
subscriptions and create appropriate numbering patterns for them (it
tries to create as few patterns as possible). Frequency is
mapped to the correct entry in subscription_frequencies table.
This patch includes kohastructure.sql and updatedatabase.pl changes
+ sample frequencies data and sample numberpatterns data for fresh
installs (sample data is included in updatedatabase.pl)
=== TEST PLAN: ===
Create a new subscription:
- Go to Serials module and click "New subscription" button
- On the first page, choose a biblio and click next to go to the
second page
- Pick a first issue publication date
- Choose frequency '1/day'
- Choose a subscription length of 15 issues
- Choose a subscription start date
- Choose numbering pattern 'Volume, Number'
- A table appears, fill 'Begins with' cells with '1'
- Click on 'Test prediction pattern' button
The prediction pattern is displayed at the right of the page. You can
see in it the serial number, the publication date and a checkbox to
allow you to choose which serials will not be received (irregularities).
You can see that serial number start from "Vol 1, No 1" continue to "Vol
1, No 12" and then restart with "Vol 2, No 1".
Frequency is '1/day' so you can see that publication date is incremented
by one day line after line.
- Now you can play a little with frequencies and numbering patterns,
change one of them (or both) and click again on 'Test prediction
pattern'
- For example, choose frequency '3/weeks' and click on 'Test
prediction pattern' button'.
There is a little behaviour change compared with current master.
Publication date will not be guessed within the week. Koha can't know
when you will receive issues. So the publication date stay the same
(monday of each week) for 3 consecutive issues and then jump to the next
week.
- Now choose frequency '1/3 months' and numbering pattern 'Seasonal'
- Fill 'Begins with' cells with '2012' for Year and '0' for Season
- Click on 'Test prediction pattern'
- You should have something like 'Spring 2012', 'Summer 2012', ...,
'Winter 2012', 'Spring 2013'
- Note that you can have seasons for south hemisphere by entering '2'
in 'Year/Inner counter'
- 2nd note: if you have some locales installed on your system, you can
type its name in the 'Locale' field (actually it does not work for
seasons name, only for month names and day names)
If you want to modify the numbering pattern you can still do it here:
- Click on 'Show/Hide advanced pattern' link. The advanced pattern
table is shown but all fields are readonly
- Click on 'Modify pattern' button. All readonly fields are now
editable. Note that 'Begins with' and 'Inner counter' line are
repeated here and any modifications in the small table will be
replicated in the big table, and vice versa.
- Pattern name is emptied, if you type a new name, a new pattern will
be created, and if you type the same name as an existing numbering
pattern, this one will be modified (with a confirmation message)
- There is two new lines in this table:
- Label: it's what is displayed in the smaller table headers above
- Numbering: used to format numbers in different ways. can be
'seasons', 'monthname' or 'dayname'. Month name and day name can be
localized using the 'Locale' field. Seasons can't (values for
English and french are hard-coded in Serials.pm)
- You can modify what you want in the table and click on 'Test
prediction pattern' button each time you want to see your
modifications. (Note that checkboxes for irregularities aren't displayed
in this mode, and you can't save the subscription until you have saved
or cancelled your changes).
- To cancel your modifications, just click on 'Cancel modifications'
button.
- To save them, click on 'Save as new pattern'. If the pattern name is
already existing, a confirmation box will ask you if you want to
modify the existing numbering pattern. Otherwise a new pattern will be
created and automatically selected.
Once you have finished modifying numbering pattern, you can click again
on 'Test prediction pattern' to define irregularities, and then click on
'Save subscription'.
Now you can check the serials module still works correctly:
- Check the subscription detail page to confirm that nothing is
missing. Especially the 'Frequency' and 'Number pattern' information.
- Try to receive some issues. Check that the serial number is correctly
generated and if irregularities you have defined are taken into
account (if you have defined some).
- Check that receiving is blocked once you have reached the number of
issues you have defined in subscription length (or once you have
reached the subscription end date)
In serials menu (to the left of almost each page of serials menu) you
have two new links: 'Manage frequencies' and 'Manage numbering
patterns'.
'Manage numbering patterns' lead to a page which list all numbering
patterns and allow you to create, edit or delete them. The interface is
almost the same as numbering pattern modification in subscription-add.pl
'Manage frequencies' lead to a page which list all frequencies and allow
you to create, edit or delete them.
Try to create a new frequency:
- Click on 'Manage frequencies' link in the serials menu and then click
on 'New frequency':
- Fill in the description (mandatory).
- Unit is one of 'day', 'week', 'month', year' or 'None' ('None' is for
an irregular subscription)
- If unit is different from 'None' you have to fill the two following
fields (Issues per unit, and Units per issue)
- Note that at least one of those must be equal to 1
- Issues per unit is the number of received issues by 'unit' and Units
per issue is the number of 'unit' between two issues
- Display order is used to build the drop-down list. Leave empty and it
will be set to 0 (top of the list)
- Then click on 'Save'
- Check that this new frequency appears in the frequencies table and in
the drop-down list in subscription-add.pl
Subscription history has been moved in its own page. To test if it still
works, choose a subscription with manual history enabled (or modify an
existing subscription to turn on manual history).
- On the detail page, tab 'Planning', you should have a link 'Edit history'.
- Click on it
- Modify history and click on Save
- In tab 'Summary' you should have the infos you just entered
And finally, you can check that old subscriptions (by old I mean
subscriptions that existed before the update) are correctly linked to an
existing numbering pattern and an existing frequency. Numbering patterns
should be named 'Backup pattern X' where X is a number.
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Great development! Work as described. No koha-qa errors
(with all patches applied). Please QA this fast.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Squashed commits:
-----------------
Bug 7688 follow-up: Small fixes for QA
- # Subroutines::ProhibitExplicitReturnUndef: Got 1 violation(s) in
C4::Serials::GetSubscriptionIrregularities
- Bad template constructions fixed in serials/subscription-add.tt
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
----
Bug 7688 follow-up: Small fixes for QA #2
- "return undef" -> "return"
- ":utf8" -> ":encoding(UTF-8)"
- TAB -> SPACES
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
----
Bug 7688: Translate sample frequencies for french
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
----
Bug 7688: Fix generating next serial when there is no 'Expected' issue
It can happen when the Expected issue is claimed. In this case the
status of the last serial is 'Claimed'
This patch change the API of GetNextSeq and GetSeq
Test plan:
- Create a subscription which starts a long time ago so that serials
automatically appear in late issues
- Receive the first serial
- Go to claims page and claim the 2nd serial.
- Go back to the subscription page and click on 'Serial collection'
- You should have 2 serials, one 'Arrived' and one 'Claimed'.
- Click on Generate Next. This should fail with a software error message
("can't call method output ...")
- Apply this patch and click again on Generate Next. A new issue must be
created with status 'Expected'.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
----
Bug 7688: Followup FIX perldoc
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When an index does not contain a structure part, the structure "wrdl"
is automatically added and a structure is mandatory to build search
query (to convert ':' into '=').
But the code that tests that the structure is not already defined looks
in entire index string :
$index =~ /(st-|phr|ext|wrdl|nb|ns)/
It should look for a comma followed by a structure and in the case of
"nb" and "ns" look for an exact match.
The consequence is that an index containing ns or nb or phr or etc does
not work.
This patch modifies the regexp for this part and other parts looking at
structures into index.
Test plan :
- Desactivate all searching sysprefs.
- Create a new index called "ansa" number 8999 into bib1.att,
ccl.properties and records.abs
- Index a biblio with a value on this index, ie "VALUE"
- Perform a search on this index by editing URL:
http://<server>/cgi-bin/koha/catalogue/search.pl?idx=ansa&q=VALUE
=> Without patch, the search does not work. The PQF query is
"@and ansa: VALUE"
=> With patch, the search works. The PQF query is "@attr 1=8999 VALUE";
- Perform same test with an index containing a structure ie "aphra"
- Set QueryAutoTruncate syspref to automatically
=> Check * is added to operand : PQF query is
"@attr 1=8999 @attr 4=6 @attr 5=1 VALUE"
- You may check stopwords removal but this feature is obsolete
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: as far as I can test, this works. Small tab error reported
by koha-qa, fixed in a followup.
This kind of patch is difficult to test without explicit instructions,
not everyone knows how to check what kind of PQF search is used.
I don't know. But I can test search results.
Test:
1) Deactivate search sysprefs
QueryAutoTruncate = only if * is added
QueryFuzzy = Don't try
QueryStemming = Don't try
QueryWeightFields = Disable
UseQueryParser = Do not try
2) Create new index 'ansa'
bib1.att : att 8999 ansa
ccl.properties : ansa 1=8999
records.abs : melm 999 ansa:w,ansa:p
1) and 2) from comment 3 on Bug
3) In the undestanding that index refers to field 999,
edited default framework, made 999a visible on editor
4) Edit sample record, add 'VALUE' to 999a, save, reindex
5) Search with /cgi-bin/koha/catalogue/search.pl?idx=ansa&q=VALUE
No results
6) Apply patch, repeat search
Got results
That's all I can test. If not enough for QA, then this
must wait until further and explicit test instructions
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
There is (for MARC21, at least), an exising indexes that this patch
fixes: Code-institution.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::Accounts::fixcredit and C4::Accounts::refund are marked as
deprecated and are not used. They can be removed.
Use:
git grep fixcredit
git grep refund
and verify these routines are not currently in used.
Bonus: The module exports reconcileaccount which is not defined. The
export is removed too.
MLT: Ran qa tool on this patch with no issues either.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
[1] Fix regression on bug 10663
Looks like the regression was introduced by a glitch during rebasing.
[2] Fix errors in Circulation_issue.t
The change in AddRenewal() turned up an issue with how the test
script issued one of the test items.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds a renewal tool that functions similar to the returns where a
librarian can continuously scan items for renewal. This script blocks
renewals that are impossible, and allow the same renewal overrides
as circulation.pl
Test plan:
1) Apply the patches for bug 8798
2) Apply this patch
3) Browse to /cgi-bin/koha/circ/renew.pl
4) Enter an invalid barcode, you should get an error message
5) Enter a valid, but not checked out barcode, you should get an error
message.
6) Enter a valid barcode that is checkout out and should be renewable,
you should get a success message.
7) Enable AllowRenewalLimitOverride
8) Enter a barcode for an item that has been renewed too many times
9) You should get a warning which you can override.
10) Disable AllowRenewalLimitOverride
11) Repeat steap 8
12) You should get a blocking error message
11) Enter a barcode for an item with unfilled holds on it,
you should get an overridable warning
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Passes all tests and QA script, some issues have been
addressed in follow-ups.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch uses understandable codes instead of magical numbers for the
aqorders.orderstatus field.
+ execute sql queries in unit tests into a transaction.
Signed-off-by: Pierre Angot <tredok.pierre@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
Check that deleted orders are not listed in the late orders search
results.
Signed-off-by: Cedric Vita <cedric.vita@dracenie.com>
Signed-off-by: Pierre Angot <tredok.pierre@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
You can now search orders by
- order status
- fund
The patch series also adds a new field, aqorders.orderstatus, which can
contain following values:
new
ordered
partial (for partially received orders)
complete
cancelled
To test: Search and check if results are consistent in histsearch.pl
Signed-off-by: Cedric Vita <cedric.vita@dracenie.com>
Signed-off-by: Pierre Angot <tredok.pierre@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on last patch.
Note: status are no longer numeric, but strings now.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Right now there is no way to change the budget or fund when receiving an
item, which is annoying, particularly at the end of the fiscal year when
every item not already received has to be switched to the following
year's budget. This patch adds the ability to change the budget and fund
when receiving.
To test:
1) Apply patch.
2) Create an order for a vendor, choosing a fund to use for that order.
3) Receive the order, leaving the fund unchanged. Make sure the fund
did not change.
4) Create another order for a vendor, choosing a fund to use for that
order.
5) Receive the order, this time changing the fund. Make sure the fund
is changed.
6) Run the unit test:
> prove t/db_dependent/Acquisition.t
7) Sign off.
(Notes: this patch depends on the Acquisitions.t unit test improvements
in bug 10274; the seemingly-unrelated change in SQLHelper quiets an
irritating warning caused by the NewOrder call in ModReceiveOrder)
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A routine declares two lexical variables named $stylesheet.
This patch renames the second to keep code clearer and
avoid propagating compile time warnings
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, works as advertised!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Given how easy it is to accidentally receive items from one invoice on
multiple invoices, the ability to merge invoices can be quite handy.
This patch adds that ability to Koha's Acquisitions module.
To test:
1) Apply patch.
2) Run unit test:
> prove t/db_dependent/Acquisition/Invoices.t
3) Create two invoices from the same vendor for merging, and receive at
least one order on each.
4) Do a search on the Invoices page that brings up both the invoices you
created.
5) Check the boxes next to the two invoices.
6) Click "Merge selected invoices."
7) Choose which invoice you want to keep (the default will be the first).
8) Click "Merge."
9) Confirm that the resulting invoice has all the orders you received
listed on it.
10) Sign off.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Merged several invoices sucessfully - with and without received
orders, open and closed. Works nicely.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In the C4::Acquisition module, 2 routines do the same work. This patch
merges these 2 routines.
Test plan:
test the acqui/orderreceive.pl, acqui/uncertainprice.pl
and serials/acqui-search-result.pl, acqui/parcel.pl scripts.
Note: on acqui/parcel the basket filter is a search on basket name (was
on basket id, which was not relevant).
Signed-off-by: Pierre Angot <tredok.pierre@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pm, no adverse bahaviors noted. All sub calls updated.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the web-based self-check module pages
specify that no browser (or proxy caching) occur at all.
This prevents a security issue where letting the SCO session time out,
then hitting the back button allowed one to view the previous
patron's session.
This patch adds an optional fifth parameter to output_with_http_headers(),
and output_html_with_http_headers(), a hashref for miscellaneous
options. One key is defined at the moment: force_no_caching, which if
if present and set to a true value, sets HTTP headers to specify no
browser caching of the page at all.
To test:
[1] Start a web-based self-check session and optionally perform
some transactions.
[2] Allow the session to time out (it may be helpful to set
SelfCheckTimeout to a low value such as 10 seconds).
[3] Hit the back button. You should not see the previous patron's
self-check session.
[4] Verify that prove -v t/Output.t passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Ed Veal <ed.veal@bywatersolutions.com>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
No code implements the routines Get and TransformHtmlToMarc2,
so don't export them into users' namespace
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Recent changes to LDAP broke auth_by_bind in many situations. This bug
resets the behaviour to what it used to be, however also allows the new
behaviour by adding the 'anonymous_bind' parameter to the LDAP config.
Testing:
1) Find an LDAP configuration that was broken recently that uses
auth_by_bind
2) Apply this patch
3) See if it works again.
Additionally, testing the original path in the case of 'anonymous_bind'
being set should probably be done too, but I have no idea about the LDAP
server config for that.
Signed-off-by: Ulrich Kleiber <ulrich.kleiber@bsz-bw.de>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
1) set an empty string for the ReservesMaxPickUpDelay pref
2) place a hold on an item
3) check in the item
4) click on "Print and confirm"
5) an error occurs
> The 'days' parameter (undef) to DateTime::Duration::new was an 'undef'
6) apply the patch
7) repeat steps 1 to 4
8) the error does not occur anymore.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
An empty string didn't do it for me, I had to set the
variable for the systempreference to NULL. I am not sure
if this can happen when editing from the interface, but
this change should not have any ill side effects and it has
unit tests!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Koha::DateUtils::output_pref took 4 parameters and the last one is a
boolean, so some calls were:
output_pref($dt, undef, undef, 1)
This patch changes its prototype to
output_pref({
dt => $dt,
dateformat => $dateformat,
timeformat => $timeformat,
dateonly => $boolean
});
An alternative is to call the output_pref routine with a datetime
object, without using an hashref:
output_pref($dt);
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
These variables still need to be exported to the template by default for
the 'prog' OPAC template to work correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The goal of this theme is to provide a fully-responsive OPAC which
offers a high level of functionality across multiple devices with varied
viewport sizes. Its style is based on the CCSR theme, with elements of
the Bootstrap framework providing default styling of buttons, menus,
modals, etc.
The Bootstrap grid is used everywhere, but Bootstrap's default
responsive breakpoints have been expanded to allow for better
flexibility for our needs.
All non-translation-depended files are in the root directory of this new
theme:
css, images, itemtypeimg, js, less, and lib. Languages.pm has been
modified to ignore the new directories when parsing the theme language
directories.
This theme introduces the use of LESS (http://lesscss.org/) to build
CSS. Three LESS files can be found in the "less" directory: mixins.less,
opac.less, and responsive.less. These three files are compiled into one
CSS file for production: opac.css. "Base" theme styles are found in
opac.less. A few "mixins" (http://lesscss.org/#-mixins) are found in
mixins.less. Any CSS which is conditional on specific media queries is
found in responsive.less.
At the template level some general sturctural changes have been made.
For the most part JavaScript is now at the end of each template as is
recommended for performance reasons. JavaScript formerly in
doc-head-close.inc is now in opac-bottom.inc.
In order to be able to maintain this structure and accommodate
page-specific scripts at the same time the use of BLOCK and PROCESS are
added. By default opac-bottom.inc will PROCESS a "jsinclude" block:
[% PROCESS jsinclude %]
Each page template in the theme must contain this block, even if it is
empty:
[% BLOCK jsinclude %][% END %]
Pages which require that page-specific JavaScript be inserted can add it
to the jsinclude block and it will appear correctly at the bottom of the
rendered page.
The same is true for page-specific CSS. Each page contains a cssinclude
block:
[% BLOCK cssinclude %][% END %]
...which is processed in doc-head-close.inc:
[% PROCESS cssinclude %]
Using these methods helps us maintain a strict separation of CSS links
and blocks (at the top of each page) and JavaScript (at the bottom). A
few exceptions are made for some JavaScript which must be processed
sooner: respond.js (https://github.com/scottjehl/Respond, conditionally
applied to Internet Explorer versions < 9 to allow for layout
responsiveness), the _() function required for JS translatability, and
Modernizr (http://modernizr.com/, a script which detects browser
features and allows us to conditionally load JavaScript based on
available features--or lack thereof).
Another new JavaScript dependency in this theme is enquire.js
(http://wicky.nillia.ms/enquire.js/), which lets us trigger JavaScript
events based on viewport size.
I have made an effort to re-indent the templates in a sane way,
eliminating trailing spaces and tabs. However, I have not wrapped lines
at a specific line length. In order to improve template legibility I
have also tried to insert comments indicating the origin of closing tags
like <div> or template directives like [% END %]:
</div> <!-- / .container-fluid -->
[% END # / IF ( OpacBrowseResults && busc ) %]
TESTING
Proper testing of this theme is no easy task: Every template has been
touched. Each page should work reasonable well at a variety of screen
dimensions. Pages should be tested under many conditions which are
controlled by toggling OPAC system preferences on and off. A variety of
devices, platforms, and browsers should be tested.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Apply this patch
2) Run updatedatabase.pl
3) Enable patronimages
4) Verify patron images are still displaying correctly
5) Test deleting a patron image
6) Test adding a patron image from moremember.pl
7) Test adding a patron image from tools/picture-upload.pl
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
* Added base class files for all tables in koha using
DBIx::Class::Schema::Loader.
* Added a (very basic) test file for C4::Context
* Also added dependencies in required files.
To Test:
[1] Install patch
[2] Make sure you can still connect to Koha
[3] You may optionally run this test script:
use Koha::Database;
use Data::Dumper;
my $db = Koha::Database->new();
my $schema = $db->schema();
print Dumper($schema->resultset("Borrower"));
If you run this file you should get a DBIx dump of the borrowers table.
Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In a serials module, when searching subscriptions, the results table as
a "Notes" column.
In TT code, you see that it tries to display public note
"subscription.notes" and internal note "subscription.internalnotes".
The internal note is displayed well but not the public note.
You can see the 2 notes in serial details in summary tab.
The problem commes from the SQL query. A join is perform on subscription
and biblio, both containing a "notes" column.
This patch solves the problem by using a alias in query for both columns
(biblio.notes is acutally not used in template but could be).
Test plan :
- Edit a subscription
- Add public and internal notes. For example : "too busy" and "on holiday"
- Perform a subscription search that returns this subscription
=> "Notes" column contains both notes. For example : "too busy (on holiday)"
- Test with only public note
- Test with only internal note
Works as described.
Signed-off-by:Mathieu Saby <mathieu.saby@uhb.fr>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Works as described, fixes a bug as the templates show that
the intention was to display both notes in the column.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
Verify that existing CSV lists list MARC CSV profiles and not SQL CSV
profiles.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch:
- adds a new column 'type' to the export_format table.
- renames the field name export_format.marcfields with
export_format.content.
Test plan:
- Check that existing profiles have the type "marc" selected by default
- Create a new profile with a type "sql"
- Save and verify the profile is correctly displayed when you select it.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. koha-qa reports Small tabs errors,
corrected in followup
Test:
1) go to Tools > CSV profiles, Create profile, current
2) Apply patch, run updatedatabase
3) Go to Tools > CSV profiles, new option present
old profile with type MARC
4) Create new profile MARC, save and show correct
5) Create new profile SQL, save and show correct
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass with all 3 patches applied.
Works as described. Functionality for SQL profiles will be
added by another patch. For now it's possible to add/edit/delete
them.
Existing CSV profiles can still be exported correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch allows to define default values in the authorities framework.
Some code already existed but the feature did not work.
Test plan:
1/ Choose a framework, field and subfields.
2/ Define a default value.
3/ Create a new authority and check that the subfield is
automatically filled with the default value.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. koha-qa reports some tabs, fixed in followup
Test
1) Apply patch, run updatedatabase.pl
2) Edit auth framework, put default value someware, save
3) Add new auth, default value present
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Verified database update is done correctly.
Controlfields 0xx
- Edited an existing field (001)
- Set a default value for subfield @
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Fields
- Edited an existing field (100)
- Set a default value for subfield e
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In OAI set mappings, the value "is equal to" is hardcoded. This
enhancement changes it to a dropdown menu to choose between "is equal
to" and "not equal to".
To test:
* define a set
* define a mapping for said set with "is equal to"
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
* change mapping to 'not equal to', save
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Amended patch: Fix bug id in updatedb.pl
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adjusts the auto-completion on the authority record
finder (accessed from the bib editor) so that if you do
start typing in the "Main entry ($a only)" input field, it will
return only the $a of the main heading for matching authority
records.
This fixes a problem where typing "shakes", choosing
"Shakespeare, William, 1564-1616" from the auto-completion
result list, then hitting the search button fails to bring
up results, as the dates come from the $d of the 100 field
(in MARC21).
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised.
Tested with an authority where I added my search term in $b.
The modified authority came up in main entry, not in mainmainentry.
That was the desired result.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To reproduce and test:
To reproduce:
1) Create an authority record with main heading (100) in Latin script
(e.g. Oppenheimer, Aharon -- subfields $a and $b) and parallel form
(700) in Hebrew (אופנהיימר, אהרן -- subfields $a and $b).
Mark it correctly in $8 with freheb (or engheb if you like);
2) Reindex and search;
3) You will see:
Oppenheimer Aharon
freheb: אופנהיימר
Whereas you would rather like to see (mind language and lack of $b above):
Oppenheimer, Aharon
Hebrew: אופנהיימר, אהרן
The patch corrects the issue and should not harm those who (improperly)
put only one triple in $8
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
Same result on OPAC and STAFF
Turns out that test plan is wrong,
you neet to fill tag 200ab, not 100ab, for main heading.
I filled 100a with some example data from UNIMARC auth manual.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Took me a bit to figure it out, works according to test plan.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Before fixing UNIMARC DOM indexing, we must fix GRS-1 indexing
1) In advanced search, some Coded fields index are not working: Print,
Illustration, Content
2) Country-heading index is not working
3) Some subfields are indexed in wrong indexes :
102$a should be in Country-publication instead of Country-heading
(non defined in bib1.att)
106$a, filled only for printed works, should be in ff88-23 (form of
item) instead of itype. (ff88-23 is made for Marc21 008 pos
23, which contains the same data as 106a)
200$b should be in Material-type instead of (or in addition to) itype
and itemtype: (Material-type :"free-form string, ... that
describes the material type of the item, e.g., cassette, kit,
computer database, computer file.")
100$a pos 22-24 should not be indexed as "ln" : it is the language of
the record, not the language of the ressource
4) Index names are too long : if we index new positions of coded fields,
with existing names it breaks Zebra indexing (there must be a limit
in line lenghth in record.abs?)
5) There are a lot of warns when rebuiding zebra.
This patch make some changes in bib1.att (could be used later to improve
search) :
- fixing wording for att 51 and 1012
- adding comments for attributes based on MARC21 008 field (8800-8841)
- creating 8806 (tpubdate), 8838 (Modified-code), 8818 (ff8-18), 8840
(ff8-18-21), 8819 (ff8-19), 8821 (ff8-21), 8828 (ff8-28), 8830
(ff8-30), 8831 (ff8-31)
- creating attributes specific to UNIMARC : 9701-9707 (Video-mt,
Graphics-type, Graphics-support, Title-page-availability,
Cumulative-index-availability, script-Title, char-encoding)
- setting apart 3 blocks of attributes, so it could be easy to make
further changes :
-- common to Marc21 and UNIMARC : 8806, 8822, 8838
-- slightly different in Marc21 and UNIMARC (different meanings
according to the type of the record => don't match a single
UNIMARC field)
-- specific to UNIMARC : 9701-9707
In ccl.properties :
- creating a new index: Country-publication 1=1053
- suppressing some warns by mapping with bib1 att:
Date-time-last-modified, Name, rtype, Music-number
- defining indexes using the 3 blocks attributes defined in bib1
(common to Marc21 and UNIMARC, slightly different, specific to UNIMARC)
In record.abs :
- renaming some index for 100-105-110 fields
- correcting indexing of 102$a (country of publication)
106$a (ff88-23)
100$a pos 22-24 (language of record, no more
indexed)
105$a pos. 0-3 (illustration code)
200$b (for the moment, I keep it indexed in
itype and itemtype, but also Material-Type)
In C4/Search.pm :
- adding "Country-publication" index
In OPAC and staff interface template subtypes_unimarc.in :
- renaming indexes to take into account the changes made to Zebra
config files
To test (this cannot be done with a sandbox) :
1) Apply the patch in a UNIMARC GRS-1 Koha instance
2) Copy the following files from the etc/zebradb of your source
directory into the etc/zebradb of your main Koha directory:
-- etc/zebradb/biblios/etc/bib1.att
-- etc/zebradb/ccl.properties
-- etc/zebradb/marc_defs/unimarc/biblios/record.abs
3) Reindex your data (rebuild_zebra -x -b -r -v)
4) Try to use those Coded fields indexes in Advanced search, in OPAC
and Staff interface (available after clicking on "More options",
then on "Coded information filters"):
Audience, Print, Literary genre, Biography, Illustration, Content,
Video Types, Serials, Serial Type, Periodicity, Regularity
5) Try to search "Country-publication=FR" in simple search
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Tests for GRS-1
Followed test plan
Search by coded fields works, but only on OPAC,
on staff there are few options
Search by Country-publication works after patch
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Modified Record::marc2bibtex to varlidate fields 100, 110 and 111 in
non-UNIMARC flavours.
Test plan:
1)Search any books in the OPAC with a main entry (1XX in MARC21, 700-720 in UNIMARC)
2)Export the record in the bibtex format
==>The output won't contain the main entry.
3)Apply the patch
4)Export the record
==>The record will contain the main entry
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes a long standing bug.
Passes all tests and QA script.
Tested with multiple records, seems to work well.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes some incoherences of the routine
GetBooksellerWithOrders().
Now it considers the field $estimateddeliverydateto and it replaces it
by now() only if it is undef.
Also, it doesn't test if $aqbookseller.deliverytime is not Null anymore
but if $deliverytime = null or undef, it replaces it by 0.
It also verifies if $delay is >= 0 and return undef if it is a negative
value.
To Test:
Before, this routine sorts out the BookSellerWithLateOrders. If a
bookseller did not specify a deliverytime, it would never appears in
the list of LateOrders. Moreover, if the field "Estimated delivery
date to" was specified, it didn't take care of the value and it
returns the late order up to today's date.
Now, the returned list considers all the fields give and if the
delivery time of the bookseller is not specified, it calculates the
late orders as if the deliverytime is 0. By default, all booksellers
which have orders in late until today are listed unless "estimated
delivery date to" is specified.
prove t/db_dependent/Bookseller.t
t/db_dependent/Bookseller.t ..
[Some warnings about uninitialized values]
WARNING: GetBooksellerWithLateOrders is called with a negative value at C4/Bookseller.pm line 135.
t/db_dependent/Bookseller.t .. ok
All tests successful.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds some improvements for the routine GetOpenIssue().
Now, it verifies if the parameter is given (if not it returns undef)
and it returns $sth->fetchrow_hashref() instead of a $issue.
To test:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 2 wallclock secs ( 0.06 usr 0.01 sys + 1.09 cusr 0.07 csys = 1.23 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Same situation as the one noted in comment of
Bug 10683, test fails unless there is an issuingrule
All, All with 1 as renewals allowed.
With that condition, it succeeds
No koha-qa errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds return values to DeleteTransfer:
Undef if no parameters are given
1 if a Transfer is deleted
0E0 if a wrong parameter is given
It also fixes some unit tests in t/db_dependent/Circulation_transfers.t
To test:
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=14, 20 wallclock secs ( 0.03 usr 0.00 sys + 0.39 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Applied 10681 and 10692 before 10698
Run prove t/db_dependent/Circulation_transfers.t without errors
No koha-qa errors on all 3 patches
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish().
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>