This patch adds a new method mock_userenv from t::lib::Mocks in order to
simplify the mock of the userenv.
Test plan:
prove all the test files modified by this patch
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This is an alternative to bug 21732 as transfers are automatically cancelled on marking an item lost, and the items holding rbanch is set to the transfers source ('from') branch.
When an item is marked as lost, the routine should also clean up any
outstanding transfers.
Also added tests to t/db_dependent/Circulation.t which check:
* If transfer is automatically deleted when item is marked as lost
* If the items holdingbranch automatically changes when item with
transfers on it is marked as lost.
Test plan:
1. Find a item which is in transfer, i.e. find an item with the text in
the 'Status' field of the table in detail.pl that indicates it is in
transfer
2. Set the item to 'Lost' either by clicking on Edit->Edit items from
the detail.pl page
OR
clicking on the Items tab on the left side of the detail.pl page
3. Notice that the transfer is now cancelled for the item and the items
holdingbranch is the transfers source ('from') branch
4. Run t/db_dependent/Circulation.t
Sponsored-by: Brimbank Library, Australia
Signed-off-by: Andreas Hedström Mace <andreas.hedstrom.mace@sub.su.se>
(fixed the introduction of a whitespace line and removed a double
declare warning from the new tests as part of QA)
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Just add "use utf8".
Test plan:
Run Circulation.t; verify there are no warnings.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test plan:
Set AudioAlerts to 0
Prove t/db_dependent/selenium/regressions.t
=> Without this patch the value of the pref is not reset to 0
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
And use a DBIx transaction instead.
Test plan:
prove that the test files modified by this patch are passing
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch introduces regression tests for Koha::Account::outstanding_*
methods so they don't pick wrong lines when amountoutstanding matches
what we are looking for (i.e. negative for credits and positive for
debits).
To test:
- Apply this patch
- Run:
$ kshell
k$ prove t/db_dependent/Koha/Account.t
=> FAIL: Tests fail because pathological account lines are wrongly
picked.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch tests for a new behaviour in the _FixAccountForLostAndFound
method.
The method will now add the amountoutstanding value for the lost item
fee to the CR credit to be generated. This means that:
- If there's some remaining debt, the same amount will be added to the
CR credit and used to cancel that debt. The final amountoutstanding
will be the same as before, but an offset will be generated as
required.
- If the line was written off, the behaviour remains unchanged, so no
offset.
- If the line was payed and/or written off in full only the payments are
refund, preserving the current behaviour.
Only changes to the 'remaining debt' use cases on this tests are
expected.
To test:
- Apply this patch
- Run:
$ kshell
k$ prove t/db_dependent/Circulation.t
=> FAIL: Tests fail because the behaviour is not correct.
Note: some tests order changes are introduced to avoid calling
discard_changes twice
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
To test:
1 - Export your patrons
a - Create a report 'SELECT * FROM borrowers'
b - Run and save the report as csv (check your delimiter)
c - Delete the borrowernumebr column
2 - Use the Patron Import tool to import the csv from above
3 - Set matching to 'cardnumber'
4 - Set 'If matching record is already in the borrowers table:' to
Overwrite
5 - Import
6 - None are import because of matchign userid (their own)
7 - Apply patch
8 - Repeat
9 - Patrons are successfully overwritten
10 - prove -v t/db_dependent/Koha/Patrons/Import.t
11 - prove -v t/db_dependent/Koha/Patrons.t
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Pierre-Marc Thibault <pierre-marc.thibault@inLibro.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This tests usuall cases, but note, that administrator could make a
branch with code "%%%" or so, which this test does not cover, and some
functionalities will not work in that case: opac limit override,
holdinbranch facet
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
To test:
1 - Add a library group (Admin->Library groups)
2 - Enable use as an opac and staff search limit
3 - Add some libraries to the group
4 - Visit advanced search on staff and opac
5 - Note the dropdown has as many empty rows as there are libraries
in the group
6 - Apply patch, restart all the things
7 - Visit staff and opac advanced search
8 - Confirm the group dropdowns are correct
9 - Enable OpacMastheadLibraryPulldown
10 - Ensure the dropdown on opac shows groups correctly
11 - Confirm earchign groups works from all three locations
12 - prove -v t/db_dependent/LibraryGroups.t
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
A short dicussion lead to the decision to make it explictly clear that
this method will implicitly apply credits against debits in a 'First In
First Out' manor, meaning oldest outstanding debits will be paid off
first.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This was requested on the QA review and I agree.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Christopher Brannon <cbrannon@cdalibrary.org>
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
The string extraction process was not taking into account the fact that
standard/package install have a completely different directory structure
than the dev install
This patch tries to keep the exact same behaviour for dev installs,
while making it work for standard install by using opachtdocs,
intrahtdocs, opacdir and intranetdir from $KOHA_CONF
Test plan:
1. Follow test plan in
d708255c7a
2. Do a standard install and repeat step 1 on this new install
3. If you know how to build the Debian package, build it, install it and
verify that koha-translate works as expected
4. prove t/LangInstaller.t
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Moved db_dependent tests to the proper directory in the hierarchy, fixed tests to check correct results now that the authority query builder works better and added several new tests to cover changed functionality.
Sponsored-by: National Library of Finland
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 10306 changed behavior on cloning item subfields by no longer splitting
constructions like 'A | B' in item fields like ccode.
If it is really recommended to clone item subfields, I am not so sure
about. But this patch at least restores the possibility to do so while
we discuss if we should ;)
Test plan:
[1] Run Items.t
[2] Make an item subfield repeatable in framework. And test edit items.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Works as expected. Also fixes the display of collections on the items
table (on editing items).
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
The illrequest api response should always be augmented with an id_prefix
field which is not part of the core illrequest object
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch removes this file. The included tests are similar to those in
the db_dependent section, with the difference that this have some things
mocked. But the tests still require name resolution and connecting to
the RB servers. This situation breaks package building without any
noticeable gain.
The feature could be better tested by mocking WebService::ILS calls (so
no external world contact), but it belongs to a separate bug report.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This tests pass on the build step only because they are skipped, due to
the absense of Test::DBIx::Class library.
It makes sense for them to be in the db_dependent directory, as they are
not completely mocked (re: external world interactions).
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Kind of funny that we did not touch this test here. But it passed!
Trivial fixes:
[1] Typo precessfee
[2] Typo the linked
[3] Add rollback
Test plan:
Run t/db_dependent/Circulation/Chargelostitem.t
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
We have the itemnumber no need to pass the issue_id, we can retrieve it
from chargelostitem
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Michal Denar <black23@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Test Plan:
1) Apply this patch
2) prove t/db_dependent/Accounts.t
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Michal Denar <black23@gmail.com>
[EDIT:]
Patch should have increased the number of tests obviously.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
https://bugs.koha-community.org/show_bug.cgi?id=20244
Test plan:
1. Add a record with alternate script fields in 880 (sample attached in the bug).
2. Make sure the the record can be found e.g. with the alternate script title.
3. Add a record with multiple authors.
4. Check that in the index there is only a single author__sort field.
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
https://bugs.koha-community.org/show_bug.cgi?id=20244
Test plan:
1. Add a record with an ISBN-10 or ISBN-13 that can be converted to ISBN-10 (e.g. 1-56619-909-3).
2. Verify that the record can be found by searching for "1-56619-909-3", "1566199093", "978-1-56619-909-4" and "9781566199094".
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Add missing pods, remove obsolete syspref and add test for serialization format for records exceeding max record size
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
https://bugs.koha-community.org/show_bug.cgi?id=19893
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Implement optimized indexing for Elasticsearch
How to test:
1) Time a full elasticsearch re-index without this patch by running the
rebuild_elastic_search.pl with the -d flag:
`koha-shell <instance_name> -c "time rebuild_elastic_search.pl -d"`.
2) Apply this patch.
3) Time a full re-index again, it should be about twice at fast (for a
couple of thousand biblios, with fewer biblios results may be more
unpredictable).
Sponsored-by: Gothenburg University Library
Signed-off-by: Ere Maijala <ere.maijala@helsinki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
This patch adds tests for:
- Output of ImportFramework on fields removal and error conditions (file
not found)
- Different file format behaviour (csv, xml and ods are tested).
To test:
- Run
$ kshell
k$ prove t/db_dependent/ImportExportFramework.t
=> SUCCESS: All green :-D
- Sign off :-D
Note: the biblio_framework* files are based on the original one on the
patchset, and exported in 17.11 with different formats.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
The following error is raised if TestBuilder generates too big numbers:
Out of range value for column 'tax_value_on_ordering'
UPDATE aqorders
SET
tax_value_on_ordering = quantity * ecost_tax_excluded * tax_rate_on_ordering,
tax_value_on_receiving = quantity * unitprice_tax_excluded * tax_rate_on_receiving
WHERE ordernumber = ?
"] at /home/vagrant/kohaclone/C4/Acquisition.pm line 1484.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
For the subscription we would like to keep the original internal note
(from the first order), to display it unmodified each time we receive
issues.
Sponsored-by: BULAC - http://www.bulac.fr/
Signed-off-by: Séverine QUEUNE <severine.queune@bulac.fr>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
It will take too much time to fix the failures here, given the
simplicity of the subroutines I would be in favor of deleting them and
continue the move on a separate bug report to use Koha::Object-objects
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
I've checked that everything removed is no longer in use.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
For other types of split rules
Sponsored-by: Goethe-Institut
Signed-off-by: Christian Stelzenmüller <christian.stelzenmueller@bsz-bw.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Sponsored-by: Goethe-Institut
Signed-off-by: Christian Stelzenmüller <christian.stelzenmueller@bsz-bw.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Currently the call number splitting seems to be mostly implemented for
DDC and LC classifications.
Those are both not very common in some countries.
A lot of libraries use their own custom classification schemes so the call number
plitting is something that should be individually configurable.
This enhancement adds the ability to define custom splitting rules based
on regular expressions.
How does it work so far?
From C4/Labels/Label.pm there are 3 differents splitting methods defined, depending on items.cn_source.
if cn_source is "lcc' or 'nlm' we split using Library::CallNumber::LC
if cn_source is 'ddc' we split using a in-house method
Finally there is a fallback method to split on space
And nothing else is done for other cn_source
The idea of this patch is to mimick what was done for the "filing rules" and add
the ability to define "splitting rules" that will be used by the "Classification sources".
A classification source will then have:
* a filing rule used to sort items by callnumbers
* a splitting rule used to print labels
To acchieve this goal this enhancement will do the following
modifications at DB level:
* New table class_split_rules
* New column class_sources.class_split_rule
Test plan:
* Execute the update database entry to create the new table and
column.
I. UI Changes
a) Create/modify/delete a filing rule
b) Create/modify/delete a splitting rule
c) Create/modify/delete a classification source
=> A filing rule or splitting rule cannot be removed if used by a
classification source
II. Splitting rule using regular expressions
a) Create a splitting rule using the "Splitting routine" "RegEx"
b) Define several regular expressions, they will be applied one after
the other in the same order you define them.
Something like:
s/\s/\n/g # Break on spaces
s/(\s?=)/\n=/g # Break on = (unless it's done already)
s/^(J|K)\n/$1 / # Remove the first break if callnumber starts with J or K
c) You can test the regular expressions using filling the textarea with
a list of callnumbers. Then click "Test" and confirm the callnumbers are
split how you expected.
d) Finally create a new classification source that will use this new
splitting rule.
III. Print the label!
a) Create a layout. It should have the "Split call numbers" checkbox
ticked, and display itemcallnumber
b) Use this layout to export labels, use items with different
classification source ('lcc', 'ddc', but also the new one you have
create)
=> The callnumbers should have been split according to the regex you
defined earlier!
Notes:
* The update database entry fill the class_sources.class_split_rule
with the value of class_sources.class_sort_rule
If default rules exist it will not work, we should add a note in the
release notes (would be enough?)
* C4::ClassSplitRoutine::* should be moved to Koha::ClassSplitRule,
but it sounded better to keep the same pattern as ClassSortRoutines
* Should not we use a LONGTEXT for class_split_rules.split_regex instead
of VARCHAR(255)?
* class_sources.sql should be filled for other languages before pushed
to master!
IMPORTANT NOTES: The regular expressions are stored as it, and eval is
used to evaluate it (perlcritic raises a warning about it (Expression
form of "eval"). It can lead to serious security issues (execution of
arbitrary code on the server), especially if the modifier 'e' is used.
We could then remedy the situation with one of these following points:
- Assume that this DB data is safe (We can add a new permission?)
- Assume that the data is not safe and deal with possible attack
Cons: how be sure we are exhaustive? Making sure it matches ^s///[^e/]*$
would be enough?
- Use Template Toolkit syntax instead (Really safer?)
[% callnumber.replace('\s', '\n').replace ... %]
- Cut the regex parts: find, replace, modifiers
like we already do for Marc modification template. Cons: we are going to
have escape problems, the "find" and "replace" parts should not be
handle the same way (think "\n", "\\n", "\1", "\s", etc.)
I did not manage to implement this one easily.
Sponsored-by: Goethe-Institut
Signed-off-by: Christian Stelzenmüller <christian.stelzenmueller@bsz-bw.de>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>