Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch adjusts the code that uses GetOpenIssue to use/find a Koha::Checkout object
instead
To test:
1 - Add a course to course reserves
2 - Create an item with barcode TESTKOC
3 - Add the item to a course
4 - Checkout the item
5 - View course details on stff and opac and confirm item shows as checked out and due date displays
6 - prove t/db_dependent/Circulation/issue.t t/db_dependent/Circulation.t t/db_dependent/CourseReserves.t
7 - Browse to Circulation->Upload offline circulation
8 - Upload a file to return the item: https://wiki.koha-community.org/wiki/Koha_offline_circulation_file_format
9 - Confirm item is returned
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
It's possible that there could be 0 possible reserves, for example
when the hold has already been filled, thus the check would fail as
the item count can never be less than 0.
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Before the changes from bug 31112 when CheckReserves returned a
non-priority hold we didn't return "on_reserve" status but checked in
addition to that whether there are any priority holds and if there
were, only then we returned the "on_reserve" error.
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Previously we fetched all in a single call using biblionumbers
Fetching each individually could be a performance hit on systems
with large numbers of holds
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
We now count all holds for all patrons, we can still eject if we have more
holds than we do items
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If a single patron had more than 1 hold to a biblio and there was only one
available item we allowed incorrectly renewing the checkout when
AllowRenewalIfOtherItemsAvailable was set to "Allow". This
changes CanBookBeRenewed so that it makes sure all the holds are
filled and not just one per patron.
To test:
1) prove t/db_dependent/Circulation.t
2) (Optional, as unit test is provided)
- Set AllowRenewalIfOtherItemsAvailable = Allow
- Create biblio with three items
- Checkout one item to patron A
- Add two biblio-level holds for patron B
- Try to renew patron A's checkout with and without this patch.
- Notice that without this patch the renewal succeeds even though we
one unfilled hold left. After applying the patch the renewal should
fail.
Signed-off-by: Sally <sally.healey@cheshiresharedservices.gov.uk>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
To test:
1) Please check manually that the logic stays the same, use git's -w command line parameter to
ignore whitespace changes in the diff output.
2) prove t/db_dependent/Circulation.t
Signed-off-by: Sally <sally.healey@cheshiresharedservices.gov.uk>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch handles the final removal of GetTransfers from
C4::Circulation.
Test plan
1/ Check that there is no mention of the GetTransfers method codebase
wide now
2/ Run the circulation and transfers tests and check nothing fails..
perhaps even run the full test suit in k-t-d
3/ Signoff
Rebased-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes the final use of GetTransfers from C4::Search.
Test plan
1/ Perform a search that will include results for some items that have
transfers of various states assigned to them
2/ Check the results match expectations (before and after applying the
patch should look the same)
3/ Signoff
Rebased-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes the GetTransfers call from
GetItemsAvailableToFillHoldRequestsForBib instead replacing it with an
inline JOIN in the initial query.
Test plan
1/ Run the holds queue
2/ Check the results
3/ Put one of the items in the holds queue into transit
4/ Run the holds queue again
5/ Check that the results do not contain the item that is in transit
6/ Apply the patch
7/ Run the holds queue again
8/ Check that the results still do not contain the item that is in
transit
Rebased-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Test Plan ( assuming Koha Testing Docker or kohadevbox ):
1) Check out master
2) Start the SIP server ( edit the SIP config koha instutution to be
"CPL" instead of "kohalibrary" )
3) Telnet to 6001
4) Send 9300CNkoha|COkoha|CPCPL|
5) Send 11YY20220711 16350220250711 163502AOCPL|AA23529000035676|AB39999000001396|ACkoha|BON|BIN|
6) Note the due date for the checkout in Koha is not in the year 2025:
Henry Acevedo (23529000035676) checked out Philippics. by Cicero, Marcus Tullius. 39999000001396
7) Apply this patch set
8) Restart all the things!
9) Check in the checkout
10) Repeat steps 3 through 5
11) Note the due date is now in 2025!
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If a renewal via SIP would produce an error due to being on reserve, or
exceeding maximum renewals, Koha's SIP2 implementation will refuse to
renew the item even if the "no block" flag is set to Y.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes two noisy warnings from C4::Templates
0) Apply patch and koha-plack --restart kohadev
1) Go to http://localhost:8081/
2) Note no warnings in /var/log/koha/kohadev/plack-intranet-error.log
3) Go to http://localhost:8080/
3) Note no warnings in /var/log/koha/kohadev/plack-opac-error.log
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Fridolin Somers <fridolin.somers@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Looks like most of existing code wants comma as default value.
Also impact installer/data/mysql/mandatory/sysprefs.sql.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
System preference 'CSVdelimiter' has a special case for tabulation.
Preference value contains string 'tabulation' but string '\t' must be used in CSV file.
This is OK in many places, for exemple Bug 17590.
This patch adds C4::Context->csv_delimiter to add a uniq metod dealing
with this behavior.
Also create Koha::Template::Plugin::Koha->CSVDelimiter for calls from
Toolkit Templates.
Test plan :
1) Set system preference 'CSVdelimiter' = 'tabs'.
2) Create CSV export in impacted pages
3) Check columns are separated by tabulation character and not string 'tabulation'
4) Check with another delimiter
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
and split the subroutine into 2 smaller subroutines (one for ordering,
the other for receiving)
Test plan:
1. Create a vendor and an acquisition basket
2. In this basket, create new orders using all the different methods
(from an existing record, from a suggestion, from a new record, ...)
then close the basket and receive these orders.
Make sure it works the same with and without the patch
3. Run tests in t/Prices.t,
t/db_dependent/Acquisition/populate_order_with_prices.t, and
t/db_dependent/Budgets.t
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When creating authorities via SRU or linking exising authorities
(via cgi-bin/koha/authorities/auth_finder.pl) the search results are
generated using the hardcoded mappings $bib_heading_fields /
$auth_heading_field from C4::Heading::MARC21. For TOPIC_TERM / field 150
these mappings currently include the subfield `abvxyz68`.
But: We are using the GND provided by Deutsche Nationalbibliothek.
We imported some authorities from there, for example:
http://d-nb.info/gnd/4114171-4 "Kind <0-3 Jahre>" (kid 0-3 years)
http://d-nb.info/gnd/4196417-2 "Kind <0-4 Jahre>" (kid 0-4 years)
When searching for these terms, Koha only displays "Kind", which is not
very helpful, as there are a lot of different authorities for different
age bands.
GND stores "Kind" in 150a, and "0-3 Jahre" in 150g.
But in the hardcoded mappings used by Koha, subfield g is not included.
This patch adds subfield g to these mappings, thus making it possible to
easily select the correct authority.
Test plan:
* Create an authority (or edit an existing one) and set 150g to "foo"
* Create a new biblio (or edit an existing one), go to field 650 and click
on the search-icon on the right.
* A popup should open, where you can search for "Authority type: TOPIC_TERM",
enter the name of the authority (150a!) in the search box
* In the resulting list, you will only see the value of 150a.
* Apply the patch
* Search again, now you should see "foo" in the result list
Sponsored-by: Steiermärkische Landesbibliothek
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This was a test added by the following commit:
commit 99eccc18ed
Date: Thu Jun 16 10:10:09 2011 +0100
Bug 5549 : Handle datetimes on return
It's no longer needed now, we can pass a DateTime or an ISO-formatted
date
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
The idea rely on the KohaDates TT plugin for the date formatting. We
should not have any output_pref calls in pl or pm (there are some
exceptions, for ILSDI for instance).
Also flatpickr will deal with the places where dates are inputed. We
will pass the raw SQL value (what we call 'iso' in Koha::DateUtils), and
the controller will receive the same value, no need to additional
conversion.
Note that DBIC has the capability to auto-deflate DateTime objects,
which makes things way easier. We can either pass the value we receive
from the controller, or pass a DT object to our methods.
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes the now unused: _update_import_record_marc
Additionally, as items are already present in import biblio we no longer need to embed
them, so that parameter is removed and the option removed from the sub and pod and everywhere
it was used
In all cases, we were embedding, so we don't need a way to get without
Tests updated
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
We are stripping the marc item tags at import - we save them when not importing items, but
strip when importing items
I think we can save on writes by leaving them in the record. This also allows comparison to what was staged
versus items created
To test:
1 - Stage a marc record with items, but do not look for items
2 - Confirm the item tags remain in staged marc
3 - Import the record
4 - Confirm items are nto in imported marc record
5 - Stage the record again, but look for items
6 - Confirm the item tags are stipped from imported record
7 - Import and confirm imported record has no item tags
8 - Apply patch and repeat 1-5
9 - Confirm item tags remain in record
10 - Import and confirm item tags not in imported marc
Signed-off-by: Andrew <andrewfh@dubcolib.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch restores the param, while still leaving the check against invalid
login credentials to ensure we don't leak information.
To test:
1 - enable EnableExpiredPasswordReset
2 - Edit a patron to set password to expire in the past
3 - Attempt opac login as patron
4 - It fails, but you are redirected to login screen with no info
5 - Apply patch
6 - Attempt login
7 - You are notified password expired and given reset link
8 - Go back to login screen
9 - Login with correct username,, wrong password
10 - You are notified of incorrect credentials, not password expiration
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
AddBiblioToBatch and AddAuthToBatch were btoh passing a random number for the
parameter $update_counts
This simply removes that value as should have been done on 22532
To test:
1 - Stage a marc file
2 - Confirm it works befor and after patch
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
We must not pass $dbh but retrieve it when needed instead
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
We don't need to build allowed_scripts_for_private_opac for staff
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Test plan:
You might want to add a simple SQL statement to your English
custom.sql in order to verify the execution.
Run a Koha install (in English).
Check if the data shows your custom sql action.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
In Koha, any report that uses C4::Reports::Guided will be limited to 999,999 rows. This is causing problems for larger libraries where some reports may have over a million results.
Test Plan:
1) Create a report "SELECT * FROM borrowers" and run it, note the number
of results
2) Apply this patch
3) Add the line `<report_results_limit>3</report_results_limit>`
within the <config> block of your koha-conf.xml
4) Restart all the things!
5) Run the report, download the results as a CSV
6) Note your CSV only has 4 lines, the header and 3 patrons
Signed-off-by: Rachael Laritz <rachael.laritz@inlibro.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Sponsored-by: Rijksmuseum, Netherlands
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Sponsored-by: Rijksmuseum, Netherlands
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Bug 28786 let librarians enable a Two-factor authentication but force them to use
an application to generate the TOTP token.
This new enhancement add the ability to send an email containing the token to the
patron once it's authenticaed
The new notice template has the code '2FA_OTP_TOKEN'
Test plan:
- Setup the two-factor authentication (you need the config entry and the
syspref ON)
- Enable it for your logged in patron
- Logout
- Login and notice the new link "Send the code by email"
- Click on it and confirm that you received an email with the code
- Use the code to be fully logged in
QA question: Is 400 the correct error code to tell the email has not
been sent?
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Sponsored-by: Rijksmuseum, Netherlands
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
- add search form field for ISSN number
- add search logic including ISSN variations search
if SearchWithISSNVariations preference is set
Signed-off-by: KIT Library Germany <michaela.sieber@kit.edu>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Some libraries have certain item types that can only do in house checkouts via SIP self check machines. In these cases, the items should not be demagnetized since the items cannot leave the library.
Test Plan:
1) Apply this patch
2) prove t/db_dependent/SIP/Message.t
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch updates the Local-Number indexing by adding a zeropad option
to Zebra indexing and adding this to the mapping files
It also updates C4/Search.pm to allow biblionumber as an option
To test:
1 - Apply patches
2 - copy etc/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl to /etc/koha/zebradb/marc_defs/marc21/biblios/biblio-zebra-indexdefs.xsl
3 - Restart all, reindex zebra
4 - Browse to: http://localhost:8081/cgi-bin/koha/catalogue/search.pl?idx=kw&q=a&sort_by=biblionumber_dsc&count=20
5 - Confirm records sorted correctly
6 - Browse to http://localhost:8081/cgi-bin/koha/catalogue/search.pl?idx=kw&q=a&sort_by=biblionumber_asc&count=20
7 - Confirm records sorted correctly
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This is really opinionated, but I found this to be much cleaner to read
and thought it was worth pusing as well.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When something is changed by a cronjob, and that entity is logged via action logs, we can know what changed, and that it was via a cronjob, but we cannot necessarily know which cronjob made that change. The closest we can come is to find the action logs for the cronjob module which ran before the change which is by no means reliable assuming the CronLog is even enabled.
We should add a new column to action logs to store the name of the script ran for any action logs where the interface is "cron".
Test plan:
1) Apply this patch
2) Run updatedatabase.pl
3) Enable all the Log/Logging sysprefs
4) Run some cronjobs that will generate action logs
5) Note the new action_logs column "script" contains the filename of the
cronjob that caused the change.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes a duplicated stanza left form moving routine
Changes the routines to use inbound_library_address
Improves the display if the system preferences
Updates the update file
Moves smaple notice
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Right now, if a library automatically cancels expired waiting holds, a
librarian must still re-checkin an item to trap the next available hold
for that item. It would be better if the next hold was automatically
trapped and the librarians receive an email notification so they can
make any changes to the item if need be ( hold area, hold slip in item,
etc ).
Test Plan:
1) Apply this patch
2) Run updatedatabase.pl
3) Create a record with one item
4) Place two holds on that record
5) Check in the item and set it to waiting for the first patron
6) Set ReservesMaxPickUpDelay to 1
7) Enable ExpireReservesMaxPickUpDelay
8) Enable ExpireReservesAutoFill
9) Set an email address in ExpireReservesAutoFillEmail
10) Modify the holds waitingdate to be in the past
11) Run misc/cronjobs/holds/cancel_expired_holds.pl
12) Note the hold is now waiting for the next patron
12) Note a waiting hold notification email was sent to that patron
13) Note a hold changed notification email was sent to the library
Signed-off-by: Victoria Faafia <vfaafia29@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
There are 5 fields that are not set if no value is provided when saving/editing a rule in Administration->Circulation and fines rules
- issuelength
- hardduedate
- unseenrenewalsallowed
- rentaldiscount
- decreaseloanholds
This is problematic because it gives the impression these rules are set as blank, but in reality they don't exist and the rule will fal back to the higher level
To test:
1 - Set a rule for
Patron category: Teacher
Itemtype: All
Hard due date: (Today)
Lona period: 10
2 - Set a rule for
Patron category: Teacher
Itemtype: Books
Hard due date: (leave blank)
Loan period: 10
3 - Expected behaviour is Book item will checkout to teacher for 10 days, all other types will be due yesterday at 25:59:00
4 - Checkout an non-book item type to teacher
5 - Hard due date applies
6 - Checkout a 'book' item type to teacher
7 - Hard due date applies - FAIL
Signed-off-by: Caroline Cyr La Rose <caroline.cyr-la-rose@inlibro.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Test plan:
(A) Reproduce the bug:
A1 In Administration > System preferences > Web services:
- Enable OAI-PMH
- Enable OAI-PMH:AutoUpdateSets
- Enable OAI-PMH:AutoUpdateSetsEmbedItemData
A2 In Tools > Export data > Export bibliographic records, export 1 (one)
biblio record. Be sure that this biblio record has at least one item.
Don't tick "Don't export items".
A3 Delete the exported biblio record, and its items.
A4 In command line, on the server load the exported file:
./bulkmarcimport.pl -b -v -file /path/to/koha.mrc
A5 Retrieve the biblio record in Koha. Note the absence of the item(s). This is
the bug.
(B) Apply the patch
B1 Delete the record (without item) loaded at A4.
B2 In command line, on the server load the exported file:
./bulkmarcimport.pl -b -v -file /path/to/koha.mrc
B2 Retrieve the biblio record in Koha. Note the presence of the item(s).
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Silly mistake from bug 28786, the $type should be compared to "opac"
instead of "OPAC", erk!
Test plan:
Turn 2FA on
Set it up for an user
Login at the OPAC
=> Without this patch you keep being redirected to the auth form screen
=> With this patch applied you are able to successfully login
Signed-off-by: Caroline Cyr La Rose <caroline.cyr-la-rose@inlibro.com>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Can't use an undefined value as a HASH reference at /kohadevbox/koha/C4/Auth.pm line 985
C4::Auth::checkauth('CGI=HASH(0x5603b7dc4300)', 0, 'HASH(0x5603b2633238)', 'intranet', undef, 'intranet-main.tt') called at /kohadevbox/koha/C4/Auth.pm line 186
C4::Auth::get_template_and_user('HASH(0x5603b7b83d08)') called at /kohadevbox/koha/mainpage.pl line 40
Test plan:
Open a private window
Hit /cgi-bin/koha/mainpage.pl?logout.x=1
Signed-off-by: Sally <sally.healey@cheshiresharedservices.gov.uk>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If the rule renewalperiod is the blank empty string instead of being null/undefined or non-existant, Koha will interpret the renewal period as being zero days instead of falling back to the issuelength rule.
It makes sense to me that a literal 0 here should make it renew for zero days even though that is nonsensical.
Test Plan:
1) Delete all your rules
2) Create an all/all/all rules with an empty string for renewal base
period
3) Note that renewing an item does nothing
4) Apply this patch
5) Restart all the things!
6) Renew again
7) Note the renewal uses the issuelength rule as intended
Signed-off-by: Sally <sally.healey@cheshiresharedservices.gov.uk>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
# Failed test 'No tests run for subtest "CancelHold"'
# at t/db_dependent/ILSDI_Services.t line 806.
Undefined subroutine &C4::ILSDI::Services::CanReserveBeCanceledFromOpac called at /kohadevbox/koha/C4/ILSDI/Services.pm line 941.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Not sure about the warn, we shouldn't need it as we are raising an
exception. But better (for now) than introducing regressions.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
In detail.pl we must provide a degraded view with an error message about
invalid MARC::Record.
We are then forced to reproduce the GetMarcBiblio behaviour and call
StripNonXmlChars on the MARC::XML
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Rebased-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Includes:
Bug 29697: (follow-up) Use flag embed_items
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
JD Amended patch:
-# FIXME Special case here
- print "Biblio not found\n,";
+ print "Biblio not found\n";
- my $biblio = Koha::Biblio->find($hostbiblionumber);
+ my $biblio = Koha::Biblios->find($hostbiblionumber);
Rebased-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
It may be helpful to know exactly what number was used for the sms alert
that was sent. As such, we should ensure it's set at the time of
sending.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch removes the fallback handling for smsalartnumber as the
to_address in notices. We ignore the to_address field in the message
queue at send time for sms anyway and use smsalertnumber exclusively so
having this field populated is just confusing to the end user
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If the value of a SIP field is "0", that evaluates to false, so any calls to maybe_add with a value of "0" will not get added to the SIP response message.
Test Plan:
1) Apply this patch
2) prove t/db_dependent/SIP/Message.t
Signed-off-by: Michal Urban <michalurban177@gmail.com>
JK: Adjust commit title
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
The header rows still showed \t because the newly defined
variable wasn't used there.
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch corrects the export of the 2 other reports
using CSV profiles:
* Late issues (serials)
* Basket (acquisitions)
To test:
1) Late issues
* Update the late issues sample report to use tab as separator
* Create a subscription
* Go to serial collection and 'generate next' to get some late issues
* Go to Claims
* Export the late issues and verify format is correct
* Verify exported file has tabs
2) Basket summary
* Create an order with several order lines
* Create an SQL type CSV profile for basket export using tab as separator
Example: aqorders.quantity|aqordres.listprice|Title=biblio.title
* Export the basket using your configured CSV profile
* Verify exported file has tabs
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
To test:
1. Have a vendor setup
2. Go to serials and add a new serial w/ that vendor.
3. When creating a serial make this first issues sometime in the past.
4. Go to Claims, choose your vendor and load the table.
5. No published on column.
6. Apply patch
7. Try step 4 again and now you should see a published on column.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
C4::Items::GetAnalyticsCount is part of the easy analytics feature.
Like Bug 20702 make early return when the system preference EasyAnalyticalRecords is disabled.
Actually it may block an item deletion for wrong reason.
Test plan :
1) Dont apply patch
2) Build an item and a linked analytical record with 773$0 and $9
3) Enable EasyAnalyticalRecords
4) Try to delete the item
5) You have an alert because linked to analytics
6) Disable EasyAnalyticalRecords
7) Try to delete the item
8) You have an alert because linked to analytics
9) Apply patch
10) Try to delete the item
11) No alert, it works :D
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch adds "complete field" to the authority "starts with"
search so that it uses the untokenized "p" register.
Test plan:
1. Apply the patch
2. koha-plack --restart kohadev
3. Go to http://localhost:8081/cgi-bin/koha/authorities/authorities-home.pl
4. Type in "Espen" into the search box and hit "Submit"
5. Note that there are 3 results
6. Change "contains" to "starts with" and hit "Submit"
7. Note that no results are returned
8. Change the search from "Espen" to "Sandberg" and hit "Submit"
9. Note that 3 results are returned
10. Experiment to your heart's content and rejoice at your new found power
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When placing holds via SIP2, there is no holdability check. This seems very incorrect.
Test Plan:
1) Apply this patch
2) prove -r t/db_dependent/SIP
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch records the bundle issue from which an item is marked as lost
so that we may use that to infer who lost the item (for later charges
and display).
Test plan
0) Apply all patches up to this point
1) Checkout a bundle to a user
2) Checkin the bundle and do not scan one of the barcodes at
confirmation
* Note that the item not scanned is marked as lost
3) Navigate to the biblio for the lost item and note that it is marked
as lost.
4) Navigate to the biblio for the collection and expand the collection
item that contains the lost item. Note the item is marked as lost and
checkout details are listed.
5) Checkin the lost item
* The item should be marked as found and the return_claims line should
be marked as resolved.
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch updates the circulation system to account for bundle
checkins. We add a content verification step to ensure bundle content is
all present at checkin and we use this comparison to mark missing items
as lost.
Test plan
0) Apply patches up to this point
1) Checkin an item that belongs to a bundle
* An alert should be triggered noting that the item belongs to a
bundle
* The option to remove the item from the bundle should be clear
* Click remove should result in the alert dissapearing and the item
having been removed from the bundle.
2) Checkin an item bundle
* A modal confirmation dialog should appear requesting each item
barcode be scanned
* As items are scanned they should be highlighted in yellow in the
bundle content table
* Upon submission;
* The user will be alerted to any unexpected items that were
scanned and told to put them to one side.
* The user will be alerted that any missing items in the validation
will have been marked as lost.
* The bundle item will be marked as checked in.
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
MARC::Record and MARC::File::* modules sometimes use the position 09 of
the leader to detect encoding. A blank character means 'MARC-8' while an
'a' means 'UTF-8'.
In a UNIMARC authority this position is used to store the authority type
(see https://www.transition-bibliographique.fr/wp-content/uploads/2021/02/AIntroLabel-2004.pdf [FR]).
In this case, 'a' means 'Personal Name'.
The result is that the import will succeed for a Personal Name
authority, but it will fail for all other authority types.
Steps to reproduce:
0. Be sure to have a Koha UNIMARC instance.
1. Download the MARCXML for "Honoré de Balzac"
curl -o balzac.marcxml https://www.idref.fr/02670305X.xml
2. Verify that it's encoded in UTF-8
file balzac.marcxml
(should output "balzac.marcxml: XML 1.0 document, UTF-8 Unicode
text")
3. Go to Tools » Stage MARC for import and import balzac.marcxml with
the following settings:
Record type: Authority
Character encoding: UTF-8
Format: MARCXML
Do not touch the other settings
4. Once imported, go to the staged MARC management tool and find your
batch. Click on the authority title "Balzac Honoré de 1799-1850" to
show the MARC inside a modal window. There should be no encoding
issue.
5. Write down the imported record id (the number in column '#') and go
to the MARC authority editor. Replace all URL parameters by
'breedingid=THE_ID_YOU_WROTE_DOWN'
The URL should look like this:
/cgi-bin/koha/authorities/authorities.pl?breedingid=198
You should see no encoding issues. Do not save the record.
6. Import the batch into the catalog. Verify that the authority record
has no encoding issue.
7. Now download the MARCXML for "Athènes (Grèce)"
curl -o athènes.marcxml https://www.idref.fr/027290530.xml
8. Repeat steps 2 to 6 using athènes.marcxml file. At steps 4 and 5 you
should see encoding issues and that the position 9 of the leader was
rewritten from 'c' to 'a'. Strangely, importing this batch fix the
encoding issue, but we still lose the information in position 09 of
the leader
This patch makes use of the MARCXML representation of the record instead
of the ISO2709 representation, because, unlike
MARC::Record::new_from_usmarc, MARC::Record::new_from_xml allows us to
pass directly the encoding and the format, which prevents data to be
double encoded when position 09 of the leader is different that 'a'
Test plan:
- Follow the "steps to reproduce" above and verify that you have no
encoding issues.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Koha spends an incredible amount of time on parsing and processing parameters
passed in to slips and notices. It would be immensely more efficient to be able
to pass objects directly to GetPreparedLetter so it doesn't need to do any
fetching / processing on them.
Test plan:
1) Apply this patch
2) prove t/db_dependent/Letters/TemplateToolkit.t
Signed-off-by: Josef Moravec <josef.moravec@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Can't locate object method "subclasses" via package "C4::ClassSplitRoutine" at /kohadevbox/koha/C4/ClassSplitRoutine.pm line 53
Certainly from bug 17600.
Test plan:
Home -> Administration -> Classification sources -> New splitting rule
And create classification sources and filing rules.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
If the letter has been removed, fall back to itemnumber/due date. (Title is
no longer fetched.) We may assume that the notice is present.
Note: The option to 'protect' a notice may need some more thought. Perhaps
it needs to be an attribute on itself.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested by deleting notice, running fines again.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
It would be great if we could customize what information was added to the "Description of charges" field when a fine was made so data could be stored even when the item is deleted.
Test Plan:
1) Create an overdue checkout that will get a fine
2) Run fines.pl
3) Note the description for the fine
4) Delete the fine from the database
5) Apply this patch
6) Run updatedatabase.pl
7) Restart all the things!
8) Run fines.pl
9) Note the description of the fine is unchanged
10) Delete the fine again
11) Browse to Slips & Notices
12) Edit the new notice OVERDUE_FINE_DESC
You will have access to the objects checkout, item, and borrower
13) Run fines.pl
14) Note your new description was used
Signed-off-by: Christopher Brannon <cbrannon@cdalibrary.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch fixes some unit tests by ensureing we set a valid userid for
mock userenv setting so that the foreign key constraint doesn't fail and
it also removes the exception class and check for renewer_id from the
store method as, for example with autorenewals, the renewal may not have
been triggered by a actual user.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch performs the following column renames:
* id => renewal_id
* issue_id => checkout_id
The idea is that no translation is needed for the API, and also, being a
new table, we can educate the users into the 'to be' terminology we are
leaning towards, instead of having them learn one naming to create
reports and then need to translate them once we normalize things in a
future.
That said, this is simple to review.
Apply this patch and repeat the test plan.
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Rename the issues.renewals field to renewals_count to prevent a method
name collision with the new relation accessor introduced by this
patchset.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
The C4::Suggestions::SearchSuggestion subroutine is badly written and
can be replaced by calls to Koha::Suggestions->search.
The hard part in this patch is suggestion.pl, the other occurrences have
been replaced easily.
Test plan:
The idea is to test the whole suggestion workflow.
1. Create a suggestion on OPAC
2. Create a suggestion on the staff interface
3. Edit suggestions
4. Filter suggestions (use the different filters and "organize by"
values)
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: Remove SearchSuggestion tests
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: (QA follow-up) Save some DB queries
This patch makes the suggestion-related pages rely on array size instead
of querying the DB each time they need to. In the case of
suggestion/suggestion.pl it goes from 4 COUNT(*) to 1.
To test, with KTD:
1. Run on the host machine:
$ docker exec -ti koha_db_1 bash
$ mysql -ppassword
> SET GLOBAL general_log_file='/var/log/mysql/mycustom.log';
> SET GLOBAL log_output = 'FILE';
> SET GLOBAL general_log = 'ON';
> \q
$ tail -f /var/log/mysql/mycustom.log | grep suggestions
2. Visit the different pages changed on this bug
=> SUCCESS: Some queries
3. Apply this patch
4. Repeat 2
=> SUCCESS: Less queries!
5. Sign off :-D
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: Fix branchcode and budgetid filtering
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: Fix conflict with bug 28941
Well, this patchset fixed the security bug...
Redoing on top of bug 28941
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: (follow-up) Missing semicolon
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: Fix 'all' libraries
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Bug 23991: (follow-up) Add value to filter_archived
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Corrected variable name on update to match everywhere else
Added a default value for limit in buildQuery and only append limit if it has content
Signed-off-by: Katrin Fischer <katrin.fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
It does not longer exist.
Also fix a spelling (emtpy ==> empty)
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When a patron enters an ISBN/ISSN when suggesting a new purchase, the
ISBN is used to find duplicates. Title/Author are ignored (as patrons
might misspell them, and the ISBN provides a better way to find
duplicates)
Test Plan:
* in the OPAC, go to /cgi-bin/koha/opac-suggestions.pl
* Click "new purchase suggestion"
* Enter any title
* Enter an ISBN that exists in your library
* Click "Submit your suggestion"
-> A new purchase suggestion has been submitted
* Apply the patch!
* Again start a new purchase suggestion, enter any title and the
duplicate ISBN, and Submit
* You should see the note "A similar document already exists: ..."
Please note that the title should not be required when entering an ISBN,
but as the list of required fields is managed via OPACSuggestionMandatoryFields
I fear that it's rather complicated to make title an optional required
field if isbn is provided.
I also added a simple non-DB test case to t/db_dependent/Suggestions.t
(in a subtest at the end), but could not actually run it as I haven't
gotten around to set up / try a testing Koha dev env...
Sponsored-by: Steiermärkische Landesbibliothek
Signed-off-by: Paul Derscheid <paul.derscheid@lmscloud.de>
Signed-off-by: Jonathan Druart <jonathan.druart@bugs.koha-community.org>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This patch fixes the broken commit_file.pl script.
It doesn't deal with commiting the import from the UI.
To test:
1. Pick a file for staging:
$ kshell
k$ misc/stage_file.pl --file TestDataImportKoha.mrc
=> SUCCESS: All good
2. Commit!
k$ misc/commit_file.pl --batch-number 1
=> FAIL: You see
DBIx::Class::Storage::DBI::_exec_txn_begin(): DBI Exception: DBD::mysql::db begin_work failed: Already in a transaction at /kohadevbox/koha/C4/Biblio.pm line 303
3. Apply this patch
4. Repeat 2
=> SUCCESS: Commit succeeds
5. Sign off :-D
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Bug 29325: (QA follow-up) Remove unexisting parameters of BatchRevertRecords
There is no interval and callback as in BatchCommitRecords.
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Bug 29325: Call progress callback one last time to confirm comppletion
Previously after finishing the loop we were still in a transaction that never completed - we should report progress when done
one final time to commit the last records
To test:
1 - Stage a file with > 100 records
2 - Commit file
3 - Confirm batch is imported and no records left as staged
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Bug 29325: Fix import from staff client
same test as before, but via the staff client
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Bug 29325: Handle the transaction in BatchCommitRecords
Requiring the callback to commit was breaking reversion, and likely elsewhere
Let's simplify and say that the routine iteself will handle the txn and commit
TO test:
1 - Stage a file
2 - Import a file
3 - Revert a file
4 - Test staff client and command line
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
The Swedish Libris ILL backend lets librarians store a specific due date
when an ILL loan is received.
This patch set adds a new date_due column to the illrequets table that can be
used by the different backends to store a due date. If an illrequest has the
date due set, it will be used when the item is checked out instead of the calculation
using the circulation conditions.
To test:
- Apply the patch and make sure the atomic database update is run
- Use the FreeForm backend to add one ILL request. Take note of the
illrequest_id of the request you created. We refer to this as
"x" below.
- Connect a biblio (with biblionumber y), that has an item with a
barcode, to the ILL request directly in the database:
UPDATE illrequests SET biblio_id = y WHERE illrequest_id = x;
- Next we set the due date, this would normally be done by or from the backend.
UPDATE illrequests SET date_due = "2023-01-01" WHERE illrequest_id = x;
- Go to circulation and issue the barcode of the item to the
patron associated with the FreeForm ILL request. Verify that the
loan gets a due date of 2023-01-01.
- Ideally: return the item and issue it again through SIP2 and SCO,
and verify that the due date is still 2023-01-01.
- Verify that there are no regressions, so that regular calculation
of due dates still work.
- prove t/db_dependent/Circulation.t
(Patch description, test plan and partial code credits to Magnus Enger)
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
(Patch description and test plan rewritten to reflect changes in development)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
CalcDateDue() works on its own copy of the $startdate parameter
so the cloning in the calling end is not necessary.
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
1) This removes support for passing string dates to CanBookBeIssued. The
function didn't publicly even document support for string dates, only
DateTime objects.
2) We get a $duedate always at least from CalcDateDue so having
$issuingimpossible{INVALID_DATE} = output_pref($duedate);
was unneccesary and thus removed.
3) The check "duedate cannot be before now" was needlessly complex: if
the due date really cannot be before now we should check seconds too
and warn the librarian! Thus the truncation to minutes can be dropped
safely.
To test:
1) prove t/db_dependent/Circulation.t
2) prove t/db_dependent/Illrequests.t
3) Enable OnSiteCheckouts and disable SpecifyDueDate syspref. Create
on-site checkout for any patron and verify the due date is your
current date at 23:59, you can check the exact minute with sql:
> select * from issues
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
BatchCommitItems is only being used within this module and isn't
mentioned in EXPORT_OK. This patch simply renames it to
_batchCommitItems to take the _ standard for private functions and also
adds a little hint to the POD of the function to clarify that the caller
must trigger a re-index.
JK: Amended patch to rename also the function in t/db_dependent/ImportBatch.t
and fix typo "commiting" => "commiting"
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
When committing staged marc imports to the catalogue we will often be
importing a batch of records. We don't want to send one index request
per biblio affected, we want to index them all after the records have
been modified otherwise we will end up with multiple tasks per record
(when items are also affected).
Test plan:
1) Use the stage marc record tool to stage and commit a set of records and
confirm the behaviour remains correct.
2) If using Elastic, check that only one indexing job is queued to take
place resulting from the committed import.
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Barcode is trimmed of leading/trailing whitespaces in many instances
before the barcodedecode sub was called. This patch instead makes that
barcodedecode sub is going to trim it itself and removes unnecessary,
and repetitive code that was used before barcodedecode was called.
Steps to test:
1. Edit item with any barcode, add a bunch of whitespaces at the start
and at the bottom of it. Save the item. Ensure that this action ruins
the barcode and ensure that the spaces are still there by editing the
same item again.
2. Apply the patch.
3. Edit the same item again in the same fashion. Ensure that now all
whitespaces are getting trimmed and it doesn't affect the barcode in
any negative way.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
This makes two simple changes:
- Limit TransformMarcToKoha to the fields we need
- Pass forward the biblioitemnumber when adding items to a new biblionumber
Profiling with NYTProf I saved ~8-9 seconds importing around 400 bibs/1000 items
Reducing calls in item store to use a passed biblionumber was the largest gain.
To test:
1 - Import some records and items
2 - Verify values etc., revert
3 - Apply patch
4 - Import again
5 - Verify values etc. same as before
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Joonas Kylmälä <joonas.kylmala@iki.fi>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>