Like Bug 19372, selecting MARC framework currently doesn't work when adding to basket from an external source.
Strangly I can reproduce on koha-testing-docker, but we have this issue with a Ubuntu Focal install.
Looks like it comes from a bad syntaxe than needs to be replaced in any case.
Test plan:
1) Add an order to a basket from an external source
2) Select another framework than the default one on the search result
view. Before doing 'add order' on choosen search result line.
3) Chek the framework code you will pick will be used in the created biblio record
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Victor Grousset/tuxayo <victor@tuxayo.net>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
ILS-DI GetRecords generates bad encoding of MARCXML for UNIMARC, like OAI in Bug 34467
Enable ILS-DI and display a record with :
<opac url>/cgi-bin/koha/ilsdi.pl?service=GetRecords&id=<biblionumber>
Well-known issue, fixed
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
holds_table.inc depends on form-submit.js, but importing the asset
directly breaks the script. Add a comment stating that any script that
imports holds_table.inc must import form-submit.js too.
Also change uri filters to html
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Update the revert hold button to use the new include for submitting
forms from link data with a POST request
To test:
1. Place a hold on a biblio record
2. Check an item in to fill the hold
3. On the holds tab for the biblio record, click the "Revert waiting
status" button for that hold
--> The page reloads but the hold is still waiting
4. Apply patchset
5. Click to another page and then return to the holds tab (we don't want
to refresh the page and resend the request)
6. Click the "Revert waiting status" button for that hold
--> The hold should be reverted to pending status
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Having double dashes inside a commmented block is not valid XML. This
patch restores it, with an added message explaining it
To test:
1. Run:
$ xmllint etc/z3950/config.xml
=> FAIL: You get:
etc/z3950/config.xml:5: parser error : Double hyphen within comment: <!--
<config>
<z3950_responder_options>
<z3950_responder_options>--add-item-status k -t 5</z3950_responder_options
2. Apply this patch
3. Repeat 1
=> SUCCESS: All good!
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds the <config> node that the z3950 responder starter script is looking for in the z3950/config.xml to the example code.
To test:
- verify that the <config> </config> is around the commented z3950_additional_options suggestion in the etc/z3950/config.xml file
- copy the config stanza to the live file: /etc/koha/sites/kohadev/z3950/config.xml
- restart_all
- ps aux | grep z3950
- confirm the script has restarted
- confirm the options: --add-item-status k -t 5 have been passed through
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
1) Visit <staff_url>/cgi-bin/koha/tools/holidays.pl
2) Select "Holiday only on this day" (or whatever other option to add an holiday). Click 'Save.
3) Now click on that day in the calendar, pick "delete this holiday". Click 'Save'. Notice the holiday is still there.
4) Apply patch. Repeat. Notice its deleted.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch resolves the undefined bookings_table variable issue reported
and also further fixes a subsequent undefined timeline bug that was
exposed by the original fix.
Signed-off-by: Esther Melander <esther@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new 'bookings' tab to the bottom of the members
details pages. When a patron has any future or current bookings against
their record the tab will display the number of bookings in the tab name
and on clicking the tab a bookings table will display the current and
upcoming bookings.
Signed-off-by: Esther Melander <esther@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Also wraps statuses into b instead of the SWITCH statement,
which should hopefully help to make translation a little nicer.
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
A follow-up changed orders->cancel a bit. This test assumed to
cancel completed lines here. So we are doing it now with a
direct update to have the original result.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
Based on the described criteria, check a few biblio records.
Look at Acquisitions tab on the intranet detail page.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This allows you to see quickly if a biblio has linked orders or not.
And if they are all cancelled, or some still in processing, or some
are complete (and the rest cancelled).
Test plan:
Run t/db_dependent/Koha/Biblio.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new heading field containing the display form of the authority record
NOTE: If trying to save an auhority in the 'DEFAULT' framework, you will get an error, you should not be
using DEFAULT for authorities and we should remove from the list on another bug
To test:
1 - Apply patches
2 - Update database
3 - Restart all
4 - Create a new authority, save.
5 - Do this for various types
6 - View the db records:
SELECT * FROM auth_header\G
7 - Note new heading field is populated correctly
8 - Edit your new authorities
9 - Confirm the heading field is updated correctly
10 - Import some authorities and confirm heading generated correctly
11 - Import auth via Z39.50 and confirm heading generated correctly
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds the option to specify a framework to be used when overlaying records from webservices/connexion
To test:
1 - vim /etc/koha/sites/kohadev/connexion.cnf
2 - Set content:
host:
port: 8888
koha:http://localhost:8081
log:/var/log/koha/kohadev/connexion.log
match:ISBN
user:kohauser
password:kohapass
overlay_action:replace
nomatch_action:create_new
item_action:always_add
import_mode:direct
framework:BKS
overlay_framework:
debug:1
3 - Save the sample file from this bug into your kohaclone (or copy and paste into a file your koha test site can reach)
4 - On the command line:
perl misc/bin/connexion_import_daemon.pl -c /etc/koha/sites/kohadev/connexion.cnf
5 - In another terminal:
cat bug_33418.test | nc -v localhost 8888
6 - It should report success and a biblionumber
7 - In Koha: Cataloguing->Manage staged MARC record
8 - View the most recent batch with file name (webservice)
9 - Confirm it was imported, no match
10 - Click 'View' under the 'Record' column
11 - Confirm record loads correctly and 'MARC framework' detail is 'Books, Booklets, Workbooks'
12 - On the terminal repeat the command:
cat bug_33418.test | nc -v localhost 8888
13 - It should succed
14 - View the new batch, confirm the record matched this time
15 - View the record details, confirm framework is now 'default'
16 - On the first terminal hit Ctrl+C to stop the daemon
16 - Edit connexion.cnf and change:
import_mode:stage
framework:ACQ
overlay_framework:IR
17 - Restart daemon:
perl misc/bin/connexion_import_daemon.pl -c /etc/koha/sites/kohadev/connexion.cnf
18 - Delete the record created above
19 - On the second terminal repeat the command:
cat bug_33418.test | nc -v localhost 8888
20 - Confirm the batch is created, but not imported
21 - In terminal:
perl misc/cronjobs/import_webservice_batch.pl --framework=ACQ --overlay_framework=BKS
22 - Confirm batch imported, and record in ACQ framework
23 - In terminal:
cat bug_33418.test | nc -v localhost 8888
perl misc/cronjobs/import_webservice_batch.pl --framework=ACQ --overlay_framework=BKS
24 - Confirm batch added, record matched, record imported, and record now in Books framework
25 - Stop the deamon, edit connexion.cnf:
import_mode:direct
26 - Start the daemon, and on other terminal repeat:
cat bug_33418.test | nc -v localhost 8888
27 - Confirm record in Binders framework
28 - Set record framework to Books
29 - Stop daemon, edit cnf and remove 'overlay_framework' setting
30 - Start daemon and cat the file again
31 - Confirm the record remains in Books framework
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Matt Blenkinsop <matt.blenkinsop@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
One of the embeds refers to suggestions. This table also has
a quantity column.
Test plan:
Verify that Pending Orders table does load now instead of
triggering an internal server error on ambiguous column.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Now filter_by_active checks quantities, we should ascertain that
our test data is correct, not leaving it to random values from
TestBuilder.
Test plan:
Run t/db_dependent/Koha/Acquisition/Order.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
An active order that has no items to receive any more is actually
not very active anymore :)
Test plan:
Run t/db_dependent/Koha/Acquisition/Orders.t
Enable OPACAcquisitionDetails.
Add order on basket for new biblio. Check that opac-detail does not
yet count the item as on order (since status is still new).
Close basket. Check opac-detail again. Should count it now.
Reopen basket. Cancel line. Check opac-detail again.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Active orders are more than just not cancelled. See filter_by_active
in Koha::Acquisition::Orders. They are still in the acq process; we
still need to receive items on those orders.
When we cancel and want to delete a biblio, we should check for not
cancelled orders (broader than active orders as in Orders.pm).
Test plan:
Git grep active_orders.
Run t/db_dependent/Koha/Biblio.t
Run t/db_dependent/Koha/Acquisition/Order.t
Add two order lines in a basket referring to same biblio. Try to
cancel with delete biblio. Should not be possible.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Current is a bit misleading here. It means the ones that are not
cancelled.
We already use the filter_out_etc naming schema. So in harmony with
that, we switch it to filter_out_cancelled.
Test plan:
Run t/db_dependent/Koha/Acquisition/Order.t
Run git grep -l '::Orders' | xargs grep filter_by_current
* All occurrences should be related to Recalls, not Orders.
Go to acqui/basket.pl. Delete a whole basket who has some active
and some cancelled lines.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Because of the way Koha::Logger has been used to log to different categories based on the interface and caller, it can be extremely hard to log all of a particular log statement to one place.
For custom report runs, the category is plack-intranet.C4::Reports::Guided when run from the web interface, cron.C4::Reports::Guided when run from runreport.pl, and plack-intranet.C4::Auth when run from svc/report.
We should add a more standardized report run log, both with and without the full query, so that administrators can log all report runs to a centralized location. If an administrator were to need the "point of entry" for reports, it is easy to include via parameters in PatternLayout.
Test Plan:
1) Apply this patch
2) Modify your log4perl file, add the following:
log4perl.logger.reports.execute.time = INFO, REPORTTIME
log4perl.appender.REPORTTIME=Log::Log4perl::Appender::File
log4perl.appender.REPORTTIME.filename=/tmp/report-time.log
log4perl.appender.REPORTTIME.mode=append
log4perl.appender.REPORTTIME.layout=PatternLayout
log4perl.appender.REPORTTIME.layout.ConversionPattern=[%d] [%p] [%P] %m%n
log4perl.appender.REPORTTIME.utf8=1
log4perl.logger.reports.execute.query = INFO, REPORTQUERY
log4perl.appender.REPORTQUERY=Log::Log4perl::Appender::File
log4perl.appender.REPORTQUERY.filename=/tmp/report-query.log
log4perl.appender.REPORTQUERY.mode=append
log4perl.appender.REPORTQUERY.layout=PatternLayout
log4perl.appender.REPORTQUERY.layout.ConversionPattern=[%d] [%p] [%P] %m%n
log4perl.appender.REPORTQUERY.utf8=1
3) Restart all the things!
4) Run a report somehow:
CLI: ./misc/cronjobs/runreport.pl 1
API: /cgi-bin/koha/svc/report?id=1
Web: /cgi-bin/koha/reports/guided_reports.pl?reports=1&phase=Run this report
5) Note the report runs are logged to /tmp/report-time.log and /tmp/report-query.log
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
The system preference RoutingSerials controls if the routing list
feature is available in the staff interface or not. If routing lists
are deactivated, the search option should not show.
Also updated the label to read: "Has routing list"
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Adds a search option to the advanced search in the
serials module that allows to limit search on subscriptions
with routing lists.
Test plan:
1. Apply this patch
2. Create two subscriptions, one with a routing list and one without
3. Navigate to Serials home and tick the checkbox labeled "Search routing lists only:"
4. Confirm that the only search result to appear is the subscription you added the routing list to
5. Run unit tests: prove t/db_dependent/Serials.t
Sponsored by: Bibliotheksservice-Zentrum Baden-Wuerttemberg
Signed-off-by: Christian Stelzenmüller <christian.stelzenmueller@bsz-bw.de>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Perltidied changes to make QA test tools pass.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch updates the 'title' and 'comment count' links to action the
same ticket detail/update modal as the 'details' button.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Paul Derscheid <paulderscheid@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
As we're already updating whitespace over almost the entire file, we may
as well go and tidy the whole thing too and add the /* keep tidy */
flag. (I checked for conflicts with existing bugs.. there's bug 20930
which will need a rebase, but it will already conflict and need a rebase
and is currently FQA)
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Removing this JS code from document.ready.
The script tag is at the end of the DOM and there is no good reasons (at
least I didn't find any) to wait for the whole document to be ready
before executing the JS code.
It made the selenium tests selenium/system_preferences_search.t to fail
randomly with:
# Failed test 'The first "Policy" section (under "Accounting") is currently expanded'
# at t/db_dependent/selenium/system_preferences_search.t line 63.
# got: undef
# expected: 'expanded'
# Looks like you failed 1 test of 6.
Because we set the class in this JS code and selenium won't wait for it
to finish before starting running the tests.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
1) Go to reports and select "Patrons"
2) Select some filters (patron category, library,..) and run it
3) Normally you will see filters selected but with blank value like
"branch code = "
4) Apply this patch
5) Refresh
Sponsored by: BibLibre
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch fixes a few things:
1) The blue dialog box now clears when navigating away from the page
2) The background job now uses skip_record_index to avoid queuing indexing jobs for every new biblio and instead queues one job at the end
3) Large files that get chunked now successfully create linked biblios if requested
4) Title matching rules have been expanded to check the package ID so that we can have duplicate titles in different packages
5) A link to the package is now included on the job report page
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch allows a file with additional columns to be imported. When the file is submitted, the system will enqueue the background job and send back information to the UI that certain columns have been ignored. The data will stil l import as normal but only the standard KBART columns will be parsed and imported
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch rebases in the changes from bug 36618 to make biblio creation optional
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This commit allows CSV files to be imported alongside TSV files.
It also adds some performance improvements relating to the max_allowed_packet and the matching of titles, as well as some small bugfixes and unit test changes
Test plan:
1) Enable the ERM module
2) Navigate to E-resource management > eHoldings > Local > Packages
3) Create at least one package
4) Navigate to E-resource management > eHoldings > Local > Titles
5) There should be a button for "Import from KBART file"
6) Click this button
7) Select the package that you created from the dropdown and then choose your KBART file using the "Choose file" button. I have attached some example files to the bug but feel free to use your own if you have them.
8) Click Submit
9) If your file is a valid file, a background job will be queued, if not then a warning will display showing what is incorrect in your file
10) To test the file format warning, edit your file and add a random column heading into the file e.g. test_column. When you upload it, the warning should show that an invalid column "test_column" has been detected
11) Click on the background job. (If you have uploaded a very large file, the system will chunk the file into smaller pieces and create multiple background jobs)
12) It should display a progress bar followed by a report and any error messages
13) Navigate to E-resource management > eHoldings > Local > Titles and you should see your new titles.
14) Run the unit test: prove t/db_dependent/Koha/BackgroundJob/ImportKBARTFile.t
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a background job that will import a KBART file
Sponsored-by: UKHSA
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new component to handle the file import, a route to that component and the API client route needed to access the API
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds the endpoint needed to queue an import_from_kbart_file background job
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>