1) Visit <staff_url>/cgi-bin/koha/tools/holidays.pl
2) Select "Holiday only on this day" (or whatever other option to add an holiday). Click 'Save.
3) Now click on that day in the calendar, pick "delete this holiday". Click 'Save'. Notice the holiday is still there.
4) Apply patch. Repeat. Notice its deleted.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch resolves the undefined bookings_table variable issue reported
and also further fixes a subsequent undefined timeline bug that was
exposed by the original fix.
Signed-off-by: Esther Melander <esther@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new 'bookings' tab to the bottom of the members
details pages. When a patron has any future or current bookings against
their record the tab will display the number of bookings in the tab name
and on clicking the tab a bookings table will display the current and
upcoming bookings.
Signed-off-by: Esther Melander <esther@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Also wraps statuses into b instead of the SWITCH statement,
which should hopefully help to make translation a little nicer.
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
A follow-up changed orders->cancel a bit. This test assumed to
cancel completed lines here. So we are doing it now with a
direct update to have the original result.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
Based on the described criteria, check a few biblio records.
Look at Acquisitions tab on the intranet detail page.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This allows you to see quickly if a biblio has linked orders or not.
And if they are all cancelled, or some still in processing, or some
are complete (and the rest cancelled).
Test plan:
Run t/db_dependent/Koha/Biblio.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new heading field containing the display form of the authority record
NOTE: If trying to save an auhority in the 'DEFAULT' framework, you will get an error, you should not be
using DEFAULT for authorities and we should remove from the list on another bug
To test:
1 - Apply patches
2 - Update database
3 - Restart all
4 - Create a new authority, save.
5 - Do this for various types
6 - View the db records:
SELECT * FROM auth_header\G
7 - Note new heading field is populated correctly
8 - Edit your new authorities
9 - Confirm the heading field is updated correctly
10 - Import some authorities and confirm heading generated correctly
11 - Import auth via Z39.50 and confirm heading generated correctly
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Andrew Fuerste-Henry <andrewfh@dubcolib.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds the option to specify a framework to be used when overlaying records from webservices/connexion
To test:
1 - vim /etc/koha/sites/kohadev/connexion.cnf
2 - Set content:
host:
port: 8888
koha:http://localhost:8081
log:/var/log/koha/kohadev/connexion.log
match:ISBN
user:kohauser
password:kohapass
overlay_action:replace
nomatch_action:create_new
item_action:always_add
import_mode:direct
framework:BKS
overlay_framework:
debug:1
3 - Save the sample file from this bug into your kohaclone (or copy and paste into a file your koha test site can reach)
4 - On the command line:
perl misc/bin/connexion_import_daemon.pl -c /etc/koha/sites/kohadev/connexion.cnf
5 - In another terminal:
cat bug_33418.test | nc -v localhost 8888
6 - It should report success and a biblionumber
7 - In Koha: Cataloguing->Manage staged MARC record
8 - View the most recent batch with file name (webservice)
9 - Confirm it was imported, no match
10 - Click 'View' under the 'Record' column
11 - Confirm record loads correctly and 'MARC framework' detail is 'Books, Booklets, Workbooks'
12 - On the terminal repeat the command:
cat bug_33418.test | nc -v localhost 8888
13 - It should succed
14 - View the new batch, confirm the record matched this time
15 - View the record details, confirm framework is now 'default'
16 - On the first terminal hit Ctrl+C to stop the daemon
16 - Edit connexion.cnf and change:
import_mode:stage
framework:ACQ
overlay_framework:IR
17 - Restart daemon:
perl misc/bin/connexion_import_daemon.pl -c /etc/koha/sites/kohadev/connexion.cnf
18 - Delete the record created above
19 - On the second terminal repeat the command:
cat bug_33418.test | nc -v localhost 8888
20 - Confirm the batch is created, but not imported
21 - In terminal:
perl misc/cronjobs/import_webservice_batch.pl --framework=ACQ --overlay_framework=BKS
22 - Confirm batch imported, and record in ACQ framework
23 - In terminal:
cat bug_33418.test | nc -v localhost 8888
perl misc/cronjobs/import_webservice_batch.pl --framework=ACQ --overlay_framework=BKS
24 - Confirm batch added, record matched, record imported, and record now in Books framework
25 - Stop the deamon, edit connexion.cnf:
import_mode:direct
26 - Start the daemon, and on other terminal repeat:
cat bug_33418.test | nc -v localhost 8888
27 - Confirm record in Binders framework
28 - Set record framework to Books
29 - Stop daemon, edit cnf and remove 'overlay_framework' setting
30 - Start daemon and cat the file again
31 - Confirm the record remains in Books framework
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Matt Blenkinsop <matt.blenkinsop@ptfs-europe.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
One of the embeds refers to suggestions. This table also has
a quantity column.
Test plan:
Verify that Pending Orders table does load now instead of
triggering an internal server error on ambiguous column.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Now filter_by_active checks quantities, we should ascertain that
our test data is correct, not leaving it to random values from
TestBuilder.
Test plan:
Run t/db_dependent/Koha/Acquisition/Order.t
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
An active order that has no items to receive any more is actually
not very active anymore :)
Test plan:
Run t/db_dependent/Koha/Acquisition/Orders.t
Enable OPACAcquisitionDetails.
Add order on basket for new biblio. Check that opac-detail does not
yet count the item as on order (since status is still new).
Close basket. Check opac-detail again. Should count it now.
Reopen basket. Cancel line. Check opac-detail again.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Active orders are more than just not cancelled. See filter_by_active
in Koha::Acquisition::Orders. They are still in the acq process; we
still need to receive items on those orders.
When we cancel and want to delete a biblio, we should check for not
cancelled orders (broader than active orders as in Orders.pm).
Test plan:
Git grep active_orders.
Run t/db_dependent/Koha/Biblio.t
Run t/db_dependent/Koha/Acquisition/Order.t
Add two order lines in a basket referring to same biblio. Try to
cancel with delete biblio. Should not be possible.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Current is a bit misleading here. It means the ones that are not
cancelled.
We already use the filter_out_etc naming schema. So in harmony with
that, we switch it to filter_out_cancelled.
Test plan:
Run t/db_dependent/Koha/Acquisition/Order.t
Run git grep -l '::Orders' | xargs grep filter_by_current
* All occurrences should be related to Recalls, not Orders.
Go to acqui/basket.pl. Delete a whole basket who has some active
and some cancelled lines.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Janusz Kaczmarek <januszop@gmail.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Because of the way Koha::Logger has been used to log to different categories based on the interface and caller, it can be extremely hard to log all of a particular log statement to one place.
For custom report runs, the category is plack-intranet.C4::Reports::Guided when run from the web interface, cron.C4::Reports::Guided when run from runreport.pl, and plack-intranet.C4::Auth when run from svc/report.
We should add a more standardized report run log, both with and without the full query, so that administrators can log all report runs to a centralized location. If an administrator were to need the "point of entry" for reports, it is easy to include via parameters in PatternLayout.
Test Plan:
1) Apply this patch
2) Modify your log4perl file, add the following:
log4perl.logger.reports.execute.time = INFO, REPORTTIME
log4perl.appender.REPORTTIME=Log::Log4perl::Appender::File
log4perl.appender.REPORTTIME.filename=/tmp/report-time.log
log4perl.appender.REPORTTIME.mode=append
log4perl.appender.REPORTTIME.layout=PatternLayout
log4perl.appender.REPORTTIME.layout.ConversionPattern=[%d] [%p] [%P] %m%n
log4perl.appender.REPORTTIME.utf8=1
log4perl.logger.reports.execute.query = INFO, REPORTQUERY
log4perl.appender.REPORTQUERY=Log::Log4perl::Appender::File
log4perl.appender.REPORTQUERY.filename=/tmp/report-query.log
log4perl.appender.REPORTQUERY.mode=append
log4perl.appender.REPORTQUERY.layout=PatternLayout
log4perl.appender.REPORTQUERY.layout.ConversionPattern=[%d] [%p] [%P] %m%n
log4perl.appender.REPORTQUERY.utf8=1
3) Restart all the things!
4) Run a report somehow:
CLI: ./misc/cronjobs/runreport.pl 1
API: /cgi-bin/koha/svc/report?id=1
Web: /cgi-bin/koha/reports/guided_reports.pl?reports=1&phase=Run this report
5) Note the report runs are logged to /tmp/report-time.log and /tmp/report-query.log
Signed-off-by: Brendan Lawlor <blawlor@clamsnet.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
The system preference RoutingSerials controls if the routing list
feature is available in the staff interface or not. If routing lists
are deactivated, the search option should not show.
Also updated the label to read: "Has routing list"
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Adds a search option to the advanced search in the
serials module that allows to limit search on subscriptions
with routing lists.
Test plan:
1. Apply this patch
2. Create two subscriptions, one with a routing list and one without
3. Navigate to Serials home and tick the checkbox labeled "Search routing lists only:"
4. Confirm that the only search result to appear is the subscription you added the routing list to
5. Run unit tests: prove t/db_dependent/Serials.t
Sponsored by: Bibliotheksservice-Zentrum Baden-Wuerttemberg
Signed-off-by: Christian Stelzenmüller <christian.stelzenmueller@bsz-bw.de>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Perltidied changes to make QA test tools pass.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch updates the 'title' and 'comment count' links to action the
same ticket detail/update modal as the 'details' button.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Paul Derscheid <paulderscheid@gmail.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
As we're already updating whitespace over almost the entire file, we may
as well go and tidy the whole thing too and add the /* keep tidy */
flag. (I checked for conflicts with existing bugs.. there's bug 20930
which will need a rebase, but it will already conflict and need a rebase
and is currently FQA)
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Removing this JS code from document.ready.
The script tag is at the end of the DOM and there is no good reasons (at
least I didn't find any) to wait for the whole document to be ready
before executing the JS code.
It made the selenium tests selenium/system_preferences_search.t to fail
randomly with:
# Failed test 'The first "Policy" section (under "Accounting") is currently expanded'
# at t/db_dependent/selenium/system_preferences_search.t line 63.
# got: undef
# expected: 'expanded'
# Looks like you failed 1 test of 6.
Because we set the class in this JS code and selenium won't wait for it
to finish before starting running the tests.
Signed-off-by: David Nind <david@davidnind.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
1) Go to reports and select "Patrons"
2) Select some filters (patron category, library,..) and run it
3) Normally you will see filters selected but with blank value like
"branch code = "
4) Apply this patch
5) Refresh
Sponsored by: BibLibre
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Emily Lamancusa <emily.lamancusa@montgomerycountymd.gov>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch fixes a few things:
1) The blue dialog box now clears when navigating away from the page
2) The background job now uses skip_record_index to avoid queuing indexing jobs for every new biblio and instead queues one job at the end
3) Large files that get chunked now successfully create linked biblios if requested
4) Title matching rules have been expanded to check the package ID so that we can have duplicate titles in different packages
5) A link to the package is now included on the job report page
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch allows a file with additional columns to be imported. When the file is submitted, the system will enqueue the background job and send back information to the UI that certain columns have been ignored. The data will stil l import as normal but only the standard KBART columns will be parsed and imported
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch rebases in the changes from bug 36618 to make biblio creation optional
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This commit allows CSV files to be imported alongside TSV files.
It also adds some performance improvements relating to the max_allowed_packet and the matching of titles, as well as some small bugfixes and unit test changes
Test plan:
1) Enable the ERM module
2) Navigate to E-resource management > eHoldings > Local > Packages
3) Create at least one package
4) Navigate to E-resource management > eHoldings > Local > Titles
5) There should be a button for "Import from KBART file"
6) Click this button
7) Select the package that you created from the dropdown and then choose your KBART file using the "Choose file" button. I have attached some example files to the bug but feel free to use your own if you have them.
8) Click Submit
9) If your file is a valid file, a background job will be queued, if not then a warning will display showing what is incorrect in your file
10) To test the file format warning, edit your file and add a random column heading into the file e.g. test_column. When you upload it, the warning should show that an invalid column "test_column" has been detected
11) Click on the background job. (If you have uploaded a very large file, the system will chunk the file into smaller pieces and create multiple background jobs)
12) It should display a progress bar followed by a report and any error messages
13) Navigate to E-resource management > eHoldings > Local > Titles and you should see your new titles.
14) Run the unit test: prove t/db_dependent/Koha/BackgroundJob/ImportKBARTFile.t
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a background job that will import a KBART file
Sponsored-by: UKHSA
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds a new component to handle the file import, a route to that component and the API client route needed to access the API
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
This patch adds the endpoint needed to queue an import_from_kbart_file background job
Signed-off-by: Clemens Tubach <clemens.tubach@kit.edu>
Signed-off-by: Nick Clemens <nick@bywatersolutions.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Set syspref EmailPurchaseSuggestions to "email address of library"
4. Under Administration > Libraries, enter an email address for at least
one library
5. Go to "my account", and enter a Primary email and Phone number
6. Go to the Acquisitions module > Suggestions page
7. Click "New purchase suggestion", and fill in values for, at minimum:
Bibliographic information:
Title
Author
Copyright date
ISBN/ISSN
Publisher
Publication place
Collection title
Document type
Reason for suggestion
Notes
Acquisition information:
Library (set to the library you entered an email for in step 4)
Copies
8. Submit the suggestion
9. Set the suggestion status to "Rejected"
a. Check the checkbox next to the suggestion
b. Set the "Mark selected as" drop-down (below the table) to Rejected
c. Select a value for the "Reason" drop-down
d. Click the Submit button
10. Set the suggestion status to "Accepted" (as above)
11. Set the suggestion status to "Ordered" (as above)
12. Return to "my account" and open the Notices tab
--> There should be notices suggestion declined, suggestion accepted,
suggestion ordered, and (depending on settings) new suggestion*
13. Open each notice, and confirm the all information was filled in
correctly
* New suggestion will be there if you're using default KTD settings/data
and logged in as the root user. If it is not there, query the database
(by command line or SQL report) to see the generated notice text:
SELECT * FROM message_queue WHERE letter_code = 'NEW_SUGGESTION'
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
To test:
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Enable SMS Notices
a) Find the system preference SMSSendDriver
b) Enter "Email" as the value for SMSSendDriver and save
4. Go to a patron account, and set the following messaging preferences:
Item due:
- Check SMS
- Check Email
- Leave Digests only unchecked
Advance notice
- Select 1 day in advance
- Check SMS
- Check Email
- Leave Digests only unchecked
5. Go to the checkout tab for that patron
6. Set a custom due date for today and check out an item
7. Set a custom due date for tomorrow and check out another item
8. Run the cron job to generate notices:
misc/cronjobs/advance_notices.pl -v -c
9. Open the Notices tab on that patron's account
--> The patron should have four notices:
Item due reminder (one for SMS and one for email)
Advance notice of item due (one for SMS and one for email)
10. Open each notice and confirm that all information is correct
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
To test:
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Enable SMS Notices
a) Find the system preference SMSSendDriver
b) Enter "Email" as the value for SMSSendDriver and save
4. Go to a patron account, and set the following messaging preferences:
Hold filled:
- Check SMS
- Check Email
- Leave Digests only unchecked
5. Leave SMS number and all email address fields blank for that patron
6. Place a hold for that patron
7. Check in an item to fill the hold
8. Open the Notices tab on that patron's account
--> The patron should have a Print "hold filled" notice, since they
don't have an email address or SMS number in the system
9. Add a Primary email and SMS number to the patron account
10. Place another hold for that patron
11. Check in an item to fill the hold
12. Go back to the Notices tab on that patron's account
--> The patron should now have Email and SMS "hold filled" notices as well
13. Open each notice and confirm that all information is correct
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Enable the system preference RenewalSendNotice
4. Check an item out to a patron
5. Edit that patron:
a. Enter a value for Primary email
b. Set messaging preference to enable email notices for Item
checkout and renewal
6. Renew the checked-out item
7. Check the patron's Notices tab for the Item renewal notice, and
confirm that the notice is correct
Signed-off-by: Matt Blenkinsop <matt.blenkinsop@ptfs-europe.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
To test:
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Find a patron and make sure their account contains values for
all of the following fields:
First name
Surname
Card number
Phone
Address
Address 2
City
Zipcode
Email
4. Place a hold for this patron, specifying a hold note
5. Check in an item to fill the hold and click "Confirm and print slip"
6. Confirm that all information on the hold slip is correct
Signed-off-by: Barbara Johnson <barbara.johnson@bedfordtx.gov>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
To test:
1. Apply patch
2. reset_all (or start your testing environment in the first place)
3. Go to "my account", and enter a Primary email
4. Under Administration > Libraries, enter an email address for at least
one library
5. Add the demo user (koha) as the owner of the Main Fund
a) Go to the Acquisitions module
b) Click on Main Fund
c) Click the Actions button at the end of Main Fund row > click Edit
d) Click "Select owner"
e) Search for "koha"
f) Click the Select button
g) Click Submit
6. Click "Suggestions" on the left side of the page, and click "New
purchase suggestion"
7. Fill in values for, at minimum:
Bibliographic information:
Title
Author
Acquisition information:
Library (set to the library you entered an email for in step 4)
Fund (set to Main Fund)
9. Submit the suggestion
10. Run the cron job to generate TO_PROCESS notification email:
misc/cronjobs/notice_unprocessed_suggestions.pl -v -c --days=0
11. Set the new suggestion to "Accepted"
a) Check the checkbox next to the new suggestion
b) Below the table, set "Mark selected as" drop-down to "Accepted"
c) Click Submit
12. Create an order from the purchase suggestion
a) Return to the Acquisitions homepage
b) Leave the Vendor field blank and click Search
c) Next to the sample basket (My Basket), click "Add to basket"
d) Click "From a suggestion"
e) Click "+ Order" next to the suggestion
f) Make sure there are values for all required item fields
g) Click "Add item"
h) Click Save
13. Close the basket and receive the order
a) Click "Close basket" and click "Yes, close"
b) Click "Receive shipments"
c) Enter a number in "Vendor invoice" and click Next
d) Check the checkbox next to "My Basket" and click "Receive
selected" button
e) Check the checkbox under "Receive?" in the Items table
f) Click Confirm
14. Return to "my account" and open the Notices tab
--> There should be (among others) notices for "A suggestion is ready to
be processed" and "Suggested purchase available"
15. Open each of the above two notices, and confirm all the information
was filled in correctly
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>
Test plan:
1. Reports > click Database schema link, verify it gets a 404 not foun
2. Reports > Use saved > click Database schema link, verify 404
3. Apply patch
4. Reports > click Database schema link, verify it loads as main
5. Reports > Use saved > click Database schema link, verify it loads
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Tomas Cohen Arazi <tomascohen@theke.io>
Signed-off-by: Katrin Fischer <katrin.fischer@bsz-bw.de>