No code implements the routines Get and TransformHtmlToMarc2,
so don't export them into users' namespace
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Recent changes to LDAP broke auth_by_bind in many situations. This bug
resets the behaviour to what it used to be, however also allows the new
behaviour by adding the 'anonymous_bind' parameter to the LDAP config.
Testing:
1) Find an LDAP configuration that was broken recently that uses
auth_by_bind
2) Apply this patch
3) See if it works again.
Additionally, testing the original path in the case of 'anonymous_bind'
being set should probably be done too, but I have no idea about the LDAP
server config for that.
Signed-off-by: Ulrich Kleiber <ulrich.kleiber@bsz-bw.de>
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
1) set an empty string for the ReservesMaxPickUpDelay pref
2) place a hold on an item
3) check in the item
4) click on "Print and confirm"
5) an error occurs
> The 'days' parameter (undef) to DateTime::Duration::new was an 'undef'
6) apply the patch
7) repeat steps 1 to 4
8) the error does not occur anymore.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
An empty string didn't do it for me, I had to set the
variable for the systempreference to NULL. I am not sure
if this can happen when editing from the interface, but
this change should not have any ill side effects and it has
unit tests!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Koha::DateUtils::output_pref took 4 parameters and the last one is a
boolean, so some calls were:
output_pref($dt, undef, undef, 1)
This patch changes its prototype to
output_pref({
dt => $dt,
dateformat => $dateformat,
timeformat => $timeformat,
dateonly => $boolean
});
An alternative is to call the output_pref routine with a datetime
object, without using an hashref:
output_pref($dt);
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
These variables still need to be exported to the template by default for
the 'prog' OPAC template to work correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The goal of this theme is to provide a fully-responsive OPAC which
offers a high level of functionality across multiple devices with varied
viewport sizes. Its style is based on the CCSR theme, with elements of
the Bootstrap framework providing default styling of buttons, menus,
modals, etc.
The Bootstrap grid is used everywhere, but Bootstrap's default
responsive breakpoints have been expanded to allow for better
flexibility for our needs.
All non-translation-depended files are in the root directory of this new
theme:
css, images, itemtypeimg, js, less, and lib. Languages.pm has been
modified to ignore the new directories when parsing the theme language
directories.
This theme introduces the use of LESS (http://lesscss.org/) to build
CSS. Three LESS files can be found in the "less" directory: mixins.less,
opac.less, and responsive.less. These three files are compiled into one
CSS file for production: opac.css. "Base" theme styles are found in
opac.less. A few "mixins" (http://lesscss.org/#-mixins) are found in
mixins.less. Any CSS which is conditional on specific media queries is
found in responsive.less.
At the template level some general sturctural changes have been made.
For the most part JavaScript is now at the end of each template as is
recommended for performance reasons. JavaScript formerly in
doc-head-close.inc is now in opac-bottom.inc.
In order to be able to maintain this structure and accommodate
page-specific scripts at the same time the use of BLOCK and PROCESS are
added. By default opac-bottom.inc will PROCESS a "jsinclude" block:
[% PROCESS jsinclude %]
Each page template in the theme must contain this block, even if it is
empty:
[% BLOCK jsinclude %][% END %]
Pages which require that page-specific JavaScript be inserted can add it
to the jsinclude block and it will appear correctly at the bottom of the
rendered page.
The same is true for page-specific CSS. Each page contains a cssinclude
block:
[% BLOCK cssinclude %][% END %]
...which is processed in doc-head-close.inc:
[% PROCESS cssinclude %]
Using these methods helps us maintain a strict separation of CSS links
and blocks (at the top of each page) and JavaScript (at the bottom). A
few exceptions are made for some JavaScript which must be processed
sooner: respond.js (https://github.com/scottjehl/Respond, conditionally
applied to Internet Explorer versions < 9 to allow for layout
responsiveness), the _() function required for JS translatability, and
Modernizr (http://modernizr.com/, a script which detects browser
features and allows us to conditionally load JavaScript based on
available features--or lack thereof).
Another new JavaScript dependency in this theme is enquire.js
(http://wicky.nillia.ms/enquire.js/), which lets us trigger JavaScript
events based on viewport size.
I have made an effort to re-indent the templates in a sane way,
eliminating trailing spaces and tabs. However, I have not wrapped lines
at a specific line length. In order to improve template legibility I
have also tried to insert comments indicating the origin of closing tags
like <div> or template directives like [% END %]:
</div> <!-- / .container-fluid -->
[% END # / IF ( OpacBrowseResults && busc ) %]
TESTING
Proper testing of this theme is no easy task: Every template has been
touched. Each page should work reasonable well at a variety of screen
dimensions. Pages should be tested under many conditions which are
controlled by toggling OPAC system preferences on and off. A variety of
devices, platforms, and browsers should be tested.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Apply this patch
2) Run updatedatabase.pl
3) Enable patronimages
4) Verify patron images are still displaying correctly
5) Test deleting a patron image
6) Test adding a patron image from moremember.pl
7) Test adding a patron image from tools/picture-upload.pl
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
* Added base class files for all tables in koha using
DBIx::Class::Schema::Loader.
* Added a (very basic) test file for C4::Context
* Also added dependencies in required files.
To Test:
[1] Install patch
[2] Make sure you can still connect to Koha
[3] You may optionally run this test script:
use Koha::Database;
use Data::Dumper;
my $db = Koha::Database->new();
my $schema = $db->schema();
print Dumper($schema->resultset("Borrower"));
If you run this file you should get a DBIx dump of the borrowers table.
Signed-off-by: wajasu <matted-34813@mypacks.net>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In a serials module, when searching subscriptions, the results table as
a "Notes" column.
In TT code, you see that it tries to display public note
"subscription.notes" and internal note "subscription.internalnotes".
The internal note is displayed well but not the public note.
You can see the 2 notes in serial details in summary tab.
The problem commes from the SQL query. A join is perform on subscription
and biblio, both containing a "notes" column.
This patch solves the problem by using a alias in query for both columns
(biblio.notes is acutally not used in template but could be).
Test plan :
- Edit a subscription
- Add public and internal notes. For example : "too busy" and "on holiday"
- Perform a subscription search that returns this subscription
=> "Notes" column contains both notes. For example : "too busy (on holiday)"
- Test with only public note
- Test with only internal note
Works as described.
Signed-off-by:Mathieu Saby <mathieu.saby@uhb.fr>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Works as described, fixes a bug as the templates show that
the intention was to display both notes in the column.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
Verify that existing CSV lists list MARC CSV profiles and not SQL CSV
profiles.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch:
- adds a new column 'type' to the export_format table.
- renames the field name export_format.marcfields with
export_format.content.
Test plan:
- Check that existing profiles have the type "marc" selected by default
- Create a new profile with a type "sql"
- Save and verify the profile is correctly displayed when you select it.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. koha-qa reports Small tabs errors,
corrected in followup
Test:
1) go to Tools > CSV profiles, Create profile, current
2) Apply patch, run updatedatabase
3) Go to Tools > CSV profiles, new option present
old profile with type MARC
4) Create new profile MARC, save and show correct
5) Create new profile SQL, save and show correct
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass with all 3 patches applied.
Works as described. Functionality for SQL profiles will be
added by another patch. For now it's possible to add/edit/delete
them.
Existing CSV profiles can still be exported correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch allows to define default values in the authorities framework.
Some code already existed but the feature did not work.
Test plan:
1/ Choose a framework, field and subfields.
2/ Define a default value.
3/ Create a new authority and check that the subfield is
automatically filled with the default value.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. koha-qa reports some tabs, fixed in followup
Test
1) Apply patch, run updatedatabase.pl
2) Edit auth framework, put default value someware, save
3) Add new auth, default value present
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Verified database update is done correctly.
Controlfields 0xx
- Edited an existing field (001)
- Set a default value for subfield @
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Fields
- Edited an existing field (100)
- Set a default value for subfield e
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In OAI set mappings, the value "is equal to" is hardcoded. This
enhancement changes it to a dropdown menu to choose between "is equal
to" and "not equal to".
To test:
* define a set
* define a mapping for said set with "is equal to"
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
* change mapping to 'not equal to', save
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Amended patch: Fix bug id in updatedb.pl
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adjusts the auto-completion on the authority record
finder (accessed from the bib editor) so that if you do
start typing in the "Main entry ($a only)" input field, it will
return only the $a of the main heading for matching authority
records.
This fixes a problem where typing "shakes", choosing
"Shakespeare, William, 1564-1616" from the auto-completion
result list, then hitting the search button fails to bring
up results, as the dates come from the $d of the 100 field
(in MARC21).
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised.
Tested with an authority where I added my search term in $b.
The modified authority came up in main entry, not in mainmainentry.
That was the desired result.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To reproduce and test:
To reproduce:
1) Create an authority record with main heading (100) in Latin script
(e.g. Oppenheimer, Aharon -- subfields $a and $b) and parallel form
(700) in Hebrew (אופנהיימר, אהרן -- subfields $a and $b).
Mark it correctly in $8 with freheb (or engheb if you like);
2) Reindex and search;
3) You will see:
Oppenheimer Aharon
freheb: אופנהיימר
Whereas you would rather like to see (mind language and lack of $b above):
Oppenheimer, Aharon
Hebrew: אופנהיימר, אהרן
The patch corrects the issue and should not harm those who (improperly)
put only one triple in $8
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
Same result on OPAC and STAFF
Turns out that test plan is wrong,
you neet to fill tag 200ab, not 100ab, for main heading.
I filled 100a with some example data from UNIMARC auth manual.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Took me a bit to figure it out, works according to test plan.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Before fixing UNIMARC DOM indexing, we must fix GRS-1 indexing
1) In advanced search, some Coded fields index are not working: Print,
Illustration, Content
2) Country-heading index is not working
3) Some subfields are indexed in wrong indexes :
102$a should be in Country-publication instead of Country-heading
(non defined in bib1.att)
106$a, filled only for printed works, should be in ff88-23 (form of
item) instead of itype. (ff88-23 is made for Marc21 008 pos
23, which contains the same data as 106a)
200$b should be in Material-type instead of (or in addition to) itype
and itemtype: (Material-type :"free-form string, ... that
describes the material type of the item, e.g., cassette, kit,
computer database, computer file.")
100$a pos 22-24 should not be indexed as "ln" : it is the language of
the record, not the language of the ressource
4) Index names are too long : if we index new positions of coded fields,
with existing names it breaks Zebra indexing (there must be a limit
in line lenghth in record.abs?)
5) There are a lot of warns when rebuiding zebra.
This patch make some changes in bib1.att (could be used later to improve
search) :
- fixing wording for att 51 and 1012
- adding comments for attributes based on MARC21 008 field (8800-8841)
- creating 8806 (tpubdate), 8838 (Modified-code), 8818 (ff8-18), 8840
(ff8-18-21), 8819 (ff8-19), 8821 (ff8-21), 8828 (ff8-28), 8830
(ff8-30), 8831 (ff8-31)
- creating attributes specific to UNIMARC : 9701-9707 (Video-mt,
Graphics-type, Graphics-support, Title-page-availability,
Cumulative-index-availability, script-Title, char-encoding)
- setting apart 3 blocks of attributes, so it could be easy to make
further changes :
-- common to Marc21 and UNIMARC : 8806, 8822, 8838
-- slightly different in Marc21 and UNIMARC (different meanings
according to the type of the record => don't match a single
UNIMARC field)
-- specific to UNIMARC : 9701-9707
In ccl.properties :
- creating a new index: Country-publication 1=1053
- suppressing some warns by mapping with bib1 att:
Date-time-last-modified, Name, rtype, Music-number
- defining indexes using the 3 blocks attributes defined in bib1
(common to Marc21 and UNIMARC, slightly different, specific to UNIMARC)
In record.abs :
- renaming some index for 100-105-110 fields
- correcting indexing of 102$a (country of publication)
106$a (ff88-23)
100$a pos 22-24 (language of record, no more
indexed)
105$a pos. 0-3 (illustration code)
200$b (for the moment, I keep it indexed in
itype and itemtype, but also Material-Type)
In C4/Search.pm :
- adding "Country-publication" index
In OPAC and staff interface template subtypes_unimarc.in :
- renaming indexes to take into account the changes made to Zebra
config files
To test (this cannot be done with a sandbox) :
1) Apply the patch in a UNIMARC GRS-1 Koha instance
2) Copy the following files from the etc/zebradb of your source
directory into the etc/zebradb of your main Koha directory:
-- etc/zebradb/biblios/etc/bib1.att
-- etc/zebradb/ccl.properties
-- etc/zebradb/marc_defs/unimarc/biblios/record.abs
3) Reindex your data (rebuild_zebra -x -b -r -v)
4) Try to use those Coded fields indexes in Advanced search, in OPAC
and Staff interface (available after clicking on "More options",
then on "Coded information filters"):
Audience, Print, Literary genre, Biography, Illustration, Content,
Video Types, Serials, Serial Type, Periodicity, Regularity
5) Try to search "Country-publication=FR" in simple search
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Tests for GRS-1
Followed test plan
Search by coded fields works, but only on OPAC,
on staff there are few options
Search by Country-publication works after patch
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Modified Record::marc2bibtex to varlidate fields 100, 110 and 111 in
non-UNIMARC flavours.
Test plan:
1)Search any books in the OPAC with a main entry (1XX in MARC21, 700-720 in UNIMARC)
2)Export the record in the bibtex format
==>The output won't contain the main entry.
3)Apply the patch
4)Export the record
==>The record will contain the main entry
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes a long standing bug.
Passes all tests and QA script.
Tested with multiple records, seems to work well.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes some incoherences of the routine
GetBooksellerWithOrders().
Now it considers the field $estimateddeliverydateto and it replaces it
by now() only if it is undef.
Also, it doesn't test if $aqbookseller.deliverytime is not Null anymore
but if $deliverytime = null or undef, it replaces it by 0.
It also verifies if $delay is >= 0 and return undef if it is a negative
value.
To Test:
Before, this routine sorts out the BookSellerWithLateOrders. If a
bookseller did not specify a deliverytime, it would never appears in
the list of LateOrders. Moreover, if the field "Estimated delivery
date to" was specified, it didn't take care of the value and it
returns the late order up to today's date.
Now, the returned list considers all the fields give and if the
delivery time of the bookseller is not specified, it calculates the
late orders as if the deliverytime is 0. By default, all booksellers
which have orders in late until today are listed unless "estimated
delivery date to" is specified.
prove t/db_dependent/Bookseller.t
t/db_dependent/Bookseller.t ..
[Some warnings about uninitialized values]
WARNING: GetBooksellerWithLateOrders is called with a negative value at C4/Bookseller.pm line 135.
t/db_dependent/Bookseller.t .. ok
All tests successful.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds some improvements for the routine GetOpenIssue().
Now, it verifies if the parameter is given (if not it returns undef)
and it returns $sth->fetchrow_hashref() instead of a $issue.
To test:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 2 wallclock secs ( 0.06 usr 0.01 sys + 1.09 cusr 0.07 csys = 1.23 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Same situation as the one noted in comment of
Bug 10683, test fails unless there is an issuingrule
All, All with 1 as renewals allowed.
With that condition, it succeeds
No koha-qa errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds return values to DeleteTransfer:
Undef if no parameters are given
1 if a Transfer is deleted
0E0 if a wrong parameter is given
It also fixes some unit tests in t/db_dependent/Circulation_transfers.t
To test:
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=14, 20 wallclock secs ( 0.03 usr 0.00 sys + 0.39 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Applied 10681 and 10692 before 10698
Run prove t/db_dependent/Circulation_transfers.t without errors
No koha-qa errors on all 3 patches
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish().
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author
I run prove t/db_dependent/Reserves.t without errors
don't know if more tests are needed.
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author.
Run prove t/db_dependent/Accounts.t without errors
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The move avoids a problem where many modules would gain
a dependency on C4::Auth just because C4::Members needs access
to hash_password().
This patch also adds a couple unit tests for the new password
hashing code.
To test:
[1] Verify that there are no regressions on the test plan for bug
9611.
[2] Verify that t/AuthUtils.t and t/db_dependent/Auth.t pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Right now overdues come from the branch, but the
others come from the admin email address - this
is a problem in multi-branch systems because they
have to come up with one email address that all
branches have access to.
C4::Letters::_send_message_by_email currently sets
the from address in the following order:
1) Address specified in message
2) Koha admin email address
The order will now be:
1) Address specified in message
2) Borrower's home library email address
3) Koha admin email address
Test Plan:
1) Set your library email addresses, and the KohaAdminEmailAddress
Make sure each of them are unqiue
2) Choose a borrower, enable the enhanced messaging and enable the
checkout and checkin email notices. Use your email address for
the borrower's email so you can recieve the emails.
3) Check out an item, check the from address of the email,
it should be the email addres set in KohaAdminEmailAddress
4) Apply the patch
5) Return the item, check the from address of the email,
it should match the email address set for the borrower's
home library.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If a lot of items has the same callnumber, the order should be on the
itemnumbers. Otherwise the left side is always filled with the same
items.
+ Fix a bad c/p for the next link (when js is disabled).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The next and previous links should completely refresh the shelf.
For example:
[<] [1] [2] [3] [4] [5] [6] [>]
Before this patch, the next and previous links were the same as the 1
and 6.
With this patch, after clicking on next, we will get:
[<] [7] [8] [9] [10] [11] [12] [13] [>]
This patch adds a new AJAX script to get the shelf browser block.
Test plan:
- On a detail biblio page, click on a "Browse shelf" link.
- Play with the next and previous links.
- Deactivate Javascript (using NoScript for example) and check that you
get the same behavior (but the page is reloaded).
- Launch the unit tests: prove t/db_dependent/ShelfBrowser.t
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- improve POD
- remove extraneous comments
- correct license statement in new files
- remove backticks in database update SQL
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch introduces a new Z39.50 interface for searching Z39.50
compliant databases for MARC authority records.
These databases aren't as common as their bibliographic equivalents,
but they're out there and very useful. I have included info at the
bottom of this messsage for sample authority databases you can try.
To test this patch:
1) Set up Z39.50 client targets for authority databases. (I've included
information at the bottom of this message for LibrariesAustralia's
test server for authorities as well as instructions on how to use
your Koha's z39.50 authority server as well. The Library of Congress
also has authority databases available (unsure if these are test or
prod), and you might have access to others through OCLC or RLIN. OCLC
provides login credentials for their test databases.
2) Go to the Authorities module
3) Click on the new "Z39.50 search button"
4) Select your authority search targets from the list.
5) Do a search for an authority you would like using either the "Raw"
input box or the more specific input boxes for names, subjects, subject
sub divisions, or titles. (I like searching Name (personal): Eric on
the LibrariesAustralia test DB.)
6) You should see a table listing the server, heading, authority type,
and two other columns (MARC and a nameless column). "Authority type"
is the type of authority it will become when imported in to Koha. In
the Eric example, "PERSO_NAME".
7) Click on "MARC" next to the results of interest to review the MARC
authority record.
8) When you're satisfied with a record, click on "Import".
9) The pop-up window will close and your original Koha window will
change to the "Adding authority Personal Name" screen (in the Eric
example).
10) All the relevant fields should be filled out for the record. Review
them and make any changes as necessary. (N.B. The 001 will be cleared
when saved, so if you have a use for the imported control number, move
it to the 010, 016, or 035 as appropriate. If you have a default value
for the 003, this will also likely be overwritten. Move it if necessary.
The 005 will also be updated when saved, so do not worry about that.)
11) When you're satisfied, click save.
12) Presto! You've imported your first authority record via Z39.50!
--
Here is the info for the LibrariesAustralia test Z39.50 authority
database:
Z39.50 server: LibrariesAustralia Authorities
Hostname: z3950-test.librariesaustralia.nla.gov.au
Port: 210
Database: AuthTraining
Userid: ANLEZ
Password: z39.50
Syntax: MARC21/USMARC
Encoding: utf8
-
The U.S.A. Library of Congress also provides Z39.50 access to its Name
and Subject Authorities (http://www.loc.gov/z3950/lcserver.html).
Name Authority:
Z39.50 server: Library of Congress Name Authority File
Hostname: lx2.loc.gov
Port: 210
Database: NAF
Syntax: MARC21/USMARC
Encoding: utf8
Subject Authority:
Z39.50 server: Library of Congress Subject Authority File
Hostname: lx2.loc.gov
Port: 210
Database: SAF
Syntax: MARC21/USMARC
Encoding: utf8
(N.B. Both of these databases also include title authorities.)
-
For testing purposes, you can also set up a Z39.50 client target,
which points at your own Koha instance's Z39.50 authority server.
To find the hostname, go to /etc/koha-conf.xml and find the value for
the <listen id="authorityserver"> element. Depending on your
configuration, this could be something like the following:
unix:/zebra/koha/var/run/zebradb/authoritysocket
(N.B. You might be using a different scheme than unix sockets...)
To find the database, scroll down to the bottom of koha-conf.xml until
you reach the <config> element. Within this, look for the value of the
element <authorityserver>. It should probably be "authorities".
To set up this Z39.50 client target in Koha...
Z39.50 server: my koha authorities
Hostname: unix:/zebra/koha/var/run/zebradb/authoritysocket
Port:
Database: authorities
Userid:
Password:
Syntax: MARC21/USMARC (or whichever flavour you need)
Encoding: utf8
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - Add a z39.50 interface for authority searching
This patch adds the "recordtype" column to the "z3950servers" table.
The value in this column (biblio or authority) then controls whether
the z3950 server shows up in a bibliographic search (through the
Acq and Cataloguing modules) or in an authority search (through
the Authorities module).
I also edited the z3950 management console to show this value
and allow users to edit it. The default value is "biblio", since
the vast majority of z3950 targets will be bibliographic. However,
there is an option to add/edit a z3950 target as a source of
authority records.
Test Plan:
1) Apply both patches
2) Run updatedatabase.pl (after setting your KOHA_CONF and PERL5
environmental variables)
3) Use the test plan from the 1st patch
N.B. Make sure that your Z39.50 client target has a Record Type
of Authority, otherwise it won't display when you're doing a
Z3950 search for authorities.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - fix tabs/whitespace errors to pass QA
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The current implementation didn't build the cookie array correctly,
yielding login problems in some scenarios.
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Passing language=<valid_language_code> as a parameter in any Koha's URL
can be used to set the desired language.
This patch touches
- C4::Templates
- C4::Auth
Adds a new method getlanguagecookie that does exactly that, for use in
get_template_and_user.
Also modifies getlanguage so it checks (a) if there's a 'language'
parameter in the CGI object and (b) checks if its valid and enabled for
the desired interface.
To test:
* Without the patch
- access any koha page
- add ?language=code to the end of the URL (change code for a valid language code
it needs to be installed using perl translate install code, and enabled either for
the staff or opac interface, depending where are you testing)
- Nothing happens with the language parameter
* With the patch
- access any koha page
- add ?language=code (the same as before) and hit enter
- the language should be changed to the one you chose
- if you browse through some links, you will see
koha 'remembers' the language you passed as a parameter
(i.e. the language cookie has been updated).
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Brendan <brendan@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Works very well. No errors.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
More comments on last patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Small patch to make koha-qa happy.
Fixes small POD error
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test:
* SIP: Have an old user and create a new user
- use either tenet sip test or
C4/SIP/interactive_patron_check_password.pl to check old
userid/password
- do the same for the new user
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described
Test
1) using perl C4/SIP/interactive_patron_check_password.pl
can check current (short) and new (long) passwords
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test:
* LDAP:
- Turn on LDAP auth in koha-config.xml. Set "update" in your server config to 1
- Change user's password on LDAP
- Login to Koha using LDAP - Koha password should be updated, to check
- Turn off LDAP auth in koha-config.xml
- You should be ble to log in with the new password
I do not have a LDAP facility, so I cheated. I ran
perl -e 'use C4::Auth_with_ldap; C4::Auth_with_ldap::_do_changepassword("srdjan", 1000022259, "srdjan");'
and was able to change the password.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described.
Test
1) change <useldapserver> to 1
2) copy/paste sample <ldapserver> config from perldoc C4/Auth_with_ldap
3) using sample script was able to change password,
use (userid, borrowernumber, newpass) as arguments
4) checked with OPAC and in database
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
RM note: Digest::MD5 is used in C4::ImportExportFramework as part
of an unnecessary reimplementation of functionality supplied by
File::Temp. See bug 10991 for a proposal to remove it.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
What this patch aims to accomplish?
* All new passwords are stored as Bcrypt-hashes
* For password verification:
- If the user was created before this patch was applied then use
MD5 to hash the entered password <-- backwards compatibility
- If the user was created after this patch was applied then use
Bcrypt to hash the entered password
* Any password change made via the staff interface or the OPAC will
be automatically Bcrypt-hashed; this applies to old users whose
passwords were stored as MD5 hashes previously
Test plan:
1) Add new users and check whether their passwords are stored as
Bcrypt hashes or not.
2) To test that authentication works for both old as well as new
users:
a) Login as an existing user whose password is stored as a
MD5 hash
b) Login as an existing user whose password is stored as a
Bcrypt hash
3) In the staff interface, change the password of an existing user
whose password is stored as an MD5 hash
a) Check the new password is stored as a Bcrypt-hash in the database
b) Try to login with the new password
4) In the OPAC, verify that
a) Old user with old pass can change password, new format
b) New user with new pass can change password
c) Old and new user with self-updated pass can login
Whitespace cleanup was contributed by Bernardo Gonzalez Kriegel.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10925 removes the last call to C4::Utils.
The module becomes useless and can be deleted.
Verify that t/db_dependent/Context.t still successfully passes.
git grep hashdump
git grep maxwidth
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, no subs from the module are used anywhere
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This fixes a regression introduced by the patch for bug
9394 -- when printing a hold slip using the 'print and confirm'
button, the slip would contain only the text 'reserve not found',
not a full hold slip.
This patch also adds a regression test.
To test:
[1] Check in an item that would fill a hold. Use the 'print
and confirm button' to confirm the hold.
[2] The printout will only contain text to the effect of
'reserve not found'.
[3] Apply the patch.
[4] Repeat step 1. This time, a full hold slip should be printed.
[5] Verify that prove -v t/db_dependent/Reserves.t passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Pass all tests, new and old, and QA script.
Verified wrong and corrected behaviour.
Note: Sometimes there will not be the message 'reserve not found'
showing up, but hold information for a different record. This happens
when there exists a reserve_id with the borrowernumber of the patron
in question in your database.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To reproduce:
1/ Edit your apache virtual host and set the DEBUG environment variable
(SetEnv DEBUG 1).
2/ Try to login with an ldap user
3/ You will be redirected to the 500 error page.
The Koha logs contains:
malformed header from script. Bad header=------------------------------: mainpage.pl
The hashdump routine directly prints to STDOUT (!) and breaks the
headers.
It appears Net::LDAP::?->dump does the same thing.
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Maybe we can kill C4::Utils after getting rid of this
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
- Configure an LDAP server and $KOHA_CONF, etc.
- Make sure ExtendedPatronAttributes is defined and that
there is no attribute defined that is specified to be
a unique ID.
- Try to log in using an account originating from the
LDAP directory.
- You will got a software error:
Can't use an undefined value as an ARRAY reference at
/home/koha/src/C4/Auth_with_ldap.pm line 183.
- Apply the patch.
- Try to log in again; this time it should work.
Signed-off-by: Nuño López Ansótegui <nunyo@masmedios.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Description:
A new pref ConfirmFutureHolds is added. When confirming a hold at checkin time,
the number of days in this pref is taken into account when looking for reserves.
Note that this pref does not interfere with renewing, issuing or transferring
a book. For report Holds to pull, the default end date is calculated with this
new preference.
The use of ConfirmFutureHolds is useful only when future holds are allowed.
Test plan:
1) Enable future holds. Add a number of days into ConfirmFutureHolds.
2) Place a future hold within this number of days.
3) Run holds to pull report. Check default startdate and enddate.
4) Check this book in. Can you confirm the hold? Do not confirm.
5) Issue the book to another patron. You should not see a warning.
6) Renew the book for this patron via opac or staff. No warning either.
7) Check in again. Warning pops up again.
8) Transfer book. Switch branch. Check in. Hold found pops up. Do not confirm.
9) Back to first branch. Check in (with popup). Remove the hold. Add new future
hold past the number of days. Check in (no warn).
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch improves the POD for C4::BackgroundJob->get(). It also
fixes ->set() so that it cannot scribble over values that are properly
internal to the object.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
After executing a batch items modifications, a dialog is displayed above
the result table. It contains the number of items (and total fields)
which has been modified.
Note that items that are selected for modification but which do not
end up actually be changed are not reported in the final counts.
This patch adds two methiods to C4::BackgroundJob, ->set() and ->get(),
that allow background jobs to pass arbitrary data back to the client.
Test plan:
1/ Go to tools/batchMod.pl
2/ Enter a barcodes list
3/ Check/uncheck items and fill some values to apply
4/ Save
5/ The table summary will be displayed with a dialog box on top:
XX item(s) modified (with YY fields modified)
Check that XX and YY correspond with what you expected.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
batch modification still seems to work correctly, with the helpful addition of the counter. Thanks!
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Sometimes when using the batch item modification tool, we would like to
automatically uncheck on loan items.
This patch also adds a new routine in C4::Circulation, IsItemIssued(),
which, when passed an itemnumber, returns whether the item is
currently on loan.
Test plan:
1/ Go to tools/batchMod.pl.
2/ Enter some barcode (at least 1 should be on loan).
3/ Click on the Continue button.
4/ Click on the "Clear on loan" link.
5/ Check that on loan items are unchecked.
Launch the unit test file:
prove t/db_dependent/Circulation/IsItemIssued.t
http://bugs.koha-community.org/show_bug.cgi?id=10572
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Works as expected, only modifies items that are checked (still). No regression noted.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, works as advertised.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds several unit tests for GetHiddenItemnumbers and fixes the POD for it.
It also wraps the tests for rollback, modernizes and adds a license text to it.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
There should be a:
"require YAML;"
or
"use YAML qw/Load/;"
as the GetHiddenItems routine has a reference to YAML::Load.
This was discovered while adding a GetHiddenItems() call into
opac/opac-MARCdetail.pl. I believe this problem dates back to
bug 6488 or bug 5984.
I also added an optimization to GetHiddenItems to prevent
processing if there is nothing in the system preference. Test
by searching for a biblio which has some or all of its items
hidden.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Maybe
return () unless $yaml =~ /\S/;
or
return () if $yaml =~ /^\s*$/;
would have been easier to read.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
no need to create a variable just to call to send it as parameter the line after
Signed off by: Alex Hatley <alexh@cctexas.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds functionality to forgive overdue fine when an item is
set to lost status. Fines are forgiven only when the syspref
WhenLostForgiveFine is set to yes. Item can be set to lost status from:
- catalogue/moredetail.pl
- cataloguing/additem.pl
- tools/batchMod.pl
- misc/cronjobs/longoverdue.pl
Changed subroutine C4::Circulation::LostItem to forgive fines on the
item depending on the value of syspref WhenLostForgiveFine. This
routine is currently used to return an item and charge a replacement
cost.
Also added a new syspref in C4::Circulation::LostItem -
WhenLostChargeReplacementFee. The replacement fee will now be charged
only if this syspref is set to yes. The default value of the
WhenLostChargeReplacementFee is yes, meaning that current behavior
will not change during upgrade.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Alex Hatley <alexh@cctexas.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Testing notes on last patch in series.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test authentication via ILS-DI:
- with userid and password
- with userid and wrong password
- with cardnumber and password
- with cardnumber and wrong password
...
Before the patch only userid will authenticate the patron.
After the patch was applied, userid and cardnumber will work.
To test:
- Run t/db_dependent/ILSDI_Services.t - all tests should pass.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors
Test:
Enable ISL-DI
access opac with /cgi-bin/koha/ilsdi.pl?service=AuthenticatePatron&username=XXX&password=YYY
With userid/cardnumber & password returns borrowernumber
With userid/cardnumber & wrong password returns PatronNotFound
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds unit tests for Authenticate patron.
To test:
- Run perl t/db_dependent/ILSDI_Services.t
- Verify all tests pass
Note: there are some tests marked as TODO.
Rewriting AuthenticatePatron to make cardnumber and userid
work for authenticating a patron will be implemented in the
next patch. Tests related to this are currently showing as
'not ok', but are still passing.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors
With both patches applied, all test pass.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds a new column to item types. Text in this column is
displayed as a warning when an item of the given type is checked in.
The type of message can also be chosen, affecting how the message is
displayed.
Use case: Items that are on inter-library loan can have a separate
item type, and when items of this type are checked in a message
saying something like "ILL! Remember to return it to the owning
library!" can be displayed.
To test:
- Apply the patch
- Go to Home > Administration > Item types administration
- Check that there is a new column, called "Check in message"
- Edit an item type and add a check in message
- Check that the check in message you added is displayed in the table
- Check in an item with an item type that has a check in message
- Check that the message is displayed
- Repeat the steps above, but select "Alert" instead of the default
"Message" as the "Check in message type". Check that the message
is displayed in a yellow alert box, not a blue message box.
- Check in an item with an item type that does *not* have a check
in message, and make sure no false messages are displayed
- Create a new item type from scratch and check that it works
the way it is supposed to
- Run the tests in t/ItemType.t, which are updated by this patch
This patch also removes backticks around column names in the
itemtypes table in installer/data/mysql/kohastructure.sql
UPDATE 2013-07-22
- Rebased on current master (no changes)
- Added "AFTER summary" to the SQL statement in updatedatabase.pl
- Added another placeholder on line 170 of admin/itemtypes.pl
Thanks Katrin!
UPDATE 2013-07-29
- Make this message independent of all other messages - thanks Owen!
- Make it possible to choose the type of message ("alert" or
"message")
Sponsored-by: Kultur i Halland - Regionbibliotek
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed some tabs to make the QA script happy.
All old and new tests pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This allow to keep transfers informations without having untranslatable
strings in database.
Signed-off-by: sonia <koha@univ-lyon3.fr>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
On basket.pl and parcel.pl there is a 'Transfer' link which allow you to
transfer order lines from a basket to another.
The link leads to a new page which allow you to search for a bookseller,
then display this bookseller's baskets. Then you can pick a basket and
the transfer will be done.
Signed-off-by: Marc Veron <veron@veron.ch>
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: sonia <koha@univ-lyon3.fr>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Before, ModBookseller always returns undef. This patch modifies it in
order to be more explicit.
Now it returns :
1 -> If a modification has been done
0E0 -> If the given ID doesn't exist
undef -> If no ID given
It also fixes one of the tests which didn't pass before
in t/db_dependent/Bookseller.t
To Test:
prove t/db_dependent/Bookseller.t
Bookseller.t .. 1/54
[Some warnings about uninitialized values]
t/db_dependent/Bookseller.t .. ok
All tests successful.
Files=1, Tests=54, 1 wallclock secs ( 0.03 usr 0.00 sys + 0.46 cusr 0.04 csys = 0.53 CPU)
Result: PASS
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Change is logical and passes new and old tests.
Fixed the author line to have Kenza's correct email address.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes DelBookseller returns the number of suppliers it has
deleted or undef if an error has occurred.
It also fixes a test which doesn't pass before in t/db_dependent/Bookseller.t
To test:
prove t/db_dependent/Bookseller.t
t/db_dependent/Bookseller.t .. 1/54
All tests successful.
Files=1, Tests=54, 1 wallclock secs ( 0.02 usr 0.00 sys + 0.48 cusr 0.02 csys = 0.52 CPU)
Result: PASS
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Logical change and makes another test pass :)
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When requested by the summary flags the sipserver
should return in the patron info response barcodes of
the relevant titles.
For available holds this is the barcode of the captured items.
For unavailable holds ( i.e. current unsatisfied holds ),
we need to send a barcode so that the unit can use this to
request the title info. The barcode could be any one
belonging to the title.
This patch also corrects the erroneous return of empty items
in the patron information response. If the unit supplies a
range 1 - 100 unless the title has a hundred or more copies the
unit expects all copies. The server was erroneously stuffing
the returned arrays with null copies so that all summary requests
returned 100 copies (mainly without barcodes)
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Testing notes:
Using the test script provided on the bug report, but changed it
to match sip user and patron existing in my database.
Before applying the patch the SIP responses show the behaviour
pointed out above regarding the 100 items. After applying the
patch and restarting the SIP server responses are much more clean
not returning empty IDs.
64 Patron information response
AS = hold items
hold items count is correct.
AS contains barcodes of waiting holds.
Before patch, all AS were empty.
AT = overdue items
overdue items count is correct.
AT contains barcodes of overdue items.
AU = charged items
charged items count is correct.
AU contains barcodes of charged items.
AV = fine items
Judging from behaviour seen and comment in
Patrons.pm currently not implemented.
BU = recall items
Recalls are not implemented in Koha yet.
CD = unavailable hold items
unavailable items count is correct.
CD contains barcode for item level holds and is empty
for title level holds where no item can be determined.
Before patch, all CD were empty.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes a copypasta'd copyright statement, and some incorrect
POD and indentation.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Switch to the new method of showing star ratings. Also, fix some
translation bugs, an error that occurred when caching was disabled and
add a stub unit test.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Show any relevant results from the OverDrive ebook/audiobook service
on the OPAC search. This is done by showing a link with "Found xx
results in the library's OverDrive collection" at the top of search
results and linking to a page that shows the full results.
This requires an OverDrive developer account, and is enabled by
setting the OverDriveClientKey and OverDriveClientSecret
system preferences. In addition, this patch adds the
OverDriveLibraryID system preference.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Henry Bankhead <hbankhead@losgatosca.gov>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gives you the option of sending a patrons home branch code
in an AF field for patron status requests. It is controlled at the account
login level, so it can be enable on a per-sip-login basis.
Test Plan:
1) Apply patch
2) Edit SIPconfig.xml, add the parameter 'send_patron_home_library_in_af="1"'
to the login you will be using to test.
3) Start your SIP2 server.
4) Connect to it via telnet ( something like: '9300CNterm1|COterm1|CPCPL|' )
5) Send a patron status request ( like: '2300120121110 82925AOCPL|AA23529000035676|ACterm1|ADletmein' )
6) Examine reponse you should see something like this:
"24 00120121210 085332AEHenry Acevedo|AA23529000035676|BLY|CQN|AFGreetings from Koha. |AFMPL|AO|"
Note the second AF field with the value MPL.
Signed-off-by: George Williams <georgew@latahlibrary.org>
Signed-off-by: Christopher Brannon <cbrannon@cdalibrary.org>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The tests were too restrictive. It should be possible to pass
a checkin/checkout test with different values defined for
magnetic media.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add a terminator option to SIPConfig.xml, choices for 'terminator' are
'CR' or 'CRLF'. The default continues to be 'CRLF' if 'terminator' is
undefined.
Test Plan:
1) Apply patch
2) Start SIP server
3) Run C4/SIP/t/04patron_status.t
4) Stop SIP server
5) Add terminator="CR" for account login 'term1'
6) Run 04patron_status.t again, you should see no change
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Adrien Saurat <adrien.saurat@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Case-insensitivity of the system preference cache was introduced in
the patch for bug 6132. This patch corrects some breakage that
occurred. Longer-term, IMO a hard look needs to be taken at
using a case-insensitive collation for syspref codes, and coded
values in general.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For Koha installations with multiple OPAC URLs, It would be nice to be
able to override systeprefs from the http conf file. Case in point,
a library wants to have two separate opacs, one the is only viewable
from within the library that allows patrons to place holds, and a second
public one that does not. In this case, overriding the system preference
RequestOnOpac would accomplish this simply, and with no ill affects.
This feature would of course be should only be used to override
cosmetic effects on the system, and should not be used for system
preferences such as CircControl, but would be great for preferences
such as OpacStarRatings, opacuserjs, OpacHighlightedWords and many
others!
Test Plan:
1) Apply this patch
2) Disable the system pref OpacHighlightedWords
3) Do a seach in the OPAC, not the term is not highlighted
4) Edit your koha-http.conf file, add the line
SetEnv OVERRIDE_SYSPREF_OpacHighlightedWords "1"
to your koha-http.conf file's OPAC section.
Also add the line
SetEnv OVERRIDE_SYSPREF_NAMES "OpacHighlightedWords"
to the Intranet section
5) Restart your web server, or just reload it's config
6) Do a seach, now your search term should be highlighted!
7) From the intranet preference editor, view the pref,
You should see a warning the this preference has been overridden.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the wthdrawn field in items and deleteditems to be
withdrawn instead. No functional changes are made.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Save for translation files (that will be fixed on next release),
only occurrence of wthdrawn is on updatedatabase.pl
No koha-qa errors.
This touch many files, and I did not test everything,
but all seems normal. I think that any problem could
be fixed later.
Perhaps both entries in updatedatabase.pl could be joined
into one, but thats for QA.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The patch corrects the issue -- the content of the field 225 shall be
displayes under Series now.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described, no koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comparing the XSLT with the normal view the patch seems to
work correctly. Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
CheckReserves was using the CircControl system preference to determine what
patrons an item can fill a hold for. It should be using ReservesControlBranch
instead.
Test Plan:
1) Set ReservesControlBranch to "item's home library".
2) Create an item at Library A, place holds for it for patrons at
Library B, Library C, and Library A in that order,
for pickup at the patrons home library.
3) Make sure the holds policy for Library A is set to
Hold Policy = "From home library" and
Return Policy = "Item returns home".
Make sure the holds policies for the other libraries are set to
Hold Policy = "From any library".
4) Check the item in at Library C, the hold for the patron at Library B
should pop up, even though it's in violation of the circulation rules.
Don't click the confirm button!
5) Apply this patch, and reload the page,
now the hold listed should be for the last hold,
the hold for the patron at Library A, which is correct.
This patch adds the subroutine C4::Reserves::GetReservesControlBranch as
an equivilent to C4::Circulation::_GetCircControlBranch.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed POD so that arguments and explanation match (C<$item>).
Also tested opac-reserves.pl for regressions.
Passes all tests, QA script, and Reserves.t.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Remove uncalled sub GetCcodes
Also remove comment in opac-search.pl which is
remaining reference to it and serves no
useful purpose
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The OPAC description for an authorized value is not required to be
populated. In particular, if it is NULL, the staff description is
displayed instead.
This patch makes sure that the sort order (in OPAC mode) uses either
the staff description or the OPAC description as needed for each
value.
To test:
[1] Make sure that AdvancedSearchTypes includes "ccode"
[1] Arrange your CCODE values so the sort order for staff labels
is different from the sort order for OPAC descriptions. Also,
ensure that one of the OPAC descriptions is NULL. For example,
authorised_value | lib | lib_opac
--------------------------------------
ZZZ | A_STAFF | Z_PUBLIC
DDD | D_STAFF | NULL
AAA | Z_STAFF | A_PUBLIC
[2] Prior to the patch, any CCODE values where the OPAC description
is NULL will sort first in the OPAC advanced search page, even
if the displayed label shouldn't come first.
[3] Apply the patch.
[4] Verify that the collection list on the OPAC advanced search page
is now correct.
[5] Verify that the sort order on the staff advanced search page
has not changed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works nicely, tested in staff and OPAC.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Collection codes and shelving locations are displayed in the OPAC and
staff client via GetAuthorisedValues which currently sorts results by
"lib, lib_opac." Consequently if lib (the description for the staff
client) doesn't match lib_opac (the description for the OPAC) sorting
will appear to be nonsensical in the OPAC. GetAuthorisedValues can be
passed an $opac parameter, so this should be used to switch how reuslts
are sorted. This patch implements such a switch.
To test, modify your collection code or shelving location authorized
values so that lib and lib_opac do not match. Set your
AdvancedSearchTypes system preference to display the modified authorized
values and view the advanced search page in the OPAC and staff client.
Sorting should be correct in each case according to the correct value
(lib in the staff client, lib_opac in the OPAC).
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Tested in staff and opac and it works perfectly!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add an option to cleanup_database.pl to purge the search_history
entries older than X days.
Test plan:
- Apply patch
- Check that your test DB has some entries a little older than 30 days
and a few ones even older than that in search_history:
SELECT * FROM search_history WHERE time < DATE_SUB( NOW(), INTERVAL 30 DAY );
If not, modify some existing entries.
- Run cleanup_database with a fixed number of days (replace XX with
something higher than 30)
/misc/cronjobs/cleanup_database.pl --searchhistory XX
- Check that entries older than XX days got deleted from search_history
- Run without the day parameter
/misc/cronjobs/cleanup_database.pl --searchhistory
- Check that entries older than 30 days got deleted from search_history
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Need to check for definedness, not Perl truth.
Also adds description of the return value to the POD.
To test:
Run prove -v t/db_dependent/Circulation_transfers.t and verify that
the tests pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds return values to DeleteBranchTransferLimits:
1 if a Transfer Limit is deleted
undef if no parameters is given
0E0 if a wrong parameter is given
More, it fixes and adds some tests in t/db_dependent/Circulation_transfers.t
To test :
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=14, 19 wallclock secs ( 0.02 usr 0.01 sys + 0.39 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested with patch for bug 10692 applied.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
There is nothing prevent '0' from being used as a library code.
To test:
Run prove -v t/db_dependent/Circulation_transfers.t and verify that
the tests pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch test if the parameters $toBranch and $fromBranch are given.
If not, CreateBranchTransferLimit now returns undef.
This patch also fixes and adds some regression tests in
t/db_dependent/Circulation_transfers.t
NOTE:
Currently, we can add a transferlimit to nonexistent branches because
in the database branch_transfer_limits.toBranch
and branch_transfer_limits.fromBranch aren't foreign keys.
To test:
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=15, 18 wallclock secs ( 0.02 usr 0.01 sys + 0.42 cusr 0.00 csys = 0.45 CPU)
Result: PASS
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Test plan :
Check if the regression tests still works
prove t/db_dependent/Branch.t
t/db_dependent/Branch.t .. 1/36 Using a hash as a reference is deprecated at t/db_dependent/Branch.t line 207.
t/db_dependent/Branch.t .. ok
All tests successful.
Files=1, Tests=36, 0 wallclock secs ( 0.03 usr 0.01 sys + 0.12 cusr 0.00 csys = 0.16 CPU)
Result: PASS
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
From the man page
finsh()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
[1] Turn on the syspref for enabling OPAC holds.
[2] Create an item and bring it up on the OPAC search
results. Run through the following possibilities,
by changing the item, and verify that the place hold
link in OPAC search results appears only when the item is
- not lost AND
- not withdrawn AND
- not damaged (or is damged and AllowHoldsOnDamagedItems is ON) AND
- the item is not marked not-for-loan OR
the item has a negative notforloan value (e.g., it is on order)
Note that it is necessary to reindex the test bib after making
each change to the test item.
[3] Also verify that whether or not in the item is in transit does
NOT affect whether the place hold link appears.
[4] Verify that there is no regression on bug 8975 (i.e., if an
item is on order, that status should be displayed in staff client
search results).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In search results, one could not place a hold on an item in transit
and for loan (items.notforloan=0). This appears when AllowOnShelfHolds
is allowed.
This patch repairs a regression introduced by the patch for bug 8975.
Test plan :
- Set AllowOnShelfHolds to on
- Create a record with a normal item : not lost, not withdrawn, not
damaged, notforloan=0
- Index this record
- Perform a search on OPAC that returns this record (and others)
=> You see in actions "Place hold"
- Add this item in transit : /cgi-bin/koha/circ/branchtransfers.pl
- Re-perform the search on OPAC
=> You see in actions "Place hold" and item "in transit"
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
1/ delete_report should return undef is no parameter is given.
2/ delete_report returns the number of affected rows.
3/ delete_report should be tested with 1 and more parameters.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The first patch add a bad indentation for this routine. This patch fixes
that.
Also, the $sth->finish statement is useless and was removed.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the option to select multiple saved reports for
deletion.
To test you must have two or more saved reports to delete. Deletion
should work properly when:
- Selecting one report for deletion by checking the box.
- Selecting more than one report for deletion by checking boxes.
- Clicking the old "Delete" link
Clicking the delete button should prompt you to confirm. Clicking cancel
should cancel.
Clicking the delete button when no boxes are checked should trigger an
alert asking you to select reports for deletion.
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Functional tests pass, template tests pass.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For some reason, C4::HoldsQueue::MapItemsToHoldRequests used the system
preference AutomaticItemReturn to decide if an attempt to fill local
holds with local items. No explanation of this behavior is provided.
This patch removes this behavior, and also adjusts the calculation
of the lead-cost library to always return the pickup library if it
is on the list of libraries that could fill the hold -- on the
basis that if the item is already at the pickup library, its
transport cost is inherently zero.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes QA script and adds unit tests.
Tested with some examples and those worked correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the example template syntax in the POD for
C4::Branch::GetBranches() to use Template Toolkit syntax.
To test, view the POD for C4::Branch::GetBranches() and confirm that it
looks correct.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Checked the POD with "perldoc C4/Branch.pm" before and after applying
the patch. The example now uses TT syntax, and looks sensible.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the example template syntax in the POD for
C4::Creators::Lib::html_table() to use Template Toolkit syntax.
To test, view the POD for C4::Creators::Lib::html_table() and confirm
that it looks correct.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Checked the POD with "perldoc C4/Creators/Lib.pm" before and after applying
the patch. The example now uses TT syntax, and looks sensible.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the example template syntax in the POD for
C4::Items::GetItemStatus() to use Template Toolkit syntax.
To test, view the POD for C4::Items::GetItemStatus() and confirm that it
looks correct.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
This patch works as advertised (verified with "perldoc C4::Items"),
for GetItemStatus, but it does not fix a a similar example for
GetItemLocation in the same file, which still has the old template
syntax. So a followup or separate bug for that is called for.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
It seems the default option is not in used in templates.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the example template syntax in the POD for
C4::Koha::GetSupportList() to use Template Toolkit syntax.
To test, view the POD for C4::Koha::GetSupportList() and confirm that
it looks correct.
Signed-off-by: Magnus Enger <magnus@enger.priv.no>
This patch works as advertised (verified with "perldoc C4::Koha"),
for GetSupportList, but it does not fix a a similar example for
GetItemTypes, getauthtypes and getframework in the same file,
which still has the old template syntax. So a followup or separate
bug(s) for those are called for.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
It seems the default option is not in used in templates.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
With IndependentBranches turned on, if you try to check out an item
which belongs to another library you will get an error message which is
missing the library name. This patch corrects the problem by passing the
necessary variable to the template and outputting the library name using
the KohaBranchName TT plugin.
To test, turn on IndependentBranches and try to check out an item
belonging to another library (note that you must test with a staff user
who is not a superlibrarian). The error message you see should include
the name of the library to which the item belongs:
"This item belongs to Nelsonville and cannot be checked out from this
location."
Checkouts of items belonging to the library should be unaffected.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The OPAC still uses the old tablesorter plugin which isn't being
actively maintained. We use DataTables in the staff client and should in
the OPAC too. The plugin was added a while ago but never implemented on
any pages. This patch upgrades the plugin to the latest version and
places it in opac-tmpl/lib for cross-theme access. The patch implements
DataTables on all pages which previously used the tablesorter plugin.
The old tablesorter plugin is removed.
The customized DataTable configuration script, datatables.js, has been
trimmed-down from the staff client version in order to limit it to only
that functionality required in the OPAC.
Sorting based on date is done based on the data's enclosing <span> title
attribute as it is in the staff client:
<span title=" [% iso date %]">[% date | $KohaDates %]</span>
Slight modifications to Serials.pm and opac-search-history.pl have been
made to accommodate this change.
To test, view each page in the OPAC which uses JS-based table sorting:
- The bibliographic detail page
- The cart
- The search history page
- The suggestions page
- The tags page (logged in as a user who has entered tags)
- The "most popular" page (opac-topissues.pl)
- The logged in user summary page (opac-user.pl)
- The subscription "full history" page (opac-serial-issues.pl?selectview=full)
- The self-checkout main page (with existing checkouts)
Table sorting should work correctly on all pages in both the prog and
ccsr themes. Sorting should work for dates whatever your dateformat
system preference setting. Tables listing titles should exclude articles
("a," "an," and "the" in English) when sorting.
Also test the serial collection page in the staff client, which is
affected by the change to Serials.pm. Confirm that dates are displayed
and sorted correctly.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, works as advertised!
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works really nicely on all pages.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In C4::Acquisition::ModReceiveOrder, a call to NewOrder is badly used.
NewOrder returns ($basketno, $ordernumber) but in ModReceiveOrder the
ordernumber is got with
my $ordernumber = NewOrder( $args );
It works because:
sub t{
return ("a", "b");
}
my $a = t();
say $a;
Will display 'b'.
But it is not really clear.
Test plan:
Check that there is no regression for partial receives.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
To test:
Verify that prove -v t/db_dependent/RotatingCollections.t passes
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, passes UT provided by bug 10653
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch teaches GetHoldsQueueItems to consult
the item-level_itypes system preference and return
the item-level or bib-level item type accordingly.
To test:
- Arrange so that an item that shows up on the holds queue
report has one item type while its bib has a different one.
- Run the report with item-level_itypes ON. Verify that
the item-level item type is displayed.
- Change item-level_itypes to OFF. Run the report again and
verify that the bib-level item type is displayed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The hold queue report shows collection code but not item type. This
patch adds it. Also added is use of the KohaAuthorisedValues template
plugin to display the collection code description instead of code.
To test, apply the patch and view the holds queue. There should be a new
item type column showing an item type description for each row. The
collection column should now show the collection description instead of
code.
Signed-off-by: Melia Meggs <melia@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
When a patron changes to a category with enrollment fee, they
are not generated.
Test plan:
- Choose a category without fee (e.g. Kid)
- Add an enrollment fee for another category (e.g. Young adult)
- Choose a kid and change his category to "Young adult".
- Note the warning message "Fees & Charges: Patron has Outstanding fees
& charges of XX" on the check out page.
This patch also moves all instances of adding the enrollment fee
to a new routine in C4::Members, AddEnrolmentFeeIfNeeded(), so
additional tests include:
- Register a new patron and give it a category that has
an enrollment fee. Verify that the fee is charged.
- Renew the patron. Verify that the additional fee is charged.
- Register a new patron with a child patron category.
- Use the 'update child to adult' menu option to change the
patron's category to one that is fee-bearing. Verify that the
enrollment fee was charged.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
This reverts changes made to CanBookBeRenewed by
patches from bug 9367.
GetReserveStatus is not suitable to recognize if an item
can fild a hold on return and CheckReserves is restored.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
This patch includes a squash of a follow-up authored by
Katrin Fischer <Katrin.Fischer.83@web.de>:
CheckReserves returns '' when no reserve is found,
so $resfound will always be defined and we need to
check if it's true.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Enable the syspref emailLibrarianWhenHoldIsPlaced
2) Modify the HOLDPLACED notice, add some item level fields
3) Place an item level hold
4) Check the email you receive ( or just look at it from the db )
You should see the item level fields are new populated
5) Place a title level hold
6) Check the email you receive - item fields are not populated,
but notice still looks ok.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This makes the POD for the columns() function consistent
with the rest of C4/Members.pm. It also removes a note
that can be relegated to the bug report and the Git
history.
Also, since C4::Members::columns() is not actually a
class method, this patch changes the invocation to
not call it that way.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The initial thought was to remove this function. However,
tools/import_borrowers.pl uses it. So rather than remove
it to solve the problem, it was reworked to a more generic
solution which runs faster.
By accessing $sth->{NAME} directly, the driver becomes
responsible for filling it correctly. This happens when a SELECT
is done on the borrowers table. It does not even have to have
data in the result set!
The columns method could be more generic and used elsewhere too.
Comparison between the old method and the STH method showed a
significant time difference. The old method took 35 seconds
for 40k iterations versus 19 seconds for the same amount of
iterations with the STH method regardless of the size of the
borrowers table.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If IndependentBranches is ON, patrons are not allowed to place
hold requests on items whose owning library is different from
the patron's home library, *unless* the canreservefromotherbranches
system preference is also ON.
The patch implements the intended behavior; without it, IndependentBranches
and canreservefromotherbranches were not consulted during the
item holdability check.
To test:
[1] Have IndependentBranches ON and canreservefromotherbranches
OFF. Make sure that the circulation rules are set up to
permit patrons to place hold requests in general.
[2] In the OPAC, log in as a patron from library A, and try placing
a hold on an item from library B. The patron will be able to
place the request.
[3] Cancel the request.
[4] Apply the patch.
[5] Try placing the same hold request. This time, the request should
be forbidden.
[6] Turn on canreservefromotherbranches.
[7] Try placing the hold request. This time, it should go through.
[8] Cancel the request.
[9] Turn off IndependentBranches.
[10] Try placing the hold request and verify that it is permitted.
[10]
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish() and replace prepare_cached by prepare.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add validation of the value of the KohaOpacRecentSearches. In
particular, this patch avoids the generation of an internal server
error when the OPAC is presented with an old cookie that uses the
old Storable-based serialization.
This patch also moves parsing of the cookie value into a
new routine in C4::Auth, ParseSearchHistoryCookie, and adds
a test case.
To test (in conjunction with the previous patch):
Exercise the OPAC search history functionality, after
turning on the EnableOpacSearchHistory syspref:
- As an anonymous user, conduct a variety of searches,
including ones that include non-ASCII characters
- Check the search history and verify that all searches
are listed
- Apply this patch and the previous one.
- Do *not* clear the KohaOpacRecentSearches cookie
- Check the search history and verify that no searches
are listed any more
- As an anonymous user, conduct a variety of searches,
including ones that include non-ASCII characters
- Check the search history and verify that all searches
are listed
- Log into the OPAC
- Verify that current and past searches are listed in
search history.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
Exercise the OPAC search history functionality, after
turning on the EnableOpacSearchHistory syspref:
- Clear the KohaOpacRecentSearches cookie
- As an anonymous user, conduct a variety of searches,
including ones that include non-ASCII characters
- Check the search history and verified that all searches
are listed
- Log into the OPAC
- Verify that current and past searches are listed in
search history.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When the Quote of the Day tool selects a new new quote, it updates the
timestamp and does not take the timezone into account. Thus the time is
set to +4 hours (e.g. 2013-06-11 13:33:48 when the time is 2013-06-11
09:33:48). It then repeats the same quote every day.
To replicate:
Set Administration >> System preferences >> OPAC preferences >> Features
>> QuoteOfTheDay to Enable
In Home >> Tools >> Quote Editor, add several quotes.
In the opac, refresh the home page. You should get a quote of the day at
the top.
mysql> select * from quotes;
Note the timestamp of the quote selected by the tool. It will not match
the date on the machine (unless your server's timezone is set to UTC).
If you change the date to the previous date and refresh the opac, the
tool wlill select another quote, which will not change unless forced.
Test Plan:
1) Remove all your quotes and import a fresh set
2) Enable the quote of the day and view the opac
3) Look at your quotes table and note the timestamp is incorrect
4) Repeat steps 1 and 2
5) Look at your quotes table and note the timestamp is now correct
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
From-address and to-address were the same (patron's email) for
subscription alerts. This patch changes 'from' the branch or
kohaadminemailaddress
To test
- add a subscription in staff/serials in case you don't have any
- enable patron notifications or the subscription
- in the OPAC, subscribe to the serial
- in staff/serial, receive an issue of the serial
Before applying the patch, the email that is supposed to be sent
has the patron's email as 'from' and 'to' (and is likely to fail).
If you follow the steps after applying the patch, the email alert
should have the 'from' address of the patron's branch or
kohaadminemiladdress -- which should also work fine with the MTA/SMTP
you have set up for messaging.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch corrects the mixup for LC call number and control number.
Further, as suggested by Galen, it would be better to not introduce hardcoded
tags in the Z3950Search subs in Breeding.pm.
This patch resolves that by calling TransformMarcToKohaOneField.
Note that this only involves changes to _addrowdata and _isbn_show. These
subs are only used in building the displayed results table.
Additionally, for French UNIMARC installs publicationyear is used to fill
the Date column (copyrightdate is not used in those installs). The edition
statement is only used in unimarc_lecture_pub not in unimarc_complet.
Test plan:
Do some Z3950 searches and look for values in all result columns.
For MARC21 on LOC (and/or others):
Look for isbn 9780415964845 (check LCCN).
Look for author Rowling.
For UNIMARC on BNF2 (and/or others):
On BNF2 look for isbn 2070518426: result contains date and multiple isbn's.
Look for title: Guide des candidats aux emplois de commissaire de police.
Third result show edition statement (if you use 205$a with pub install).
Note that there are no results with LCCN here (just as before).
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested for MARC21 and UNIMARC (French lecture_pub install).
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
As Jonathan correctly noted, the new Z3950 response only showed one isbn
although more isbn numbers could be in the record and would be imported.
To resolve this display problem, I traverse them all now in the updated
routine _isbn_show. There is no change in the imported records.
Note that before this patch TransformMarcToKoha did put all isbn numbers in
one field, separated by pipes (for display only). This behavior is restored
now. The three regexes on the individual isbn numbers now seem to be
overkill, but I left them there for completeness.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested this on a fresh French install under UNIMARC with BNF server.
Tested it too for MARC21.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Refactors Z3950Search.
Disable batch record counts for z3950 records.
Test plan:
Do various Z3950 searches on multiple targets from Cataloging and Acquisition.
Behavior should not have changed.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- fix identation in one line
- remove a commented-out warn
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A change-and-replace went a tick too far. This patch
adjusts the column alias in the query run in MergeHolds()
to reflect that the value being returned is the number of
hold requests, not an ID.
To test:
[1] This patch should have no visible changes to behavior. To
verify, pick to bib records that have hold requests on them,
then merge them together. Verify that the merged bib
contains sll of the hold requests on it.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
* C4::Reserves::_FixPriority
- The previous code checked the cancellationdate. If think you never pass
in it with bad parameters, but in order to be sure I added the check on
this value.
- The reservedates array was never used.
* circ/circulation.tt
There was a bug: it was not possible to remove an hold from the
circulation page. Passing reserve_id fixes the issue.
* C4::Reserves::GetReserveId
This subroutine did not have a unit test.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch switches from using a combination of
biblionumber/borrowernumber to using reserve_id where possible.
Test Plan:
1) Apply patch
2) Run t/db_dependent/Holds.t
Signed-off-by: Maxime Pelletier <maxime.pelletier@libeo.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::Booksellers::GetBooksellersWithLateOrders has an unused parameter.
The $branch variable is never used in the routine.
Test plan:
Check that no behavior changes on the late orders page.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I couldn't find any use of the branch parameter apart from
the one corrected by this patch. Also tested late orders,
couldn't find any problems.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Several system preference variables are unavailable to the OPAC login
template because they are not explicitly enabled for that page. Instead
of adding them to Auth.pm using the old method this patch uses the new
system preference check syntax using the Koha TT plugin.
The following preferences are now checked using this syntax in
masthead.inc:
OpacAddMastheadLibraryPulldown
UseCourseReserves
reviewson
OpacShowRecentComments
In order for the call in masthead.inc to the new plugin to work on all
OPAC pages "[% USE Koha %]" must be added to any template which
includes it (most of them).
Also in this patch: A change to Auth.pm to enable correct display of the
LibraryName in the title of the OPAC login page.
To test, turn on the above system preferences and confirm that the
relevant links appear under the OPAC's main search bar on all pages
including the login page.
Confirm that the text specified in the LibraryName system preference is
shown as the title of the login page.
Confirm that course reserves and comments are displayed correctly on the
biblio detail page.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
I checked both prog and ccsr - all seems well and the links are appearing and disappearing in accordance with the appropriate sysprefs.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works as described.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If you are not logged in to the OPAC, looking at the login page, and you
click the Lists button to see public lists it says there are none. This
patch corrects Auth.pm so that it loads the list of public lists in this
situation.
To test you must have at least one public list. Make sure you are logged
out of the OPAC and visit the login page (/cgi-bin/koha/opac-user.pl).
Clicking the "Lists" button should show you a list of public shelves.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
works as described, and list button is not shown when opacpublic is disabled.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Generating (e.g.) overdue notices can result in spurious warnings in
the cronjob logs:
$ ./misc/cronjobs/overdue_notices.pl -t -library CPL
prepare_cached(SELECT * FROM issues WHERE itemnumber = ?) statement handle DBI::st=HASH(0x54a7828) still Active at C4/Letters.pm line 589
This patch removes the warning by making sure that the relevant statement
handle is finished after fetching its first row of results.
To test:
[1] Set up an overdue loan such that running overdue_notices.pl will
trigger the generation of a notice.
[2] Run overdue_notices.pl -t and note the warning message.
[3] Apply the patch.
[4] Run overdue_notices.pl -t again and note that the warning message
is no longer displayed.
[5] Check the message_queue table and verify that the overdue
notices generated in steps 2 and 4 have the same text.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch changes a few occurences of ISO-8859-1 to UTF-8
within the XML generation of the ILS-DI module.
To test:
- Activate ILS-DI system preference
- Go to [youropac]/cgi-bin/koha/ilsdi.pl
- Check all examples in the documentation for the correct
encoding
- Check GetAvailability gives you the correct encoding and
check the source for the correct encoding
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adjusting to reflect the removal of the branchcode parameter
to GetBranchCategories; also filter on the 'searchdomain'
library group type, as appears to have been intended.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The prototype is not consistent, GetBranchCategory should return only 1 result
and GetBranchCategories should not have a categorycode argument.
This patch fixes that.
Test plan:
1/ Try to add/remove/modify a library.
2/ Add some groups
3/ Add these groups to a library
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4:Circulation:GetUpcomingDueIssues is used in the advance_notices.pl
script. This patch corrects an error in its handling of the maxdays
parameter that resulted in it picking up *all* upcoming due loans and
recently overdue loans.
Test plan :
- Create an issue with a date due in the paste
- Create an issue with a date due in two days
- Launch advance notices with due date in max 2 days : perl misc/cronjobs/advance_notices.pl -c -n -v -m=2
=> You get a warn "found 0 issues"
- Launch advance notices with due date in max 3 days : perl misc/cronjobs/advance_notices.pl -c -n -v -m=3
=> You get a warn "found 1 issues"
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
I did the following test :
- 1 book to check in 2 days
- 2 books to check in in the past
before applying the patch :
$perl ../misc/cronjobs/advance_notices.pl -c -n -v -m=2
getting upcoming due issues at ../misc/cronjobs/advance_notices.pl line 203.
found 1 issues at ../misc/cronjobs/advance_notices.pl line 205.
I changed the value of "-m" : 0, 1, 2, 3, 4
=> always 1 issue found (the book to check in in 2 days)
after applying the patch :
$perl ../misc/cronjobs/advance_notices.pl -c -n -v -m=2
found 0 issues
for m = 0, 1, 2 => 0 issues
$perl ../misc/cronjobs/advance_notices.pl -c -n -v -m=3
found 1 issues
for m = 3,4,5 => 1 issues (the book to check in in 2 days)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, works as advertised.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adds a new system preference AllowTooManyOverride to control whether
a librarian can override the 'Too many checked out' message which is
currently always overridable.
Test Plan:
1) Apply patch
2) Run updatedatabase.pl
3) Attempt to check out 1 more item to a patron than the max issues
4) You should be allowed to override by default ( current behavior )
5) Set AllowTooManyOverride to "Don't allow"
6) Repeat step 3
7) You should be blocked from being able to issue the item
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
The new system preference is activated by default, which mean there
will be no change in behaviour on update.
The system preference is correctly added to the database and .pref
files.
Test plan and QA script passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
Add/edit a supplier and check that the delivery time is set in DB.
Note: This patch cleans the code (sql query) in order to see easily if a
problem occurred.
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
C4::ClassSortRoutine::Dewey can pad the wrong part of a call number internally.
The subroutine get_class_sort_key tokenizes a call number string (splitting on
periods and whitespace) and counts the number of tokens that solely contain
digits. If there is only one such digit group, a comment in the code states
that it will pad said digit group. However, the bug is that the code assumes
said digit group is the first token, when this may not be the case.
In practice, this can cause poor sorting when used a call number is in the form
of PREFIX _space_ 3DIGITS.
To test:
[1] Create two item records whose class scheme is set to
'ddc' (Dewey) and whose call numbers contain prefixes, e.g.,
J DVD 700.1 ABC and J DVD 850 DEF.
[2] Use the inventory tool to produce a list of item items that include
the two created in step 1. Obsere that that items are sorted
in the incorrect order, with "J DVD 850 DEF" coming before
"J DVD 700.1 ABC". Alternatively, run the following SQL
to see the incorrect sort order:
SELECT cn_sort, itemcallnumber
FROM items
WHERE itemcallnumber LIKE 'J DVD%'
ORDER BY cn_sort;
[4] Apply this patch.
[5] Run misc/maintenance/touch_all_items.pl to force cn_sort to be
recalculated.
[6] Repeat step 2 and verify that the call numbers are now sorted
corrected.
Signed-off-by: Jason Etheridge <jason@esilibrary.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This routine has been introduced by commit 2d90fb22d4.
The only call has been removed by commit 9eba7dc594.
So now, this routine is useless.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The message fields which are returned in the SIP
Screen message field in a Patron Information response
had the dollar symbol hardcoded.
It would be possible to get the symbol from currency
but omitting any symbol would be consistent with the UI
and avoid problems with devices using weird encodings
for local currency symbols (e.g. the many variations
of UK Pound sign)
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This routine is not in used and does not make sense. It should not be
used later.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Passes koha-qa.pl, not references to get_branch_code_from_name found.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If you enable another translation, and disable English, then if you dont
have a cookie set, or your browser is not set to that language, you will
get English. So you can not disable English in either the staff client
or the OPAC.
This patch fixes the language selection to do the right thing.
To test you must have at least one other language installed besides
English. Apply the patch and disable the en translation. Koha should
fall back to one of the enabled translations.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
I added a patch description and test plan, missing from the
original patch.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I have tested with various combinations of activated languages
and have found no regression. If the cookie is set, the right
language is shown accordingly. Else the first language in the
list seems to be picked. It did never fall back to English
in my tests, when English was explicitly deactivated.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In parcels.pl, and other acquisition pages, the funds are not sorted
by name in combo-box. With a great number of funds, it is difficult
to find one.
This patch adds a default value to $orderby arg of C4::Budgets::GetBudgets.
Test plan :
- Create a new fund with a name beginning with 'z' and set you as owner.
- Create a new fund with a name beginning with 'a' and set you as owner.
- Go to a vendor and click "Receive shipments"
- Look at fund combobox
=> Funds are sorted by name
Signed-off-by: Silvia Simonetti <s.simonetti@cineca.it>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
When testing make sure your funds would sort the other way
around when sorting by code instead of description.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the date published to the subscriptions tab in the staff
interface bib display and renames the former "Date" column to
"Date arrived".
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
There is currently no way to delete unused invoices (for example,
invoices created by mistake), and there really should be, since errors
and absent-mindedness can result in numerous empty invoices over the
course of years.
To test:
1) Apply patch.
2) Create three invoices in the Acquisitions module. For one of them,
receive at least one item. For the other two, do not receive any
items.
3) View one of the invoices that does not have any items on it.
4) Try to delete it. This should succeed.
5) View the invoice that has an item. There should not be any option
to delete it.
6) Do an invoice search that brings up the other invoice with no items
on it. Try to delete it from the results page. This should succeed.
7) Run the unit test:
> prove t/Acquisition/Invoice.t
8) Sign off.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass. I also did another test:
I cancelled all receipts from an existing invoice and then could
successfully delete it in the last step.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This fixes a bug where, assuming LDAP authentication is enabled, if a user
tries to log in while the LDAP server is down, the following fatal error
is displayed:
Can't call method "bind" on an undefined value at C4/Auth_with_ldap.pm line 134, <DATA> line 558.
This patch catches this error to allow normal authentication when LDAP connexion fails.
Test plan :
- Configure LDAP connexion with a host not having LDAP. ie :
<useldapserver>1</useldapserver>
<ldapserver id="ldapserver">
<hostname>localhost</hostname>
<base>dc=test,dc=com</base>
<user>cn=Manager,dc=test,dc=com</user>
<pass>passwd</pass>
<replicate>0</replicate>
<update>0</update>
<auth_by_bind>0</auth_by_bind>
<mapping>
<firstname is="givenname" ></firstname>
<surname is="sn" ></surname>
<branchcode is="branch" >MAIN</branchcode>
<userid is="uid" ></userid>
<password is="userpassword" ></password>
<email is="mail" ></email>
<categorycode is="employeetype" >PT</categorycode>
</mapping>
</ldapserver>
- Try to connect with mysql user (defined in koha-conf.xml)
- Try to connect with a user defined in borrowers
You may try to connect with working LDAP connexion
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The DB field aqorders.biblioitemnumber seems to be unused except to get
the itype on the spent.pl page.
This information can be retrieved uising another SQL join.
Test plan:
Try a complete workflow in the acquisition module: create an order,
receive it, play with the syspref AcqCreateItem.
Check that no regression is found and that the data for existing
orders don't change.
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If OPAC reserve page is accessed without being logged-in, login form is displayed as well as a CAS authentication link (if enabled). A click on this link will lead to CAS server but one comming back to Koha, page shows an error : "ERROR: No biblionumber received".
This is because CAS link only contains the query path "/cgi-bin/koha/opac-reserve.pl", not the query parameters.
This patch adds query parameters to URI sent to CAS.
Test plan :
- Enable CAS
- Go to opac without been logged-in
- Try to place hold on a record
=> You get to /cgi-bin/koha/opac-reserve.pl?biblionumber=XXX showing authentication page
=> Check that CAS link contains query param "biblionumber"
- Click on CAS link and log in
=> Check you return well logged-in to reserve page with biblionumber param
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I have followed the test plan as far as I could and the links
contain the biblionumber now, which they didn't before.
I couldn't check the CAS login, but my normal login worked
as expected.
All tests and the QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Most important: Does no longer delete all shelves!
Checks if there are ten borrowers for testing. But even works without them :)
When creating or modifying lists, takes name clashes into consideration.
Small change to _CheckShelfName in VirtualShelves module. Making it possible to
check a name for a list whose owner has been set to NULL. Note that a test
like field=? with undef for placeholder will not work in MySql.
Test plan:
How do you test a test? Well, you could run it on various databases..
But for real hacking, you could also add some debug lines.
I tested this by forcing 10 undefs in @borrowernumbers.
And by overwriting the return value of randomname with an existing name.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch revises the staff client list contents view to better match
staff client search results, showing more information and offering more
ways to interact with the contents than before.
- List contents output has been modified so that the staff client can
use XSLT-formatted data just as the OPAC can. As in the OPAC it
depends on XSLTResultsDisplay being enabled.
- A "toolbar" has been added which is similar to that in search results,
offering the option to add items to a Cart, add them to a different
list, place multiple holds, remove items from the list, or merge
records.
- This toolbar has been made to float on scroll like the one on the
MARC edit page.
- Library and shelving location have been added to the display of call
numbers. Call numbers are linked to a search as they are in search
results.
- Edit links are included just as they are in search results.
- Automatic focus on the add by barcode form has been removed so that
the page doesn't jump to the bottom unnecessarily.
- basket.js's "addMultiple" function has been modified so that it
receives an array of checkboxes rather than looking for checkboxes in
a specific form. This helps abstract its functionality for use on both
search results and lists. results.tt is modified accordingly.
- The page layout has been widened to make room for the increased amount
of information on the page.
- A new "merge" icon has been added to the default Bootstrap sprite.
To test:
- View both public and private lists in the staff client.
- View lists with and without contents.
- Test the functionality of options in the toolbar: Add to cart, add to
lists, place multiple holds, remove items, merge items.
- Test with users with and without cataloging privileges to confirm that
catalog-related controls are correctly shown or hidden.
- Test with XSLTResultsDisplay set both to "default" and empty.
- Since the staff client and OPAC use some of the same code, test that
lists in the OPAC have not broken.
- Since JavaScript was modified which affects both lists and search
results, confirm that adding items to the Cart and Lists from search
results hasn't been broken by this patch.
Revision corrects conditional display of hold link, hiding it in cases
where there are no items or the record's itemtype is not for loan.
Also corrected is the behavior of the Cart/List "save" button in order
to prevent it from submitting the "remove items" action which is the
default for the form.
Signed-off-by: jmbroust <jean-manuel.broust@univ-lyon2.fr>
Edit: Patch rebased against current master and hard-coded paths to
/prog/ corrected.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When clicking the login link for opac-user.pl in a multiple branch
scenario the environment variable for OPAC_CSS_OVERRIDE was ignored from
the koha-conf.xml file. It seems like is is working on every page in
the opac except for the login page.
Test Plan:
1) Set up a Koha server with 2 separate catalog configurations
( e.g. opac1.kohatest, opac2.kohatest )
2) Set the OPAC_CSS_OVERRIDE directive for separate css files
in each opac
3) Browse to the opac login page, note the css is not applied
4) Apply this patch
5) Reload the page, note the css is now applied
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
XISBN API uses normalized ISBN of a biblio to get a list of ISBNs
of related editions, then searched via SQL in database for biblios
with those ISBNs.
I noticed that if input ISBN has hyphens, the ISBNs returned by the
OCLC XISBN service also have hyphens; otherwise, if the input ISBN
doesn't have hyphens, the returned ISBNs don't either.
Consequently, an SQL query on biblioitems.isbn may not turn up
the right biblios.
Also, if biblio has several ISBNs, only first one can be found with
the original SQL query (isbn LIKE '$xisbn%').
This patch replaces SQL query by a simple search "nb=$xisbn". This will
find biblio from ISBN with or without hyphen.
Test plan :
- Activate FRBRizeEditions and XISBN sysprefs
- Go to a biblio witch has several editions
- Note its normalized ISBN (you may look in amazon links)
- Replace [ISBN] by biblio normalized ISBN in this URL : http://xisbn.worldcat.org/webservices/xid/isbn/[ISBN]?method=getEditions&format=xml&fl=form,year,lang,ed
- Go to this URL and see which ISBNs are returned
- Perform a simple search on thoses ISBNs : nb:1234567890
- Look at "Editions" tab
=> Check that diplayed biblios are the same you found by simple search
Signed-off-by: jmbroust <jean-manuel.broust@univ-lyon2.fr>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Removes the following exported but unused subs from Overdues.pm:
CreateItemAccountLine
UpdateAccountLines
CheckAccountLineLevelInfo
CheckAccountLineItemInfo
CheckExistantNotifyid
GetNextIdNotify
GetNotifyId
ReplacementCost
ReplacementCost2
GetOverdueDelays
GetOverduerules
Test plan:
It is hard to test the removal of something that was not used :) Try this:
Do a recursive grep on these routine names in the Koha codebase.
Compile some scripts that use the Overdues module.
And just to be sure we do not break something:
Go to Circulation: Do a checkout, checkin, place and confirm a hold.
Go to Patrons: Goto Check out. Goto Fines.
Run the command line scripts: fines.pl and overdue_notices.pl.
Go to opac-user.pl.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When displaying (in the OPAC) the set of records saved to a list,
an apparently random number (typically 33 or 34) is sometimes
displayed at the end of the publication description. In particular,
this can occur when XSLT is *not* being used to display search results;
this patch corrects the problem.
Now for the technical details:
This patch checks to see if "size" is undefined. If it is, we add a
blank (i.e. "") value to it in place of undef.
If we do not do this, calling "itemloo.size" will return the size
of the "itemloo" hash, rather than the value for the "size" key.
This is because "size" is a virtual method in Template Toolkit. It's
uncertain why the value is retrieved for the "size" key when there is
a defined value and why TT doesn't use the method instead, and that
it uses "size" as a method only if there is either no "size" key or
if the value tied to the "size" key is null/undef. This might be a
feature or it might be a bug in TT...
In the meantime, we will check to see if it's undefined. If it is,
we'll give it a value.
This bug has been identified in the opac-search.pl, search.pl and
addbooks.pl pages before. To address it, we're currently checking
if there is a "size" key, and if not...we're adding one with a blank
value.
This patch takes up that same idea, although I think it might be better
to rename the variable before passing it to TT in case the behaviour
of TT changes in the future in regards to how it handles virtual
methods.
N.B. Obviously, this only affects users not using XSLTs.
--
Test Plan:
Before applying the patch:
0) Make sure you have opac search result XSLT turned off
1) Find bib records that do not have a 300$c (Dimensions) value.
2) Find bib records that do have a 300$c (Dimensions) value.
(N.B. These values should be stored in the `size` column of biblioitems).
3) Add items from both sets of records to a List
4) Note that records without a 300$c will display a number at the end
of the "Publication" description/string. It should be something like
33 or 34 in most cases.
5) Note that records with a 300$c don't display this number. They just
show the value from 300$c.
Apply the patch.
6) Clear your cache, refresh the page, etc.
7) Note that the number (e.g. 33 or 34) has disappeared from the end
of the "Publication" description/string.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Added a small comment at the end of this one line.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Are not used. Contain several FIXMEs.
Removing them makes life easier.
Test plan:
Actually, you cannot test this.
But for confidence: do a Z3950 search in cataloguing and acquisition.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Use the Perl module Library::CallNumber::LC to parse and split
LC call numbers when generating spine labels.
For example, QH541.15.C6 C25 2012 should be split as follows:
QH
541.15
.C6
C25
2012
To test, create an item with call number QH541.15.C6 C25 2012
and classification source LC, then create a spine label for that
item using a layout of type 'biblio' that has the split call numbers
option enabled. The call number should be split as indicated above.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If a record has only one item, and that item has one item-level hold on
it, that hold will not show in the holds queue.
Test Plan:
1) Create 1 record with 1 item at BranchA
2) Create an item-level hold for that item, for pickup at BranchA by a
patron of BranchA
3) Run build_holds_queue.pl
4) View the holds queue for BranchA
5) Note the hold is not in there
6) Apply this patch
7) Re-run build_holds_queue.pl
8) View the holds queue again
9) Not that the hold is now there
Signed-off-by: George Williams <georgew@latahlibrary.org>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This module is currently packaged by Debian for Wheezy and by
Ubuntu for Precise and Quantal.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This replaces the previous hand-coded normalizer. Because
LC::CallNumber::LC appears to reject strings that aren't valid
LC call numbers, significant changes to the test cases were
made as well -- however, the one that really counts is the
last one which verifies the sorting.
To recalculate the call number sort key for each item, it is necessary
to run misc/maintenance/touch_all_items.pl
To test, create item records with the following call numbers, setting
the classification sort to 'lcc':
QC100 .U57 NO. 555 1986
QC145 .A57 V.12 1980
QC145.45 .H4 D65 1998
QC995 .E29 1997
Next, make a report of them in the inventory tool. The items should be sorted
in the above order.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes a bug whereby XSLT files from the
prog theme would be used (for English OPACs and staff
interfaces) even if the user had created and enabled a
custom theme that provided override XSLT files.
This patch provides a clearer implementation of the fallback
logic and adds test cases.
To reproduce the bug:
[1] Set OPACXSLTDetailsDisplay to 'default' and English as the OPAC
language.
[2] Create a new OPAC theme, including copying the XSLT files.
[3] Set opactheme to the new theme.
[4] Make a change to koha-tmpl/opac-tmpl/NEWTHEME/en/xslt/MARC21slim2OPACDetail.xsl
[5] View a bib record in the OPAC. The change made in the previous step
is not reflected.
To test after applying the patch:
[6] Reload the bib record in the OPAC. The change made in step 4 should
now be reflected.
[7] (To be thorough) Go through the test plan for bug 8947
and verify that there is no regression.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The fines.pl script uses the system preference CircControl to decide
what branches circ rules to use for fine generation.
Recently, code was added to the returns system to recalculate the fine
at checkin time ( to support hourly loans ). The problem is that this
code does not respect CircControl.
Test Plan:
1) Set circ control to "the library you are logged in at"
2) Set different fines rules for two different librarys
3) Check an item out at library A, backdate the due date so it's overdue
and will have fines.
4) Check the item in at library B
5) Observe that the fines should be generated based on library A's rules,
but the fines will be based on library B's rules instead!
5) Apply the patch
6) Repeat steps 3 and 4.
7) Observe now that the fines should reflect the fines rules for Library A
Note: it seems counter-intuitive for the fines system to behave this way
based on the preference being set to "the library you are logged in at"
but it does make sense. The rules used are from "the library you are
logged in at" when the item is first checked out.
If the fines system really did use the rules for the library the item was
returned to, it would be easy to exploit the library system. Some Koha
using systems have branches that charge fines, and others that don't, so
a patron could just return any overdue items to a non-charging branch
to avoid ever paying fines!
Furthermore, it would mean that the fines.pl script would be using one
set of rules to charge fines, and the returns system could possibly be
using another. Since fines.pl has been around far longer, it makes sense
to assume the fines.pl behavior is canonical.
Signed-off-by: Mickey Coalwell <mcoalwell@nekls.org>
Signed-off-by: George Williams <georgew@latahlibrary.org>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Merged with reservations; see comment on bug report for details.
On by default.
To Test
1/ Create an overdue item, that should get fines
2/ Return the item
3/ Check the borrowers record to see if the fine has been added/updated
Apply patch
1/ Make sure preference is set to do
Repeat steps 1-3 above
2/ Switch the preference to don't
Repeat stes 1-2
3/ Check the fine hasn't been added/updated
Signed-off-by: David Cook <dcook@prosentient.com.au>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass, works as described.
I would categorize this a bug fix for libraries that don't want
the new changed behaviour that was introduced by recalculating
fines on return.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Need to have invoiceid and pass it to retrieve selected
invoicenumber. Wrong data passed causing incorrect
records to be displayed
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Enable IndependantBranches
2) Apply this patch
3) Run updatedatabase.pl
4) Verify that the system preference still functions correctly
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Code will also respect notes when using the "Writeoff All" button but WILL NOT when using either the "Pay Amount" or "Pay Selected" buttons Fixed uri encoding of arguments
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Renamed that routine to GetItemCourseReservesInfo in
order to avoid any potential confusion with reserves
qua hold requests.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
ModItem currently will attempt to update an item
even if no field updates are specified. This patch
avoids (harmless) error messages in the Apache
logs if an item is not actually being changed when it
is placed or taken off reserve.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
New modules should not export any symbols by default
without a very good reason.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Adds a course reserves system for academic libraries.
The course reserves system allows libraries to create courses
and put items on reserves for those courses.
Each item with at least one reserve can have some of its attributes
modified while it is on reserve for at least one active course.
These attributes include item type, collection code, shelving location,
and holding library. If there are no active courses with this item
on reserve, it's attributes will revert to the original attributes
it had before going on reserve.
Test Plan:
1) Create new authorised value categories DEPARTMENT and TERM
2) Create a new course, add instructors to that course.
3) Reserve items for that course, verify item attributes have changed.
4) Disable course, verify item attributes have reverted.
5) Enable course again, verify item attributes again.
6) Delete course, verify item attributes again.
7) Create two new courses, add the same item(s) to both courses.
8) Disable one course, verify item attributes have not reverted.
9) Disable both courses, verify item attributes have reverted.
10) Enable one course, verify item attributes are again set to the
new values.
11) Edit reserve item attributes, verify.
12) Disable all courses, edit reserve item attributes, verify
the item itself still has its original attributes, verify
the reserve item attributes have been updated.
13) Verify the ability to remove instructors from a course.
14) Verify new permissions, top level coursereserves, with
subpermissions add_reserves and delete_reserves.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Corinne Bulac <corinne.hayet@bulac.fr>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
http://bugs.koha-community.org/show_bug.cgi?id=8125
- the dateformat value is send to all templates (from
C4::Auth::get_template_and_user)
- remove all assignment of dateformat in all .pl files
- Remove "all" occurrences (those I found!) of dateformat_*
From now the only way to get the date format is a string comparaison
(dateformat == "metric")
Checked with the command:
git grep "\(dateformat_us\|dateformat_metric\|dateformat_iso\)" | grep
-v translator
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Tested all the datepickers I could find, looks good.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adds allbaskets parameter to GetBasketsInfosByBookseller. (Only used in booksellers.pl now)
Normally, all 'active' baskets are shown. With allbaskets=1 all baskets :)
In the template I had to rename a loop var supplier to supplier1 to resolve
name conflict between template vars.
In the template I added the string: Cancel filter.
Note that this string is already translated:
msgid "Cancel filter"
msgstr ""
Hope this helps.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Undoing the filter works and I checked that the string gets
translated with the po files in current master.
So this is almost perfect, only we can't apply the filters
again and the link remains 'cancel' when we already did.
Sending a follow-up trying to fix this.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
The patch for bug 9523 added a JOIN to the biblio table when identifying
the best match so that if a matched record had been deleted it would
not hold up the import process. Unfortunately, this broke all authority
matching, since of course authorities don't appear in the biblio table.
This patch adds a join to auth_header as well, and decides which to
check based on the record type.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comment on third patch of this series.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
When introducing QueryParser, I introduced a check for QueryParser at
too high a level, causing authority matching to try and use SimpleSearch
for authorities prematurely, when SearchAuthorities should be handling
it. This patch corrects the level of the check. This patch only moves
three lines, but thanks to the change in if level, it adjusts the
indentation quite a bit.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on third patch of this series.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Prior to this patch, at more-or-less random intervals pages working
with notices will cease to function. To test:
1) Apply patch.
2) Try to edit some notices.
3) Trigger some notices.
4) If you were able to edit the notices and trigger the notices, sign
off.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
I did a regression script without Plack:
- edit, add, delete and copy notice
- trigger checkout/checkin notice
- print issueslip
No problems found.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Before patch you will see in acqui/booksellers.pl all the baskets ever
created.
After the patch you will see in acqui/booksellers.pl only the basket
with expected items.
Test plan :
* Create a basket with some orders lines
You should see this basket in acqui/booksellers.pl
* receive or cancel all the line in this basket
This basket shouldn't appear any more in acqui/booksellers.pl
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Simply revert order of 2 lines in Items.pm. Previous line write to
variable used in next line as function argument.
To Test:
Give an item a special restricted value, define one if you have to
in the authorised values.
Observe that, without this patch, statuses are not shown in the
OPAC in parentheses. My example was an item that had a restricted
value of "Library Staff Only"
It should have been shown under status on the detail page of the
OPAC, but was not.
Apply the patch, observe that restricted values are now shown
for your item, for example:
Available (Library Staff Only) in the status column.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and test plan.
Simple change fixing a display problem, no string changes.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Moving the warn line below the line that returns if amount<=0.
If amount<=0, a false warn is now raised because of the return after it.
We should only warn here if we do not return.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested with fines.pl on overdue.
Before this patch:
Reducing fine for item 199709 borrower 23 from 44 to -1 - MaxFine reached.
This did not happen however because of the return.
After this change: no false warning.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
If a patron has over time accumulated fines greater than the amount
defined in MaxFine, the patron will never get more fines even if they
have previously paid off those fines.
This bug was introduced by the patch for Bug 7420.
Test Plan:
1) Create a patron
2) Create a fine of 10.00 for that patron
3) Pay off the fine
4) Set MaxFines to 5.00
5) Check out an item to the patron, backdate the due date
so the item should generate fines.
6) Run fines.pl, observe that no fine was created
7) Apply the patch
8) Rerun fines.pl
9) Observe that the fine was created correctly
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Enable IndependantBranches
2) Set HomeOrHoldingBranch to holding branch
3) Delete an item whose holding branch is your logged in branch, and
whose home branch is not
4) Apply this patch
5) Repeat step 3, it should fail
6) Try to delete another items whose home branch is your logged in
branch, and whose holding branch is a different branch. This
deletetion should succeed.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Change is logical, only homebranch should determine if the item
can be deleted.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch restores the line exporting GetOrderNumber that I accidentally suppressed.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Restores a line that was deleted by the first patch.
Was not sure if patches should be squashed.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Revised patch according to QA comments. No more dependent from bz 9780.
At present, merging records breaks the link order/record, except
if an item of the deleted record is used in the order.
This is a serious issue for libraries creating items on receipt.
This patch moves existing orders from deleted record to destination record.
It creates a new function Acquisitions::GetOrdersByBiblionumber,
that could be used by other patches later.
To test :
Check the problem :
1. Set syspref AcqCreateItem = Create an item when receiving an order
1. Create a basket with one order
2. Put the record used by this order in a list
3. Put an other record in the list
4. Merge the 2 records, keeping as a reference the record NOT used in the order
5. In the order, you will see for that order "Deleted bibliographic information..."
6. Apply the patch
7. Repeat steps 1-4
8. In the order, you will see the title/author of the kept record.
9. Set syspref AcqCreateItem = Create an item when placing an order
10. Repeat steps 1-4 (an item will be created)
11. In the oreder, you will see the title/author of the kept record
(it is already the case at present. the patch should not alter this behavior)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Test plan, test suite and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
The original implementation of QueryParser did not handle truncation
based on the QueryAutoTruncate system preference. This patch adds support.
To test:
1) Apply patch.
2) Turn on UseQueryParser.
3) Set QueryAutoTruncate to "automatically."
4) Search for "har". Note that it returns results with words
like "Harry" (i.e. with right truncation).
5) Search for "har*". Note that it still returns results with right
truncation.
6) Set QueryAutoTruncate to "only when * is added."
7) Search for "har". Note that it returns only records that have the
exact word "har" in them (most likely there will be none unless you
have Hebrew items).
8) Search for "har*". Note that once again it returns results for "Harry"
(i.e. right truncated results).
9) Sign off.
This patch also reindents a hash in Koha/QueryParser/Driver/PQF.pm
because it was hard to read before.
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Thx for fixing this Jared!
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Making sure that the regex does not kill more than it should.
Amended: does now only look at separating colons(;) not commas(,).
Amended: two index expressions in direct context replaced by same regex for
consistency.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
When manual history is disabled in subscription history section
if a serial has been previously set as missing and is received
or set as expected, late or claimed, it will be deleted from missinglist
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Regex needs a followup. More comments on Bugzilla.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
I just add use utf8; to the Search.pm and the problem
was solved .
Test plan :
1- Add bib records with non-latin characters
2- search for some of these records
3- try to refine your search using Subject / Author
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work fixing URLs in facets. Now they work correctly.
No errors.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
I tested facets with the 22 Arabic records provided on
bug 9579 successfully. Before the patch the links are not
correct, after applying the patch the links work as
expected.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Remedied by:
- in Circulation.pm changing AnonymiseIssueHistory so that it returns ($rows, $err_history_not_deleted) instead of $rows
- consequential change to misc/cronjobs/batch_anonymise.pl to handle updated return value, and fail if there is an error
- consequential change to tools/cleanborrowers.pl although this still fails silently (raised as bug 9944)
- update of opac-privacy.pl to check return value and pass on error
- update of opac-privacy.tt to display error if appropriate
Note bug 9942 remains unfixed, which is a similar issue upon issue return.
To test:
1. OPAC
- enable privacy mode (preference OpacPrivacy)
- leave anonymous patron set to zero (preference AnonymousPatron)
- attempt to delete user history
- observe error
- check history - still there
- change anonymous patron to a valid user
- attempt to delete user history
- observe success message
- check history - gone
2. cleanborrowers.pl
- test it functions as before. bug 9944 has been raised for it continuing to silently fail.
3. batch_anonymise.pl
- enable privacy mode (preference OpacPrivacy)
- leave anonymous patron set to zero (preference AnonymousPatron)
- run script (I use --days -1 for testing)
- script should fail with a Carp message
- change anonymous patron to a valid user
- run script as before
- script returns quietly
- check history - gone
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
The SQL query build in C4::Items::_koha_modify_item performs an update on a row of items table identified by itemnumber.
Actually the query is build using a hash of datas :
for my $key ( keys %$item ) {
$query.="$key=?,";
push @bind, $item->{$key};
}
But this hash contains 'itemnumber' key, so you get an update including the primary key.
It is actually harmless but may be dangerous.
This patch simply skips itemnumber key in above loop.
Test plan :
Check you can create and modify items.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This bug was reintroduced by the patch for bu 5911: Transport Cost Matrix
Test Plan:
1) Place a hold on a record
2) Run build_holds_queue.pl
3) Verify the hold is showing in the holds queue
4) Suspend the hold
5) Re-run build_holds_queue.pl
6) Note the hold is still in the holds queue
7) Apply patch
8) Re-run build_holds_queue.pl
9) Note the hold is no longer in the holds queue
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes test plan and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch replaces an earlier patch by Marcel de Rooy, which
had become outdated because lots of new dependencies were
added since the patch was made.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Bug 9902 introduced an issue in the C4::Items::PrepareItemrecordDisplay
routine. The existence of $defaulvalue hashref should be tested before
getting to the branchcode key.
Test plan:
Before applying the patch, an error occurred when you try to create an
order from a staged file.
After applying the patch, the error does not appear anymore.
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Adding a FIXE at a line that uses $sth->{NAME} for possible utf8 problems.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Adds a comment, no danger from that.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch makes ParseLetter somewhat more restrictive in removing
punctuation characters from the end of a table field.
Based on the assumption that we want to remove punctuation from fields in
biblio and biblioitems (like ISBD).
ParseLetter should not remove e.g. a parenthesis in itemcallnumber, but still
removes e.g. a colon (:) at the end of a title.
Removed an unneeded global and lookahead from the regex.
Test plan:
1) Add a colon (:) to the end of a title.
2) Add a colon to the end of item copynumber.
3) Place a hold on that item. Check it in. Confirm hold.
4) Check the email or print notice generated. There should be no colon at the
end of the title, but the colon in the copynumber should still be there.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I compared checkout notices with lots of different fields before
and after applying the patch. For example the ) at the end of a
field in branches is now longer removed. Other fields looked ok
before and after.
Passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
If you do not have ccode or location governed by authorized value
(you can release this default connection in the marc structure),
these item values are not passed through in the items section,
created by buildKohaItemsNamespace for XSLTParse4Display.
This simple patch checks if the authorized value hash on ccode or
location returns something and passes the original value in otherwise.
Test plan:
Temporarily disconnect ccode and location from authorized values
in MARC structure.
Edit an item, put some values in location and ccode.
Look at this record via opac search (XSLT enabled). Toggle the value of
OPACItemLocation to show ccode or location before call number.
Restore authorized values-connection when applicable.
Note: Since bug 9995 adjusts OPAC XSLT Results, it may be helpful
to apply these
patches when testing this.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
This works as described for the XSLT result list.
The text is shown when OpacItemLocation is set to show collection
or location.
Note: Displaying location and collection without using authorised
values doesn't work in other places like the detail page item table.
So this will need more work to be fully functional.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
A user might create a SQL report that relies on non-existent authorised value categories.
Because of a typo, or just because they copy&pasted the report from the Wiki.
Use cases are:
- The user creates a report from SQL
a) Uses bad authorised values
b) Clicks 'Save Report'
c) Koha lists the problematic authorised values
d) The user decides to
e-1) Save it anyway, it gets saved
e-2) Edit the report, it gets back to where it chose 'Save Report'
- The user edits an already saved report (Update SQL)
a) Uses bad authorised values
b) Clicks 'Update SQL'
c) Koha lists the problematic authorised values
d) The user decides to
e-1) Save it anyway, it gets saved
e-2) Edit the report, it gets back to where it chose 'Update SQL'
- The user tries to run a saved report that contains bad authorised values, Koha advertises the problem and provides the user with a button 'Edit SQL' to fix things.
To test, just create a report from SQL using invalid authorised values like this (misspelled 'branch'):
SELECT *
FROM itemtypes
WHERE hola=<<Test branch1|branchee>> AND
hola2=<<Test branch2|brancha>>
Regards
To+
Notes:
- I added several comments on the code.
- Fixed an annoying warning of uninitialised variable also (refactored some tiny bits to do it).
- Added the following methods
- C4::Reports::Guided::GetReservedAuthorisedValues
- C4::Reports::Guided::GetParametersFromSQL
- C4::Reports::Guided::IsAuthorisedValueValid
- C4::Reports::Guided::ValidateSQLParameters
- C4::Koha::IsAuthorisedValueCategory
- Those methods could have been used to refactor this guided reports code as its *a bit messy*. I chose to do it in a new bug of course :-D.
- Fixed some trivial perlcritic -5 errors
- Removed some debugging stuff left by mistake
- Fixed some POD problems
- Optimal SQL-driven IsAuthorisedValueCategory method
- Thanks to Owen and Jared for their patience heh.
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. No koha-qa errors.
Test:
Tried with examples (from help and test plan) reports, correctly
identifies invalid authorized values, and no problem with
authorized ones.
NOTE: Online help for this does not states that partial values
need to be between '%' in a SQLish way. Perhaps this could be
addressed inserting % in values or adding a checkbox (partial|exact).
Or changing help.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
To test:
Use attached XSLT stylesheet for OPAC Results
* set your OPACXSLTResultsDisplay to use the attached stylesheet. The path is the FULL PATH, from /, to the file.
* be sure to copy MARC21slimUtils.xsl to the same folder, or change the path in the attached one to point to the correct path on your filesystem.
Verify that the OPAC results now show the holding branch instead of the home branch.
Possible fail states:
* no branch is shown (only call numbers, if given)
* the wrong branch is shown
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Stage a MARC record file that will have matches with existing records
2) Delete the bib from Koha that was matched on
3) Attempt to import the records into Koha, the import will hang
4) Apply the patch
5) Reload manage-marc-import.pl and attempt to import again, this time it should succeed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
When QueryWeightFields is enabled, the searching query is created with several options.
In C4::Search::_build_weighted_query, when no index is defined, the query is build with fuzzy and stemming options. When an index is defined, theses options are missing, only unconditional right truncation is used.
The consequence is that when QueryStemming is disabled, a search with index can give more results (due to right truncation) that a search without.
This patch adds stemming and fuzzy on search with index, conditioned with QueryFuzzy and QuerryStemming sysprefs.
Also changes world list search (wrld) weight to r6 in order to set fuzzy search to r8 and stemming search to r9 (like search without index).
Test plan :
- Go to searching preferences (admin/preferences.pl?tab=searching)
- Set QueryAutoTruncate to "only if * is added"
- Set QueryFuzzy and QuerryStemming to "Don't try"
- Set QueryWeightFields to "Enable"
- Go to advanced search page
- Select an indexe (ie Title) and perform a search on a short word
=> Look at zebrarv log and see that query does not contain right truncation : @attr 5=1
- Set QueryFuzzy to "Try"
- Perform same search
=> Look at zebrarv log and see that query contains fuzzy : @attr 5=103
- Set QueryFuzzy to "Don't try" and QuerryStemming to "Try"
- Perform same search
=> Look at zebrarv log and see that query contains rigth truncation on stemmed word : @attr 5=1
Signed-off-by: koha.aixmarseille <koha.aixmarseille@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
This patch makes Fuzzy and Stemming influence search results on weighted
queries when using an index. Side-effect is however that the results for a
search like index=term* (add truncation manually too) could be LOWER than the
the number of hits for index=term. Further comments on Bugzilla.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Catalog a new record with an ISBN
2) Add some items to the record
3) Download the record as MARCXML
4) Delete the itemnumbers from the 952 fields in the record,
Change the barcode fields to unused barcodes
5) Use xml2marc to save as a standard MARC file
6) Import the record using the 'Stage MARC for import' tool
Use the settings:
Record matching rule: ISBN
Action if matching record found: ignore
Action if no match found: ignore
Item processing: always_add
Check for embedded item record data?: Yes
How to process items: Always add items
7) Import, note the bib is ignored, and the items are not processed
8) Undo import into catalog
8) Apply this patch
9) Import this batch into the catalog
10) Note the items were processed and are now added to the matching
record
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Tested with UNIMARC record. I followed the test plan, just changing 952 by 995
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Switch off the IndependantBranches syspref
2) Log into the OPAC
3) Place a suggestions
4) Instead of seeing your suggestion, you will see "There are no pending
purchase suggestions."
5) Apply this patch
6) Reload the page
7) You should now see your suggestions
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Decodes userid on two places in checkauth of C4/Auth.pm
Test plan:
Include some non-Latin characters in your userid (loginname). Arab, Chinese?
Login into opac and check user page.
Go to staff (no new login), check your login name at various places.
Logout, login via staff.
Do the same.
Go to opac again (no new login), check user page.
Optionally: Remove all your sessions from table. Do a login. Check sessions.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Works as described. No errors.
This patch fixes this problem, but I wonder if
there is a general solution that handle all as utf8.
Tested in opac and staff.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Bug 6554 patched output_html_with_http_headers to encode utf8 data, and Templates.pm to expect utf8 data to be encoded.
(At least) the staff login screen outputs directly to STDOUT (Auth.pm does, WHICH IS WRONG!) and wasn't fixed to do the encoding first.
This patch makes it use output_html_with_http_headers and solves the problem.
Changed 'use' for 'require' as jcamins and marcelr suggested.
Regards
To+
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Error 'Modification of a read-only value attempted' triggered
on login because of manipulation of $_ in the map
Moved the mod to a loop as recommended in the doc for map
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Searching for stdid: Standard ID, srchany: RAW (any) somehow did not work
anymore.
Probably my fault :) Note that these two fields are in Cataloging Z3950 search
and not in Acquisition.
Fixing encoding problems: When adding -utf flag for CGI in acqui/z3950 and
cataloging/z3950, the decoding statements in C4/Breeding, Z3950Search should be
removed.
Test plan:
Search in Cataloging with:
Standard ID: 9782358670043 on LOC
RAW (any): musee [add an accent aigu on first e] on LOC -- Add diacritic!!!
Search in Acquisition
Somewhere, does not matter, but use a diacritic.
A note: My git version still has a hard time with utf8. Need to upgrade to version 1.7.10 to resolve this..
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. No errors
Without patch z39.50 search for example Std ID OR musee gives no results,
with patch there are.
No problems in acq search.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Good catch, passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
If a lost item fee is owed and partially paid off when an item is
returned and a refund is processed, Koha tries to pay off existing
fees before adding any leftover balance as a credit on the account.
However, those fee payments aren't actually processed, due to a bug
where the accountnumber was quoted as a string literal, and thus the
for the fee payment would fail. This did not result in an DB error,
as the query was still valid SQL. Checking the return value of the
query would have revealed that the accountline had not been updated.
History:
This bug was introduced on April 23, 2007 with the commit
'reintroducing fixaccountforlostandreturned as requested by rosalie'.
Commit id 111d590e9c
On July 30, 2009 the error was flagged with a FIXME and remained
in that state until now.
Commit id 51e8fc2cb6
Test plan:
1) Create a test patron
2) Check out an item to that patron and give it a due date in the past
3) Run fines.pl to generate the fine for the item
4) Mark the item long overdue on the item tab (not in edit items)
5) Pay for the lost item
6) Check the item back in
7) Note the incorrect fines. Only the lost fee balance due is refunded,
not the entire lost fee, but no other fees are paid off.
8) Apply the patch
9) Repeat steps 1-6, then not the fines are paid correctly
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Totel due is correct after applying the patch.
All tests and QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Create a new serial with a default location, call number, and library
( the library will need to be any but the one you are logged in as )
2) Click the "Recieve" button for this serial
3) Click "Click to add item"
4) Note those values are not populated
5) Apply the patch
6) Reload the page
7) Click "Click to add item"
8) Note those values are now populated
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
All tests pass!
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
This seems to restore the former behaviour.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
billingplace and freedeliveryplace are missing in C4::Acquisition::NewBasketgroup.
Test plan :
- Go to a vendor basket groups
- Create a new basket group
- Enter a name
- Choose a billing place
- Do not choose a delivery place in combobox but enter a text in delivery place textarea
- Enter a comment
- Save
- Edit created basket group
=> Check that billing place and free delivery place are ok
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Works according to test plan, delivery place is now
correctly saved into the databas and was before lost.
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This bug was due to a difference in field names used in the item data
for items versus patrons. This patch adds a ternary to discern between
the two.
To test:
Before applying patch:
1. Create a batch of patroncards with one duplicate.
2. Run the de-duplication on the batch.
3. Note that all patrons beyond the first in the batch are now
deleted.
After applying patch:
4. Repeat steps 1-2.
5. Note that only the duplicate patron is removed.
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Tested successfully with both patron card batches and label batches.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Add indentation for readability
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Some patron infos were hard coded instead
of using the variables defined in SIPtest.pm
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Removing binmode, now encoding data in output_with_http_headers.
Replaced output_string by output_as_chars in XSLTParse4Display.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No errors.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
In current implementation (mostly commented out in this patch)
uses heuristic to guess which strings need decoding from utf-8
to binary representation and doesn't support utf-8 characters
in templates and has problems with utf-8 data from database.
With this changes, Koha perl code always uses utf-8 encoding
correctly. All incomming data from database is allready
correctly marked as utf-8, and decoding of utf8 is required
only from Zebra and XSLT transfers which don't set utf-8 flag
correctly.
For output, standard perl :encoding(utf8) handler is used
so it also removes various "wide character" warnings as side-effect.
Test scenario:
1. make sure that you have utf-8 characters in your biblio
records, patrons, categories etc.
2. try to search records on intranet and opac which contain
utf-8 characters
3. install language which has utf-8 characters, e.g. uk-UA
dpavlin@koha-dev:/srv/koha/misc/translator(bug_6554) $
PERL5LIB=/srv/koha/ perl translate install uk-UA
4. switch language to uk-UA and verify that templates
display correctly
5. test search and Z39.50 search and verify that caracters
are correct
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
I followed the test plan, adding utf-8 characters to library names,
patron categories, titles, and authorized values. I tried the uk-UA
translation and everything looked good.
When performing Z39.50 searches for titles containing utf-8 characters I
got results which were still occasionally contaminated with dummy
characters [?] but I assume this is Z39.50's fault not the patch's.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Already signed, add mine.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Currently, if you install the Norwegian translations and run
through the web installer in Norwegian, choosing NORMARC as you
marcflavour, the marcflavour syspref is set to MARC21.
To test:
- Apply the patch
- Install nb-NO
- Run through the web installer, choosing nb-NO as the language
- Choose NORMARC as the MARC dialect
- When the web installer is done, check the value of the
marcflavour syspref. It should be NORMARC, not MARC21.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Did a regression test installing UNIMARC too.
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
For limiting XISBN API use, XISBNDailyLimit syspref is used to compare
with daily use count of the API. This count is stored in
services_throttle table. But this table content is never initialized,
not in installer nor in updatedatabase. So count is never increased and
API is used without limit.
This patch add an insert of service type in services_throttle if not
existing. So service throttle will be initialized.
Test plan :
- Check that you don't have a line in services_throttle for
service_type=xisbn
- Activate FRBRizeEditions and XISBN sysprefs
- Set a small number in XISBNDailyLimit (ie 5)
- Go to a biblio page (with ISBN)
- Look at services_throttle table
=> you should have a line for service_type=xisbn with service_count=1.
- Refresh biblio page untill limit is reached
=> service_count should be equal to limit for service_type=xisbn
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Passes test plan correctly.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Fixes problem, tested according to test plan.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Exporting to Bibtex from OPAC returns a software error.
This is because call to C4::Biblio::GetMarcAuthors does
not return only authors but also authority link.
This patch replaces this call by a direct read of
MARC::Record, like for other Bibtex datas.
C4::Biblio::GetMarcAuthors is really destinated to a
direct use in a template.
Also, actually all author subfields are joined with
'and'. According to Bibtext format, authors should be
"firstname surname and ..." or "surname, firstname and
...". I have choosen second one because in non-UNIMARC
it corresponds to $a content.
For example UNIMARC :
700 $aDoe $bJohn
700 $aDoe $bJanne
Gives : Doe, John and Doe, Janne
For example MARC21 :
700 $aDoe, John
700 $aDoe, Janne
Gives : Doe, John and Doe, Janne
Test plan :
Without patch :
Exporting to Bibtex from OPAC returns a software error.
With patch :
Exporting to Bibtex from OPAC succeeds.
Authors are composed using : $a, $b and ... for UNIMARC, $a ... for other marc flavours.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Works as decribed. All record export that produces
error pre-patch, now export without error.
No koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes error and output of additional authors.
Main entry in 100 is still missing.
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
If XSLTResultsDisplay is enabled and items in your search results lack a
shelving location or a ccode errors will appear in the log complaining
of "uninitialized value in hash element." This patch adds a check on
these values to quiet the errors.
To test, find or create a record with items which have no shelving
location and/or no collection code. Perform a search the results for
which will include your record. Check for errors in the log.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Cleans up logs for result list quite a bit.
Passes all tests and QA script.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Apply patch
3) Create suggestions for multiple libraries
2) Select 'Any' for the 'For' field under 'Acquisition information'
3) Note you are seeing the suggestions for all branches
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Works as described
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch fixes wrong behaviour.
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Test Plan:
1) Enable AutomaticItemReturn
2) Place a reserve on a record where the holding and home branches differ
for the available items
3) Rebuild the holds queue
4) Check the holds queue, verify the item is listed in the items to pull for the item's home branch
5) Disable AutomaticItemReturn
6) Rebuild the holds queue
7) Verify the item is listed in the items to pull for the item's holding branch
8) Enable AutomaticItemReturn
9) Apply patch
10) Rebuild the holds queue
11) Verify the item is listed in the items to pull for the item's holding branch
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Tested per plan, and the patch seems sane. Functionality of the hold queue is restored to previous behaviour.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Amended test plan to make clear it has to be a record level
hold for the test plan to work.
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
To test:
1) check perl syntax on file
$ perl -cw ./misc/migration_tools/koha-svc.pl
you should *not* get 'syntax OK' returned from command
2) apply patch, and install File::Slurp module
$ sudo cpanm File::Slurp
3) check perl syntax on file
$ perl -cw ./misc/migration_tools/koha-svc.pl
you should now get 'syntax OK' returned from command
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Without File:Slurp 1) complains of missing module,
with module then sintax Ok.
No errors.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Changed module from required to optional. The two files in which
it is used are extraneous from the viewpoint of the average user.
This reverts commit 60508cb03d, reversing
changes made to 8579d07f14.
The patches for bug 7688 caused a failure in t/db_dependent/Serials.t:
not ok 8 - test getting history from sub-scription
Conflicts:
installer/data/mysql/kohastructure.sql
installer/data/mysql/updatedatabase.pl
kohaversion.pl
If a supplier is defined for a subscription, you cannot order this
subscription to another supplier. If no supplier is defined, you can.
FIX: If a cancelled order is linked to a subscription, you can order it.
Signed-off-by: Leila Arkab <koha.aixmarseille@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
DB changements:
- Adds 2 fields: subscription.reneweddate and aqorders.subscriptionid.
- Removes 2 unused fields: aqorders.serialid and aqorders.subscription.
Main test plan:
1) Create a subscription
2) Create a bookseller and a basket
3) Add a new order 'from a subscription'
4) Search your subscription and check if results are correct
5) Click on the "order" link
6) Check the biblio information are filled in the form
7) Select a budget and fill some price information.
8) retry steps 3 and 4. Verify you cannot order the same subscription.
Message:Outstanding order (only one order per subscription is allowed).
9) click on your subscription (already added) and check you have a new
table "Acquisition details" with your price information in the "Ordered
amount" line.
10) receive this order
11) On your subscription detail page, the "Spent amount" line must be
filled with your price information.
12) Re order the same subscription. Now you are allowed to. Prices
information have to be filled with the previous information.
13) Retry some orders and click on a maximum of links in order to find a
bug :)
Signed-off-by: Leila Arkab <koha.aixmarseille@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on last patch.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
It can happen when the Expected issue is claimed. In this case the
status of the last serial is 'Claimed'
This patch change the API of GetNextSeq and GetSeq
Test plan:
- Create a subscription which starts a long time ago so that serials
automatically appear in late issues
- Receive the first serial
- Go to claims page and claim the 2nd serial.
- Go back to the subscription page and click on 'Serial collection'
- You should have 2 serials, one 'Arrived' and one 'Claimed'.
- Click on Generate Next. This should fail with a software error message
("can't call method output ...")
- Apply this patch and click again on Generate Next. A new issue must be
created with status 'Expected'.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
- # Subroutines::ProhibitExplicitReturnUndef: Got 1 violation(s) in
C4::Serials::GetSubscriptionIrregularities
- Bad template constructions fixed in serials/subscription-add.tt
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Serials numbering pattern and frequencies are no more hard-coded. Now
it's possible to create, edit and delete numbering patterns (and
frequencies). This implies new sql tables (subscription_numberpatterns
and subscription_frequencies)
Numbering patterns behave almost as before, there are still the same
values to configure (addX, everyX, settoX, whenmorethanX). lastvalueX
and innerloopX remain in subscription tables.
There is a new value in numbering patterns: numberingX. For each
"column" (X, Y or Z) you can tell how to format the number. Actually
numberingX can be set to:
- 'dayname' (name of the day) (0-6 or 1-7 depending on which day is the
first of the week)
- 'monthname' (name of the month) (0-11)
- 'season' (name of the season) (0-3) (0 is Spring)
These names are localized by using POSIX::setlocale and POSIX::strftime
and setting a 'locale' value to the subscription. Locale have to be
installed on the system.
Note that season names are not localized using POSIX::strftime (it can't
do this), so names are hardcoded into the code (available languages: en,
fr). This could be fixed in the future by using a Perl localization
framework.
Frequencies can be configured using 3 parameters:
- 'unit': one of 'day', 'week', 'month', 'year'
- 'issuesperunit': integer >= 1, the number of received issues per
'unit'
- 'unitsperissue': integer >= 1, the number of 'unit' between two
issues
One of 'issuesperunit' and 'unitsperissue' must be equal to 1.
Examples:
unit = 'day', issuesperunit=3, unitsperissue=1 => 3 issues per day
unit = 'week', issuesperunit=1, unitsperissue=3 => 1 issue each 3
weeks
Prediction pattern is now computed server-side and is more consistent
with what Koha will do. The publication date is displayed alongside the
serial number.
Irregularities can now be checked one by one, in the prediction pattern
table, or if frequency is 'day-based' (unit is 'day'), there is the
possibility to check all issues for a week day at once.
When an irregularity is found, there is the possibility to keep the
serial number unchanged, or to skip it. It is configured at subscription
creation or modification.
For instance, with a daily subscription you can have:
skip serial number | keep serial number
----------------------+----------------------
2012-01-01 ¦ No 1 | 2012-01-01 ¦ No 1
2012-01-03 ¦ No 3 | 2012-01-03 ¦ No 2
To lighten the subscription modification page, manual history has been
moved in its own page subscription-history.pl which is accessible on
subscription-detail.pl, tab 'Planning'.
Important note: updatedatabase.pl script takes into account existing
subscriptions and create appropriate numbering patterns for them (it
tries to create as few patterns as possible). Frequency is
mapped to the correct entry in subscription_frequencies table.
This patch includes kohastructure.sql and updatedatabase.pl changes
+ sample frequencies data and sample numberpatterns data for fresh
installs (sample data is included in updatedatabase.pl)
=== TEST PLAN: ===
Create a new subscription:
- Go to Serials module and click "New subscription" button
- On the first page, choose a biblio and click next to go to the
second page
- Pick a first issue publication date
- Choose frequency '1/day'
- Choose a subscription length of 15 issues
- Choose a subscription start date
- Choose numbering pattern 'Volume, Number'
- A table appears, fill 'Begins with' cells with '1'
- Click on 'Test prediction pattern' button
The prediction pattern is displayed at the right of the page. You can
see in it the serial number, the publication date and a checkbox to
allow you to choose which serials will not be received (irregularities).
You can see that serial number start from "Vol 1, No 1" continue to "Vol
1, No 12" and then restart with "Vol 2, No 1".
Frequency is '1/day' so you can see that publication date is incremented
by one day line after line.
- Now you can play a little with frequencies and numbering patterns,
change one of them (or both) and click again on 'Test prediction
pattern'
- For example, choose frequency '3/weeks' and click on 'Test
prediction pattern' button'.
There is a little behaviour change compared with current master.
Publication date will not be guessed within the week. Koha can't know
when you will receive issues. So the publication date stay the same
(monday of each week) for 3 consecutive issues and then jump to the next
week.
- Now choose frequency '1/3 months' and numbering pattern 'Seasonal'
- Fill 'Begins with' cells with '2012' for Year and '0' for Season
- Click on 'Test prediction pattern'
- You should have something like 'Spring 2012', 'Summer 2012', ...,
'Winter 2012', 'Spring 2013'
- Note that you can have seasons for south hemisphere by entering '2'
in 'Year/Inner counter'
- 2nd note: if you have some locales installed on your system, you can
type its name in the 'Locale' field (actually it does not work for
seasons name, only for month names and day names)
If you want to modify the numbering pattern you can still do it here:
- Click on 'Show/Hide advanced pattern' link. The advanced pattern
table is shown but all fields are readonly
- Click on 'Modify pattern' button. All readonly fields are now
editable. Note that 'Begins with' and 'Inner counter' line are
repeated here and any modifications in the small table will be
replicated in the big table, and vice versa.
- Pattern name is emptied, if you type a new name, a new pattern will
be created, and if you type the same name as an existing numbering
pattern, this one will be modified (with a confirmation message)
- There is two new lines in this table:
- Label: it's what is displayed in the smaller table headers above
- Numbering: used to format numbers in different ways. can be
'seasons', 'monthname' or 'dayname'. Month name and day name can be
localized using the 'Locale' field. Seasons can't (values for
english and french are hard-coded in Serials.pm)
- You can modify what you want in the table and click on 'Test
prediction pattern' button each time you want to see your
modifications. (Note that checkboxes for irregularities aren't displayed
in this mode, and you can't save the subscription until you have saved
or cancelled your changes).
- To cancel your modifications, just click on 'Cancel modifications'
button.
- To save them, click on 'Save as new pattern'. If the pattern name is
already existing, a confirmation box will ask you if you want to
modify the existing numbering pattern. Otherwise a new pattern will be
created and automatically selected.
Once you have finished modifying numbering pattern. You can click again
on 'Test prediction pattern' to define irregularities, and then click on
'Save subscription'.
Now you can check the serials module still works correctly:
- Check the subscription detail page to confirm that nothing is
missing. Especially the 'Frequency' and 'Number pattern' infos
- Try to receive some issues. Check that the serial number is correctly
generated and if irregularities you have defined are taken into
account (if you have defined some).
- Check that receiving is blocked once you have reached the number of
issues you have defined in subscription length (or once you have
reached the subscription end date)
In serials menu (to the left of almost each page of serials menu) you
have two new links: 'Manage frequencies' and 'Manage numbering
patterns'.
'Manage numbering patterns' lead to a page which list all numbering
patterns and allow you to create, edit or delete them. The interface is
almost the same as numbering pattern modification in subscription-add.pl
'Manage frequencies' lead to a page which list all frequencies and allow
you to create, edit or delete them.
Try to create a new frequency:
- Click on 'Manage frequencies' link in the serials menu and then click
on 'New frequency':
- Fill in the description (mandatory).
- Unit is one of 'day', 'week', 'month', year' or 'None' ('None' is for
an irregular subscription)
- If unit is different from 'None' you have to fill the two following
fields (Issues per unit, and Units per issue)
- Note that at least one of those must be equal to 1
- Issues per unit is the number of received issues by 'unit' and Units
per issue is the number of 'unit' between two issues
- Display order is used to build the drop-down list. Leave empty and it
will be set to 0 (top of the list)
- Then click on 'Save'
- Check that this new frequency appears in the frequencies table and in
the drop-down list in subscription-add.pl
Subscription history has been moved in its own page. To test if it still
works, choose a subscription with manual history enabled (or modify an
existing subscription to turn on manual history).
- On the detail page, tab 'Planning', you should have a link 'Edit history'.
- Click on it
- Modify history and click on Save
- In tab 'Summary' you should have the infos you just entered
And finally, you can check that old subscriptions (by old I mean
subscriptions that existed before the update) are correctly linked to an
existing numbering pattern and an existing frequency. Numbering patterns
should be named 'Backup pattern X' where X is a number.
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Great development! Work as described. No koha-qa errors
(with all patches applied). Please QA this fast.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes problem found by QA scripts.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch aims to solve the LDAP bind authentication method. Here are
some considerations:
- This is a standalone patch, so all the previous submitted ones are
rendered obsolete;
- LDAP bind authentication is now done in 3 steps:
1 - LDAP anonymous bind;
2 - LDAP search entry for the given username;
3 - LDAP bind with the DN of the found entry + the given password.
- The process fails if none or more than 1 entries are found for the
given username;
- The <principal_name> setting in koha-conf.xml isn't used anymore;
- The patch is backwards compatible, so users already using the
previously implemented LDAP bind authentication should be able to use
it the same.
http://bugs.koha-community.org/show_bug.cgi?id=7973
Signed-off-by: Vitor Fernandes
Signed-off-by: Dobrica Pavlinusic <dpavlin@rot13.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script and has 2 solid sign offs.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch adds the ability to add groups to the library select
pulldown on the opac, if it is enabled.
Test Plan:
1) Apply patch
2) Run updatedatabase.pl
3) Go to Administration › Libraries and groups
4) Create a new group, or edit an existing one
5) Ensure the 'Show in search pulldown' checkbox is checked
6) Save the group
7) Enable OpacAddMastheadLibraryPulldown if it is not already enabled
8) Load the OPAC, try the group search from the libraries pulldown menu
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Yes! Now this works, and well.
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Before this patch, the GetItemIssue routine returns items.renewals
instead of issues.renewals
Signed-off-by: Broust <jean-manuel.broust@univ-lyon2.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Broust <jean-manuel.broust@univ-lyon2.fr>
I tested one more time on a sandbox but I confirm the problem:
the due date doesn't change when you renew more than one time
with syspref renewalperiodbase turned on: "the old due date of
the checkout". The due date should change any time.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Added signed-off line and problem description from bugzilla.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch adds some unit tests for CalcDateDue and GetLoanLength
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Tests done:
- Checked update works correctly for existing circulation rules.
- Adding, deleting and overwriting circulation rules works.
- Renewals work for different circulation rules and changes
to the holiday calendar.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Renew an issue for a number of days (filled in the issuing rules).
Test if rules work for any i[item]types and if there is no regression.
- new column issuingrules.renewalperiod
- remove all occurrences of an already removed syspref (globalDueDate)
- remove an unused routine (Overdues::GetIssuingRules)
How it works:
- On existing installations, the issuingrules.renewalperiod =
issuingrules.loanlength. So the behaviour is the same before and after
this patch.
- when you add a rule, you can choose a renewal period (the unit value
is the issuingrules.unit). So you can have a renewal period in hours
or days.
- The default value for the renewal period is 21 days (same as
loanlength)
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Test comments on second patch.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This reverts commit af89f98a7a, reversing
changes made to cc49dc70fb.
The changes made by bug 8089 caused problems with the
t/db_dependent/Context.t unit test. Given the proximity of feature
freeze and release I am exercising my RM privilege and reverting rather
than seeking to fix the problems right this moment.
The primary advantage to the Firefox offline cirulation plugin when compared
to the offline circulation desktop application, is the ability to add offline
circulation actions to a queue so that multiple machines running offline
circ can have their circ actions combined and ordered chronologically before
being executed. This commit adds the ability to put actions from uploaded
KOC files into this queue. In this way, both the FF plugina and the desktop
application can be run side by side with no ill effects.
Signed-off-by: Bob Birchall <bob@calyx.net.au>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
See the script's documentation for more details
New parameters are:
- authtypes
- filter
- insert
- update
- all
Signed-off-by: Pascale Nalon <pascale.nalon@gmail.com>
This patch is live in Mines ParisTech since 2012-07-24.
Signing off
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
- Moved the sign-off from bugzilla to the commit message.
- All tests and QA script pass.
- Amended commit message to list new parameters.
- Verified this patch works on a UNIMARC installation.
- Verified normal import still works correct on a MARC21
installation.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>