This patch expands t/db_dependent/Search.t to run
the same tests using both the GRS-1 and DOM indexing
modes. It also adds hooks in zebra_config.pl to make
it easier to stage test cases for non-MARC21 Zebra
indexing.
Note that in DOM mode one of the tests is currently a
TODO, as relevance ranking for wegihted queries differs
between GRS-1 and DOM.
To test:
[1] Verify that prove -v t/db_dependent/Search.t passes.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
[1] When running t/db_dependent/Search.t, veify that no warnings like
this are shown:
15:52:07-10/10 zebraidx(2006) [warn] Index 'Number-music-publisher' not found in attset(s)
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes biblio-zebra-indexdefs.xsl files.
It was generated from biblio-koha-indexdefs.xsm with the new
koha-indexdefs-to-zebra.xsl amended by F. Démians's patch.
To test :
- Take a DOM UNIMARC Koha
- Apply all the patchs of 8252 bug, including this one
- Copy src/etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
to your etc/zebradb/marc_defs/unimarc/biblios/ located in your
installation directory
- Run rebuid_zebra -b -x -r -v
- make advanced searches on staff interface and opac, on coded fields
indexes (Audience, Literary genre, Biography, Illustration, Content,
Video Types, Serial Type, Periodicity, Regularity, Picture)
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Ok for me. This patch put in sync indexes XSL definition with
authoritative XML definition. Subsequently, it won't be difficult to
amend DOM UNIMARC indexes defintion if necessary. And, as it is, I don't
see any regression, whereas I can see huge improvements. Thanks Mathieu!
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch modify koha-indexdefs-to-zebra.xsl in order to add the
ability to populate indexes with subfield substring.
It's now possible to understand such construction as:
<index_subfields xmlns="http://..." tag="100" subfields="a" offset="7" length="1">
<target_index>tpubdate:s</target_index>
</index_subfields>
Signed-off-by:Mathieu Saby <mathieu.saby@univ-rennes2.fr>
I applied the patch and ran
xsltproc koha-indexdefs-to-zebra.xsl ../marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml \
> ../marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
I looked at the generated file. It looks nice.
Then I copied it file in my INSTALLDIR/etc/zebra.... and reindexed my
records with rebuild_zebra.pl
I made some searches on coded position index and non coded position
indexes, everything works.
http://bugs.koha-community.org/show_bug.cgi?id=8252
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This followup restores the original wording of "Date/time-last-modified"
index, and change the name of "Music-number" index to
"Number-music-publisher"
To test :
1. In a UNIMARC Koha instance
2. Apply patchs #1, #2 and this followup
3. Copy from src/etc/zebradb directory to the etc/zebradb/ in your main
Koha directory the following files:
-- zebradb/biblios/etc/bib1.att
-- zebradb/ccl.properties
-- zebradb/marc_defs/unimarc/biblios/record.abs
-- zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml
-- zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
4. Rebuild zebra with -b -x -v -r options
5. Write a value like "test071a" in 071$a field in a record
6. Check if you can find this record with this search:
"ccl=Number-music-publisher:test071a"
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Test
Copy files
reindex full
Modify a couple of record to add 071a with test message
Reindex -v -z -b -x
Search test message as described and found modified records.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the same changes in UNIMARC DOM configuration as patch
1 made for GRS-1.
positions of subfields are indexed that way :
In biblio-koha-indexdefs.xml :
tag="100" subfields="a" offset="17" length="1"
In biblio-zebra-indexdefs.xsl :
xslo:value-of select="substring(., 17, 1)"
I had to edit biblio-zebra-indexdefs.xsl by hand, because
etc/zebdradb/xml/koha-indexdefs-to-zebra.xsl does only support
"subtring" in handle-one-index-control-field template.
It is good for MARC21, but not for UNIMARC : in MARC21, indexing
subtrings is needed for controled field (001-009, with no subfields)
But in UNIMARC it is needed for subfields of 1XX fields.
So if DOM indexing is working with these new files, we may need to
change koha-indexdefs-to-zebra.xsl.
Test plan (not possible in a sandbox) :
1) In a Koha instance using UNIMARC and DOM indexing
2) Apply Patch 1 and Patch 2 (this one)
3) Copy the following files from the etc/zebradb directory of your
source into the etc/zebradb directory of your main Koha directory :
-- etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml
-- etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
-- etc/zebradb/ccl.properties
-- etc/zebradb/biblios/etc/bib1.att
4) rebuild zebra with -x -b -r -v options
5) check if coded filters in advanced search are usable in OPAC and
Staff interface
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works. No koha-qa errors.
Test for DOM
Apply patches
Don't forget to copy files
reindex
Search by coded fields works, also Country-publication
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Before fixing UNIMARC DOM indexing, we must fix GRS-1 indexing
1) In advanced search, some Coded fields index are not working: Print,
Illustration, Content
2) Country-heading index is not working
3) Some subfields are indexed in wrong indexes :
102$a should be in Country-publication instead of Country-heading
(non defined in bib1.att)
106$a, filled only for printed works, should be in ff88-23 (form of
item) instead of itype. (ff88-23 is made for Marc21 008 pos
23, which contains the same data as 106a)
200$b should be in Material-type instead of (or in addition to) itype
and itemtype: (Material-type :"free-form string, ... that
describes the material type of the item, e.g., cassette, kit,
computer database, computer file.")
100$a pos 22-24 should not be indexed as "ln" : it is the language of
the record, not the language of the ressource
4) Index names are too long : if we index new positions of coded fields,
with existing names it breaks Zebra indexing (there must be a limit
in line lenghth in record.abs?)
5) There are a lot of warns when rebuiding zebra.
This patch make some changes in bib1.att (could be used later to improve
search) :
- fixing wording for att 51 and 1012
- adding comments for attributes based on MARC21 008 field (8800-8841)
- creating 8806 (tpubdate), 8838 (Modified-code), 8818 (ff8-18), 8840
(ff8-18-21), 8819 (ff8-19), 8821 (ff8-21), 8828 (ff8-28), 8830
(ff8-30), 8831 (ff8-31)
- creating attributes specific to UNIMARC : 9701-9707 (Video-mt,
Graphics-type, Graphics-support, Title-page-availability,
Cumulative-index-availability, script-Title, char-encoding)
- setting apart 3 blocks of attributes, so it could be easy to make
further changes :
-- common to Marc21 and UNIMARC : 8806, 8822, 8838
-- slightly different in Marc21 and UNIMARC (different meanings
according to the type of the record => don't match a single
UNIMARC field)
-- specific to UNIMARC : 9701-9707
In ccl.properties :
- creating a new index: Country-publication 1=1053
- suppressing some warns by mapping with bib1 att:
Date-time-last-modified, Name, rtype, Music-number
- defining indexes using the 3 blocks attributes defined in bib1
(common to Marc21 and UNIMARC, slightly different, specific to UNIMARC)
In record.abs :
- renaming some index for 100-105-110 fields
- correcting indexing of 102$a (country of publication)
106$a (ff88-23)
100$a pos 22-24 (language of record, no more
indexed)
105$a pos. 0-3 (illustration code)
200$b (for the moment, I keep it indexed in
itype and itemtype, but also Material-Type)
In C4/Search.pm :
- adding "Country-publication" index
In OPAC and staff interface template subtypes_unimarc.in :
- renaming indexes to take into account the changes made to Zebra
config files
To test (this cannot be done with a sandbox) :
1) Apply the patch in a UNIMARC GRS-1 Koha instance
2) Copy the following files from the etc/zebradb of your source
directory into the etc/zebradb of your main Koha directory:
-- etc/zebradb/biblios/etc/bib1.att
-- etc/zebradb/ccl.properties
-- etc/zebradb/marc_defs/unimarc/biblios/record.abs
3) Reindex your data (rebuild_zebra -x -b -r -v)
4) Try to use those Coded fields indexes in Advanced search, in OPAC
and Staff interface (available after clicking on "More options",
then on "Coded information filters"):
Audience, Print, Literary genre, Biography, Illustration, Content,
Video Types, Serials, Serial Type, Periodicity, Regularity
5) Try to search "Country-publication=FR" in simple search
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Tests for GRS-1
Followed test plan
Search by coded fields works, but only on OPAC,
on staff there are few options
Search by Country-publication works after patch
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch updates the unit tests for the BibTeX export
to add a regression test for supplying the author for
non-UNIMARC records. It also adjusts the test to reflect
the change in quote character from "" to {}.
To test:
[1] Verify that prove -v t/db_dependent/Record.t passes
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Modified Record::marc2bibtex to varlidate fields 100, 110 and 111 in
non-UNIMARC flavours.
Test plan:
1)Search any books in the OPAC with a main entry (1XX in MARC21, 700-720 in UNIMARC)
2)Export the record in the bibtex format
==>The output won't contain the main entry.
3)Apply the patch
4)Export the record
==>The record will contain the main entry
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes a long standing bug.
Passes all tests and QA script.
Tested with multiple records, seems to work well.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Term also occurs on the spent page accessible from the funds
table on the acquisition module start page.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the init script return the status of the relevant
processes.
To test:
- Apply the patch, build package and install
- Run
$ service koha-common status
Note: it can be tested just copying the debian/koha-common.init script
to a server running koha-common instances and calling it
$ ./koha-common.init status
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Robin Sheat <robin@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
With a translated intranet (ie fr-FR) the status filter does not work
for "Late" status. It is because status in combobox filter is
translated "Retard" and status in table is translated "En retard".
This patch changed javascript filter to work on a status code instead
of status name.
The new classes may be used to change CSS depending on status.
Test plan :
- Use a translated intranet (ie fr-FR)
- Go to serials claim of a vendor with issues of multiple status
- Check that status filter does its work
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch passes all tests and QA script.
The status filter should probably be changed to only
allow filtering on status that can appear on the page.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Adds NSB/NSE sort on z3950 seach results in acquisition module.
Also adds default sorting on title column like in cataloguing module.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
At least the BNF server returns results containing non-sorting
characters (NSB/NSE).
In order to sort results according these characters, this patch adds a
new Datatable function.
Test plan:
- search 'tintin' on the z3950 search (cataloguing/z3950_search.pl)
- sort on title (default sort) and check that results are not well
sorted.
- apply this patch
- do the same search and check that the first result is "Hergé. Les
Aventures de Tintin..."
The value of the cell is:
<td>Hergé. Les Aventures de Tintin...</td>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Works as advertised and doesn't break existing searching
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the sorting by title string option to the table of
invoices. This allows column data to be sorted based on the
ISO-formatted date rather than the formatted-for-display date. A "blank"
(0000-00-00) date is added to cells which contain no date data.
To test, view the Acquisitions Invoices page (acqui/invoices.pl) and
confirm that the "Billing date" column is sorted correctly regardless of
the dateformat system preference.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Template only change, passes all tests and QA script.
Sorting of the billing column now works as expected.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes some incoherences of the routine
GetBooksellerWithOrders().
Now it considers the field $estimateddeliverydateto and it replaces it
by now() only if it is undef.
Also, it doesn't test if $aqbookseller.deliverytime is not Null anymore
but if $deliverytime = null or undef, it replaces it by 0.
It also verifies if $delay is >= 0 and return undef if it is a negative
value.
To Test:
Before, this routine sorts out the BookSellerWithLateOrders. If a
bookseller did not specify a deliverytime, it would never appears in
the list of LateOrders. Moreover, if the field "Estimated delivery
date to" was specified, it didn't take care of the value and it
returns the late order up to today's date.
Now, the returned list considers all the fields give and if the
delivery time of the bookseller is not specified, it calculates the
late orders as if the deliverytime is 0. By default, all booksellers
which have orders in late until today are listed unless "estimated
delivery date to" is specified.
prove t/db_dependent/Bookseller.t
t/db_dependent/Bookseller.t ..
[Some warnings about uninitialized values]
WARNING: GetBooksellerWithLateOrders is called with a negative value at C4/Bookseller.pm line 135.
t/db_dependent/Bookseller.t .. ok
All tests successful.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds some improvements for the routine GetOpenIssue().
Now, it verifies if the parameter is given (if not it returns undef)
and it returns $sth->fetchrow_hashref() instead of a $issue.
To test:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 2 wallclock secs ( 0.06 usr 0.01 sys + 1.09 cusr 0.07 csys = 1.23 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Same situation as the one noted in comment of
Bug 10683, test fails unless there is an issuingrule
All, All with 1 as renewals allowed.
With that condition, it succeeds
No koha-qa errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The borrowers table needs to be cleared after the items table
(last_returned_by column).
Some checks were missing for GetRenewCount and AddRenewal.
Now the tests simulated a renewal for a item and check that the renews
left is decremented.
Moreover the issuingrules tables should be cleared and filled with known
values.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds some unit tests wrapped in a transaction for
C4::Circulation.pm.
Circulation_Branch.t adds tests for routines which deal with
branch_item_rules,branch_borrower_circ_rules,
default_branch_circ_rules, default_circ_rules, and
default_branch_item_rules in the database.
Circulation_issue.t adds tests for routines which deal with accountline
and issues in the database.
NOTE: Some commented tests have to be fixed, and some tests can be added.
More, other routines of Circulation.pm are tested in the patches:
10692 UT: Routines about transfers in Circulation.pm need unit tests
10710 UT : OfflineOperation's routines in C4/Circulation.t need unit tests
10767 UT: Routines which interact with the table issuingrules in C4/Circulation need unit test
Test plan:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 0 wallclock secs ( 0.02 usr 0.00 sys + 0.40 cusr 0.02 csys = 0.44 CPU)
Result: PASS
prove t/db_dependent/Circulation_Branch.t
t/db_dependent/Circulation_Branch.t .. ok
All tests successful.
Files=1, Tests=10, 2 wallclock secs ( 0.06 usr 0.00 sys + 1.02 cusr 0.06 csys = 1.14 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Well, I don't know exactly what to do with this, so
I left it to QA
a) prove t/db_dependent/Circulation_Branch.t works well and without erros
b) prove t/db_dependent/Circulation_issue.t works without errors for me
ONLY if I have a issuingrule for All, All with 1 as renewals allowed, in
other cases it fails.
No koha-qa errors
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds return values to DeleteTransfer:
Undef if no parameters are given
1 if a Transfer is deleted
0E0 if a wrong parameter is given
It also fixes some unit tests in t/db_dependent/Circulation_transfers.t
To test:
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=14, 20 wallclock secs ( 0.03 usr 0.00 sys + 0.39 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Applied 10681 and 10692 before 10698
Run prove t/db_dependent/Circulation_transfers.t without errors
No koha-qa errors on all 3 patches
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish().
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author
I run prove t/db_dependent/Reserves.t without errors
don't know if more tests are needed.
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author.
Run prove t/db_dependent/Accounts.t without errors
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The move avoids a problem where many modules would gain
a dependency on C4::Auth just because C4::Members needs access
to hash_password().
This patch also adds a couple unit tests for the new password
hashing code.
To test:
[1] Verify that there are no regressions on the test plan for bug
9611.
[2] Verify that t/AuthUtils.t and t/db_dependent/Auth.t pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A librarian cannot modify a hold's pickup location unless he or she has
the permission modify_holds_priority. This appears to be a bug and not
by design. The reason the update fails is due to the priority not being
passed to ModReserve. The priority is not displayed when a librarian
does not have the modify_holds_priority permission.
Test Plan:
1) Remove the modify_holds_priority from your logged in user
* User cannot be a superlibrarian
2) Attempt to change the pickup location for a hold
3) Note the change fails
4) Apply this patch
5) Repeat step 2
6) Note the change succeeds
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Template change only.
Passes all tests and QA script.
Tested:
- Confirmed that changing the pickup location is not
possible in master without the modify_holds_priority
permission
- Confirmed that the patch fixes the problem - pickup
location can now be changed with and without the
permission
- Pickup location can still not be changed, when
IndependentBranches is activated
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Besides some grammar and capitalization corrections, this patch
adds a link to the wiki page for installation Koha on Ubuntu
from packages.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This is a major rework. Key improvements include:
- Removed confusing multiple versions for Ubuntu leaving only one set
of instructions.
- The packages koha-deps and koha-perldeps are used.
- License has been updated to reflect GPL3.
- More wiki reference links have been included.
- It is aimed to be based on source, not just tarball or just git.
- Sample output has been cut as much as possible.
- Almost cut-and-paste easy, making it friendlier than INSTALL.debian.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed a tiny typo: seperate
Makes all sense to me - only wondering a bit about the recommendation
of using lynx for the web installer.
Quite an improvement, so passing QA.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Right now overdues come from the branch, but the
others come from the admin email address - this
is a problem in multi-branch systems because they
have to come up with one email address that all
branches have access to.
C4::Letters::_send_message_by_email currently sets
the from address in the following order:
1) Address specified in message
2) Koha admin email address
The order will now be:
1) Address specified in message
2) Borrower's home library email address
3) Koha admin email address
Test Plan:
1) Set your library email addresses, and the KohaAdminEmailAddress
Make sure each of them are unqiue
2) Choose a borrower, enable the enhanced messaging and enable the
checkout and checkin email notices. Use your email address for
the borrower's email so you can recieve the emails.
3) Check out an item, check the from address of the email,
it should be the email addres set in KohaAdminEmailAddress
4) Apply the patch
5) Return the item, check the from address of the email,
it should match the email address set for the borrower's
home library.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I was able to reproduce the problem with local covers and
the patch fixes it in my tests.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In Firefox at least, the shelf browser cannot be reopened after
hiding it with the "close shelf browser" link. This followup improves
the behavior of the "close shelf browser" link so that the shelf browser
can be redisplayed.
To test, open a bibliographic detail page in the OPAC and click a
"browse shelf" link. Click the "close shelf browser" link--the shelf
browser should be hidden. Click the original "browse shelf" link and the
shelf browser should reappear without reloading the page.
Test with Firefox and Chrome (at least).
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Testing notes:
- New unit tests in t/db_dependent/ShelfBrowser.t pass.
- System preference OPACShelfBrowser still works as expected.
- Closing and opening the shelf browser works as expected.
- Next and Previous links show new and nicer behaviour.
- Logs are clean.
Tested with Firefox and Chromium under Ubuntu.
Notes: The currently displayed record could maybe be highlighted
a bit better.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If a lot of items has the same callnumber, the order should be on the
itemnumbers. Otherwise the left side is always filled with the same
items.
+ Fix a bad c/p for the next link (when js is disabled).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The next and previous links should completely refresh the shelf.
For example:
[<] [1] [2] [3] [4] [5] [6] [>]
Before this patch, the next and previous links were the same as the 1
and 6.
With this patch, after clicking on next, we will get:
[<] [7] [8] [9] [10] [11] [12] [13] [>]
This patch adds a new AJAX script to get the shelf browser block.
Test plan:
- On a detail biblio page, click on a "Browse shelf" link.
- Play with the next and previous links.
- Deactivate Javascript (using NoScript for example) and check that you
get the same behavior (but the page is reloaded).
- Launch the unit tests: prove t/db_dependent/ShelfBrowser.t
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The LC NAF and SAF authority Z39.50 database are indeed production
services, so we may as well showcase the new authority Z39.50 search
functionality.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
- improve POD
- remove extraneous comments
- correct license statement in new files
- remove backticks in database update SQL
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch introduces a new Z39.50 interface for searching Z39.50
compliant databases for MARC authority records.
These databases aren't as common as their bibliographic equivalents,
but they're out there and very useful. I have included info at the
bottom of this messsage for sample authority databases you can try.
To test this patch:
1) Set up Z39.50 client targets for authority databases. (I've included
information at the bottom of this message for LibrariesAustralia's
test server for authorities as well as instructions on how to use
your Koha's z39.50 authority server as well. The Library of Congress
also has authority databases available (unsure if these are test or
prod), and you might have access to others through OCLC or RLIN. OCLC
provides login credentials for their test databases.
2) Go to the Authorities module
3) Click on the new "Z39.50 search button"
4) Select your authority search targets from the list.
5) Do a search for an authority you would like using either the "Raw"
input box or the more specific input boxes for names, subjects, subject
sub divisions, or titles. (I like searching Name (personal): Eric on
the LibrariesAustralia test DB.)
6) You should see a table listing the server, heading, authority type,
and two other columns (MARC and a nameless column). "Authority type"
is the type of authority it will become when imported in to Koha. In
the Eric example, "PERSO_NAME".
7) Click on "MARC" next to the results of interest to review the MARC
authority record.
8) When you're satisfied with a record, click on "Import".
9) The pop-up window will close and your original Koha window will
change to the "Adding authority Personal Name" screen (in the Eric
example).
10) All the relevant fields should be filled out for the record. Review
them and make any changes as necessary. (N.B. The 001 will be cleared
when saved, so if you have a use for the imported control number, move
it to the 010, 016, or 035 as appropriate. If you have a default value
for the 003, this will also likely be overwritten. Move it if necessary.
The 005 will also be updated when saved, so do not worry about that.)
11) When you're satisfied, click save.
12) Presto! You've imported your first authority record via Z39.50!
--
Here is the info for the LibrariesAustralia test Z39.50 authority
database:
Z39.50 server: LibrariesAustralia Authorities
Hostname: z3950-test.librariesaustralia.nla.gov.au
Port: 210
Database: AuthTraining
Userid: ANLEZ
Password: z39.50
Syntax: MARC21/USMARC
Encoding: utf8
-
The U.S.A. Library of Congress also provides Z39.50 access to its Name
and Subject Authorities (http://www.loc.gov/z3950/lcserver.html).
Name Authority:
Z39.50 server: Library of Congress Name Authority File
Hostname: lx2.loc.gov
Port: 210
Database: NAF
Syntax: MARC21/USMARC
Encoding: utf8
Subject Authority:
Z39.50 server: Library of Congress Subject Authority File
Hostname: lx2.loc.gov
Port: 210
Database: SAF
Syntax: MARC21/USMARC
Encoding: utf8
(N.B. Both of these databases also include title authorities.)
-
For testing purposes, you can also set up a Z39.50 client target,
which points at your own Koha instance's Z39.50 authority server.
To find the hostname, go to /etc/koha-conf.xml and find the value for
the <listen id="authorityserver"> element. Depending on your
configuration, this could be something like the following:
unix:/zebra/koha/var/run/zebradb/authoritysocket
(N.B. You might be using a different scheme than unix sockets...)
To find the database, scroll down to the bottom of koha-conf.xml until
you reach the <config> element. Within this, look for the value of the
element <authorityserver>. It should probably be "authorities".
To set up this Z39.50 client target in Koha...
Z39.50 server: my koha authorities
Hostname: unix:/zebra/koha/var/run/zebradb/authoritysocket
Port:
Database: authorities
Userid:
Password:
Syntax: MARC21/USMARC (or whichever flavour you need)
Encoding: utf8
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - Add a z39.50 interface for authority searching
This patch adds the "recordtype" column to the "z3950servers" table.
The value in this column (biblio or authority) then controls whether
the z3950 server shows up in a bibliographic search (through the
Acq and Cataloguing modules) or in an authority search (through
the Authorities module).
I also edited the z3950 management console to show this value
and allow users to edit it. The default value is "biblio", since
the vast majority of z3950 targets will be bibliographic. However,
there is an option to add/edit a z3950 target as a source of
authority records.
Test Plan:
1) Apply both patches
2) Run updatedatabase.pl (after setting your KOHA_CONF and PERL5
environmental variables)
3) Use the test plan from the 1st patch
N.B. Make sure that your Z39.50 client target has a Record Type
of Authority, otherwise it won't display when you're doing a
Z3950 search for authorities.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - fix tabs/whitespace errors to pass QA
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Galen found a case where the cookies array was not built flat. I add a
unit test for that (check the cookie array is flat) and also test the
cookies output of get_template_and_user so we are sure the &language=
parameter is correctly handled.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
- Tests in t/db_dependent/Auth.t pass
- Tested in intranet, OPAC logged in, OPAC logged out
* Adding a valid language code to the URL switches the language
as expected
* Adding an invalid language code causes no change
Nice feature!
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The current implementation didn't build the cookie array correctly,
yielding login problems in some scenarios.
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Passing language=<valid_language_code> as a parameter in any Koha's URL
can be used to set the desired language.
This patch touches
- C4::Templates
- C4::Auth
Adds a new method getlanguagecookie that does exactly that, for use in
get_template_and_user.
Also modifies getlanguage so it checks (a) if there's a 'language'
parameter in the CGI object and (b) checks if its valid and enabled for
the desired interface.
To test:
* Without the patch
- access any koha page
- add ?language=code to the end of the URL (change code for a valid language code
it needs to be installed using perl translate install code, and enabled either for
the staff or opac interface, depending where are you testing)
- Nothing happens with the language parameter
* With the patch
- access any koha page
- add ?language=code (the same as before) and hit enter
- the language should be changed to the one you chose
- if you browse through some links, you will see
koha 'remembers' the language you passed as a parameter
(i.e. the language cookie has been updated).
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Brendan <brendan@bywatersolutions.com>
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Works very well. No errors.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
More comments on last patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>