Kyle M Hall [Fri, 21 Sep 2012 16:16:04 +0000 (12:16 -0400)]
Bug 8798: (follow-up) rename schema class updater script
updateDatabase.pl is a bit too close to updatedatabase.pl in installer
and may cause some confusion. I would suggest update_dbix_class_files.pl
as a unambiguous and descriptive name for this file.
Elliott Davis [Sun, 2 Sep 2012 13:45:40 +0000 (08:45 -0500)]
Bug 8798: DBIx::Class base classes for all Koha tables
* Added base class files for all tables in koha using
DBIx::Class::Schema::Loader.
* Added a (very basic) test file for C4::Context
* Also added dependencies in required files.
To Test:
[1] Install patch
[2] Make sure you can still connect to Koha
[3] You may optionally run this test script:
use Koha::Database;
use Data::Dumper;
my $db = Koha::Database->new();
my $schema = $db->schema();
print Dumper($schema->resultset("Borrower"));
If you run this file you should get a DBIx dump of the borrowers table.
Kyle M Hall [Wed, 4 Sep 2013 16:14:07 +0000 (12:14 -0400)]
Bug 10820: display item status as lost if item is both lost and on loan
In the OPAC, if an items is both lost and checked out, it will show as
lost on the search results, and checkout out in the record details. The
lost status should take precedence over the checked out status, as the
checked out status may lead a patron to believe the book may return
soon.
Test Plan:
1) Check an item out to a patron
2) Set it to lost ( requires itemlost to be revealed in the framework
for the items editor ).
3) Rebuild your zebra indexes
4) Run a search where that item is in the results list
5) Note the item is marked as lost
6) View the record details
7) Note the item is listed as "checked out"
8) Apply this patch
9) Repeat steps 4-6, note the item is now listed as lost
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Fridolyn SOMERS [Tue, 6 Aug 2013 10:13:34 +0000 (12:13 +0200)]
Bug 10689: make public note appear in subscriptions search
In a serials module, when searching subscriptions, the results table as
a "Notes" column.
In TT code, you see that it tries to display public note
"subscription.notes" and internal note "subscription.internalnotes".
The internal note is displayed well but not the public note.
You can see the 2 notes in serial details in summary tab.
The problem commes from the SQL query. A join is perform on subscription
and biblio, both containing a "notes" column.
This patch solves the problem by using a alias in query for both columns
(biblio.notes is acutally not used in template but could be).
Test plan :
- Edit a subscription
- Add public and internal notes. For example : "too busy" and "on holiday"
- Perform a subscription search that returns this subscription
=> "Notes" column contains both notes. For example : "too busy (on holiday)"
- Test with only public note
- Test with only internal note
Works as described.
Signed-off-by:Mathieu Saby <mathieu.saby@uhb.fr> Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Works as described, fixes a bug as the templates show that
the intention was to display both notes in the column.
Jonathan Druart [Tue, 10 Sep 2013 13:24:07 +0000 (15:24 +0200)]
Bug 10854: add ability to export serial claims using CSV profile.
Test plan:
- Create a CSV profile (or use the default one) with a type 'sql'.
- Go to serials/claims.pl, select the wanted CSV profile and click on
the "Export selected items data".
- Verify the CSV file is correctly generated.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors
On top of 10853 (solving merge conflict)
Need to do homework to test. Add subscription and serial claim
notice.
1) Create a new CSV profile, type SQL, copy sample fields
2) Go to claims, select vendor
3) Go to Export selected items with created profile
4) CSV (in my case I use | as separator) download successfully
5) All fields present, file correct
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Works as advertised, keeps existing format by porting it to
a SQL profile.
Jonathan Druart [Tue, 10 Sep 2013 13:20:09 +0000 (15:20 +0200)]
Bug 10853: Add DB field export_format.type ('marc' or 'sql').
This patch:
- adds a new column 'type' to the export_format table.
- renames the field name export_format.marcfields with
export_format.content.
Test plan:
- Check that existing profiles have the type "marc" selected by default
- Create a new profile with a type "sql"
- Save and verify the profile is correctly displayed when you select it.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described. koha-qa reports Small tabs errors,
corrected in followup
Test:
1) go to Tools > CSV profiles, Create profile, current
2) Apply patch, run updatedatabase
3) Go to Tools > CSV profiles, new option present
old profile with type MARC
4) Create new profile MARC, save and show correct
5) Create new profile SQL, save and show correct
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass with all 3 patches applied.
Works as described. Functionality for SQL profiles will be
added by another patch. For now it's possible to add/edit/delete
them.
Existing CSV profiles can still be exported correctly.
Galen Charlton [Fri, 11 Oct 2013 01:46:47 +0000 (01:46 +0000)]
Bug 10240: (follow-up) don't display patrons as lost or gone-no-address incorrectly
Because borrowers.lost and borrowers.gonenoaddress are nullable, when
downloading patron information for an offline patron sync, convert null
values to zero.
To test:
[1] Set borrowers.lost to NULL for a patron.
[2] Do an offline sync, then disconnect network access (or
stop the webserver) and enter the offline interface.
[3] Try checking out to the test patron. A warning will
(incorrectly) disply stating that the patron's card is lost.
[4] Apply the patch.
[5] Do steps 2 and 3 again. This time, there will be lost card
warning.
Bug 10240: (follow-up) correctly record fines and fix label
At some point in rebasing I managed to remove the part of the code
that saved fine payments. This patch re-adds that feature. This patch
also corrects the label on the check out tab to not mention partial
names for checkouts when offline, and partial name searches are not
supported in offline circ.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10240: (follow-up) warn when patron's card is expired
This patch improves the alert messages to be slightly better English
and warns the librarian if a patron's card has expired. Like all alerts,
this is non-fatal since in the case of network failure there is no
particular reason to expect that the offline database is current.
To test this particular patch you can try checking something out to an
expired patron, otherwise test plan remains the same as above.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Logging out/setting the library does not work while using offline
mode, so it makes no sense to present those options to the user.
Much better is some sort of explanatory message informing them that
those two links don't work.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Fix the following two issues:
1) After paying a fine when offline the fine amount becomes NaN.
2) For previous checkouts for a patron, the title and barcode
fields have the wrong infomation in them (i.e. they have been swapped)
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10240: (follow-up) don't delete transactions if auth fails
When uploading transactions, we were not checking that authentication
had succeeded before deleting the transactions from the local database.
That was bad. With this patch, we check. That is good.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Address the following issues:
1/ Address minor qa issues with the templates:
FAIL koha-tmpl/intranet-tmpl/prog/en/modules/circ/offline-mf.tt
FAIL forbidden patterns
forbidden pattern: intranet-tmpl should certainly
replaced with [% interface %] (line 24)
[etc.]
OK tt_valid
OK valid_template
FAIL koha-tmpl/intranet-tmpl/prog/en/modules/circ/offline.tt
FAIL forbidden patterns
forbidden pattern: intranet-tmpl should certainly
replaced with [% interface %] (line 509)
[etc.]
FAIL tt_valid
lines 5, 5
2/ Run perltidy on new scripts
3/ download.pl returns data.finished = 1 if number of returned
data < 5000 (avoids 1 ajax call)
4/ Replace qq{} around sql queries with q{}
Also, a race condition existed that resulted in pending transactions
only getting deleted from the local database in certain circumstances
(fast connections under Chrome, mostly). This patch fixes that so that
successfully-uploaded transactions are always deleted.
This patch also addresses Jonathan's suggestion:
3/ add a message on check in (currently the input becomes empty but the
user is not informed).
... and Magnus's suggestion about moving the Synchronize link to the
right on the homepage.
Also, this addresses the further issues Jonathan noted:
- The tab of checkouts always shows "*0* Checkouts"
- If I am not well-educated, I click on the "Check out" link on the
offline home page, I enter a barcode, click on "Check out" and I get a
js error (without user message): "TypeError: curpatron is undefined"
(with chromium I get: Numeric transaction modes are deprecated in
IDBDatabase.transaction. Use "readonly" or "readwrite").
- There is a "border-right" css rule on the h5.patron-title. It is
display when there is no patron selected) [really minor!].
- tables are displayed even if there is no data
- The "Clear screen" link (X) points to an old script:
circ/offline-circulation.pl
- There is a warning when clicking on the "Synchronize" link when the
user is offline, but not for the "Pending offline circulation actions"
link.
- Still exists:
> The "Checked in item." message text never disappear (even if I go on the
> offline home page (circ/offline.pl#offline-home)).
Finally, this patch adds a link to the Pending offline operations page
on the synchronize page for easier navigation.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com> Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10240: Offline circulation using HTML5 and IndexedDB
This patch adds an HTML5-based offline mode to Koha's existing
circulation module, allowing librarians to check out items using a
basically familiar interface. The feature will be implemented using
the Application Cache and IndexedDB features of the HTML5 specification,
both of which are fully supported on Firefox 10+ and Chrome 23+, with
limited support going back to Firefox 4 and Chrome 11. The basic
workflow enabled by this patch is as follows:
Part 1: While connected to the Internet
1. Enable offline functionality by turning on the
"AllowOfflineCirculation" system preference.
2. Sync the offline circulation database on the computer that will be
used for offline circulation by following the "Offline circulation
interface" link on the Circulation home page, choosing "Synchronize (must be online)",
and clicking the "Download records" button. This process may take a while.
3. Bookmark /cgi-bin/koha/circ/offline.pl (the page you are currently
on) for easy access when offline.
Part 2: While disconnected from the Internet
4. Navigate to /cgi-bin/koha/circ/offline.pl using the bookmark you
created while online.
5. Start checking books in by scanning the barcode of an item that has
been returned into the box in the "Check in" tab.
6. Scan the barcodes of any additional items that have been returned.
7. Start checking out books to a patron by scanning the patron's barcode
in the box in the "Check out" tab.
8. Set a due date (the "Remember for session" box will be checked by
default, since circulation rules are not computed during offline
transactions and therefore a due date must be specified by the
librarian).
9. Scan an item barcode (if you did not set a due date, it will prompt
you) to check the item out to the patron.
10. If a patron has a fine you can see the total amount (current to when
the offline module was synced), and record a payment. Unlike when in
online mode, there will be no breakdown of what item(s) fines are
for, and you will only be able to record the payment amount and not
associate it with a particular item.
Part 3: While connected to the Internet
11. Click the "Synchronize" link and choose "Upload transactions" to
upload the transactions recorded during the offline circulation
session.
12. Navigate to /cgi-bin/koha/offline_circ/list.pl (there will be a
link from the Offline circulation page) and review the
transactions, as described in the documentation for the Firefox
Offline circulation plugin:
http://wiki.koha-community.org/wiki/Offline_circulation_firefox_plugin
RM note: the IndexedDB jQuery plugin bundled with this patch is
copyright 2012 by Parashuram Narasimhan and other contributors and is
licensed under the MIT license. The home page for the plugin is
http://nparashuram.com/jquery-indexeddb/.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Works very well, no koha-qa errors
Test with Firefox 24.0
1) did some checkouts pre sync
2) synchronize database (Download)
3) go offline
4) Proceed to checkin some items from patron
5) Proceed to checkout items to patrons, setting date
6) Proceed to checkout to expired patron, warning appears
7) go online
8) Upload records
9) go to review transacctions and proceed
10) verified on patrons that checkin/out are done
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Chris Hall [Fri, 22 Mar 2013 02:34:17 +0000 (15:34 +1300)]
bug 10365: change routing slips to use date published rather than planned date
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Jonathan Druart [Wed, 17 Jul 2013 12:38:55 +0000 (14:38 +0200)]
Bug 10602: Set default value for authority fields via the framework
This patch allows to define default values in the authorities framework.
Some code already existed but the feature did not work.
Test plan:
1/ Choose a framework, field and subfields.
2/ Define a default value.
3/ Create a new authority and check that the subfield is
automatically filled with the default value.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. koha-qa reports some tabs, fixed in followup
Test
1) Apply patch, run updatedatabase.pl
2) Edit auth framework, put default value someware, save
3) Add new auth, default value present
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Verified database update is done correctly.
Controlfields 0xx
- Edited an existing field (001)
- Set a default value for subfield @
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Fields
- Edited an existing field (100)
- Set a default value for subfield e
- Edited subfield again, checking default was saved correctly
- Verified the default shows up correctly when creating a
new authority using this authority type
Mirko Tietgen [Fri, 14 Dec 2012 17:01:37 +0000 (18:01 +0100)]
Bug 9295: Introduce operator equal/ notequal to OAI set mapping instead of hardcoded 'equal' value.
In OAI set mappings, the value "is equal to" is hardcoded. This
enhancement changes it to a dropdown menu to choose between "is equal
to" and "not equal to".
To test:
* define a set
* define a mapping for said set with "is equal to"
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
* change mapping to 'not equal to', save
* run /misc/migration_tools/build_oai_sets.pl -r -v
* confirm that you have correct entries in SQL: select * from
oai_sets_biblios;
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com> Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Amended patch: Fix bug id in updatedb.pl
Galen Charlton [Thu, 10 Oct 2013 22:46:18 +0000 (22:46 +0000)]
Bug 9282: (follow-up) remove log noise caused by authorities/ysearch.pl
This patch (which is incidental to the main patches for bug 9282),
improves the AJAX authority search code by fixing incorrect contruction
of the parameters to SearchAuthorities() that led to errors like this
in the Apache log when doing auto-completion searches on "main entry"
and "anywhere" in the authority finder:
ysearch.pl: Use of uninitialized value $i in string eq ...
In the process, this patch also removes a use of the now-deprecated
Perl smartmatch operator.
To test:
[1] Verify that the main test plan for bug 9282 still works.
Fridolyn SOMERS [Thu, 13 Dec 2012 13:38:11 +0000 (14:38 +0100)]
Bug 9282: improve auto-completion for authority record finder
This patch adjusts the auto-completion on the authority record
finder (accessed from the bib editor) so that if you do
start typing in the "Main entry ($a only)" input field, it will
return only the $a of the main heading for matching authority
records.
This fixes a problem where typing "shakes", choosing
"Shakespeare, William, 1564-1616" from the auto-completion
result list, then hitting the search button fails to bring
up results, as the dates come from the $d of the 100 field
(in MARC21).
Signed-off-by: Mathieu Saby <mathieu.saby@univ-rennes2.fr> Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised.
Tested with an authority where I added my search term in $b.
The modified authority came up in main entry, not in mainmainentry.
That was the desired result.
Galen Charlton [Thu, 10 Oct 2013 21:57:29 +0000 (21:57 +0000)]
bug 5202: (follow-up) tweak display of merge action link on staged batch page
This patch adds a vertical bar between the link to display the
matching authority record and the link to initiate a record
merge; prior to this patch, the two links ran together.
The link on the merge reference had been wrong, and framework types
were not indicated when you were merging records of two types. This
patch fixes those problems.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
When rebasing the patch for this, I accidentally moved the line
setting the mergereference when merging from breeding to *before*
I checked whether we were merging from breeding. Needless to say,
this caused problems. This patch fixes that, and:
* Makes the error translatable.
* Adds a missing authtypesloop argument for the template.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Due to massively incorrect data in the default authority frameworks,
we were getting warnings about undefined values and spaces from checking
which tab subfields/fields were displayed in. This patch eliminates those
warnings.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 5202: merge authorities from the authority file and reservoir
This patch gives Koha the ability to merge authority records using the
same interface used by bibliographic records, though slightly different
methods for selecting which records to merge. The two ways to select
records are as follows:
1) Records can be selected from authority search results by clicking
the "Merge" link for two records.
2) Authority records can be merged from the reservoir by clicking the
merge-related links in the Manage staged MARC batch screen.
To test:
1) Apply patch.
2) Do a search for an authority record that will turn up multiple
identical records (or at least two records that you don't mind
merging).
3) Click the "Merge" link for the first record.
4) Click the "Merge" link for the second record.
5) Choose which fields from which record you want to appear in the
resulting record.
6) Confirm that those are the fields that exist in the resulting record.
7) Stage an authority record (for example, an authority record you
saved from your catalog.
8) Search for a record to merge with it using the "Search for a record
to merge in a new window" link.
9) Merge these records, confirming that the resulting record (after
going through the entire merging process) matches your expectations.
10) Set up a matching rule for authorities, and export an authority from
your catalog that will match based on that rule. For MARC21, the
following is a good choice for a rule:
Matching rule code: AUTHPER
Description: Personal name main entry
Match threshold: 999
Record type: Authority record
[Match point 1:]
Search index: mainmainentry
Score: 1000
Tag: 100
Subfields: a
11) Stage the record you just exported, choosing the matching rule you
just created.
12) Merge the record using the "Merge" link, confirming that the
resulting record (after going through the entire merging process)
matches your expectations.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Testing notes on last patch in series.
Mathieu Saby [Mon, 9 Jan 2012 07:58:20 +0000 (08:58 +0100)]
Bug 7421: support indexing UNIMARC authority records using the DOM Filter
I took as a base the patch of F. Demians, but made a lot of changes,
so I think it is more logical to create a new patch as the behavior is
not the same as previous patch.
I tried to define DOM config files as a "miror" of record.abs, so the
behavior be the same.
If it is OK, we will be able to improve indexing later, for example
suppressing warns, managing indicators or subdivisions, etc.
I made some little changes to record.abs :
- comments
- 216 was indexed in Conference-name as well as Trademark. I suppose
that "Conference-name" is an error, so I indexed only in Trademark
- index 2 new notes : 340 / 356
The only difference between record.abs and DOM is that DOM config files
does not index complete fields, but subfields.
Ex :
melm 200 ===> <kohaidx:index_subfields tag="200" subfields="abcdfgjxyz">
I took all the subfields from the UNIMARC Authorities manual. The only
subfields not indexed are numeric subfields : $7, $8 for language of
record, and $0,2,3,5,6 for 4XX/5XX/7XX
To test :
- index a set of bib and auth records with GRS-1
- make some searches on different kind of authorities
- index the same records with DOM
- make the same searches
- You are not supposed to see differences
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
As I am not a UNIMARC user it's hard for me to test this, but
while testing other authority related patches I noticed that I couldn't
index the UNIMARC authorities of the sample base. The files are obviously
missing and reindex_zebra.pl notes this. With this patch applied,
indexing works and authorities are searchable in my installation.
Galen Charlton [Thu, 10 Oct 2013 20:59:11 +0000 (20:59 +0000)]
Bug 7421: add regression tests for UNIMARC authorities DOM indexing
This patch enhances t/db_dependent/Search.t to test indexing and
searching of authority records. It supplies a sample authority file
for MARC21 (consisting of the record for William Shakespeare from the
Library of Congress) and a sample UNIMARC authority file supplied
by Henri-Damien Laurent.
It also adds tests for both MARC21 and UNIMARC authorities.
When the main patch for bug 7421 is applied, t/db_dependent/Search.t
should pass.
Janusz Kaczmarek [Thu, 14 Mar 2013 15:25:40 +0000 (17:25 +0200)]
Bug 10335: display translated forms of headings in UNIMARC authorities correctly
To reproduce and test:
To reproduce:
1) Create an authority record with main heading (100) in Latin script
(e.g. Oppenheimer, Aharon -- subfields $a and $b) and parallel form
(700) in Hebrew (אופנהיימר, אהרן -- subfields $a and $b).
Mark it correctly in $8 with freheb (or engheb if you like);
2) Reindex and search;
3) You will see:
Oppenheimer Aharon
freheb: אופנהיימר
Whereas you would rather like to see (mind language and lack of $b above):
Oppenheimer, Aharon
Hebrew: אופנהיימר, אהרן
The patch corrects the issue and should not harm those who (improperly)
put only one triple in $8
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
Same result on OPAC and STAFF
Turns out that test plan is wrong,
you neet to fill tag 200ab, not 100ab, for main heading.
I filled 100a with some example data from UNIMARC auth manual.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Took me a bit to figure it out, works according to test plan.
In UNIMARC DOM indexing, "item" index was working only for subfields
of 995 field mapped with specific indexes, and also in index (ex :
$a, $b...). It was not working for the other subfields (ex : $g),
because a comment from record.abs was integrated in DOM config files.
This patch removes the comment.
To test, in a DOM UNIMARC environment :
1) In a item, write some value "Test10037" in 995$g
2) Search for this value in simple search, this way : item=Test10037
=> you should have no results
3) Apply the patch. if necessary, copy the modified
etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml and
etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl into
the /etc/... directory in your main Koha directory
4) Reindex Zebra biblios
5) Do the same search as 2) => you should have one result
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
Test
NOTE: default UNIMARC framework don't have 995g,
so I must add it first.
1) Added test string to 995b on some record
2) Reindex and search as indicated, no results
3) cp files to destination
4) reindex
5) search and result ok !
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Galen Charlton [Thu, 10 Oct 2013 18:50:13 +0000 (18:50 +0000)]
Bug 10037: regression test for searching UNIMARC item index in DOM mode
This patch adds a regression test that verifies that searching the
UNIMARC 'item' index for a value that is indexed by virtue of being
included in the fallback index definition for the 995 field works for
both GRS-1 and DOM.
The main patch will allow t/db_dependent/Search.t to pass.
This patch adds a few basic search and indexing tests for UNIMARC,
using sample data provied by Frédéric Demians. The new tests
by no means completely cover the possibilities, but should be
considered a starting point for future tests.
To test:
[1] Verify that prove -v t/db_dependent/Search.t passes.
Galen Charlton [Thu, 10 Oct 2013 17:58:30 +0000 (17:58 +0000)]
Bug 8252: (follow-up) make it easier to add UNIMARC indexing tests
This patch refactors t/db_dependent/Search.t so that it can be
subsequently modified to add tests for indexing and searching
UNIMARC records without duplicating the setup code.
This patch also moves the MARC21 records from t/db_dependent/data/zebraexport
to t/db_dependent/data/marc21/zebraexport so that UNIMARC sample
data can be slotted in t/db_dependent/data/unimarc/zebraexport.
To test:
[1] Verify that prove -v t/db_dependent/Search.t passes.
Galen Charlton [Thu, 10 Oct 2013 17:28:51 +0000 (17:28 +0000)]
bug 8252: (follow-up) add search tests on music number
There was some churn in previous patches about the desired
name of the music publisher number index, so this patch
adds tests to confirm that both the old and new names
work as CCL keywords.
Galen Charlton [Thu, 10 Oct 2013 17:18:43 +0000 (17:18 +0000)]
bug 8252: (follow-up) test both GRS1 and DOM indexing
This patch expands t/db_dependent/Search.t to run
the same tests using both the GRS-1 and DOM indexing
modes. It also adds hooks in zebra_config.pl to make
it easier to stage test cases for non-MARC21 Zebra
indexing.
Note that in DOM mode one of the tests is currently a
TODO, as relevance ranking for wegihted queries differs
between GRS-1 and DOM.
To test:
[1] Verify that prove -v t/db_dependent/Search.t passes.
This patch fixes biblio-zebra-indexdefs.xsl files.
It was generated from biblio-koha-indexdefs.xsm with the new
koha-indexdefs-to-zebra.xsl amended by F. Démians's patch.
To test :
- Take a DOM UNIMARC Koha
- Apply all the patchs of 8252 bug, including this one
- Copy src/etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
to your etc/zebradb/marc_defs/unimarc/biblios/ located in your
installation directory
- Run rebuid_zebra -b -x -r -v
- make advanced searches on staff interface and opac, on coded fields
indexes (Audience, Literary genre, Biography, Illustration, Content,
Video Types, Serial Type, Periodicity, Regularity, Picture)
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Ok for me. This patch put in sync indexes XSL definition with
authoritative XML definition. Subsequently, it won't be difficult to
amend DOM UNIMARC indexes defintion if necessary. And, as it is, I don't
see any regression, whereas I can see huge improvements. Thanks Mathieu!
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
I applied the patch and ran
xsltproc koha-indexdefs-to-zebra.xsl ../marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml \
> ../marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
I looked at the generated file. It looks nice.
Then I copied it file in my INSTALLDIR/etc/zebra.... and reindexed my
records with rebuild_zebra.pl
I made some searches on coded position index and non coded position
indexes, everything works.
Mathieu Saby [Mon, 6 May 2013 17:58:09 +0000 (19:58 +0200)]
Bug 8252: Followup for Date/time-last-modified and Music number
This followup restores the original wording of "Date/time-last-modified"
index, and change the name of "Music-number" index to
"Number-music-publisher"
To test :
1. In a UNIMARC Koha instance
2. Apply patchs #1, #2 and this followup
3. Copy from src/etc/zebradb directory to the etc/zebradb/ in your main
Koha directory the following files:
-- zebradb/biblios/etc/bib1.att
-- zebradb/ccl.properties
-- zebradb/marc_defs/unimarc/biblios/record.abs
-- zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml
-- zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
4. Rebuild zebra with -b -x -v -r options
5. Write a value like "test071a" in 071$a field in a record
6. Check if you can find this record with this search:
"ccl=Number-music-publisher:test071a"
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Test
Copy files
reindex full
Modify a couple of record to add 071a with test message
Reindex -v -z -b -x
Search test message as described and found modified records.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Mathieu Saby [Wed, 27 Mar 2013 15:04:28 +0000 (16:04 +0100)]
Bug 8252: Fix indexing of UNIMARC 1xx for DOM
This patch makes the same changes in UNIMARC DOM configuration as patch
1 made for GRS-1.
positions of subfields are indexed that way :
In biblio-koha-indexdefs.xml :
tag="100" subfields="a" offset="17" length="1"
In biblio-zebra-indexdefs.xsl :
xslo:value-of select="substring(., 17, 1)"
I had to edit biblio-zebra-indexdefs.xsl by hand, because
etc/zebdradb/xml/koha-indexdefs-to-zebra.xsl does only support
"subtring" in handle-one-index-control-field template.
It is good for MARC21, but not for UNIMARC : in MARC21, indexing
subtrings is needed for controled field (001-009, with no subfields)
But in UNIMARC it is needed for subfields of 1XX fields.
So if DOM indexing is working with these new files, we may need to
change koha-indexdefs-to-zebra.xsl.
Test plan (not possible in a sandbox) :
1) In a Koha instance using UNIMARC and DOM indexing
2) Apply Patch 1 and Patch 2 (this one)
3) Copy the following files from the etc/zebradb directory of your
source into the etc/zebradb directory of your main Koha directory :
-- etc/zebradb/marc_defs/unimarc/biblios/biblio-koha-indexdefs.xml
-- etc/zebradb/marc_defs/unimarc/biblios/biblio-zebra-indexdefs.xsl
-- etc/zebradb/ccl.properties
-- etc/zebradb/biblios/etc/bib1.att
4) rebuild zebra with -x -b -r -v options
5) check if coded filters in advanced search are usable in OPAC and
Staff interface
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works. No koha-qa errors.
Test for DOM
Apply patches
Don't forget to copy files
reindex
Search by coded fields works, also Country-publication
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Mathieu Saby [Fri, 1 Feb 2013 13:52:05 +0000 (14:52 +0100)]
Bug 8252: Fix indexing of UNIMARC 1xx for GRS-1
Before fixing UNIMARC DOM indexing, we must fix GRS-1 indexing
1) In advanced search, some Coded fields index are not working: Print,
Illustration, Content
2) Country-heading index is not working
3) Some subfields are indexed in wrong indexes :
102$a should be in Country-publication instead of Country-heading
(non defined in bib1.att)
106$a, filled only for printed works, should be in ff88-23 (form of
item) instead of itype. (ff88-23 is made for Marc21 008 pos
23, which contains the same data as 106a)
200$b should be in Material-type instead of (or in addition to) itype
and itemtype: (Material-type :"free-form string, ... that
describes the material type of the item, e.g., cassette, kit,
computer database, computer file.")
100$a pos 22-24 should not be indexed as "ln" : it is the language of
the record, not the language of the ressource
4) Index names are too long : if we index new positions of coded fields,
with existing names it breaks Zebra indexing (there must be a limit
in line lenghth in record.abs?)
5) There are a lot of warns when rebuiding zebra.
This patch make some changes in bib1.att (could be used later to improve
search) :
- fixing wording for att 51 and 1012
- adding comments for attributes based on MARC21 008 field (8800-8841)
- creating 8806 (tpubdate), 8838 (Modified-code), 8818 (ff8-18), 8840
(ff8-18-21), 8819 (ff8-19), 8821 (ff8-21), 8828 (ff8-28), 8830
(ff8-30), 8831 (ff8-31)
- creating attributes specific to UNIMARC : 9701-9707 (Video-mt,
Graphics-type, Graphics-support, Title-page-availability,
Cumulative-index-availability, script-Title, char-encoding)
- setting apart 3 blocks of attributes, so it could be easy to make
further changes :
-- common to Marc21 and UNIMARC : 8806, 8822, 8838
-- slightly different in Marc21 and UNIMARC (different meanings
according to the type of the record => don't match a single
UNIMARC field)
-- specific to UNIMARC : 9701-9707
In ccl.properties :
- creating a new index: Country-publication 1=1053
- suppressing some warns by mapping with bib1 att:
Date-time-last-modified, Name, rtype, Music-number
- defining indexes using the 3 blocks attributes defined in bib1
(common to Marc21 and UNIMARC, slightly different, specific to UNIMARC)
In record.abs :
- renaming some index for 100-105-110 fields
- correcting indexing of 102$a (country of publication)
106$a (ff88-23)
100$a pos 22-24 (language of record, no more
indexed)
105$a pos. 0-3 (illustration code)
200$b (for the moment, I keep it indexed in
itype and itemtype, but also Material-Type)
In C4/Search.pm :
- adding "Country-publication" index
In OPAC and staff interface template subtypes_unimarc.in :
- renaming indexes to take into account the changes made to Zebra
config files
To test (this cannot be done with a sandbox) :
1) Apply the patch in a UNIMARC GRS-1 Koha instance
2) Copy the following files from the etc/zebradb of your source
directory into the etc/zebradb of your main Koha directory:
-- etc/zebradb/biblios/etc/bib1.att
-- etc/zebradb/ccl.properties
-- etc/zebradb/marc_defs/unimarc/biblios/record.abs
3) Reindex your data (rebuild_zebra -x -b -r -v)
4) Try to use those Coded fields indexes in Advanced search, in OPAC
and Staff interface (available after clicking on "More options",
then on "Coded information filters"):
Audience, Print, Literary genre, Biography, Illustration, Content,
Video Types, Serials, Serial Type, Periodicity, Regularity
5) Try to search "Country-publication=FR" in simple search
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Tests for GRS-1
Followed test plan
Search by coded fields works, but only on OPAC,
on staff there are few options
Search by Country-publication works after patch
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Galen Charlton [Wed, 9 Oct 2013 05:26:27 +0000 (05:26 +0000)]
bug 6201: (follow-up) update unit tests
This patch updates the unit tests for the BibTeX export
to add a regression test for supplying the author for
non-UNIMARC records. It also adjusts the test to reflect
the change in quote character from "" to {}.
To test:
[1] Verify that prove -v t/db_dependent/Record.t passes
Bug 6201: Add fields 1xx to marc2bibtex (for MARC21 and NORMARC)
Modified Record::marc2bibtex to varlidate fields 100, 110 and 111 in
non-UNIMARC flavours.
Test plan:
1)Search any books in the OPAC with a main entry (1XX in MARC21, 700-720 in UNIMARC)
2)Export the record in the bibtex format
==>The output won't contain the main entry.
3)Apply the patch
4)Export the record
==>The record will contain the main entry
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixes a long standing bug.
Passes all tests and QA script.
Tested with multiple records, seems to work well.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Fridolyn SOMERS [Fri, 7 Jun 2013 10:14:02 +0000 (12:14 +0200)]
Bug 10430 - status filter not working in serial claims when translated
With a translated intranet (ie fr-FR) the status filter does not work
for "Late" status. It is because status in combobox filter is
translated "Retard" and status in table is translated "En retard".
This patch changed javascript filter to work on a status code instead
of status name.
The new classes may be used to change CSS depending on status.
Test plan :
- Use a translated intranet (ie fr-FR)
- Go to serials claim of a vendor with issues of multiple status
- Check that status filter does its work
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch passes all tests and QA script.
The status filter should probably be changed to only
allow filtering on status that can appear on the page.
Jonathan Druart [Tue, 5 Mar 2013 12:52:20 +0000 (13:52 +0100)]
Bug 9747: Fix NSB/NSE sorting issues on z3950 search results
At least the BNF server returns results containing non-sorting
characters (NSB/NSE).
In order to sort results according these characters, this patch adds a
new Datatable function.
Test plan:
- search 'tintin' on the z3950 search (cataloguing/z3950_search.pl)
- sort on title (default sort) and check that results are not well
sorted.
- apply this patch
- do the same search and check that the first result is "Hergé. Les
Aventures de Tintin..."
The value of the cell is:
<td>\88Hergé. Les \89Aventures de Tintin...</td>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Works as advertised and doesn't break existing searching
Owen Leonard [Mon, 9 Sep 2013 18:46:23 +0000 (14:46 -0400)]
Bug 10794: fix sorting on billing date column in invoices table
This patch adds the sorting by title string option to the table of
invoices. This allows column data to be sorted based on the
ISO-formatted date rather than the formatted-for-display date. A "blank"
(0000-00-00) date is added to cells which contain no date data.
To test, view the Acquisitions Invoices page (acqui/invoices.pl) and
confirm that the "Billing date" column is sorted correctly regardless of
the dateformat system preference.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Template only change, passes all tests and QA script.
Sorting of the billing column now works as expected.
Bug 10641 - GetBooksellerWithLateOrders in C4::Bookseller.pm has some incoherences
This patch fixes some incoherences of the routine
GetBooksellerWithOrders().
Now it considers the field $estimateddeliverydateto and it replaces it
by now() only if it is undef.
Also, it doesn't test if $aqbookseller.deliverytime is not Null anymore
but if $deliverytime = null or undef, it replaces it by 0.
It also verifies if $delay is >= 0 and return undef if it is a negative
value.
To Test:
Before, this routine sorts out the BookSellerWithLateOrders. If a
bookseller did not specify a deliverytime, it would never appears in
the list of LateOrders. Moreover, if the field "Estimated delivery
date to" was specified, it didn't take care of the value and it
returns the late order up to today's date.
Now, the returned list considers all the fields give and if the
delivery time of the bookseller is not specified, it calculates the
late orders as if the deliverytime is 0. By default, all booksellers
which have orders in late until today are listed unless "estimated
delivery date to" is specified.
prove t/db_dependent/Bookseller.t
t/db_dependent/Bookseller.t ..
[Some warnings about uninitialized values]
WARNING: GetBooksellerWithLateOrders is called with a negative value at C4/Bookseller.pm line 135.
t/db_dependent/Bookseller.t .. ok
All tests successful.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Kenza Zaki [Wed, 28 Aug 2013 08:46:07 +0000 (10:46 +0200)]
Bug 10795: Improvements for GetOpenIssue in C4::Circulation
This patch adds some improvements for the routine GetOpenIssue().
Now, it verifies if the parameter is given (if not it returns undef)
and it returns $sth->fetchrow_hashref() instead of a $issue.
To test:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 2 wallclock secs ( 0.06 usr 0.01 sys + 1.09 cusr 0.07 csys = 1.23 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Same situation as the one noted in comment of
Bug 10683, test fails unless there is an issuingrule
All, All with 1 as renewals allowed.
With that condition, it succeeds
No koha-qa errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kenza Zaki [Wed, 28 Aug 2013 08:08:10 +0000 (10:08 +0200)]
Bug 10683: add unit tests for CRUD routines in C4::Circulation
This patch adds some unit tests wrapped in a transaction for
C4::Circulation.pm.
Circulation_Branch.t adds tests for routines which deal with
branch_item_rules,branch_borrower_circ_rules,
default_branch_circ_rules, default_circ_rules, and
default_branch_item_rules in the database.
Circulation_issue.t adds tests for routines which deal with accountline
and issues in the database.
NOTE: Some commented tests have to be fixed, and some tests can be added.
More, other routines of Circulation.pm are tested in the patches:
10692 UT: Routines about transfers in Circulation.pm need unit tests
10710 UT : OfflineOperation's routines in C4/Circulation.t need unit tests
10767 UT: Routines which interact with the table issuingrules in C4/Circulation need unit test
Test plan:
prove t/db_dependent/Circulation_issue.t
t/db_dependent/Circulation_issue.t .. ok
All tests successful.
Files=1, Tests=16, 0 wallclock secs ( 0.02 usr 0.00 sys + 0.40 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Well, I don't know exactly what to do with this, so
I left it to QA
a) prove t/db_dependent/Circulation_Branch.t works well and without erros
b) prove t/db_dependent/Circulation_issue.t works without errors for me
ONLY if I have a issuingrule for All, All with 1 as renewals allowed, in
other cases it fails.
No koha-qa errors
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kenza Zaki [Thu, 8 Aug 2013 08:39:26 +0000 (10:39 +0200)]
Bug 10698: give C4::Circulation::DeleteTransfer() a return value
This patch adds return values to DeleteTransfer:
Undef if no parameters are given
1 if a Transfer is deleted
0E0 if a wrong parameter is given
It also fixes some unit tests in t/db_dependent/Circulation_transfers.t
To test:
prove t/db_dependent/Circulation_transfers.t
t/db_dependent/Circulation_transfers.t .. ok
All tests successful.
Files=1, Tests=14, 20 wallclock secs ( 0.03 usr 0.00 sys + 0.39 cusr 0.02 csys = 0.44 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Applied 10681 and 10692 before 10698
Run prove t/db_dependent/Circulation_transfers.t without errors
No koha-qa errors on all 3 patches
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kenza Zaki [Mon, 12 Aug 2013 12:40:50 +0000 (14:40 +0200)]
Bug 10711: make GetOfflineOperation() return $sth->fetchrow_hashref instead of $result
To test :
prove t/db_dependent/Circulation_OfflineOperation.t
t/db_dependent/Circulation_OfflineOperation.t .. ok
All tests successful.
Files=1, Tests=7, 19 wallclock secs ( 0.02 usr 0.01 sys + 0.24 cusr 0.02 csys = 0.29 CPU)
Result: PASS
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: applied 10681 and 10710 before 10711
prove t/db_dependent/Circulation_OfflineOperation.t run without errors
koha-qa reports only 2 tabulations, fixed in a follow-up
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kenza Zaki [Mon, 5 Aug 2013 09:28:03 +0000 (11:28 +0200)]
Bug 10681: remove inappropriate uses of finish() from C4::Circulation
This patch gets rid of finish().
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Kenza Zaki [Mon, 5 Aug 2013 09:50:40 +0000 (11:50 +0200)]
Bug 10682: remove inappropriate uses of finish() from C4::Reserves
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author
I run prove t/db_dependent/Reserves.t without errors
don't know if more tests are needed.
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kenza Zaki [Mon, 5 Aug 2013 13:55:40 +0000 (15:55 +0200)]
Bug 10685: remove inappropriate uses of finish in C4::Accounts.pm
This patch gets rid of finish.
From the man page
finish()
Indicate that no more data will be fetched from this statement handle
before it is either executed again or destroyed.
You almost certainly do not need to call this method.
Adding calls to "finish" after loop that fetches all rows is a common
mistake, don't do it, it can mask genuine problems like uncaught fetch errors.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Similar to other patches from the same author.
Run prove t/db_dependent/Accounts.t without errors
No koha-qa errors
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Kyle M Hall [Thu, 19 Sep 2013 16:18:11 +0000 (12:18 -0400)]
Bug 10917: allow staff to update hold pickup location without modify_holds_priority
A librarian cannot modify a hold's pickup location unless he or she has
the permission modify_holds_priority. This appears to be a bug and not
by design. The reason the update fails is due to the priority not being
passed to ModReserve. The priority is not displayed when a librarian
does not have the modify_holds_priority permission.
Test Plan:
1) Remove the modify_holds_priority from your logged in user
* User cannot be a superlibrarian
2) Attempt to change the pickup location for a hold
3) Note the change fails
4) Apply this patch
5) Repeat step 2
6) Note the change succeeds
Signed-off-by: Owen Leonard <oleonard@myacpl.org> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Template change only.
Passes all tests and QA script.
Tested:
- Confirmed that changing the pickup location is not
possible in master without the modify_holds_priority
permission
- Confirmed that the patch fixes the problem - pickup
location can now be changed with and without the
permission
- Pickup location can still not be changed, when
IndependentBranches is activated
Mark Tompsett [Tue, 27 Aug 2013 21:29:20 +0000 (17:29 -0400)]
Bug 7764: rework the INSTALL.ubuntu instructions
This is a major rework. Key improvements include:
- Removed confusing multiple versions for Ubuntu leaving only one set
of instructions.
- The packages koha-deps and koha-perldeps are used.
- License has been updated to reflect GPL3.
- More wiki reference links have been included.
- It is aimed to be based on source, not just tarball or just git.
- Sample output has been cut as much as possible.
- Almost cut-and-paste easy, making it friendlier than INSTALL.debian.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed a tiny typo: seperate
Makes all sense to me - only wondering a bit about the recommendation
of using lynx for the web installer.
Quite an improvement, so passing QA.
Kyle M Hall [Wed, 9 May 2012 16:47:19 +0000 (12:47 -0400)]
Bug 5544: prefer library email address over admin address as notice sender
Right now overdues come from the branch, but the
others come from the admin email address - this
is a problem in multi-branch systems because they
have to come up with one email address that all
branches have access to.
C4::Letters::_send_message_by_email currently sets
the from address in the following order:
1) Address specified in message
2) Koha admin email address
The order will now be:
1) Address specified in message
2) Borrower's home library email address
3) Koha admin email address
Test Plan:
1) Set your library email addresses, and the KohaAdminEmailAddress
Make sure each of them are unqiue
2) Choose a borrower, enable the enhanced messaging and enable the
checkout and checkin email notices. Use your email address for
the borrower's email so you can recieve the emails.
3) Check out an item, check the from address of the email,
it should be the email addres set in KohaAdminEmailAddress
4) Apply the patch
5) Return the item, check the from address of the email,
it should match the email address set for the borrower's
home library.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz> Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Julian Maurice [Fri, 27 Sep 2013 09:48:01 +0000 (11:48 +0200)]
Bug 10856: Fix cover display in shelf browser
Signed-off-by: Owen Leonard <oleonard@myacpl.org> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
I was able to reproduce the problem with local covers and
the patch fixes it in my tests.
Owen Leonard [Tue, 17 Sep 2013 12:33:01 +0000 (08:33 -0400)]
Bug 10856: (Follow-up) improve behavior of the "close shelf browser" link
In Firefox at least, the shelf browser cannot be reopened after
hiding it with the "close shelf browser" link. This followup improves
the behavior of the "close shelf browser" link so that the shelf browser
can be redisplayed.
To test, open a bibliographic detail page in the OPAC and click a
"browse shelf" link. Click the "close shelf browser" link--the shelf
browser should be hidden. Click the original "browse shelf" link and the
shelf browser should reappear without reloading the page.
Test with Firefox and Chrome (at least).
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
Testing notes:
- New unit tests in t/db_dependent/ShelfBrowser.t pass.
- System preference OPACShelfBrowser still works as expected.
- Closing and opening the shelf browser works as expected.
- Next and Previous links show new and nicer behaviour.
- Logs are clean.
Tested with Firefox and Chromium under Ubuntu.
Notes: The currently displayed record could maybe be highlighted
a bit better.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Jonathan Druart [Tue, 14 May 2013 13:11:11 +0000 (15:11 +0200)]
Bug 10856: Improve the previous and next items on the shelf browser
The next and previous links should completely refresh the shelf.
For example:
[<] [1] [2] [3] [4] [5] [6] [>]
Before this patch, the next and previous links were the same as the 1
and 6.
With this patch, after clicking on next, we will get:
[<] [7] [8] [9] [10] [11] [12] [13] [>]
This patch adds a new AJAX script to get the shelf browser block.
Test plan:
- On a detail biblio page, click on a "Browse shelf" link.
- Play with the next and previous links.
- Deactivate Javascript (using NoScript for example) and check that you
get the same behavior (but the page is reloaded).
- Launch the unit tests: prove t/db_dependent/ShelfBrowser.t
Signed-off-by: Owen Leonard <oleonard@myacpl.org> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
David Cook [Tue, 7 May 2013 07:06:33 +0000 (17:06 +1000)]
Bug 10096 - Add a Z39.50 interface for authority searching
This patch introduces a new Z39.50 interface for searching Z39.50
compliant databases for MARC authority records.
These databases aren't as common as their bibliographic equivalents,
but they're out there and very useful. I have included info at the
bottom of this messsage for sample authority databases you can try.
To test this patch:
1) Set up Z39.50 client targets for authority databases. (I've included
information at the bottom of this message for LibrariesAustralia's
test server for authorities as well as instructions on how to use
your Koha's z39.50 authority server as well. The Library of Congress
also has authority databases available (unsure if these are test or
prod), and you might have access to others through OCLC or RLIN. OCLC
provides login credentials for their test databases.
2) Go to the Authorities module
3) Click on the new "Z39.50 search button"
4) Select your authority search targets from the list.
5) Do a search for an authority you would like using either the "Raw"
input box or the more specific input boxes for names, subjects, subject
sub divisions, or titles. (I like searching Name (personal): Eric on
the LibrariesAustralia test DB.)
6) You should see a table listing the server, heading, authority type,
and two other columns (MARC and a nameless column). "Authority type"
is the type of authority it will become when imported in to Koha. In
the Eric example, "PERSO_NAME".
7) Click on "MARC" next to the results of interest to review the MARC
authority record.
8) When you're satisfied with a record, click on "Import".
9) The pop-up window will close and your original Koha window will
change to the "Adding authority Personal Name" screen (in the Eric
example).
10) All the relevant fields should be filled out for the record. Review
them and make any changes as necessary. (N.B. The 001 will be cleared
when saved, so if you have a use for the imported control number, move
it to the 010, 016, or 035 as appropriate. If you have a default value
for the 003, this will also likely be overwritten. Move it if necessary.
The 005 will also be updated when saved, so do not worry about that.)
11) When you're satisfied, click save.
12) Presto! You've imported your first authority record via Z39.50!
--
Here is the info for the LibrariesAustralia test Z39.50 authority
database:
The U.S.A. Library of Congress also provides Z39.50 access to its Name
and Subject Authorities (http://www.loc.gov/z3950/lcserver.html).
Name Authority:
Z39.50 server: Library of Congress Name Authority File
Hostname: lx2.loc.gov
Port: 210
Database: NAF
Syntax: MARC21/USMARC
Encoding: utf8
Subject Authority:
Z39.50 server: Library of Congress Subject Authority File
Hostname: lx2.loc.gov
Port: 210
Database: SAF
Syntax: MARC21/USMARC
Encoding: utf8
(N.B. Both of these databases also include title authorities.)
-
For testing purposes, you can also set up a Z39.50 client target,
which points at your own Koha instance's Z39.50 authority server.
To find the hostname, go to /etc/koha-conf.xml and find the value for
the <listen id="authorityserver"> element. Depending on your
configuration, this could be something like the following:
unix:/zebra/koha/var/run/zebradb/authoritysocket
(N.B. You might be using a different scheme than unix sockets...)
To find the database, scroll down to the bottom of koha-conf.xml until
you reach the <config> element. Within this, look for the value of the
element <authorityserver>. It should probably be "authorities".
To set up this Z39.50 client target in Koha...
Z39.50 server: my koha authorities
Hostname: unix:/zebra/koha/var/run/zebradb/authoritysocket
Port:
Database: authorities
Userid:
Password:
Syntax: MARC21/USMARC (or whichever flavour you need)
Encoding: utf8
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - Add a z39.50 interface for authority searching
This patch adds the "recordtype" column to the "z3950servers" table.
The value in this column (biblio or authority) then controls whether
the z3950 server shows up in a bibliographic search (through the
Acq and Cataloguing modules) or in an authority search (through
the Authorities module).
I also edited the z3950 management console to show this value
and allow users to edit it. The default value is "biblio", since
the vast majority of z3950 targets will be bibliographic. However,
there is an option to add/edit a z3950 target as a source of
authority records.
Test Plan:
1) Apply both patches
2) Run updatedatabase.pl (after setting your KOHA_CONF and PERL5
environmental variables)
3) Use the test plan from the 1st patch
N.B. Make sure that your Z39.50 client target has a Record Type
of Authority, otherwise it won't display when you're doing a
Z3950 search for authorities.
Signed-off-by: Mason James <mtj@kohaaloha.com>
Bug 10096 [FOLLOW-UP] - fix tabs/whitespace errors to pass QA
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com> Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 9735: Unit tests for get_template_and_user (cookies handling)
Galen found a case where the cookies array was not built flat. I add a
unit test for that (check the cookie array is flat) and also test the
cookies output of get_template_and_user so we are sure the &language=
parameter is correctly handled.
Sponsored-by: Universidad Nacional de Cordoba Signed-off-by: Julian Maurice <julian.maurice@biblibre.com> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
- Tests in t/db_dependent/Auth.t pass
- Tested in intranet, OPAC logged in, OPAC logged out
* Adding a valid language code to the URL switches the language
as expected
* Adding an invalid language code causes no change
The current implementation didn't build the cookie array correctly,
yielding login problems in some scenarios.
Sponsored-by: Universidad Nacional de Córdoba Signed-off-by: Julian Maurice <julian.maurice@biblibre.com> Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de> Signed-off-by: Galen Charlton <gmc@esilibrary.com>