This patch adjusts the test so that if an "ISBN" is not
defined (because the source string did not specify a valid
ISBN), it doesn't result in a warning once the warnings
stricture is enabled in C4::Koha.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes a problem where search errors like this in the logs when
running a stage import with the AggressiveMatchOnISBN syspref turned on:
search failed (isbn,phr=978-0-7517-9745-9 or isbn,phr=0-7517-9745-6 or
isbn,phr=978-0-7517-9745-9 or isbn,phr=0751797456 or
isbn,phr=9780751797459 or isbn,phr=978-0-7517-9759-6 or
isbn,phr=0-7517-9759-6 or isbn,phr=978-0-7517-9759-6 or
isbn,phr=0751797596 or isbn,phr=9780751797596 or isbn,phr= or
isbn,phr= or isbn,phr= or isbn,phr= or isbn,phr=) CCL parsing
error (10014) Search word expected ZOOM at
/usr/share/koha/lib/C4/Matcher.pm line 688.
This is most easily seen when running the script from command line
/misc/stage_file.pl.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Restore elementSetName to marcxml for DOM indexing in Zconn (Context.pm).
This prevents the need of rebuilding the index after restarting Zebra
server.
Removes the now incorrect reference to marcxml as 'superfluous' in four
dom config files.
Test plan:
[1] Do not yet apply this patch.
[2] Rebuild zebra index with the zebra config of commit
036f2a50e1.
[3] (Go back to master.) Restart your zebra server (no config change).
You will have results without details.
Apply this patch: you see details.
Reset to master: no details again.
[4] Install new zebra config from master.
Search again: you still see no details.
Restart zebra server. Search: you see details.
Apply this patch. Search: still details.
Restart zebra server. Search: still details.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Tested in a non-package environment (manual dev install).
The package environment should work now too (results in step 4c might differ).
Progress on bug 12012 would be appropriate to sync all changes.
Tested the response of the SRU server too.
Signed-off-by: Marc Veron <veron@veron.ch>
I tested starting on a VM with Koha 3.15.00.019 installed.
Did git pull -> Koha 3.15.00.051
Result: No details in search results.
Applied patch.
Result: Search results display fine.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes some sources of warning messages thrown by
C4::Languages, particular getTranslatedLanguages() when running
Koha's web installer.
TEST PLAN
---------
1) Apply first patch
2) prove -v t/db_dependent/Languages.t
-- There will be uninitialized string messages, etc.
3) Apply second patch (this one)
4) prove -v t/db_dependent/Languages.t
-- Only one carp message will remain.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Test pass, no warnings, no koha-qa errors
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Some types of invalid ISBNs, when run through C4::Koha::NormalizeISBN
can produce ISBN objects that contain invalid ISBNs. This can be
reproduced with an ISBN that has an invalid prefix, group code or
publisher code. An example ISBN would be "0788893777 (2 DVD 45th ed)".
When attempting to look up a record with such an ISBN, you will get an
error along the lines of: Can't call method "as_string" on an undefined
value
Instead of checking for the BAD_ISBN state, we should be checking for
the GOOD_ISBN state via the method is_valid.
Test Plan:
1) Edit a record, add the following ISBN to your record:
0788893777 (2 DVD 45th ed)
2) When Koha redirects to the record, you should see the error message
described
3) Apply this patch
4) Reload the page, you should now see your record
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This sets the version numbers of the modules added by bug 12234 to more
sensible values (in this case, the versions included in Debian Squeeze.)
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
A run of update-control, adding bash-completion as a build-time
dependency, allowing update-control to ignore anything that doesn't
have a package but isn't marked as "required" by Koha, added
dependencies that we don't use but is needed by something we do use.
All fairly mundane.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When a user runs a report containing an SQL error, no
error is shown to the user. This patch fixes this.
To test:
1) Run a report with known good SQL.
2) No error is shown.
3) Run a report with bad SQL (eg. a typo in field name)
4) No error is shown.
5) Apply patch
6) Repeat 1-4. For the bad SQL report, the database error
should be shown.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test plan:
1/ Create a new subscription, with manual history enabled
2/ Edit history by clicking on "Edit history" under Planning tab (add
some text)
3/ Receive some serials, see that received and missing issues are not
updated in Summary tab
4/ Edit subscription and disable manual history
5/ Receive some serials, see that received and missing issues are
updated, but your notes have been kept.
6/ Edit serials and change status from/to missing or not available.
Check that missing issues are updated correctly.
7/ Edit serials and change status from/to arrived. Check that received
issues are updated correctly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes an issue caught by the test case where StrWidth()
based its calculations on the internal Adobe font rather than a
TrueType font in use.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Since the addition of search groups to Koha, the branch limiting
parameter in multiple PAC by URL support should also support
limiting by these search groups. This patch adds this ability.
Signed-off-by: Srdjan <srdjan@catalyst.net.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch fixes two problems:
a) Bad PDF when using Helvetica font.
Current label code assigns 'italic' or 'oblique' variants
to title. Helvetica-Oblique was not defined, but is present
b) Bad alignment using center/right justification
Problem was bad font parameter passed to StrWidth
routine
To test:
1. Try making a batch using Helvetica, downloaded PDF do not open.
2. Try a batch of mixed scripts with layout alignment center or
right, only latin scripts align almost correctly.
3. Apply the patch and update your koha-conf.xml to add Oblique variant
4. Try again 1, now PDF opens
5. Try 2, now alignment is correct
New problem (for another bug): DejaVuSans has a good
support for arabic, but not Oblique variant. As selection
of italic/oblique is hardcoded, now Arabic titles are
not displayed. I'll try to add a checkbox to select
or not this feature.
Added a FIXME for the hardcoded forced oblique -chris_n
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Since built-in PDF fonts suport just Latin-1 encoding, we have
to switch to TrueType fonts to correctly encode all UTF-8 characters
(which we should be getting from database anyway).
This approach also nicely sidesteps our encoding cludges, but
requires paths to TrueType fonts which are included in koha-conf.xml
under new <ttf> section. Without this directive in kona-conf.xml
code will still use Latin-1 built-in pdf fonts.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Chris Nighswonger <cnighswonger@foundations.edu>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes a few adjustments in C4::Context::Zconn.
It does no longer pass the (ignored) auth, syntax parameters to _new_Zconn.
Note that auth was not completely ignored in theory, but we never pass
auth=1 [while not having user/password in koha-conf].
Furthermore, it removes the elementSetName for dom indexing. Using marcxml
here does not make a difference. It only adds a constraint on what is in
the dom-config files. (It could probably be removed there now..)
Two cosmetic code changes:
Removes unused label retry.
And moved 'servername' into the database name option.
Test plan:
When using Zebra with dom indexing, do a biblio and authority search.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The original patch did not correctly construction ISBN phrase
searches when QP is on. Unfortunately, when attempting to fix that,
I discovered that there's a deep bug in QP that makes it generate
incorrect search queries when combining more than two atoms
in with the || operator. Consequently, until that can be fixed,
this patch ensures that if UseQueryParser is on, AggressiveMatchOnISBN
has no effect.
To state it anther way, AggressiveMatchOnISBN works only when QP
is not in use.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In UNIMARC, the isbn index is ISBN.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Catalog a record with the ISBN "0394502884 (Random House)"
2) Export the record, edit it so the ISBN is now
"0394502884 (UnRandomHouse)"
3) Using the record import tool, import this record with matching
on ISBN.
4) You should not find a match
5) Apply this patch
6) Run updatedatabase.pl
7) Enable the new system preference AggressiveMatchOnISBN
8) Repeat step 3
9) The tool should now find a match
Signed-off-by: Tom McMurdo <thomas.mcmurdo@state.vp.us>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
< and > are incorrectly transformed into HTML entities on the
XSLT result list when using the GRS-1 indexing mode.
Example:
Record: <TEST>
Result list: <TEST>
HTML source: &lt;TEST&gt
To test:
- catalog a record that contains > and <
- Reindex, without using the -x option
- Confirm the display is correct
- Reindex again, using the -x option
- Confirm the display is now broken
- Apply patch
- Reindex again with and without -x
- Verify that now the display is always correct
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Note: the problem is only visible in GRS-1 setup. It works as expected.
No behaviour change in DOM.
I believe we shouldn't be (de)escaping data ad-hoc, but it seems that GRS-1
needs it because it doesn't handle HTML entities properly. This fix is OK for
GRS-1, unneeded for DOM and probably any other modern search engine.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When itemtype is defined on biblio (item-level_itypes syspref), the
method C4::Reserves::CanItemBeReserved uses item->{itemtype}. But
ithe item comes from C4::Items::GetItem and it does not have an
'itemtype' key; in this method the item type value is always in
'itype' key.
This patch corrects it.
Test plan:
You should have itemtype on biblio and 'item-level_itypes' syspref
set to biblio.
This test plan is with ReservesControlBranch on ItemHomeLibrary.
- Choose a branch, a borrower category and an item type, for example
'NYC', 'CHILD' and 'DVD'
- Set an issuing rule for 'NYC', CHILD' and 'DVD' with 'Holds allowed'
set to 10
- Set an issuing rule for 'NYC', CHILD' and all item types with
'Holds allowed' set to 0
- Choose an item of a biblio with itemtype 'DVD', that can be reserved,
with 'NYC' as homebranch
- Choose a borrower with category 'CHILD'
- Try to request the item for the borrower
=> without the patch, you can
=> with the patch, you can't
You may check reserve is allowed with 'Holds allowed' > 0 on issuing
rule for 'DVD'.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Great test plan - thanks!
Confirmed the bug, and the fix. Looks good to me.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If the template contains dynamic parts, the message won't be
considerated as duplicated.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Duplicate messages will be queued, but when sending the queued messages
duplicates are found and are marked as failed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The previous patch checks if a notice has already been sent when the
current notices has been sent in queue. Which is wrong!
We have to check if a similar notice has been sent today.
This patch has been created after an observation on a production server:
If a user place on holds several items, he will receive 1 SMS per hold.
Here we only want 1 SMS for all holds.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For PREDUE messages, one message is sent to the message_queue table for
each items in advance, meaning that the patron could receive duplicate
notices.
The SMS part for DUE and PREDUE often do not contain dynamic parts, only
a standard message.
Note that this patch *only* affects the SMS transport.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Currently Koha's SIP server ignores the return date part of an 09 ( aka
CHECKIN ) message. Koha should backdate a return, and remove
fines accordingly.
Signed-off-by: Benjamin Rokseth <benjamin.rokseth@kul.oslo.kommune.no>
Works as notified, second date field in SIP checkin (return date) is
used as return date. Return shows up in history with correct date.
Comments:
- patron is fined if return date is before issue date, but that
is largely irrelevant anyway.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The Zoom specification defines that a ScanSet should provide a way
to retrieve terms suitable for displaying and another one for using
on further searches [1].
The Net::Z3950::ZOOM implementation actually provides both [2] but we
were using the wrong one.
Using $scanset->display_term(...) instead of $scanset->term(...) fixes
the problem.
To test:
- Do a index scan search (advanced search > more options > check
'index scan')
- Notice non-latin characters are replaced by one or more '@' symbols.
- Apply the patch
- Re-do the search, everything shows as it should.
- Try to follow any of the terms (clicking on them) and notice that
it actually gives you relevant results (i.e. is not searching for
@!!!!).
[1] http://zoom.z3950.org/api/zoom-1.4.html#3.6.3
[2] http://search.cpan.org/~mirk/Net-Z3950-ZOOM/lib/ZOOM.pod#term()_/_display_term()
Sponsored-by: Universidad Nacional de Cordoba
Followed test plan. Patch behaves as expected.
Signed-off-by: Marc Véron <veron@veron.ch>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
I reproduced the issue and I confirm this patch fixes it.
I put "Fuß" in a title, reindex the record. Launch a search on Title
checking the "scan index" checkbox. And the non-latin characters are
well displayed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The logout redirection function after a CAS authentication was misused.
This patch fixes it, and allows the CAS server to redirect the user back
to the opac after logout.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
From the Authen::Cas::Client documentation
logout_url [%args]
"logout_url()" returns the CAS server's logout URL which can
be used to redirect users to end
authenticated sessions. %args may contain the following
optional parameter:
* url => $url
If present, the CAS server will present the user
with a link to the given URL once the user has logged out.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Change only affects CAS authentication and is correct
according to the module documentation.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This followup corrects the fact that when using $query->url(), both
GET and POST params are get.
Using $query->url_param() will only get params directly in URL.
Test plan :
- Enable CAS
- Go to login page : cgi-bin/koha/opac-user.pl
- Try to connect with local login using random login and password
(they will be transmitted by POST)
- You stay to login page
- Look at CAS login URL
=> Without this patch it will contain the random login and password
as parameters of opac-user.pl
=> With this patch it does not contain any parameter
Signed-off-by: Matthias Meusburger <matthias.meusburger@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10029 tries to fix the use of URL parameters in CAS authentication.
But is does not work.
The full URL must be used in all methods of C4::Auth_with_cas.
Also, in checkpw_cas(), the 'ticket' parameter must be removed to find
the original URL.
This patch removes the 'ticket' parameter from query before calling
checkpw_cas() since the ticket is passed as method arguemnt.
In C4::Auth_with_cas, many methods use the same code to get the CAS
handler and the service URI. This patch adds a private method
_get_cas_and_service() to do the job.
Test plan:
- Enable CAS
- Go to opac without been logged-in
- Try to place hold on a record
=> You get to /cgi-bin/koha/opac-reserve.pl?biblionumber=XXX showing
authentication page
=> Check that CAS link contains query param "biblionumber"
- Click on CAS link and log in
=> Check you return well logged-in to reserve page with biblionumber
param
- Check CAS loggout
- Check Proxy CAS auth
Signed-off-by: Koha team AMU <koha.aixmarseille@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests in t, xt, and t/db_dependent/Auth.t.
Also passes QA script.
As I have no working CAS server, I focused on regression testing:
Activated Persona and casAuthentication.
- Verified normal login against database still works.
- Verified Persona login works.
Note: With Persona you are always forwarded to the patron
account - so you have to search for the record again before
you can place a hold.
- Verified that the CAS URL contains the biblionumber when
logging in while placing a hold.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Retested 2014-04-12
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes the use of POSIX::strftime which is based on the
locales of the system.
The DateTime module translates month and day name with success, without
any locale installed.
For the saesons, I use the way used in Koha: write the word in
templates. On this way the translate script will match them and allow
translators to translate them.
This patch adds a regression: the season names are not translated
following the locale selected.
This could be done when bug 8044 will be pushed.
Test plan:
0/ Update your po files and translate the season name.
1/ Create a numbering pattern using season.
example:
Name: Seasonal
Numbering formula: {X}
X: Season, Add=1, Every=1, Set back to 0 when more than 3, formatting
"name of season"
And test the prediction pattern with:
frequency: 1/3 month
First issue : 2013-09-21
length: 12 months
X begins with 2 (21th Septembre is Fall)
2/ Click on the test pattern button, you should get:
Fall 21/09/2013
Winter 21/12/2013
Spring 21/03/2014
Summer 21/06/2014
Change the locale and verify the season names are *not* translated.
Change the Koha language and verify the season names are translated.
3/ Create a numbering pattern using day or month name.
example:
Name: day
Numbering formula: {X}
X: day, Add=1, Every=1, Set back to 0 when more than 6, formatting "name
of day"
Frequency: 1/day
First issue: 2013-11-18
length: 1 month
X begins with 0
You should get:
Monday 18/11/2013
Tuesday 19/11/2013
Wednesday 20/11/2013
[...]
Sunday 15/12/2013
Monday 16/12/2013
Tuesday 17/12/2013
change the locale and verify the day names are translated.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors
Tested on top of Bug 11265 and Bug 11263,
and solved merge conflict
Updating PO file gives seasons to translate.
Tested using seasons, day and month
Only note is different behavior
1) To use seasons you need to use staff in desired language
2) To use day and month only need to select locale
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
No regressions found. Passes koha-qa.pl, t and xt
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested this again on top of 11263 and it works as described.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
1/ Edit a Perl script, for example mainpage.pl
2/ add "use Koha::I18N;" to the top of file
3/ add a translatable message somewhere in the script (this have
to be after the call to get_template_and_user). For example:
warn gettext("This is a translated warning");
4/ Create or update the PO files with
misc/translator/translate create LANGCODE
or
misc/translator/translate update LANGCODE
(LANGCODE should be enable in syspref 'languages')
5/ In misc/translator/po/LANGCODE-messages.po you should have
your string, translate it (using a text editor or a PO file
editor, make sure you don't have the "fuzzy" flag for this
string).
6/ Go to mainpage.pl with active language being English with your
browser and check your logs. You should see your string "This
is a translated warning".
7/ Now change language to LANGCODE. Check your logs, you should
have the string translated.
Note: I chose to name the sub 'gettext' because it's the default
keyword for xgettext for Perl. We can change it to whatever we want.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Follow test plan, work as described.
No koha-qa errors.
Tests pass
Fixed small merge conflict on t/Context.t
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Copied test plan from bug.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Also store interface (intranet, opac) in context to not have to pass it
as parameter.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on last patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
CHARSET is now automatically replaced by UTF-8, and 'update' creates the
PO file if it does not exist.
Also do not try to create PO files if POT file creation failed (when
there is no messages to translate for example).
+ add some verbosity
+ add Locale::Maketext and Locale::Maketext::Lexicon to Koha
dependencies
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When using Plack, the https method returns 'OFF'.
We have to test this value before sending the value to templates.
Test plan:
1/ Fill your OPACBaseUrl
2/ Configure apache for using http
3/ Check the social networks links (should be http://OPACBaseUrl)
4/ Launch Plack
5/ Check the social networks link (should be http://OPACBaseUrl)
6/ Stop Plack
7/ Configure apache for using https
sudo openssl req -x509 -nodes -days 365 -newkey rsa:1024 -out
/etc/apache2/server.crt -keyout /etc/apache2/server.key
and add in you virtualhost (with :443)
SSLEngine on
SSLCertificateFile /etc/apache2/server.crt
SSLCertificateKeyFile /etc/apache2/server.key
a2enmod ssl
service apache2 restart
8/ Check the social networks links (should be https://OPACBaseUrl)
FIXME: Under Plack, with ssl actived, the CGI->https() method always
returns 'OFF'.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Like OPAC, the search history is now available for intranet. This
is controlled by the EnableSearchHistory system preference.
Test plan:
1/ Switch on the 'EnableSearchHistory' syspref.
3/ Launch some biblio and authority searches.
4/ Go on your search history page (top right, under "Set library").
5/ Check that all yours searches are displayed.
6/ Click on some links and check that results are consistent.
7/ Delete your biblio history searches.
8/ Delete your authority searches history searches.
9/ Launch some biblio and authority searches
10/ Play with the 4 delete links (current / previous and biblio /
authority).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This ensures that if an anonymous session is converted to a logged-in
session, that search history times from the anonymous session get
stored corectly.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Like biblio, this feature provides an authority search history.
This history is available for connected and disconnected user.
If the user is not logged in Koha, the history is stored in an
anonymous user sessin.
The search history feature is now factorized in a new module.
This patch adds:
- 1 new db field search_history.type. It permits to distinguish the
search type (biblio or authority).
- 1 new module C4::Search::History. It deals with 2 different storages:
DB or cookie
- 2 new UT files: t/Search/History.t and t/db_dependent/Search/History.t
- 1 new behavior: the 'Search history' link (on the top-right corner of
the screen) is always displayed.
Test plan:
1/ Switch on the 'EnableOpacSearchHistory' syspref.
2/ Go on the opac and log out.
3/ Launch some biblio and authority searches.
4/ Go on your search history page.
5/ Check that all yours searches are displayed.
6/ Click on some links and check that results are consistent.
7/ Delete your biblio history searches.
8/ Delete your authority searches history searches.
9/ Launch some biblio and authority searches
10/ Delete all your history (cross on the top-right corner)
11/ Check that all your history search is empty.
12/ Launch some biblio and authority searches.
13/ Login to your account.
14/ Check that all previous searches are displayed.
15/ Launch some biblio and authority searches.
16/ Check that these previous searches are displayed under "Current
session".
17/ Play with the 4 delete links (current / previous and biblio /
authority).
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All patches together pass QA script and tests.
Also, new tests in t/db_dependent/ pass.
Tested in all 4 OPAC themes, being logged in and anonymous.
Anonymous search history will be appended to personal search
history after logging in.
Also verified that cleanup_database still purges search history,
now also including the authority searchs.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Some corrections :
- opac-reserve.tt : opening <p> instead of closing
- opac-user.tt : warnexpired was in database format, adds the use
of KohaDates template plugin
- opac-user.tt : duplicated TT test : [% IF ( BORROWER_INF.warnexpired ) %]
and [% ELSIF ( BORROWER_INF.warnexpired ) %], maybe a merge error
- opac-user.tt : <string> instead of <strong>, maybe for HTML 6 :-)
- opac-user.pl : adding dateformat var to template is already done by Auth.pm
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Testing notes:
- Database update
* Changes to kohastructure match changes done by the updatedatabase
statement. Feature is activated by default. Fixing 'yes' to be '1'
in a follow up.
* Ran database update succesfully.
* Note: Patrons are now blocked by default in new installations
AND in updated installations.
- System preference
* Verified system preference shows up correctly.
- Category configuration
* Add new patron category
* Edit existing patron category
* Delete patron category
* Check patron category summary table.
=> Verified all actions work as expected.
=> Verified chosen value for BlockExpiredPatronOpacActions'
is always displayed and saved correctly.
* Note: The new value is missing from the summary table.
* Note: The new value is also not shown when deleting a patron category.
- Check functionality
* Renew and place a hold for an NOT EXPIRED patron with
a) category: use syspref (default)
syspref: block (default)
b) category: use syspref (default)
syspref: don't block
c) category: block
syspref: don't block
d) category: block
syspref: block
e) category: don't block
sypref: block
* Verified renewals and placing holds were never blocked.
* Also verified that the warning from NotifyBorrowerDeparture
still shows up correctly.
* Renew and place a hold for an EXPIRED patron with
a) category: use syspref (default)
syspref: block (default)
=> OK, both actions are blocked.
b) category: use syspref (default)
syspref: don't block
=> OK, both actions possible.
c) category: block
syspref: don't block
=> OK, both actions are blocked.
d) category: block
syspref: block
=> OK, both actions are blocked.
e) category: don't block
sypref: block
=> OK, both actions possible.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
2014-04-06 Update: Will repeat and amend above test plan on last patch in this series.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Apply patch
2) Run updatedatabase.pl
3) Pick a patron, note the patron's category
5) Issue an item to this patron
4) Edit that category, set "Block expired patrons" to "Block"
5) Verify the patron cannot renew or place holds in the OPAC
6) Edit the category again, set "Block expired patrons" to
"Don't block"
7) Verify the patron *can* renew and place holds in the OPAC
8) Edit the category again, set "Block expired patrons" to
"Follow system preference BlockExpiredPatronOpacActions"
9) Set the system preference BlockExpiredPatronOpacActions to
"Block"
10) Verify the patron cannot renew or place holds in the OPAC
11) Set the system preference BlockExpiredPatronOpacActions to
"Don't block"
12) Verify the patron *can* renew and place holds in the OPAC
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Melia Meggs <melia@bywatersolutions.com>
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Testing notes on last patch in series.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch restores the display of the authority type summary for
MARC21, where at present the heading type (i.e., "Topical Term",
"Personal Names") come over for display in the template.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch corrects a small bug :
Authorities search on all types does not show summary because it is computed
with selected type (which is empty) instead of using found authority type.
Test plan :
- Go to intranet autorities search
- Perform a search on all authorities types
- Look at results
=> Without this patch, results rows do not display the autority summary, only
authorized headings
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
From a biblio record, if one wants to add a 600$a information, a pop-up
appears. On this new window, on search terms typed and validated, a table
result is displayed, with a column "Get It!" allowing the selection of an
authority. From here, different cases:
1) If we have a simple authority with 200$a and 200$b subfields, a link
"choose" is displayed, working correctly.
2) If the authority has different occurences of 200$a/200$b, numeric links (1 2
and so on) are displayed, one for each occurence. In the example of my
screenshot, the line with a "Paul, Korky -- Pauline, Korkette" summary
possesses two links : "1" will add "Paul, Korky" whereas "2" will add
"Pauline, Korkette" (couldn't come up with a better name ;)).
3) If the authority has 200$x or 200$y subfields defined, several links are
also created, when it should not be the case. In our example, "Niclausse,
Paul -- Expositions" will create a link "1" for "Niclausse, Paul" and a link
"2" for "Expositions". Clicking on the 2nd link leads to the following
error: Software error: Can't call method "subfields" on an undefined value
at
/home/asaurat/workspace/versions/community/authorities/blinddetail-biblio-search.pl
line 86. Only the cases 1 and 2 should be handled. The creation of links
for subfields like 200$x or 200$y should be removed.
This problem is caused by the use of " -- " has separator of authorities with
several headings, but also in some heading between main part and subdivisions.
This patch corrects this by using an array in authorities summary so that
presentation is computed in template. I've choosen to use the pipe separator
between authorities with several headings. This may be changed to be
configurable.
Test plan :
- Edit an authority type summary : for example subject (heading on 250) :
summary "[250a][ -- 250x]"
- Create an authority A1 with one heading and a subdivision : for example a
subject : 250$a "History" 250$x "20th century"
- Create an authority A2 with several headings. for example a subject : 250$a
"History" 250$a "Legends"
- Rebuild Zebra queue
- Go to OPAC and click on "Authority search" and search on "History"
=> You will find A1 and A2 :
History -- 20th century
History | Legends
- Go to intranet autorities search and search on "History"
=> You will find A1 and A2 :
History -- 20th century
History | Legends
- Edit a record using this autorities type as thesaurus : for example on 606$a
- Click on thesaurus link and search on "History"
=> You will find A1 and A2 :
History -- 20th century ; 0 times ; choose ; Edit authority
History | Legends ; 0 times ; 1 2 ; Edit authority
- Click on link "2" to chosse "Legends"
=> You get "Legends" in heading field : for example 606$a
Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
I can confirm the problem and the solution. I have tested the patch on a large
DB with authorities having multiples headings. There is no regression on bug
4838.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Without the patch I couldn't choose between multiple headings
in the authority plugin, but with the patch it works as described.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>