When searching (at the OPAC or pro), facets can be enabled but never
disabled. So a user is obliged to relaunch the search.
This patch adds a new link "[x]" at the right of each selected facet.
This link relaunch the search without this facet.
Test plan:
- Launch a search (OPAC and pro)
- Enable some facets
- Disable some facets
Signed-off-by: sonia bouis <sonia.bouis@univ-lyon3.fr>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patches pass all tests and QA script. Nice feature!
Tested in Boostrap and Prog, adding end removing multiple
facets in different sequences, adding and removing the
availability limit.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes a few adjustments in C4::Context::Zconn.
It does no longer pass the (ignored) auth, syntax parameters to _new_Zconn.
Note that auth was not completely ignored in theory, but we never pass
auth=1 [while not having user/password in koha-conf].
Furthermore, it removes the elementSetName for dom indexing. Using marcxml
here does not make a difference. It only adds a constraint on what is in
the dom-config files. (It could probably be removed there now..)
Two cosmetic code changes:
Removes unused label retry.
And moved 'servername' into the database name option.
Test plan:
When using Zebra with dom indexing, do a biblio and authority search.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes changes to koha-conf.xml by removing the fallback section
from biblioserver and authserver. The information is in a include file on
the same server (no need to fall back) and moreover, some information
is not up-to-date and should be moved elsewhere.
The patch also simplifies the DOM retrieval-info files for auth and bib.
And eliminates superfluous F and usmarc from the dom-config files. (I felt
the urge to remove marcxml too, but left it for now; see also the second
patch.) For reference, look at the marcxml example files of Zebra.
NOTE: This patch does not deal with the Debian package installs. In the
same way koha-conf-site.xml.in, and *-retrieval-info-* could be adjusted.
Test plan:
[1] Run at least a dev install in order to copy the new files to your
Zebra folders. Choose for DOM indexing.
Enable the SRU server on port 9998 (small edit in koha-conf.xml).
[2] Restart Zebra and reindex -a -b -x.
[3] Verify if a search from Koha still functions as expected.
Check the SRU output on port 9998. NOTE: If you do not pass recordSchema,
you should get back a marc response now (instead of index schema).
Bonus: Add your server as a Z3950 target to another Koha install. And
perform a Z3950 search from the other server to your new install.
Bonus: Check response from the auth and biblio socket via yaz-client.
[4] Reindex again with -a -b but without -x.
[5] Repeat Koha search, SRU response (Z3950, yaz-client).
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The original patch did not correctly construction ISBN phrase
searches when QP is on. Unfortunately, when attempting to fix that,
I discovered that there's a deep bug in QP that makes it generate
incorrect search queries when combining more than two atoms
in with the || operator. Consequently, until that can be fixed,
this patch ensures that if UseQueryParser is on, AggressiveMatchOnISBN
has no effect.
To state it anther way, AggressiveMatchOnISBN works only when QP
is not in use.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
In UNIMARC, the isbn index is ISBN.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Catalog a record with the ISBN "0394502884 (Random House)"
2) Export the record, edit it so the ISBN is now
"0394502884 (UnRandomHouse)"
3) Using the record import tool, import this record with matching
on ISBN.
4) You should not find a match
5) Apply this patch
6) Run updatedatabase.pl
7) Enable the new system preference AggressiveMatchOnISBN
8) Repeat step 3
9) The tool should now find a match
Signed-off-by: Tom McMurdo <thomas.mcmurdo@state.vp.us>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes a trailing space and remove the issn index deletion
(it does not exist on deletedbiblioitems).
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Test Plan:
1) Apply this patch
2) Create a record with 5 ISBNs and 5 ISSNs
3) Create a new report from the following SQL, or execute it from the
mysql console:
SELECT isbn, issn FROM biblioitems ORDER BY biblionumber DESC LIMIT 1
4) Note that all your ISBNs and ISSNs are listed, separated by the pipe
character ( | )
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
This might be slow to run on big databases, because of the 2 index
rebuilds, however it changes no functionality just increases the field
size which is safe enough (we store multiple now already)
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
< and > are incorrectly transformed into HTML entities on the
XSLT result list when using the GRS-1 indexing mode.
Example:
Record: <TEST>
Result list: <TEST>
HTML source: &lt;TEST&gt
To test:
- catalog a record that contains > and <
- Reindex, without using the -x option
- Confirm the display is correct
- Reindex again, using the -x option
- Confirm the display is now broken
- Apply patch
- Reindex again with and without -x
- Verify that now the display is always correct
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Note: the problem is only visible in GRS-1 setup. It works as expected.
No behaviour change in DOM.
I believe we shouldn't be (de)escaping data ad-hoc, but it seems that GRS-1
needs it because it doesn't handle HTML entities properly. This fix is OK for
GRS-1, unneeded for DOM and probably any other modern search engine.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
When itemtype is defined on biblio (item-level_itypes syspref), the
method C4::Reserves::CanItemBeReserved uses item->{itemtype}. But
ithe item comes from C4::Items::GetItem and it does not have an
'itemtype' key; in this method the item type value is always in
'itype' key.
This patch corrects it.
Test plan:
You should have itemtype on biblio and 'item-level_itypes' syspref
set to biblio.
This test plan is with ReservesControlBranch on ItemHomeLibrary.
- Choose a branch, a borrower category and an item type, for example
'NYC', 'CHILD' and 'DVD'
- Set an issuing rule for 'NYC', CHILD' and 'DVD' with 'Holds allowed'
set to 10
- Set an issuing rule for 'NYC', CHILD' and all item types with
'Holds allowed' set to 0
- Choose an item of a biblio with itemtype 'DVD', that can be reserved,
with 'NYC' as homebranch
- Choose a borrower with category 'CHILD'
- Try to request the item for the borrower
=> without the patch, you can
=> with the patch, you can't
You may check reserve is allowed with 'Holds allowed' > 0 on issuing
rule for 'DVD'.
Signed-off-by: Liz Rea <liz@catalyst.net.nz>
Great test plan - thanks!
Confirmed the bug, and the fix. Looks good to me.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
To test:
[1] Run prove -v t/db_dependent/Holds.t. The last test
should fail.
[2] Apply the main patch.
[3] Run step 1 again. This time the tests should all pass.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch implements bash-completion for the following commands:
- koha-list
- koha-enable
- koha-disable
- koha-email-enable
- koha-email-disable
- koha-enable-sip
- koha-start-sip
- koha-restart-sip
- koha-stop-sip
- koha-start-zebra
- koha-stop-zebra
- koha-restart-zebra
It is implemented in a way that it removes already used or mutually
exclusive parameters (instance names, option switches).
I already have written completion for other (more complex) commands,
but I belive a simpler patch is better to start with.
IMPORTANT: this patch relies on having the koha-list command available
in the path.
To test:
- Make sure you have bash-completion installed and enabled (IRC might
help us if you encounter problems).
- Apply the patch.
Option 1:
- Pick the debian/koha-common.bash-completion file and do
$ cp debian/koha-common.bash-completion /etc/bash_completion.d/koha-common
- Open a new bash shell (I do it opening a new terminal on my Ubuntu box).
- Type one of the listed commands...
And repeatedly press <TAB>.
- Enjoy, and signoff if you belive it is usable. Otherwise report back.
Option 2:
- run:
$ . debian/koha-common.bash-completion
- Type one of the listed commands...
And repeatedly press <TAB>.
- Enjoy, and signoff if you belive it is usable. Otherwise report back.
Tests:
- Some koha-list option switches are mutually exclusive, -h should be
available in any context
- koha-enable should only autocomplete disabled instances
- koha-disable should only autocomplete enabled instances
- koha-email-enable should only autocomplete email-disabled instances
- koha-email-disable should only autocomplete email-enabled instances
- koha-*-zebra scripts should only autocomplete enabled instances.
- koha-*-sip scripts should only autocomplete sip-enabled instances.
Regards
To+
Note: writing bash-completion routines is a bit hacky, I tried to make
it the simplest way I could. Your comments are welcome.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Dobrica Pavlinusic <dpavlin@rot13.org>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Records hidden with OpacSuppression are filtered from the search
results, but the opac-detail page is still visible if you know the
biblio number. This patch hides the detail page for suppressed biblios
by redirecting (controlled by the syspref OpacSuppressionRedirect)
either to opac-blocked (default), explaining that the record is blocked
(including optional explanatory text from the syspref
OpacSuppressionMessage) or to Koha's 404 page, giving no hint that a
biblio with that number exists in the system.
Test plan:
Make sure you have at least one record with 942$n == 1.
Set OpacSuppression to "Don't hide".
Do an OPAC search that should bring up your hidden record and other
records.
Observe that your record is found.
Open the detail page for the record.
Observe that it is accessible. Copy the URL for later(!).
Set OpacSuppression to "Hide".
Leave OpacSuppressionByIPRange blank.
Set OpacSuppressionRedirect to "an explanatory page ('This record is
blocked')."
Leave OpacSuppressionMessage blank for now.
Disable queryparser(!) (because of bug 10542).
Do a full zebra reindex.
Do an OPAC search that should bring up your hidden record and other
records.
Observe that your record is not found.
Open the opac-detail URL of the record (the one you copied before).
Observe that you are redirected to opac-blocked and it displays a
short standard message.
Edit OpacSuppressionMessage and input some text.
Open the opac-detail URL of the record again (the one you copied
before).
Observe that the text you entered in OpacSuppressionMessage is
displayed under the standard text you have seen before.
Set OpacSuppressionRedirect to "the 404 error page ('Not found')."
Open the opac-detail URL of the record again (the one you copied before).
Observe that you are redirected to Koha's 404 error page.
Signed-off-by: Owen Leonard <oleonard@myacpl.org>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
If the template contains dynamic parts, the message won't be
considerated as duplicated.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Duplicate messages will be queued, but when sending the queued messages
duplicates are found and are marked as failed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The previous patch checks if a notice has already been sent when the
current notices has been sent in queue. Which is wrong!
We have to check if a similar notice has been sent today.
This patch has been created after an observation on a production server:
If a user place on holds several items, he will receive 1 SMS per hold.
Here we only want 1 SMS for all holds.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
For PREDUE messages, one message is sent to the message_queue table for
each items in advance, meaning that the patron could receive duplicate
notices.
The SMS part for DUE and PREDUE often do not contain dynamic parts, only
a standard message.
Note that this patch *only* affects the SMS transport.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Currently Koha's SIP server ignores the return date part of an 09 ( aka
CHECKIN ) message. Koha should backdate a return, and remove
fines accordingly.
Signed-off-by: Benjamin Rokseth <benjamin.rokseth@kul.oslo.kommune.no>
Works as notified, second date field in SIP checkin (return date) is
used as return date. Return shows up in history with correct date.
Comments:
- patron is fined if return date is before issue date, but that
is largely irrelevant anyway.
Signed-off-by: Martin Renvoize <martin.renvoize@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The Zoom specification defines that a ScanSet should provide a way
to retrieve terms suitable for displaying and another one for using
on further searches [1].
The Net::Z3950::ZOOM implementation actually provides both [2] but we
were using the wrong one.
Using $scanset->display_term(...) instead of $scanset->term(...) fixes
the problem.
To test:
- Do a index scan search (advanced search > more options > check
'index scan')
- Notice non-latin characters are replaced by one or more '@' symbols.
- Apply the patch
- Re-do the search, everything shows as it should.
- Try to follow any of the terms (clicking on them) and notice that
it actually gives you relevant results (i.e. is not searching for
@!!!!).
[1] http://zoom.z3950.org/api/zoom-1.4.html#3.6.3
[2] http://search.cpan.org/~mirk/Net-Z3950-ZOOM/lib/ZOOM.pod#term()_/_display_term()
Sponsored-by: Universidad Nacional de Cordoba
Followed test plan. Patch behaves as expected.
Signed-off-by: Marc Véron <veron@veron.ch>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
I reproduced the issue and I confirm this patch fixes it.
I put "Fuß" in a title, reindex the record. Launch a search on Title
checking the "scan index" checkbox. And the non-latin characters are
well displayed.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The logout redirection function after a CAS authentication was misused.
This patch fixes it, and allows the CAS server to redirect the user back
to the opac after logout.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
From the Authen::Cas::Client documentation
logout_url [%args]
"logout_url()" returns the CAS server's logout URL which can
be used to redirect users to end
authenticated sessions. %args may contain the following
optional parameter:
* url => $url
If present, the CAS server will present the user
with a link to the given URL once the user has logged out.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Change only affects CAS authentication and is correct
according to the module documentation.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This followup corrects the fact that when using $query->url(), both
GET and POST params are get.
Using $query->url_param() will only get params directly in URL.
Test plan :
- Enable CAS
- Go to login page : cgi-bin/koha/opac-user.pl
- Try to connect with local login using random login and password
(they will be transmitted by POST)
- You stay to login page
- Look at CAS login URL
=> Without this patch it will contain the random login and password
as parameters of opac-user.pl
=> With this patch it does not contain any parameter
Signed-off-by: Matthias Meusburger <matthias.meusburger@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Bug 10029 tries to fix the use of URL parameters in CAS authentication.
But is does not work.
The full URL must be used in all methods of C4::Auth_with_cas.
Also, in checkpw_cas(), the 'ticket' parameter must be removed to find
the original URL.
This patch removes the 'ticket' parameter from query before calling
checkpw_cas() since the ticket is passed as method arguemnt.
In C4::Auth_with_cas, many methods use the same code to get the CAS
handler and the service URI. This patch adds a private method
_get_cas_and_service() to do the job.
Test plan:
- Enable CAS
- Go to opac without been logged-in
- Try to place hold on a record
=> You get to /cgi-bin/koha/opac-reserve.pl?biblionumber=XXX showing
authentication page
=> Check that CAS link contains query param "biblionumber"
- Click on CAS link and log in
=> Check you return well logged-in to reserve page with biblionumber
param
- Check CAS loggout
- Check Proxy CAS auth
Signed-off-by: Koha team AMU <koha.aixmarseille@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests in t, xt, and t/db_dependent/Auth.t.
Also passes QA script.
As I have no working CAS server, I focused on regression testing:
Activated Persona and casAuthentication.
- Verified normal login against database still works.
- Verified Persona login works.
Note: With Persona you are always forwarded to the patron
account - so you have to search for the record again before
you can place a hold.
- Verified that the CAS URL contains the biblionumber when
logging in while placing a hold.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Retested 2014-04-12
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
At the OPAC, if the bootstrap theme is used, the modal
dialog does not allow to use the CAS authentication.
This patch proposes, in case CAS is enabled, to
redirect to the opac-user.pl page (like prog theme).
This is because the popup content should stay small
(thinking about mobile surf), and that CAS url is
actually only computed in opac-user (see C4/Auth.pm)
and it would not be performant to compute it in all
pages.
Test plan:
- set syspref opacthemes to bootstrap
- enable the casAuthentication syspref
- fill the casServerUrl syspref with something like:
https://localhost:8443/cas
- go on the opac home page
- click on "Log in to your account" link (top right)
=> You go to cgi-bin/koha/opac-user.pl page where you
see the cas link
- disable the casAuthentication syspref
- go on the opac home page
- click on "Log in to your account" link (top right)
=> You see a popup for login and password
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Normal login still works with/without CAS.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch removes the use of POSIX::strftime which is based on the
locales of the system.
The DateTime module translates month and day name with success, without
any locale installed.
For the saesons, I use the way used in Koha: write the word in
templates. On this way the translate script will match them and allow
translators to translate them.
This patch adds a regression: the season names are not translated
following the locale selected.
This could be done when bug 8044 will be pushed.
Test plan:
0/ Update your po files and translate the season name.
1/ Create a numbering pattern using season.
example:
Name: Seasonal
Numbering formula: {X}
X: Season, Add=1, Every=1, Set back to 0 when more than 3, formatting
"name of season"
And test the prediction pattern with:
frequency: 1/3 month
First issue : 2013-09-21
length: 12 months
X begins with 2 (21th Septembre is Fall)
2/ Click on the test pattern button, you should get:
Fall 21/09/2013
Winter 21/12/2013
Spring 21/03/2014
Summer 21/06/2014
Change the locale and verify the season names are *not* translated.
Change the Koha language and verify the season names are translated.
3/ Create a numbering pattern using day or month name.
example:
Name: day
Numbering formula: {X}
X: day, Add=1, Every=1, Set back to 0 when more than 6, formatting "name
of day"
Frequency: 1/day
First issue: 2013-11-18
length: 1 month
X begins with 0
You should get:
Monday 18/11/2013
Tuesday 19/11/2013
Wednesday 20/11/2013
[...]
Sunday 15/12/2013
Monday 16/12/2013
Tuesday 17/12/2013
change the locale and verify the day names are translated.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors
Tested on top of Bug 11265 and Bug 11263,
and solved merge conflict
Updating PO file gives seasons to translate.
Tested using seasons, day and month
Only note is different behavior
1) To use seasons you need to use staff in desired language
2) To use day and month only need to select locale
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
No regressions found. Passes koha-qa.pl, t and xt
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Tested this again on top of 11263 and it works as described.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch produces the same dropdown list as the one in the adv search.
With this way, it won't be necessary to install additional templates to
fill the locale list.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The locales list for subscriptions should not be retrieved from the
locales of the system.
This patch retrieves the locales list from the Koha DB (in the same way
as pref language and opaclanguages).
Test plan:
Edit a subscription (or a numbering pattern) and verify the list of
languages is the same as languages available in Koha.
Note: with this patch we loose the saeson translation, it is
normal. See report linked.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Work as described. No koha-qa errors.
New locale is retrieved from installed languages.
I wonder if that list can be restricted to
'enabled' ones (parsing syspref language value).
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes QA script and all tests.
Works according to description.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
1/ Edit a Perl script, for example mainpage.pl
2/ add "use Koha::I18N;" to the top of file
3/ add a translatable message somewhere in the script (this have
to be after the call to get_template_and_user). For example:
warn gettext("This is a translated warning");
4/ Create or update the PO files with
misc/translator/translate create LANGCODE
or
misc/translator/translate update LANGCODE
(LANGCODE should be enable in syspref 'languages')
5/ In misc/translator/po/LANGCODE-messages.po you should have
your string, translate it (using a text editor or a PO file
editor, make sure you don't have the "fuzzy" flag for this
string).
6/ Go to mainpage.pl with active language being English with your
browser and check your logs. You should see your string "This
is a translated warning".
7/ Now change language to LANGCODE. Check your logs, you should
have the string translated.
Note: I chose to name the sub 'gettext' because it's the default
keyword for xgettext for Perl. We can change it to whatever we want.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Follow test plan, work as described.
No koha-qa errors.
Tests pass
Fixed small merge conflict on t/Context.t
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests and QA script.
Copied test plan from bug.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Instead of writing
use CGI;
use Koha::I18N;
my $cgi = new CGI;
my $lh = Koha::I18N->get_handle_from_context($cgi, 'intranet');
print $lh->maketext('my translatable text');
you can now write
use Koha::I18N;
print gettext('my translatable text');
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Also store interface (intranet, opac) in context to not have to pass it
as parameter.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comments on last patch.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
CHARSET is now automatically replaced by UTF-8, and 'update' creates the
PO file if it does not exist.
Also do not try to create PO files if POT file creation failed (when
there is no messages to translate for example).
+ add some verbosity
+ add Locale::Maketext and Locale::Maketext::Lexicon to Koha
dependencies
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
You have to use the new module Koha::I18N
Code example:
use Koha::I18N;
use CGI;
my $input = new CGI;
my $lh = Koha::I18N->get_handle_from_context($input, 'intranet');
print $lh->maketext("Localized string!");
PO files are in misc/translator/po/LANG-messages.po.
Creation of PO files are integrated to existing workflow, so to create
PO file for a language, just run in misc/translator:
./translate create LANG
To update:
./translate update LANG
You can then translate the PO with your favorite editor. Strings will be
localized at runtime.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Works as advertised. Some details needing further attention noted on bug
report.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Add a separate phrases-icu.xml for phrase indexes
The file is based on that distributed with zebra
with a couple of additions to reflect Koha usage
This patch adds a separate tokenizer variable
for phrase indexes so that default.idx is
correctly rewritten for sites using icu
indexing
Signed-off-by: Paola Rossi <paola.rossi@cineca.it>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
- Applied patch
- perl Makefile.PL --prev-install-log ../koha-dev/misc/koha-install-log
- make upgrade
- Restarted Zebra server
- Did a full reindex of bibliographic and authorities
- Checked various searches
- Links records to authorities
- Checked created links work correctly
I couldn't find a regression with this patch.
Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>