Since nobody is currently working on the zebra layer introduced by bug
8233, Solr won't never work.
Some code has been introduced in 3.10 to prove several search engines
can cohabit into Koha but no help/fund has been found to go ahead.
It is useless to keep this code and to maintain an ambiguous situation.
I think the indexes configuration page could be restore later if someone
else introduces a new search engine into Koha.
Test plan:
Look at the code introduced by bug 8233 and verify all is removed.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch makes GetMarcISSN test for empty subfield before pushing to the
result array.
To test:
- Run the regression test
=> FAILS for all MARC flavours
- Apply the patch
- Run the regression test
=> SUCCESS: tests pass
- Sign off
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This changes the existing framework caching, which was using memoisation
if memcached was available, and memory in all cases, to use the
Koha::Cache system. This uses memcache if possible, and in-memory
otherwise. However it also clears the cache when the framework updates,
making sure that the changed version will be picked up.
Note that the in-memory cache clears itself after 10 seconds, so that if
memcached isn't available, this is the longest that old versions will
hang around.
Test plan:
* work through
http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=11842#c0
and make sure that the erronious result doesn't occur.
Note:
* The patch on bug 12041 is required for this to work.
Signed-off-by: Brendan Gallagher <brendan@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
When doing aquisitions and ordering from external z3950 targes, the item price is not inferred from the MARC record when the NORMARC framework is set.
This patch makes GetMarcPrice treat NORMARC the same as MARC21.
Test plan
* Setup Koha with NORMARC framework
* Add a norwegian z3950 search target (ex: z3950.bibsys.no:2100, database=BIBSYS)
* Create a new basket, and add order to basket from external source
* Search for a tile (ex: ISBN 8205341834) from the bibsys z3950 server
* Click to order the title
* Observe that vendor price is not set
* Apply patch, repeat search for same book
* Order, and observe the vendor price is filled in from the MARC record
http://bugs.koha-community.org/show_bug.cgi?id=12554
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Works as described. No errors.
Tested changing marcflavour syspref to NORMARC,
and following test plan, bug exist and is fixed.
Changed bug description on patch, too long :)
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Patch adds a check for NORMARC to provide the same functionality
as for MARC21. No regressions found.
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The current GetMarcISBN implementation returns an array of ISBN
in which all subfields of a ISBN field occurence are appended.
For example, in MARC21, if you have $a and $c defined, they will
be appended for output. This happens for $z.
To reproduce:
- Run the regression tests attached to this bug.
To test:
- Apply the patch, regression tests pass.
- Sign off
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Now test pass, no koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
The function header says that DelBiblio checks to make sure there aren't
any issues on any of the items. What the code does, on the other hand,
is to check whether biblio has any items attached, and refuses to
delete biblio if it has any.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
This patch fixes some potential SQL syntax errors, which can cause
fatal software errors in Koha when the environmental variable DEBUG
is on.
_TEST PLAN_
Before applying:
0) Ensure that you don't have "SetEnv DEBUG 1" in your Apache config
1) Create a new bib record
2) Click on the "Holds" tab before creating any items
3) Note the message "Cannot place hold: this record has no
items attached."
4) Add "SetEnv DEBUG 1" to your Apache config
5) Restart Apache
6) Refresh your page
7) Note the following Software Error: "DBD::mysql::st execute failed:
You have an error in your SQL syntax; check the manual that
corresponds to your MariaDB server version for the right syntax to
use near ')' at line 3 at /koha/lib/C4/Koha.pm line 835.
8) Apply the patch
9) Refresh your page
10) Note the message from Step 3
Thorough tester:
11) Remove "SetEnv DEBUG 1" from your Apache config, restart Apache,
and refresh your page. You should see the message from Step 3.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Error reproduced, patch fixes it.
Tested following test plan, no koha-qa errors.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Under certain circumstances misc/link_bibs_to_authorities.pl creates
multiple authority with identical heading.
Test plan:
1. Have some (2-3) biblio records with some repeated headings
Have BiblioAddsAuthorities = allow, AutoCreateAuthorities = generate
Have no authority records
2. Run misc/link_bibs_to_authorities.pl script
3. You will get multiple authority records -- one for each occurence of a
heading in biblio record.
4. Apply the patch.
5. Repeat 1-3 (remember to have "fresh" biblios, without $9, and no
authorities).
6. The problem should be fixed.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Tomas Cohen Arazi <tomascohen@gmail.com>
Works as described.
TransformKohaToMarc() function in C4/Biblio.pm iterates through it's
argument - which is a hashref - using 'each'. Perl is not guaranteed
to return hash keys in any particular order (not to mention that
in more recent perl versions, explicit hash key order randomization
is to be expected).
As a consequence:
1) For biblio records added via acquisition (order from a new/empty
record, order from a suggestion), freshly created MARC biblio records
doesn't always have 260 $b and 260 $c stored in the proper order
2) Holdings data exported for zebra indexing as 952 fields may have
subfields generated in more-or-less random order. While it probably (?)
does not affect zebra indexing/searching in any significant way,
end result is prone to be somehow ugly (which can be a potential
issue e.g. for people running Z39.50 server) and is not guaranteed
to be consistent; different records - or even different items in the
same record, can have 952 subfields generated in indiscriminate order.
This patch fixes abovementioned issues via introducting explicit
sorting (by subfiled code/letter) for subfield pairs before they
are added to the MARC record.
To test:
1/ Try to confirm and reproduce both issues (use perl 5.18.1 if possible
for more randomly ordered results).
2/ Apply patch.
3/ Redo the tests; ensure that both issues are now fixed and that there
are no apparent regressions of any kind (especially regarding to 952 fields
generated for zebra [re]indexing).
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Test Plan:
1) Apply patch
2) Run updatedatabase.pl
3) View some records with authorities
4) Note your previously set authority separator should still be in use
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
No koha-qa errors.
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch changes the price parsing so that it can fall
back on the currency name if an ISO code is not supplied; this allows
for handling the very common situation where the currency name
as entered was already the same as the ISO code.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Initial bug :
When there's a round price with no decimals after it,
or when the symbol is after the digits, the price is not captured
by regular expression in MungeMarcPrice routine and the variable
is not initialized.
Enhancement :
The MungeMarcPrice routine had been widely modified.
It's still possible to priority pick the active currency but
unlike the previous mechanism that worked only for prices preceded
by the currency sign, it's now valid wherever the symbol is situated.
As symbol you may enter a pure currency sign as well as a string
including it like '$US'. Moreover, an 'isocode' column had been
added in currency table (editable in the staffo interface from
Administration/Currencies and exchange rates). So the active
currency can be picked either through its symbol or through its iso
code.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Passes all tests, especially t/db_dependent/MungeMarcPrice.t
Checked currencies can be added, edited and deleted.
Notes: new ISO code field is mandatory.
Sample sql files need to be updated (bug 12146)
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The current implementation of GetMarcISBN contradicts the documented API.
It currently returns an array of hashes with only one key (marcisbn)
which doesn't add any value to it.
I chose to fix GetMarcISBN to honour the API instead of changing thex
docs, because it seems a really silly change.
To test:
- Run:
prove t/db_dependent/Biblio.t
=> SUCCESS
- catalogue/detail.pl should correctly show ISBNs.
- opac/opac-detail.pl should correctly show ISBNs in both prog and bootstrap.
- opac-opac-sendshelf.pl should correctly show ISBNs in the email.
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Nicolas Legrand <nicolas.legrand@bulac.fr>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch makes the logic inside GetMarcISBN simpler and
fixes the issue.
To test:
- Run the regression tests:
prove -v t/db_dependent/Biblio.t
=> FAIL
- Apply the patch
- Run:
prove -v t/db_dependent/Biblio.t
=> SUCCESS
- Verify that opac-detail.pl and catalogue/detail.pl look as usual regarding ISBN
- Sign off
Sponsored-by: Universidad Nacional de Cordoba
Signed-off-by: Nicolas Legrand <nicolas.legrand@bulac.fr>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The return from GetReservesFromBiblionumber contains an unnecessary
extra variable. In scalar context an array returns its element count.
Maintaining a separate count can lead to unforeseen bugs
and imposes ugly constructions on the subroutine's users.
Remove the useless count variable from the return
This patch also changes the parameters: now the routine takes a hashref.
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Placed biblio holds, future holds and item holds. Works as expected.
Tested Holds.t and Reserves.t. Pass.
Tested /cgi-bin/koha/ilsdi.pl?service=GetRecords&id=999 with two holds on
one item. Fine.
C4/SIP/ILS/Item.pm: Looked for "whatever" and "arrayref" and could not find
them anymore. Looks good.
Handled a few unneeded calls in QA follow-up.
Left only one point to-do for serials/routing-preview.pl. See Bugzilla.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch adds the words 'biblio' and 'item' to the 'info'
of the cataloguing logs which were missing them (such as biblio
delete, biblio mod, item mod, upload cover image).
This patch also adds 'authority' for authority mod.
_TEST PLAN_
Before applying:
1) Create/view mods for items, biblios, and authorities.
2) Create/view biblio deletion
3) Create/view upload cover image log
4) Note that none of these contain the words 'biblio','item',or
'authority' in their "Info" columns.
Apply patch.
5) Repeat steps 1-3
6) Note that the new logs contain 'biblio','item', and 'authority'
in their "Info" column, while the past ones don't.
7) Note also that 'biblio' and 'item' will have 'Biblio' and 'Item'
appear in their "Object" column for the new logs
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Intermittently problems in the calling environment
cause a C4::Biblio routine to be called with an undefined
MARC::Record object. This results in the process
dying and returning to the end user a low level
message such as 'cannot call method x on an undefined
object'.
For exported subroutines taking a MARC::Record object,
check that object is defined otherwise return a logical
return value and log a stack trace to the error log.
A couple of cases were checking but dying, this may have
unwelcome results in a persistent environment so croak has
been downgraded to carp
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Adds lots of checks for $record in various places, should
not affect behaviour.
Passed all tests and QA script, including new unit tests.
Tested adding and saving a new record.
Also tested detail and result pages without XSLT.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
No code implements the routines Get and TransformHtmlToMarc2,
so don't export them into users' namespace
Signed-off-by: Mark Tompsett <mtompset@hotmail.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
The patch corrects the issue -- the content of the field 225 shall be
displayes under Series now.
Signed-off-by: Bernardo Gonzalez Kriegel <bgkriegel@gmail.com>
Comment: Work as described, no koha-qa errors
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Comparing the XSLT with the normal view the patch seems to
work correctly. Passes all tests and QA script.
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
This patch switches from using a combination of
biblionumber/borrowernumber to using reserve_id where possible.
Test Plan:
1) Apply patch
2) Run t/db_dependent/Holds.t
Signed-off-by: Maxime Pelletier <maxime.pelletier@libeo.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Galen Charlton <gmc@esilibrary.com>
Removed NoZebra vestiges. This comprises several code blocks that depend on the NoZebra syspref and NZ related functions/methods.
C4::Biblio->
GetNoZebraIndexes
_DelBiblioNoZebra
_AddBiblioNoZebra
C4::Search->
NZgetRecords
NZanalyse
NZoperatorAND
NZoperatorOR
NZoperatorNOT
NZorder
C4::Installer->
set_indexing_engine
Sponsored-by: Universidad Nacional de Córdoba
Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Remove a line of debug code from EG, provide better error handling
when presented with weird data in the authority linker, and correct
queryparser configuration to reflect the correct configuration for
Zebra.
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
This patch rewrites the GetReserveStatus routine in order to take in
parameter the itemnumber and/or the biblionumber.
In some places, the C4::Reserves::CheckReserves routine is called when
we just want to get the status of the reserve. In these cases, the
C4::Reserves::GetReserveStatus is now called.
This routine executes 1 sql query (or 2 max).
Test plan:
Check that there is no regression on the different pages where reserves
are used. The different status will be the same than before applying
this patch.
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Added a new system preference to set the UNIMARC field 100 default language.
The default value for that system preference is 'fre'.
Changed Biblio.pm to use the system preference:
- if the language is bad filled in the preferences it uses 'fre' as default value
- only replaces the language when the field 100 is empty
- if the language is filled with the plugin only replaces the positions 25-28 to 'y50'
Signed-off-by: Rolando Isodoro <rolando.isidoro@gmail.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
When in framework a subfield has a number > 0 in hidden, it his hidden in MARCview.
It should be hidden also in ISBD view.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Does what it says, hides hidden fields on the OPAC
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
All tests and QA script pass.
To test:
1) Hide 245$b or another field from ISBD view in your MARC
framework by assigning a hidden value greater 0
2) Check the different views in OPAC, the field should be hidden now from
- Labelled MARC view (as it was before this patch)
- ISBD view
It will still show up for plain MARC and XSLT views.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Only changing some documentation about GetXmlBiblio
Signed-off-by: Mirko Tietgen <mirko@abunchofthings.net>
Added the word 'contain'
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Added a new system preference to control the fields to not appear in the separator.
Change GetMarcNotes to use the system preference created to only appear the fields that aren't in the list,
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
FIX some indentation in C4/Biblio.pm
+FIX 1 end of parentheses in sysprefs.sql
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
- Possibility to select for line and column: items.homebranch and
items.ccode
- Possibility to filter on these fields
- Possibility to count unique biblios (count(distinct biblionumber)),
ordered amount and spent amount (based on aqorders.datereceived)
Filtering on item homebranch and ccode works only on items that were
created at ordering or receiving (ie items are linked to an order)
Some refactoring is done, mainly replacing switch-like if statements by
given/when
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
If the itemnumbers parameter is undef, perl raises an error :
"Can't use an undefined value as an ARRAY reference"
Signed-off-by: Mason James <mtj@kohaaloha.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
EmbedItemsInMarcBiblio does not embed items when the itemnumbers param
is given. That breaks the export tools (used from commandline).
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Passed-QA-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
On 3.8.x, it was possible for multiple automatically generated
authorities to be linked to a single heading. This patch deletes
previous links from headings prior to linking them to
automatically-generated headings. This patch also corrects a
potential problem wherein multiple authorities might be generated if
a record is edited repeatedly in quick succession. The latter problem
exists on Master and 3.6.x as well, and the code that corrects the
multiple linkages is equally applicable if seemingly unnecessary.
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
An eval { eval "require $module;" }; was replaced with
eval { eval { require $module; }; }; which is a no-op, meaning that
the linker was not getting loaded, and the catalog module was throwing
up a big nasty error every time someone tried to save a record with a
heading. This patch replaces the require with can_load from
Module::Load::Conditional, which is PBP-friendly, and offers equivalent
functionality.
Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
- Expression form of "eval" at line 492, column 12. See page 161 of PBP. (Severity: 5)
- "return" statement with explicit "undef" at line 891, column 5. See page 199 of PBP. (Severity: 5)
- Subroutine prototypes used at line 1148, column 1. See page 194 of PBP. (Severity: 5)
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Cherry-picked from BibLibre's work on bug 5888:
opac-detail subject/author links improvements
Added a link to opac-authoritiesdetail.pl when possible.
Only affects 'Normal view'. Does not affect XSLT display.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Do not automatically populate $9 in bibliographic headings when the
$9 is set in the authorized heading field of the authority record.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
In the circulation page, you can now export (as csv or iso2709) a list
of items which are currently checked out by a borrower.
3 export types:
- iso2709 with items: Export the items list in iso2709 format with item
informations.
- iso2709 without items: Export the items list in iso2709 format without
item informations.
- CSV: Export the items list based on a csv profil.
2 new system preferences:
- DontExportFields: a list of fields not to be export
- CsvProfileForExport: The Csv profile name used for the csv export
Test plan:
- Fill the CsvProfileForExport syspref
- go on the borrower circulation page containing checkouts
- Select one or more items and export them to the 3 different formats.
- check if the result file is what you expected
- Test there is no regression with the export authority
- Test there is no regression using tools/export.pl with the command
line interface
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
This reverts commit 215abc8024.
The 3 patches for bug 8089 have been reverted, because they break
jenkins & Koha.
A follow-up has been provided, but it does not solve the problem on my
test server, it just changes the error message.
After a discussion with jared, Dobrica should work on another patch, so
the best option is to revert.
1. Replace all instances of memoize_memcached with appropriate calls
into Koha::Cache:
* reports/guided_reports.pl
* C4::Biblio::GetMarcStructure
* C4::Languages::getFrameworkLanguages
* C4::Languages::getAllLanguages
* C4::SQLHelper::GetPrimaryKeys
* C4::SQLHelper::_get_columns
2. Replace all references to memcached with the appropriate calls into
Koha::Cache in C4::Context.
Test plan :
* have DEBUG env set to 1
* reach addbiblio page to test the patch in Biblio.pm, or setup more than 1
language
* you should see in the logs that you're reading and writing from cache
* run the test suite twice both with and without the following environment
variables set:
export MEMCACHED_SERVERS=127.0.0.1:11211
export MEMCACHED_NAMESPACE=KOHA
export CACHING_SYSTEM=memcached
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
I'm unsure about some of the caching times 10000 is a long long time,
but other than that, works fine.
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
New version implementing Paul's advice.
See Wiki http://wiki.koha-community.org/wiki/Age_restrictiotion
Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
fix updatedatabase.pl
New fix updatedatabase.pl to apply to current master by Marc Veron veron@veron.ch
...and fixed missing curly bracket after merging updatedatabase.pl
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
export.pl [--format=format] [--date=date] [--dont_export_items]
[--deleted_barcodes] [--clean] --filename=outputfile
* format is either 'xml' or 'marc' (default)
* date should be entered as the 'dateformat' syspref is set
(dd/mm/yyyy for metric, yyyy-mm-dd for iso, mm/dd/yyyy for us)
* records exported are the ones that have been modified since 'date'
* if --deleted_barcodes is used, a list of barcodes of items deleted
since 'date' is produced (or from all deleted items if no date is
specified)
* --clean removes NSE/NSB
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Because updating the total issues count associated with a bibliographic
record on issue could cause a significant load on the server, this
commit adds the syspref UpdateTotalIssuesOnCirc (which defaults to OFF
to match existing behavior). The syspref has the following description:
Do/Do not update a bibliographic record's total issues count whenever
an item is issued (WARNING! This increases server load significantly;
if performance is a concern, use the update_totalissues.pl cron job
to update the total issues count).
Bug 6557: automatically increment totalissues
Adds the ability to automatically increment biblioitems.totalissues
whenever an item is issued.
To test:
1) Choose a record with at least one item that can circulate
2) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). Most likely there won't be any 942$0 at all
3) Enable UpdateTotalIssuesOnCirc
4) Check out the item you selected
5) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should now be one greater than before
6) Discharge the item
7) Disable UpdateTotalIssuesOnCirc
8) Check out the item you selected again
9) Check the value of 942$0 (you may need to look at the plain MARC view
on the OPAC). That value should not have changed
Bug 6557: add script to update totalissues from stats
NAME
update_totalissues.pl
SYNOPSIS
update_totalissues.pl --use-stats
update_totalissues.pl --use-items
update_totalissues.pl --commit=1000
update_totalissues.pl --since='2012-01-01'
update_totalissues.pl --interval=30d
DESCRIPTION
This batch job populates bibliographic records' total issues count
based on historical issue statistics.
--help Prints this help
-v|--verbose
Provide verbose log information (list every bib modified).
--use-stats
Use the data in the statistics table for populating total
issues.
--use-items
Use items.issues data for populating total issues. Note that
issues data from the items table does not respect the --since
or --interval options, by definition. Also note that if both
--use-stats and --use-items are specified, the count of biblios
processed will be misleading.
-s|--since=DATE
Only process issues recorded in the statistics table since
DATE.
-i|--interval=S
Only process issues recorded in the statistics table in the
last N units of time. The interval should consist of a number
with a one-letter unit suffix. The valid suffixes are h
(hours), d (days), w (weeks), m (months), and y (years). The
default unit is days.
--incremental
Add the number of issues found in the statistics table to the
existing total issues count. Intended so that this script can
be used as a cron job to update popularity information during
low-usage periods. If neither --since or --interval are
specified, incremental mode will default to processing the
last twenty-four hours.
--commit=N
Commit the results to the database after every N records are
processed.
--test Only test the popularity population script.
WARNING
If the time on your database server does not match the time on your Koha
server you will need to take that into account, and probably use the
--since argument instead of the --interval argument for incremental
updating.
=== TESTING PLAN ===
NOTE: in order to test this script, you will need to have some sort of
circulation data already existing in your Koha installation.
1) Disable UpdateTotalIssuesOnCirc
2) Run: misc/cronjobs/update_totalissues.pl --use-items -t -v
3) If you have total checkout data in your item records (i.e. anything
in 952$l), you should see messages like "Processing bib 43 (1 issues)"
4) Choose one of the lines that shows more than 0 issues, and view the
record with that biblionumber in the staff client, choosing the "Items"
tab (moredetail.pl). Add up the "Total checkouts" listed for each item,
and confirm it matches what the script reported
5) Run: misc/cronjobs/update_totalissues.pl --use-stats -t -v
6) If you have any circulation statistics in your database (i.e. any
'issue' entries in your statistics table), you should see messages
like "Processing bib 43 (1 issues)";
7) Choose one of the lines and view the record with that biblionumber in
the staff client, choosing the "Items" tab (moredetail.pl). If you
count the number of checkouts listed in each item's checkout history,
the total should match what the script reported.
8) Check out an item
9) Run: misc/cronjobs/update_totalissues.pl --use-stats
--incremental --interval=1h -t -v
10) You should see one line reporting a single circ for the bib record
associated with the item you just checked out (there may be more if
you checked out any books in the hour prior to running these tests
11) If the results in steps 4, 7, and 10 match the predictions, the
script worked
This patch to Koha was sponsored by the Arcadia Public Library and the
Arcadia Public Library Foundation in honor of Jackie Faust-Moreno, late
director of the Arcadia Public Library.
Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Tested this with my test data - numbers are correct and updated appropriately.
More importantly - if I do a popularity search, the most popular items *come up first*. Amazing.
- in various acquisition pages and serials home
- in database : biblioitems.ean
- adds ean and its mapping in default english bibliographic framework
- adds ean mapping in default french bibliographic framework
- ean search is not enabled for MARC21
The required mapping between the ean marc field and the biblioitems.ean
database field will be automatically added on an existing unimarc installation.
However, if you already have records with ean, you will have to
run misc/batchRebuildBiblioTables.pl to populate biblioitems.ean
Signed-off-by: jmbroust <jean-manuel.broust@univ-lyon2.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Passed QA at second run. Removed a merge marker only.
The root problem here is that the price is being pulled from the MARC record
and is then run through Number::Format::unformat_number. This routine is
really being misused, and should only be used to reverse the effects of
Number::Format on a number string. We are apparently using it to strip
out currency characters and the like.
Number::Format::unformat_number will choke if there is more than one period (.)
in the price field. MARC standards do not limit this field to a single period,
so unless there is only one period, we should skip number unformatting.
Examples of that break unformat_number include '18.95 (U.S.)', and
'$5.99 ($7.75 CAN)', both of which are perfectly valid.
This commit adds the function MungeMarcPrice that will better handle
find a real price value in a given price field. It does a very good
job at finding a price in any currency format, and attempts to find
a price in whichever currency is active before falling back to
the first valid price it can find.
The variable $price may fail to have an actual price, in which case
the price then defaults to '0.00', which would be rarely if ever the
correct price. To combat this, I have added highlighting to any
price in the Order Details table that begins with 0 ( i.e. '0.00' ).
Also, fixed the incomplete table footer, adding a new td with a
span of 3 to fill in the nonexistant cells.
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>