Commit graph

481 commits

Author SHA1 Message Date
Fridolyn SOMERS
1aeff203be Bug 7921: Software error while placing order
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Passed-QA-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-11-09 20:19:10 -05:00
Jared Camins-Esakov
1d5b614d8b Bug 8823: CatalogModuleRelink Creates Multiple Links between Bib and Auth record
On 3.8.x, it was possible for multiple automatically generated
authorities to be linked to a single heading. This patch deletes
previous links from headings prior to linking them to
automatically-generated headings. This patch also corrects a
potential problem wherein multiple authorities might be generated if
a record is edited repeatedly in quick succession. The latter problem
exists on Master and 3.6.x as well, and the code that corrects the
multiple linkages is equally applicable if seemingly unnecessary.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-28 19:27:27 +01:00
Jared Camins-Esakov
813c744008 Bug 8818: make sure we load modules before using them
An eval { eval "require $module;" }; was replaced with
eval { eval { require $module; }; }; which is a no-op, meaning that
the linker was not getting loaded, and the catalog module was throwing
up a big nasty error every time someone tried to save a record with a
heading. This patch replaces the require with can_load from
Module::Load::Conditional, which is PBP-friendly, and offers equivalent
functionality.

Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-10-01 19:01:50 +02:00
1ea7842b88 Bug 6679 - [SIGNED-OFF] fix 7 perlcritic violations in C4/Biblio.pm
- Expression form of "eval" at line 492, column 12.  See page 161 of PBP.  (Severity: 5)

- "return" statement with explicit "undef" at line 891, column 5.  See page 199 of PBP.  (Severity: 5)

- Subroutine prototypes used at line 1148, column 1.  See page 194 of PBP.  (Severity: 5)

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-20 12:01:36 +02:00
Julian Maurice
b0ab204bfd Bug 8210: add links to authorities in normal mode headings
Cherry-picked from BibLibre's work on bug 5888:
opac-detail subject/author links improvements

Added a link to opac-authoritiesdetail.pl when possible.

Only affects 'Normal view'. Does not affect XSLT display.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-18 17:39:41 +02:00
Jonathan Druart
82dc7b55a8 Bug 4321: clean C4::Biblio::GetBiblio and uses
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-18 12:11:54 +02:00
Fridolyn SOMERS
2ca3663687 Bug 8071: link between bib and authorities with the authid
Do not automatically populate $9 in bibliographic headings when the
$9 is set in the authorized heading field of the authority record.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-14 13:29:14 +02:00
Jonathan Druart
0f3f61e756 Bug 7986: Export issues for patron
In the circulation page, you can now export (as csv or iso2709) a list
of items which are currently checked out by a borrower.

3 export types:
- iso2709 with items: Export the items list in iso2709 format with item
  informations.
- iso2709 without items: Export the items list in iso2709 format without
  item informations.
- CSV: Export the items list based on a csv profil.

2 new system preferences:
- DontExportFields: a list of fields not to be export
- CsvProfileForExport: The Csv profile name used for the csv export

Test plan:
- Fill the CsvProfileForExport syspref
- go on the borrower circulation page containing checkouts
- Select one or more items and export them to the 3 different formats.
- check if the result file is what you expected

- Test there is no regression with the export authority
- Test there is no regression using tools/export.pl with the command
  line interface

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-13 17:58:30 +02:00
Paul Poulain
8449b14dcb Revert "Bug 8089: Use Koha::Cache for all caching"
This reverts commit 215abc8024.

The 3 patches for bug 8089 have been reverted, because they break
jenkins & Koha.
A follow-up has been provided, but it does not solve the problem on my
test server, it just changes the error message.

After a discussion with jared, Dobrica should work on another patch, so
the best option is to revert.
2012-09-12 14:12:41 +02:00
Jared Camins-Esakov
215abc8024 Bug 8089: Use Koha::Cache for all caching
1. Replace all instances of memoize_memcached with appropriate calls
into Koha::Cache:
* reports/guided_reports.pl
* C4::Biblio::GetMarcStructure
* C4::Languages::getFrameworkLanguages
* C4::Languages::getAllLanguages
* C4::SQLHelper::GetPrimaryKeys
* C4::SQLHelper::_get_columns

2. Replace all references to memcached with the appropriate calls into
Koha::Cache in C4::Context.

Test plan :
* have DEBUG env set to 1
* reach addbiblio page to test the patch in Biblio.pm, or setup more than 1
  language
* you should see in the logs that you're reading and writing from cache
* run the test suite twice both with and without the following environment
  variables set:
export MEMCACHED_SERVERS=127.0.0.1:11211
export MEMCACHED_NAMESPACE=KOHA
export CACHING_SYSTEM=memcached

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>

I'm unsure about some of the caching times 10000 is a long long time,
but other than that, works fine.
2012-09-07 16:28:29 +02:00
Marc Veron
01a7188d6a BUG 7621 [ENH] Circulation: Match age restriction of title with borrower's age without using categories
Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>

New version implementing Paul's advice.
See Wiki http://wiki.koha-community.org/wiki/Age_restrictiotion

Signed-off-by: Jonathan Druart <jonathan.druart@biblibre.com>
fix updatedatabase.pl

New fix updatedatabase.pl to apply to current master by Marc Veron veron@veron.ch
...and fixed missing curly bracket after merging updatedatabase.pl

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-05 14:32:53 +02:00
Fridolyn SOMERS
58f6c9c8fb Bug 8586: Small bug in die if no mapping in framework for biblioitems.biblioitemnumber
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-09-03 15:20:26 +02:00
Julian Maurice
4dea59847a Bug 5600: Command line interface for tools/export.pl
export.pl [--format=format] [--date=date] [--dont_export_items]
  [--deleted_barcodes] [--clean] --filename=outputfile

    * format is either 'xml' or 'marc' (default)
    * date should be entered as the 'dateformat' syspref is set
      (dd/mm/yyyy for metric, yyyy-mm-dd for iso, mm/dd/yyyy for us)
    * records exported are the ones that have been modified since 'date'
    * if --deleted_barcodes is used, a list of barcodes of items deleted
      since 'date' is produced (or from all deleted items if no date is
      specified)
    * --clean removes NSE/NSB

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-08-03 16:57:08 +02:00
Jared Camins-Esakov
c0714a99f2 Bug 6557: Record bib popularity in totalissues
Because updating the total issues count associated with a bibliographic
record on issue could cause a significant load on the server, this
commit adds the syspref UpdateTotalIssuesOnCirc (which defaults to OFF
to match existing behavior). The syspref has the following description:

  Do/Do not update a bibliographic record's total issues count whenever
  an item is issued (WARNING! This increases server load significantly;
  if performance is a concern, use the update_totalissues.pl cron job
  to update the total issues count).

Bug 6557: automatically increment totalissues

Adds the ability to automatically increment biblioitems.totalissues
whenever an item is issued.

To test:
1) Choose a record with at least one item that can circulate
2) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). Most likely there won't be any 942$0 at all
3) Enable UpdateTotalIssuesOnCirc
4) Check out the item you selected
5) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). That value should now be one greater than before
6) Discharge the item
7) Disable UpdateTotalIssuesOnCirc
8) Check out the item you selected again
9) Check the value of 942$0 (you may need to look at the plain MARC view
   on the OPAC). That value should not have changed

Bug 6557: add script to update totalissues from stats

NAME
       update_totalissues.pl

SYNOPSIS
         update_totalissues.pl --use-stats
         update_totalissues.pl --use-items
         update_totalissues.pl --commit=1000
         update_totalissues.pl --since='2012-01-01'
         update_totalissues.pl --interval=30d

DESCRIPTION
       This batch job populates bibliographic records' total issues count
       based on historical issue statistics.

       --help  Prints this help

       -v|--verbose
               Provide verbose log information (list every bib modified).

       --use-stats
               Use the data in the statistics table for populating total
               issues.

       --use-items
               Use items.issues data for populating total issues. Note that
               issues data from the items table does not respect the --since
               or --interval options, by definition. Also note that if both
               --use-stats and --use-items are specified, the count of biblios
               processed will be misleading.

       -s|--since=DATE
               Only process issues recorded in the statistics table since
               DATE.

       -i|--interval=S
               Only process issues recorded in the statistics table in the
               last N units of time. The interval should consist of a number
               with a one-letter unit suffix. The valid suffixes are h
               (hours), d (days), w (weeks), m (months), and y (years). The
               default unit is days.

       --incremental
               Add the number of issues found in the statistics table to the
               existing total issues count. Intended so that this script can
               be used as a cron job to update popularity information during
               low-usage periods. If neither --since or --interval are
               specified, incremental mode will default to processing the
               last twenty-four hours.

       --commit=N
               Commit the results to the database after every N records are
               processed.

       --test  Only test the popularity population script.

WARNING

If the time on your database server does not match the time on your Koha
server you will need to take that into account, and probably use the
--since argument instead of the --interval argument for incremental
updating.

=== TESTING PLAN ===

NOTE: in order to test this script, you will need to have some sort of
circulation data already existing in your Koha installation.

1) Disable UpdateTotalIssuesOnCirc
2) Run: misc/cronjobs/update_totalissues.pl --use-items -t -v
3) If you have total checkout data in your item records (i.e. anything
   in 952$l), you should see messages like "Processing bib 43 (1 issues)"
4) Choose one of the lines that shows more than 0 issues, and view the
   record with that biblionumber in the staff client, choosing the "Items"
   tab (moredetail.pl). Add up the "Total checkouts" listed for each item,
   and confirm it matches what the script reported
5) Run: misc/cronjobs/update_totalissues.pl --use-stats -t -v
6) If you have any circulation statistics in your database (i.e. any
   'issue' entries in your statistics table), you should see messages
   like "Processing bib 43 (1 issues)";
7) Choose one of the lines and view the record with that biblionumber in
   the staff client, choosing the "Items" tab (moredetail.pl). If you
   count the number of checkouts listed in each item's checkout history,
   the total should match what the script reported.
8) Check out an item
9) Run: misc/cronjobs/update_totalissues.pl --use-stats
   --incremental --interval=1h -t -v
10) You should see one line reporting a single circ for the bib record
    associated with the item you just checked out (there may be more if
    you checked out any books in the hour prior to running these tests
11) If the results in steps 4, 7, and 10 match the predictions, the
    script worked

This patch to Koha was sponsored by the Arcadia Public Library and the
Arcadia Public Library Foundation in honor of Jackie Faust-Moreno, late
director of the Arcadia Public Library.

Signed-off-by: Liz Rea <wizzyrea@gmail.com>
Tested this with my test data - numbers are correct and updated appropriately.

More importantly - if I do a popularity search, the most popular items *come up first*. Amazing.
2012-06-29 14:29:22 +02:00
Chris Cormack
509d673f10 Bug 7941 : Fix version numbers in modules
Signed-off-by: Kyle M Hall <kyle@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-06-11 17:29:38 +02:00
Matthias Meusburger
7f957077dd Bug 5337: EAN management : Adds ean for various searches
- in various acquisition pages and serials home
  - in database : biblioitems.ean
  - adds ean and its mapping in default english bibliographic framework
  - adds ean mapping in default french bibliographic framework
  - ean search is not enabled for MARC21

The required mapping between the ean marc field and the biblioitems.ean
database field will be automatically added on an existing unimarc installation.

However, if you already have records with ean, you will have to
run misc/batchRebuildBiblioTables.pl to populate biblioitems.ean

Signed-off-by: jmbroust <jean-manuel.broust@univ-lyon2.fr>
Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
Passed QA at second run. Removed a merge marker only.
2012-06-09 18:26:34 +02:00
285f06e394 Bug 7112 - Having two prices in 020$c causes basket creation to fail from staged marc import
The root problem here is that the price is being pulled from the MARC record
and is then run through Number::Format::unformat_number. This routine is
really being misused, and should only be used to reverse the effects of
Number::Format on a number string. We are apparently using it to strip
out currency characters and the like.

Number::Format::unformat_number will choke if there is more than one period (.)
in the price field. MARC standards do not limit this field to a single period,
so unless there is only one period, we should skip number unformatting.
Examples of that break unformat_number include '18.95 (U.S.)', and
'$5.99 ($7.75 CAN)', both of which are perfectly valid.

This commit adds the function MungeMarcPrice that will better handle
find a real price value in a given price field. It does a very good
job at finding a price in any currency format, and attempts to find
a price in whichever currency is active before falling back to
the first valid price it can find.

The variable $price may fail to have an actual price, in which case
the price then defaults to '0.00', which would be rarely if ever the
correct price. To combat this, I have added highlighting to any
price in the Order Details table that begins with 0 ( i.e. '0.00' ).

Also, fixed the incomplete table footer, adding a new td with a
span of 3 to fill in the nonexistant cells.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-05-24 15:59:21 +02:00
377b2fd673 Bug 7722 - Insidious problem with searching
I cannot find the root cause of this issue, but multiple libraries that I am aware of have problems searching on particular search terms ( and never the same terms at the same library ). The error they get when they trigger this problem is:

Tag "" is not a valid tag. at /home/koha/kohaclone/C4/Biblio.pm line 1849

Something somewhere is adding empty keys to C4::Context->marcfromkohafield, I think it may have something to do with the analytics feature that was added.

In the while loop for TransformKohaToMarc, there is a line

next unless my $dtm = $db_to_marc->{''}->{$name};

I don't think it's working.
If I dump $dtm, for each search, I see the dump twice.
It looks like this:
$VAR1 = [
           '952',
           'w'
         ];
 $VAR1 = [];
I think the second time, when it is empty is what's breaking this.
The next never fails because even though it is empty, it is still a valid arrayref.

The solution I have some up with is to skip over the elements where the arrayref is empty.

Signed-off-by: Ian Walls <koha.sekjal@gmail.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-05-15 16:08:51 +02:00
Paul Poulain
0486d0c6b7 Merge remote-tracking branch 'origin/new/bug_6199' 2012-03-28 17:54:55 +02:00
Robin Sheat
b96c8b7ffa Bug 6199 - allow bulkmarkimport.pl to remove duplicate barcodes
This adds the -dedupbarcode option that allows bulkmarkimport to erase
a barcode but keep the item of any items it finds with duplicate
barcodes.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-28 17:30:54 +02:00
Juan Romay Sieira
020c095377 Bug 7263 - Determine maximum length of some fields or subfields when cataloguing a biblio or an item.
Signed-off-by: Henri-Damien LAURENT <henridamien.laurent@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-26 10:51:22 +02:00
Katrin Fischer
454386a849 Bug 7700: Cart's more details view shows identity numbers
We already remove $9 with Koha's authority number from output
of GetMarcSubjects and GetMarcAuthors.
Patch additionally removes $0 subfields with identity numbers.

Patch also effects detail pages with normal (non-XSLT) views.

Revised to always remove $0 subfields, they are not used in UNIMARC.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-22 05:18:49 +01:00
4d26038dd2 Bug 6027 - Delete biblios if deleting all their items in batch deletion
Optionally delete bibliographic record when batch deleting items, if no items remain on the record.

Adds deleting of reserves to DelBiblio. Since subscriptions are deleted automatically,
it made sense for deletion of reserves to maintain the same behavior.

Signed-off-by: Liz Rea <wizzyrea@gmail.com>
I like the way this works, and it does. Passes tests.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-21 14:31:38 +01:00
Katrin Fischer
0a9ba9d9df Bug 6831 follow-up: add support for UNIMARC and NORMARC
1) Removes unused subroutine get_host_control_num
2) Fixes small mistake, correct subfield for ISBN 020 is z
3) Checks system preference for correct marcflavour instead of
   assuming MARC21
4) Fixes MARC21 to not use author(), because it would also add
   fields like $w and $0 to 773$a
5) Fixes MARC21 to not use title(), but 245$a, because it would
   also add too many subfields.
6) Adds definitions for UNIMARC and includes NORMARC

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Works properly with all supported MARC flavours.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-03-21 11:30:37 +01:00
Colin Campbell
adb3777e2e Bug 6831: Add ability to enter adding child record from parent
Simplifies the adding of analytical records and ensures that
the data populating the 773 tag is correct. From the host record
add child record is selected and create bib is entered to generate
a new record with host item tag populated from the parent

Caveat: currently prepare_host_field only returns a field for
MARC21. Values for UNIMARC and NORMARC can easily be added but
should be done by someone familar with those formats
and conventions

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
To test:
- create a new record
--> if you enter a value in 001 anaytics will use that in $w for linking later
--> if you set 000/LDR 19 - Multipart resource record level to 'a' there will
be a link from the parent record to the child record later
- save your record and go to the staff detail page
- in toolbar select 'New' > 'New child record'
- check field 773, 245 and 001 from the parent record should have been copied there
- check links between child and parent in staff

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Fixed conflicts in all 3 files.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Works properly for MARC21, and follow-up adds support for NORMARC and UNIMARC.
2012-03-21 11:30:35 +01:00
Julian Maurice
3b0d4e04e0 Bug 6440: Implement OAI-PMH Sets
New sql tables:
  - oai_sets: contains the list of sets, described by a spec and a name
  - oai_sets_descriptions: contains a list of descriptions for each set
  - oai_sets_mappings: conditions on marc fields to match for biblio to be
    in a set
  - oai_sets_biblios: list of biblionumbers for each set

New admin page: allow to configure sets:
  - Creation, deletion, modification of spec, name and descriptions
  - Define mappings which will be used for building oai sets

Implements OAI Sets in opac/oai.pl:
  - ListSets, ListIdentifiers, ListRecords, GetRecord

New script misc/migration_tools/build_oai_sets.pl:
  - Retrieve marcxml from all biblios and test if they belong to defined
    sets. The oai_sets_biblios table is then updated accordingly

New system preference OAI-PMH:AutoUpdateSets. If on, update sets
automatically when a biblio is created or updated.

Use OPACBaseURL in oai_dc xslt
2012-03-20 11:38:26 +01:00
Jared Camins-Esakov
5207699f98 signed off Bug 7284: Authority matching improvements
Squashed patch incorporating all previous patches (there is no functional
change compared to the previous version of this patch, this patch merely
squashes the original patch and follow-up, and rebases on latest master).

=== TL;DR VERSION ===
*** Installation ***
1. Run installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt1
and installer/data/mysql/atomicupdate/bug_7284_authority_linking_pt2
2. Make sure you copy the following files from kohaclone to koha-dev:
etc/zeradb/authorities/etc/bib1.att,
etc/zebradb/marc_defs/marc21/authorities/authority-koha-indexdefs.xml,
etc/zebradb/marc_defs/marc21/authorities/authority-zebra-indexdefs.xsl,
etc/zebradb/marc_defs/marc21/authorities/koha-indexdefs-to-zebra.xsl, and
etc/zebradb/marc_defs/unimarc/authorities/record.abs
3. Run misc/migration_tools/rebuild_zebra.pl -a -r

*** New sysprefs ***
* AutoCreateAuthorities
* CatalogModuleRelink
* LinkerModule
* LinkerOptions
* LinkerRelink
* LinkerKeepStale

*** Important notes ***
You must have rebuild_zebra processing the zebraqueue for bibs when testing this
patch.

=== DESCRIPTION ===

*** Cataloging module ***
* Added an additional box to the authority finder plugin for "Heading match,"
  which consults not just the main entry but also See-from and See-also-from
  headings.

* With this patch, the automatic authority linking will actually work properly
  in the cataloging module. As Owen pointed out while testing the patch,
  though, longtime users of Koha will not be expecting that. In keeping with
  the principles of least surprise and maximum configurability, a new syspref,
  CatalogModuleRelink makes it possible to disable authority relinking in the
  cataloging module only (i.e. leaving it enabled for future runs of
  link_bibs_to_authorities.pl).  Note that though the default behavior matches
  the current behavior of Koha, it does not match the intended behavior.
  Libraries that want the intended behavior rather than the current behavior
  will need to adjust the CatalogModuleRelink syspref.

*** misc/link_bibs_to_authorities.pl ***
Added the following options to the misc/link_bibs_to_authorities.pl script:
--auth-limit        Only process those headings that match the authorities
                    matching the user-specified WHERE clause.
--bib-limit         Only process those bib records that match the
                    user-specified WHERE clause.
--commit            Commit the results to the database after every N records
                    are processed.
--link-report       Display a report of all the headings that were processed.

Converted misc/link_bibs_to_authorities.pl to use POD.

Added a detailed report of headings that linked, did not link, and linked
in a "fuzzy" fashion (the exact semantics of fuzzy are up to the individual
linker modules) during the run.

*** C4::Linker ***
Implemented new C4::Linker functionality to make it possible to easily add
custom authority linker algorithms. Currently available linker options are:
* Default: retains the current behavior of only creating links when there is
  an exact match to one and only one authority record; if the 'broader_headings'
  option is enabled, it will try to link to headings to authority records for
  broader headings by removing subfields from the end of the heading (NOTE:
  test the results before enabling broader_headings in a production system
  because its usefulness is very much dependent on individual sites' authority
  files)
* First Match: based on Default, creates a link to the *first* authority
  record that matches a given heading, even if there is more than one
  authority record that matches
* Last Match: based on Default, creates a link to the *last* authority
  record that matches a given heading, even if there is more than one record
  that matches

The API for linker modules is very simple. All modules should implement the
following two functions:
<get_link ($field)> - return the authid for the authority that should be
linked to the provided MARC::Field object, and a boolean to indicate whether
the match is "fuzzy" (the semantics of "fuzzy" are up to the individual plugin).
In order to handle authority limits, get_link should always end with:
    return $self->SUPER::_handle_auth_limit($authid), $fuzzy;

<flip_heading ($field)> - return a MARC::Field object with the heading flipped
to the preferred form. At present this routine is not used, and can be a stub.

Made the linking functionality use the SearchAuthorities in C4::AuthoritiesMarc
rather than SimpleSearch in C4::Search. Once C4::Search has been refactored,
SearchAuthorities should be rewritten to simply call into C4::Search. However,
at this time C4::Search cannot handle authority searching. Also fixed numerous
performance issues in SearchAuthorities and the Linker script:
* Correctly destroy ZOOM recordsets in SearchAuthorities when finished. If left
  undestroyed, efficiency appears to approach O(log n^n)
* Add an optional $skipmetadata flag to SearchAuthorities that can be used to
  avoid additional calls into Zebra when all that is wanted are authority
  records and not statistics about their use

*** New sysprefs ***
* AutoCreateAuthorities - When this and BiblioAddsAuthorities are both turned
  on, automatically create authority records for headings that don't have
  any authority link when cataloging. When BiblioAddsAuthorities is on and
  AutoCreateAuthorities is turned off, do not automatically generate authority
  records, but allow the user to enter headings that don't match an existing
  authority. When BiblioAddsAuthorities is off, this has no effect.
* CatalogModuleRelink - when turned on, the automatic linker will relink
  headings when a record is saved in the cataloging module when LinkerRelink
  is turned on, even if the headings were manually linked to a different
  authority by the cataloger. When turned off (the default), the automatic
  linker will not relink any headings that have already been linked when a
  record is saved.
* LinkerModule - Chooses which linker module to use for matching headings
  (current options are as described above in the section on linker options:
  "Default," "FirstMatch," and "LastMatch")
* LinkerOptions - A pipe-separated list of options to set for the authority
  linker (at the moment, the only option available is "broader_headings," which
  is described below)
* LinkerRelink - When turned on, the linker will confirm the links for headings
  that have previously been linked to an authority record when it runs. When
  turned off, any heading with an existing link will be ignored.
* LinkerKeepStale - When turned on, the linker will never *delete* a link to an
  authority record, though, depending on the value of LinkerRelink, it may
  change the link.

*** Other changes ***
* Cleaned up authorities code by removing unused functions and adding
  unimplemented functions and added some unit tests.

* This patch also modifies the authority indexing to remove trailing punctuation
  from Match indexes.

* Replace the old BiblioAddAuthorities subroutines with calls into the new
  C4::Linker routines.

* Add a simple implementation for C4::Heading::UNIMARC. (With thanks to F.
  Demians, 2011.01.09) Correct C4::Heading::UNIMARC class loading. Create
  biblio tag to authority types data structure at initialization rather than
  querying DB.

* Ran perltidy on all changed code.

*** Linker Options ***
Enter "broader_headings" in LinkerOptions. With this option, the linker will
try to match the following heading as follows:
=600  10$aCamins-Esakov, Jared$xCoin collections$vCatalogs$vEarly works to
1800.

First: Camins-Esakov, Jared--Coin collections--Catalogs--Early works to 1800
Next: Camins-Esakov, Jared--Coin collections--Catalogs
Next: Camins-Esakov, Jared--Coin collections
Next: Camins-Esakov, Jared (matches! if a previous attempt had matched, it
would not have tried this)

This is probably relevant only to MARC21 and LCSH, but could potentially be of
great use to libraries that make heavy use of floating subdivisions.

=== TESTING PLAN ===

Note: all of these tests require that you have some authority records,
preferably for headings that actually appear in your bibliographic data. At
least one authority record must contain a "see from" reference (remember which
one contains this, as you'll need it for some of the tests). The number shown
in the "Used in" column in the authority module is populated using Zebra
searches of the bibliographic database, so you *must* have
rebuild_zebra.pl -b -z [-x] running in cron, or manually run it after running
the linker.

*** Testing the Heading match in the cataloging plugin ***
1.  Create a new record, and open the cataloging plugin for an
    authority-controlled field.
2.  Search for an authority by entering the "see from" term in the Heading Match
    box
3.  Confirm that the appropriate heading shows up
4.  Search for an authority by entering the preferred heading into the Main
    entry or Main entry ($a only) box (i.e., repeat the procedure you usually
    use for cataloging, whatever that may be)
5.  Confirm that the appropriate heading shows up

*** Testing the cataloging interface ***
6.  Turn off BiblioAddsAuthorities
7.  Confirm that you cannot enter text directly in an authority-controlled field
8.  Confirm that if you search for a heading using the authority control plugin
    the heading is inserted (note, however, that this patch does not AND IS NOT
    INTENDED TO fix the bugs in the authority plugin with duplicate subfields;
    those are wholly out of scope- this check is for regressions)
9.  Turn on BiblioAddsAuthorities and AutoCreateAuthorities
10. Confirm that you can enter text directly into an authority-controlled field,
    and if you enter a heading that doesn't currently have an authority record,
    an authority record stub is automatically created, and the heading you
    entered linked
11. Confirm that if you enter a heading with only a subfield $a that fully
    *matches* an existing heading (i.e. the existing heading has only
    subfield $a populated), the authid for that heading is inserted into
    subfield $9
12. Confirm that if you enter a heading with multiple subfields that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
13. Turn on BiblioAddsAuthorities and turn off AutoCreateAuthorities
14. Confirm that you can enter text directly into an authority-controlled field,
    and if you enter a heading that doesn't currently have an authority record,
    an authority record stub is *not* created
15. Confirm that if you enter a heading with only a subfield $a that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
16. Confirm that if you enter a heading with multiple subfields that *matches*
    an existing heading, the authid for that heading is inserted into
    subfield $9
17. Create a record and link an authority record to an authorized field using
    the authority plugin.
18. Save the record. Ensure that the heading is linked to the appropriate
    authority.
19. Open the record. Change the heading manually to something else, leaving
    the link. Save the record.
20. Ensure that the heading remains linked to that same authority.
21. Change CatalogModuleRelink to "on."
22. Open the record. Use the authority plugin to link that heading to the
    same authority record you did earlier.
23. Save the record. Ensure that the heading is linked to the appropriate
    authority.
24. Open the record. Change the heading manually to something else, leaving
    the link. Save the record.
25. Ensure that the heading is no longer linked to the old authority record.

*** Testing link_bibs_to_authorities.pl ***
26. Set LinkerModule to "Default," turn on LinkerRelink and
    BiblioAddsAuthorities, and turn AutoCreateAuthorities and
    LinkerKeepStale off
27. Edit one bib record so that an authority controlled field that has already
    been linked (i.e. has data in $9) has a heading that does not match any
    authority record in your database
28. Run misc/link_bibs_to_authorities.pl --link-report --verbose --test (you may
    want to pipe the output into less or a file, as the result is quite a lot of
    information)
29. Look over the report to see if the headings that you have authority records
    for report being matched, that the heading you modified in step 2 is
    reported as "unlinked," and confirm that no changes were actually made to
    the database (to check this, look at the bib record you edited earlier, and
    check that the authid in the field you edited hasn't changed)
30. Run misc/link_bibs_to_authorities.pl --link-report --verbose (you may want
    to pipe the output into less or a file, as the result is quite a lot of
    information)
31. Check that the heading you modified has been unlinked
32. Change the modified heading back to whatever it was, but don't use the
    authority control plugin to populate $9
33. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
34. Confirm that the heading has been linked to the correct authority record
35. Turn LinkerKeepStale on
36. Change that heading to something else
37. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
38. Confirm that the $9 has not changed
39. Turn LinkerKeepStale off
40. Create two authorities with the same heading
41. Run misc/migration_tools/rebuild_zebra.pl -a -z
42. Enter that heading into the bibliographic record you are working with
43. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
44. Confirm that the heading has not been linked
45. Change LinkerModule to "FirstMatch"
46. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
47. Confirm that the heading has been linked to the first authority record it
    matches
48. Change LinkerModule to "LastMatch"
49. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --bib-limit="biblionumber=${BIB}" (replacing ${BIB} with the biblionumber
    of the record you've been editing)
50. Confirm that the heading has been linked to the second authority record it
    matches
51. Run misc/link_bibs_to_authorities.pl --link-report --verbose
    --auth-limit="authid=${AUTH}" (replacing ${AUTH} with an authid)
52. Confirm that only that heading is displayed in the report, and only those
    bibs with that heading have been changed

If all those things worked, good news! You're ready to sign off on the patch
for bug 7284.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master and squashed follow-up, 16 February 2012
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Rebased on latest master, 21 February 2012

Signed-off-by: schuster <dschust1@gmail.com>
2012-03-07 17:34:11 +01:00
Katrin Fischer
a529262501 Bug 7576: Add ISSN to SearchForTitleIn preference
Adds a new placeholder {ISSN} to the system preference SearchForTitleIn.
For a record with multiple ISSNs only the first ISSN will be used.

Addition: Makes a small change to GetMarcControlnumber so that it checks for
NORMARC too. If you set your system preference to NORMARC, it should output
{CONTROLNUMBER} correctly now.

For testing add following code to the system preference and check output
of SearchForTitleIn for different records in your OPAC and all 3 available
views (normal, MARC and ISBD):
<li>ISSN: {ISSN}</li>
<li>ISBN: {ISBN}</li>
<li>001: {CONTROLNUMBER}</li>

Patch also includes some unit tests:
perl t/db_dependent/Biblio.t

Signed-off-by: Magnus Enger <magnus@enger.priv.no>
Tested with marcflavour = NORMARC, on one book and one periodical record.

* Book

- Before the patch:
ISSN: {ISSN}
ISBN: 0375726446
001:

- After the patch:
ISSN:
ISBN: 0375726446
001: 022976914

* Journal

- Before the patch:
ISSN: {ISSN}
ISBN:
001:

- After the patch:
ISSN: 1890-6931
ISBN:
001: 080721370

Looks good in all 3 views! Thanks for fixing the 001 thing for NORMARC!

Also tested with marcflavour = MARC21, on the same records with the same good
results. Signing off!

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Tested marcflavour= UNIMARC, works fine too
2012-02-27 11:44:20 +01:00
Paul Poulain
9cc9db0753 Bug 6875 follow-up for Items/Biblio de-nesting
the sub _find_value is used only in PrepareItemRecord sub, that has been moved to Items package

This patch moves the _find_value in Items as well.

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Editing an already arrived serial issue with an attached item
resulted in an error. After applying the patch it's fixed.
2012-02-27 11:13:02 +01:00
52afe06ddd Bug 6193 - Follow up: use SetEnv and remove memcached from koha-conf.xml
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Confirmed that memcached is still being used after the memcached configuration
in koha-conf.xml was removed, and the following two lines were added to
both virtual hosts in koha-httpd.conf:
SetEnv MEMCACHED_SERVERS "127.0.0.1:11211"
SetEnv MEMCACHED_NAMESPACE "KOHA"
2012-02-20 23:24:02 +01:00
Paul Poulain
f7a525fa09 Merge remote-tracking branch 'origin/new/bug_6875' 2012-02-20 16:45:42 +01:00
Paul Poulain
49b167e848 Bug 6875 de nesting C4::Biblio
C4::Biblio is used in many many places. The goal of this cleaning is to do from C4::Biblio a package with as many dependancies as possible.

* C4::Heading is called only in 1 place, highly rarely used (only in 1 misc/link_bibs_to_authorities.pl), moving to require
* PrepareItemrecordDisplay is a sub that is more related to Items, moving it here. It means some scripts that used this sub must be checked against use C4::Items
* C4::Items is needed in EmbedItemsInMarcBiblio, moving it only in this sub, and switching to require
* 2 subs are totally useless z3950_extended_services and set_service_options, removing them

Signed-off-by: Marcel de Rooy <m.de.rooy@rijksmuseum.nl>
No test plan included, but tested some opac and cataloging functions.
Code looks good. Marked as Passed QA.
2012-02-20 16:35:17 +01:00
795dc61f75 Bug 3264 UnCloneField() / minus button in MARC editor can clear all subfields (authorities AND biblio)
All subfields following the removed subfield were not saved.
Problem is in C4/Biblio routine TransformHtmlToMarc.
If the field is emptied, the param list contains a code param but no subfield
param. The while loop handling the subfields could not handle that. Also added
a FIXME because the whole routine depends on an assumption about the order of
cgi parameters that is not strictly guaranteed.

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
I was unable to replicate the problem, but can confirm that the patch does
not break anything under any of the following platforms/browsers:
Mac OS X 10.6.8:
Chrome 16.0.912.77
Firefox 9.0.1

Windows 7:
Firefox 3.6.3
Firefox 9.0.1
IE 8.0.7600.16385

Ubuntu 11.10
Firefox 8.0
Chromium 15.0.874.106 (Developer Build 107270 Linux)

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-02-16 18:52:46 +01:00
Chris Cormack
70237c49ef Bug 7432 : Fix how we are setting expiry time when caching
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
2012-02-15 10:26:22 +01:00
Paul Poulain
1637af349e Bug 6210 follow-up, removing warn
See http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=6210#c27
and http://bugs.koha-community.org/bugzilla3/show_bug.cgi?id=6210#c28
2012-02-02 15:35:26 +01:00
Srdjan Jankovic
91d870f67a bug_6210: Select framework if merging two records with different frameworks
ModBiblio() - set framework to "" if "Default"

Signed-off-by: Nicole C. Engard <nengard@bywatersolutions.com>

All 4 tests passed:

Test 1:  Merge two records with the same framework
    Desired result:  shouldn't get any prompting to pick a framework, and the
same framework should be used

    Test 2:  2 records, different frameworks, into the kept record's framework
    Desired result:  merge with kept records framework used

    Test 3:  2 records, different frameworks, into the discarded record's
framework
    Desired result:  merge with used records framework used

    Test 4:  2 records, different frameworks, into a third framework
    Desired result:  merge with third framework used
2012-02-01 17:40:43 +01:00
Paul Poulain
b70e4f3d6d BZ4376 Minor change in GetMarcAuthor
A minor change in the GetMarcAuthors function of C4/Biblio.pm allow differentiate the type of authors in the templates

This change allow doing things like this in the templates:
<TMPL_IF EXPR="tag == 700" && code eq 'a' >
 <strong>Author:</strong>
<!-- TMPL_ELSE -->
 <TMPL_IF EXPR="tag == 710" && code eq 'a' >
  <strong>Corpotation Author:</strong>
<!-- /TMPL_IF -->
<!-- /TMPL_IF -->

(html template syntax, but also applicable to template toolkit)

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Works as claimed and doesn't break existing functionality.
2012-01-18 12:07:12 +01:00
Paul Poulain
6456afea95 Bug 6990 follow up PODDOC & unit test updating
This patch removes the TransformKohaToMarcOneField sub, but there was some remaining POD & unit test about it
2012-01-17 12:07:03 +01:00
02d392a502 TransformKohaToMarc enhancement
TransformKohaToMarc function is called for each biblio and item that has
to be build. This function execute a DB statement for each Koha field
that has to be mapped to a MARC tag/letter. This impact deeply
performances for script like rebuild_zebra, especially since items are
not anymore in bilio records and have to be rebuild on the fly.

I'm proposing a patch which read Koha field to MARC field mapping just
one time and cache it. My test show a 30% execution time improvement on
rebuild_zebra.pl script. It uses already cached mapping in C4::Context.

Signed-off-by: Chris Cormack <chris@bigballofwax.co.nz>

http://bugs.koha-community.org/show_bug.cgi?id=6990
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2012-01-17 12:07:02 +01:00
Chris Hall
48f25fec43 bug 7239 fix to avoid error being thrown even though a record is created
Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
When creating an item in acquisitions while ordering and not filling out
any fields, there is no longer shown a perl error message.

Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-12-14 17:37:41 +01:00
Ian Walls
f56e6c0a58 Bug 7073: GetCOinSBiblio should take $record object, not biblionumber
This patch changes the GetCOinsBiblio subroutine to take a MARC record object
(as returned from GetMarcBiblio) instead of a biblionumber.  The first thing the subroutine
did was GetMarcBiblio, and the $biblionumber passed was never used again.

This subroutine was only used 3 places: opac/opac-search.pl, opac/opac-detail.pl,
and C4/VirtualShelves/Page.pm.  In the first and last cases, it was used in a loop.
In the last two cases, a call to GetMarcBiblio had already been done.  This is expensive, and
we were doing it twice per record.

For opac/opac-search.pl, the call to GetMarcBiblio was moved to just outside GetCOinSBiblio;
this will not change the performance at all.  But for opac/opac-detail.pl and C4/VirtualShelves/Page.pm,
a redudant call to GetMarcBiblio is now avoided.

To Test:
1. Enable COinSinOPACResults in system preferences.  Perform a search in the OPAC.
   Verify that the COinS spans are showing up
2. View the detail record of one of the returned items.  Confirm that the COinS span exists on the detail page.
3. View a list in the OPAC.  Confirm that COinS spans are still showing up

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-11-24 11:44:02 +01:00
Joy Nelson
1d0192b85c Bug 7221: C4/Biblio.pm documentation is incorrect.
$dbh is not a valid first parameter.  Removed $dbh from line 392 to show correct syntax.

Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
2011-11-17 16:47:28 +01:00
7ee0565f5c 7146 (Update timestamps when deleting a biblio)
Currently, when you delete an item, the timestamp column in deleteditems is
updated with current time. (This comes from an [unintentional] additional
update statement in DelItem.) It makes deletion time visible.
In the past, the marcxml was updated too at that moment, resulting in an
updated timestamp in biblioitems too. The timestamp in biblio was not touched.

If you delete a biblio however, the timestamps in deletedbiblio and
deletedbiblioitems do not reflect time of deletion. They still show the time of
last update before the record was deleted. This last update can be extracted
from MARC field 005 too.

This behavior is not consistent nor logical. I would suggest to add a statement
in DelBiblio to force updating the timestamp in deletedbiblio(items) too. It
makes the time of deletion visible in the record too. The time of deletion of a
biblio can be very useful for e.g. synchronizing purposes.

Signed-off-by: Julian Maurice <julian.maurice@biblibre.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
2011-11-05 05:50:02 +01:00
Christophe Croullebois
4d67e69107 Bug 5680: Order cancelling improvement : delete attached items & biblio if avalaible
- all items attached to the order are deleted
- if there is no more items, and if the biblio is not in other orders and no subscriptions and no holds then the biblio is proposed to deletion
Now whe have 2 links : "delete order" and "delete order and catalog record", the second one appears only if the deletion is possible.
Note that if an hold is related to the item or if the item is unique for the biblio the link "Delete order" is canceled due to hold remaining.
On mouse over explanations are shown with count.
More lines of warnings with count are shown depending of the case.

Signed-off-by: Katrin Fischer <Katrin.Fischer.83@web.de>
Configuration:
AcqCreateItem = on order

Test cases and results:

1) Order new record with 2 items
a) From basket
- delete order: only deletes items, OK!
- delete order and catalog record: deletes record and items, OK
b) From shipment/receive
- delete order: only deletes items, OK!

2) Order 1 additional item for existing record with 1 item
a) From basket:
- delete order: works, existing item and record remain, OK
- Can't delete order and catalog record, 1 item left, OK!

3) Order new record with 1 item, title level hold on record
a) From basket:
- delete order: not possible, OK!
- delete orer and catalog record: not possible, OK!
b) From shipment/receive page
- Cancel: Deletes order, record and hold silently.
NO WARNING. NOT OK. See note below.

4) Order 1 additional item for existing record with 1 item,
item level hold on existing item
a) From basket:
- delete order: works, hold and existing item remain, OK!
- delete order and catalog record: not possible, OK!
b) From shipment/receive page
- Cancel: on order item is deleted, other item and hold remain.

5) Order new serial record, create subscription
a) From basket:
- delete order: works, record and subscription remain, OK!
- delete order and catalog record: not possible, OK!
b) From shipment/receive page:
- Cancel: Subscription and record are silently deleted. NOT OK.

6) Order additional item for existing record with other on order items
a) From basket:
- delete order: works, existing on order items remain, OK!
- delete order and catalog record: not possible, OK!
b) From shipment:
- Cancel: deletes order and ordered item. OK.

Changes made:
I changed the wording of the error messages a bit in the template.
I changed the message 'Can't delete order and catalog record' to not be
shown as a link, as the link does nothing. Tooltip still appears.
I attached a screenshot to the bug showing some of my changes.

Hope that's ok.

Necessary  enhancements:
Cancelling orders when receiving items should work the same as from the
basket summary page. We need the same checks and messages there before
deleting records and items automatically.
I am signing off on this, but to go into Koha it  needs a follow-up for the
order receive page.
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-10-19 16:58:46 +13:00
Ian Walls
4e95e94727 Bug 6789: biblios with many items can result in broken search results link
This patch fixes an issue whereby biblios with many items (often > 500) would index,
but not the biblionumber itself, resulting in search results with a) inaccurate item counts
and b) no biblionumber to use in the link to the details page.  This is due to Net::Z3950::ZOOM  not providing
a mechanism for specifying different connection attributes; the maximumRecordSize ZOOM connection attribute,
if not specified, defaults to 1MB, which is less than the size of a MARC record with many, many 952 fields.  Since
it is unlikely we can fix Net::Z3950::ZOOM in a timely fashion, this patch aims to build a workaround on the Koha end.

This patch changes EmbedItemsInMarcBiblio to use append_fields instead of insert_ordered_fields,
so the 999$c will come before the item records.  It's VERY unlikely we will encounter more than 1MB of biblio-level MARC
content, as this would break the ISO-2709 standard by a large factor.

To this end, it also moves the fix_biblio_ids portion of get_corrected_marc_record out of rebuild_zebra.pl,
and makes it a part of GetMarcBiblio (right before EmbedItemsInMarcBiblio, so the 952s still come last).  fix_biblio_ids
is kept as a subroutine for the deletion portion of rebuild_zebra.pl, which still uses it.

It also uses the subroutine parameter in GetMarcBiblio to do the EmbedItemsInMarcBiblio action, rather than having
rebuild_zebra.pl perform it on the itemless record returned from GetMarcBiblio.  Simpler and cleaner that way.

To verify bug issue:
1. Find a biblio with over 700 items (or enough that the resulting MARCXML is greater than 1MB)
2. search for this biblio (in a search that would return multiple results, not just this title).  You should get the title in
the results list
3. attempt to click the link to this biblio's details page; the biblionumber should be blank, leading to a 404

To test solution:
1. Apply patch
2. modify the biblio slightly (click the 005 for example) and save
   OR manually add the biblio to zebraqueue for reindexing
3. after rebuild_zebra.pl -z -b -x runs, use the same search as above. The title should still appear.
4. click the link, and find yourself on the biblio detail page as desired

Signed-off-by: D Ruth Bavousett <ruth@bywatersolutions.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-10-15 13:47:24 +13:00
7621591ae6 Bug 6912 Test 008 presence in get CoinS function
Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Paul Poulain <paul.poulain@biblibre.com>
Signed-off by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-10-13 15:57:18 +13:00
Jared Camins-Esakov
f09e2ca27e Bug 5528: Analytic records support
Display links to parent biblios, show linked items in holdings, allow holds on
linked items. This uses MARC to maintain relationships.

Sponsored by the Mississippi Department of Archives and History and RapidRadio
Solution. Originally developed by Savitra Sirohi and Amit Gupta at OSSLabs, with
UNIMARC support added by Zeno Tajoli. Commits squashed and merge conflicts
resolved by Chris Cormack from Catalyst. Respect for NORMARC and some small
framework portability fixes made by Jared Camins-Esakov of C & P Bibliography
Services.

IMPORTANT NOTE: A bug in the 773 coding for MARC21 was corrected from the
original OSS Labs code. The 773s generated by the pre-release code did not have
the first indicator set to '0', which means that they were not supposed to
display. Going forward, the first indicator will be set correctly, but existing
records created with this code will no longer appear (they appeared before only
due to another bug). To correct this, you could globally (or, to make sure you
only modify records created with the Analytics tool, for records with 773$0)
change the first indicator of the 773 from blank to '0'.

== Background ==
An analytic record for an item is a more detailed, monographic biblio for an
item attached to a serial record .  This is often used for special issues of a
journal that are released as books on their own (assigned an ISBN, as well as an
ISSN/volume/issue).  It is important for researchers to be able to search for
these items both as issues of the serial, and as monographs.  It is equally
important for the library to not have duplicate item records for the item in
question to have to keep synchronized.

== Establishing relationships ==
Analytical records are connected to items belonging to parent or host
bibliographic records. This can be accomplished by:
* From an analytical bibliographic record linking to an host item by providing
  the item barcode as input
* From a host item by using option "analyze", this creates a new empty
  bibliographic record with field 773 (MARC21) populated
* Running a new CLI script that establishes a relationship between the
  analytical record and the host item identified by the barcode in the
  analytical record's 773$o (MARC21)

== Connecting Records ==
The relationships are maintained in the MARC records, we have not used database
tables at all.

== MARC Representation ==
In MARC21/NORMARC we have used:
* 773$9 to store the Koha item number of the host item
* 773$0 to store the Koha biblio number of the host bibliographic record

The above fields are used to display the relationships in various screens in the
OPAC and the staff interface. Additionally, when populating field 773 with host
item's details, we have used following MARC 21 mapping:
* 'a' <= 100/110/111 $a (author main)
* 'b' <= 250$a (edition)
* 'd' <= 260$a, 260$b, 260$c (place, publisher, year)
* 'o' <= barcode
* 't' <= 245$a (title)
* 'w' <= (003)001 --> if no 001 is available, we can populate biblionumber
* 'x' <= 022$a (issn)
* 'z' <= 020$a (isbn)

In UNIMARC, this code uses:
* 461$9 to store the Koha item number of the host item
* 461$0 to store the Koha biblio number of the host bibliographic record

When populating field 461 in UNIMARC, the following mapping is used:
* 't' <= 200$a (title)

== Treatment of Holds ==
A key requirement was to allow holds to be placed on host items from the
analytical record. We have accomplished this by allowing holds on specific
copies only. Biblio level holds are not allowed. This ensures that holds are
placed on specific items that are relevant to the analytical record.

== Deleting host items with linked analytical records ==
As we have not used database tables to maintain relationships, we had to use
search to find out if any linked analytical records are present. If 1 or more
analytical are present, we do not allow deletion of items. This is similar to
what we see when we try to delete authority records.

== Importing analytical records ==
Analytical records can be imported using bulkmarcimport or the GUI tools. The
new CLI script can be executed after the import to establish relationships with
host items. The script will establish relationships using the host item's
barcode, the barcode must be present in 773$o of the analytical record.

== What if there are two or more copies of the host item? ==
The current design will require that there be two host (773) fields, one for
each copy.

== What if there is no barcode available for the host item? ==
It is still possible to establish a relationship, by populating 773$9 with the
host's item number. However the CLI script uses barcode in 773$o to establish
relationships so it won't work where barcodes are unavailable. Also from an
analytical record, it is possible to establish a relationship to a host item by
providing the barcode as input, this option will not be available as well.

Commits that added the following features were squashed by Chris Cormack (this
is not a list of every commit):
* Display links to host records from biblio detail screens
* Support for UNIMARC, respecting the system preference 'marcflavor'
* Support holds from the OPAC
* Ability to link to items belong to host records from a analytical record
* Display items belonging to host records in the moredetail page
* Ability to edit items belonging to host records, also ability to delink from
  them
* Move get host items code into a C4 routine, also calling the new routine in
  related perl scripts
* Move host field population to a C4 routine, all changes in pl files to call
  new routine
* Allow only specific copy holds for analytical records plus changes to use new
  C4 routines
* Support for holds on items linked via host records
* Storing bibnumber and itemnumber in subfields 0 and 9, plus other mapping
  changes
* New command line script that establishes relationships between analytical
  records and host items and bibs. The script looks for host field (MARC21 773)
  in records, and based on barcode in subfield 'o' populates host bibnumber in
  subfield '0' and host itemnumber in subfield '9'. The script can be run after
  an import of analytical records, it can also be run in the crontab to maintain
  the relationships
* Ability to create analytical records from items, to view linked analytics, and
  prevent deletion of items that have linked analytics
* New template for catalogue/detail.pl (NOTE: not a new template file, just a
  new way of displaying analytics), template displays linked analytics and
  allows creation of analytical records
* New zebra index for item number in host fields. This index will be used to
  display links to analytical records from host records
* Display title of host record instead of the phrase host record
* Using detail.tmpl for analytics tab instead of a new template file
* Improved qualification info prepration in Prephostmarcfield
* Check for linked analytics before deleting item
* Display link to host record and more meaningful anchor text for edit item link
* Analytical record: Unimarc index in record.abs and help in
  create_analytical_rel.pl
* Adding a sys pref that controls display of options to create analytical
  relationships
* Add host entry in XSLT stylesheet in staff item detail
* Added host record support to OPAC detail XSLT
* Adding 773$0 and 773$9 to all frameworks
* Adding 773 subfields 0 and 9 to default marc framework via updatedatabase.pl
* Display create analytics and used in links in catalog detail
* Fixed problem where analytical records not showing in OPAC search results
  because GetMarcBiblio now needs a flag to add item records
* Fixed problem where analytics count was set to 1 for all records, not just
  those with analytics
* Fixed catalogue detail page not to show analytics counts if count is 0

Conflicts:
	installer/data/mysql/updatedatabase.pl
	koha-tmpl/intranet-tmpl/prog/en/modules/cataloguing/addbiblio.tt
	kohaversion.pl

Co-author: Savitra Sirohi <savitra.sirohi@osslabs.biz>
Co-author: Zeno Tajoli <tajoli@cilea.it>

Signed-off-by: Jared Camins-Esakov <jcamins@cpbibliography.com>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-10-13 10:03:39 +13:00
Srdjan Jankovic
c453a45f38 bug_6576: Submit when changing framework rather then reloading
TransformHtmlToMarc(): changed interface - no point passing params when
they can be accessed from $cgi

Signed-off-by: Liz Rea <lrea@nekls.org>
Signed-off-by: Ian Walls <ian.walls@bywatersolutions.com>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-09-22 09:05:44 +12:00
Stéphane Delaune
97964bc3e9 Bug 6135: insert fields ordered in C4::biblio:ModBiblioMarc
Fixing order subfields for biblionumber and biblioitemnumber

BibLibre MT5951

Signed-off-by: Frédéric Demians <f.demians@tamil.fr>
Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-08-01 15:23:37 +12:00
Janusz Kaczmarek
f16abec68a Bug 6480 - Koha produces a lot of apache logs for UNIMARC
For Koha with UNIMARC a lot of entries in apache log lines are produced.

In the patch, corrections to the GetCOinSBiblio function has been introduces,
in the UNIMARC section: (i.e. || '' at the end of lines that can create this
problem) -- analogous as it is in the MARC21 section.

Signed-off-by: Chris Cormack <chrisc@catalyst.net.nz>
2011-07-16 20:25:31 +12:00