connection object by doing:
my $Zconn = C4::Context->Zconn;
My initial tests indicate that as soon as your funcion ends
(ie, when you're done doing something) the connection will be
closed automatically. There may be some other way to make the
connection more stateful, I'm not sure...
install search-test.pl on your opac (or the intranet, if intranet youll need to put the tmpl file in the intranet too)
NOT FOR PRODUCTION, purely for testing
vary between different uses of the same authorised subject heading causing
linked subject searches from the detail view to fail. Other presentation fixes
within getMARCsubjects.
Replacing zebraserver and zebraport by zebradb in koha.conf . The zebra connexion can be done in a single variable "server:port/database". I used this in dirty searchMarc.pm as well as in Biblio.pm. I've replaced your code in Search.pm
It just does a simple cql search at the moment, takes a hashref of keyed by variable.
I have introduced 2 new variables to koha.conf
zebraserver and zebraport Ill add to the installer to get these set.
Very very very much a work in progress still. Thanks to paul for getting things up to this point.
Seems not to break too many things, but i'm probably wrong here.
at least, new features/bugfixes from 2.2.5 are here (tested on some features on my head local copy)
- removing useless directories (koha-html and koha-plucene)
some explanations :
- updater/updatedatabase => will transform all tables in innoDB (not related to utf8, just to warn you) AND collate them in utf8 / utf8_general_ci. The SQL command is : ALTER TABLE tablename DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci.
- *-top.inc will show the pages in utf8
- THE HARD THING : for me, mysql-client and mysql-server were set up to communicate in iso8859-1, whatever the mysql collation ! Thus, pages were improperly shown, as datas were transmitted in iso8859-1 format ! After a full day of investigation, someone on usenet pointed "set NAMES 'utf8'" to explain that I wanted utf8. I could put this in my.cnf, but if I do that, ALL databases will "speak" in utf8, that's not what we want. Thus, I added a line in Context.pm : everytime a DB handle is opened, the communication is set to utf8.
- using marcxml field and no more the iso2709 raw marc biblioitems.marc field.
- the last 5 issues are now shown, and their status can be changed (but not reverted to "waited", as there can be only one "waited")
- the library can create a "distribution list". this paper contains a list of borrowers (selected from the borrower list, or manually entered), and print it for a given issue. once printed, the sheet can be put on the issue and distributed to every reader on the list (one by one).
* synch with rel_2_2. Probably the last non manual synch, as rel_2_2 should not be modified deeply.
* code cleaning (cleaning warnings from perl -w) continued
actually existed; so if there was no isbn, and the issn was blank,
the item would be assigned a random biblionumber and the breeding farm
would report that the item already exists in the catalog (even though
it didn't). This fix adds a check to determine whether the imported
record has an issn before assigning a matching biblionumber.
But C4::Date uses Date::Manip, which in the authors own words
"If you look in CPAN, you'll find that there are a number of Date and Time packages. Is Date::Manip the one you should be using? In my
opinion, the answer is no most of the time."
He goes on to say, that because Date::Manip is powerful and is written fully in perl its also slow.
Now Circulation needs to be as fast as possible. And C4::Date isnt actually doing anything particularly tricky,
So im working on C4::Circulation::Date to be a replacement, in an attempt to win some speed
This module is for dealing with user submitted reviews of items
Currently it allows (with some scripts) a user to review any item on their reading record.
The review is marked unvetted, and a librarian must vette and approve the review before it can show to the public
The scripts to add/edit a review, and to display them for the opac are done.
The script to display a list of reviews waiting vetting for the librarians has also been done.
IMPORTANT NOTE : the MARCkoha2marc sub API has been modified. Instead of biblionumber & biblioitemnumber, it now gets a hash.
The sub is used only in Biblio.pm, so the API change should be harmless (except for me, but i'm aware ;-) )
* run updater/updatedatabase to create imageurl field in itemtypes.
* go to Koha >> parameters >> itemtypes >> modify (or add) an itemtype. You will see around 20 nice images to choose between (thanks to owen). If you prefer your own image, you also can type a complete url (http://www.myserver.lib/path/to/my/image.gif)
* go to OPAC, and search something. In the result list, you now have the picture instead of the text itemtype.
replacing 2.2 marc search by a Net::z3950 search (waiting for Perl/Zoom)
works only for title/author/isbn search, any other search is considered as 'keywork search' (=anywhere
* go to koha cvs home directory
* in misc/zebra there is a unimarc directory. I suggest that marc21 libraries create a marc21 directory
* put your zebra.cfg files here & create your database.
* from koha cvs home directory, ln -s misc/zebra/marc21 zebra (I mean create a symbolic link to YOUR zebra directory)
* now, everytime you add/modify a biblio/item your zebra DB is updated correctly.
NOTE :
* this uses a system call in perl. CPU consumming, but we are waiting for indexdata Perl/zoom
* deletion still not work
* UNIMARC zebra config files are provided in misc/zebra/unimarc directory. The most important line being :
in zebra.cfg :
recordId: (bib1,Local-number)
storeKeys:1
in .abs file :
elm 090 Local-number -
elm 090/? Local-number -
elm 090/?/9 Local-number !:w
(090$9 being the field mapped to biblio.biblionumber in Koha)
* removing useless subs
* removing some subs that are also elsewhere
* renaming all OLDxxx subs to REALxxx subs (should not change anything, as OLDxxx, as well as REAL, are supposed to be for Biblio.pm internal use only)
It provides the user with the list of items that have been ordered for a delay and are NOT yet received.
The user may filter by supplier or branch or delay.
This page is still under developpement.
Goal is to make it ready to print to reorder the books.
2 new functions have been written in Acquisition module :
getsupplierlistwithlateorders
getlateorders
branches has been modified to manage branch independancy.
Request for comment.
STILL UNDER developpment
don't update your cvs if you want to have a working head...
this commit contains :
* updater/updatedatabase : get rid with marc_* tables, but DON'T remove them. As a lot of things uses them, it would not be a good idea for instance to drop them. If you really want to play, you can rename them to test head without them but being still able to reintroduce them...
* Biblio.pm : modify MARCgetbiblio to find the raw marc record in biblioitems.marc field, not from marc_subfield_table, modify MARCfindframeworkcode to find frameworkcode in biblio.frameworkcode, modify some other subs to use biblio.biblionumber & get rid of bibid.
* other files : get rid of bibid and use biblionumber instead.
What is broken :
* does not do anything on zebra yet.
* if you rename marc_subfield_table, you can't search anymore.
* you can view a biblio & bibliodetails, go to MARC editor, but NOT save any modif.
* don't try to add a biblio, it would add data poorly... (don't try to delete either, it may work, but that would be a surprise ;-) )
IMPORTANT NOTE : you need MARC::XML package (http://search.cpan.org/~esummers/MARC-XML-0.7/lib/MARC/File/XML.pm), that requires a recent version of MARC::Record
Updatedatabase stores the iso2709 data in biblioitems.marc field & an xml version in biblioitems.marcxml Not sure we will keep it when releasing the stable version, but I think it's a good idea to have something readable in sql, at least for development stage.
- a mail is sent everytime an issue if recieved in serial module. The mail is sent to all borrowers that have put an alert on the subscription (remember that you can put an alert only if the librarian have defined a "letter" as mail to send)
- the librarian can see, for a given subscription, who has put an alert.
* adding a package Letters.pm, that manages Letters & alerts.
* adding feature : it's now possible to define a "letter" for any subscription created. If a letter is defined, users in OPAC can put an alert on the subscription. When an issue is marked "arrived", all users in the alert will recieve a mail (as defined in the "letter"). This last part (= send the mail) is not yet developped. (Should be done this week)
* adding feature : it's now possible to "put to an alert" in OPAC, for any serial subscription. The alert is stored in a new table, called alert. An alert can be put only if the librarian has activated them in subscription (and they activate it just by choosing a "letter" to sent to borrowers on new issues)
* adding feature : librarian can see in borrower detail which alerts they have put, and a user can see in opac-detail which alert they have put too.
Note that the system should be generic enough to manage any type of alert.
I plan to extend it soon to virtual shelves : a borrower will be able to put an alert on a virtual shelf, to be warned when something is changed in the virtual shelf (mail being sent once a day by cron, or manually by the shelf owner. Anyway, a mail won't be sent on every change, users would be spammed by Koha ;-) )
- Trying to get a basket not owned by so of his own branch leads to mainpage.
- Lists onlys ths baskets owned by someon of user's brach.
Auth.pm now sends a cookie with userenv informations.
Addign a Cookie containing user specific vars such as :
branch,
firstname,
surname,
cardnumber...
may be criticized from a lawyer point of view, since name and surname are given.
But the real need is for userid and branch.
And it is achieved.
Auth passes now TWO cookies :
a session cookie
And an environment cookie.
Needs Two Update in database...
On more table (action_logs)
And One more syspref Activate_Log, with On|Off values.
Maintainance has been sweeped of previous Log functions
addbiblio.pl contains a sample of code using Log.pm
To be generalized to Authorities, acquisitions, members soon.
SearchBiblio.pm forked from SearchMarc.pm
opac-search-biblio.pl forked from opac-search.pl (just change module)
an attempt at a new search using FULLTEXT indexes.
NB: Boolean won't work without MySQL >v4.0
NNB: Will be slow without indexes added on Biblio table as follows:
ALTER TABLE biblio ADD FULLTEXT (author,title,unititle,seriestitle);
Only searching on "Any word" field just now.
more to come.
how it works :
* create the table marc_Tword with the following structure :
CREATE TABLE `marc_Tword` (
`word` varchar(80) NOT NULL default '',
`usedin` text NOT NULL,
`tagsubfield` varchar(4) NOT NULL default '',
PRIMARY KEY (`word`,`tagsubfield`)
) TYPE=MyISAM;
* open a console & type export PERL5LIB & export KOHA_CONF as usual.
* fill this table with misc/build_marc_Tword.pl. Warning, this script uses a very very consumming but very fast method to fill the table : it does everything in memory, then write everything. Another method is provided (& commented), but it's 100x times slower (really !)
* open opac-search.pl and replace use C4::SearchMarc; by use C4::SearchMarcTest; as the API hasn't changed, it will work immediatly.
* go to opac-search (advanced search) & search whatever you want. Should work fine.
LIMITS :
* build_marc_Tword has problem with extended chars (accented ones mainly). So don't be afraid if you get sql errors. They are not a problem for a POC
* search works always order by title, whatever you choose.
* search works only search WORDA and WOARDB, not yet WORDA or WORDB or WORDA except WORDB.
A sub had been forgotten to use the C4::Context->marcfromkohafield array, that caches DB datas.
this is only a little improvement for normal DB modif, but almost x2 the speed of bulkmarcimport... from 6records/seconds to more than 10.
if this parameter is defined, the url is used instead of the default one.
So, you can have your own stylesheet somewhere, and use it instead of the official Koha one.
* partial support of the "linkage" MARC feature : if you enter a "link" on a MARC subfield, the magnifying glass won't search on the field, but on the linked field. I agree it's a partial support. Will be improved, but I need to investigate MARC21 & UNIMARC diffs on this topic.
http://bugs.koha.org/cgi-bin/bugzilla/show_bug.cgi?id=858
* added a button to cancel issue
* adding checkbox to cancel reserve on the book (checked by default)
* the cancelation reserve is done on reserves done for a given item or for any item
Nelsonville, pls test & confirm it's OK
In 2.4, a new DB structure will highly speed things and this limit will be removed.
FindDuplicate is activated again, the perf problems were due to this problem.
For instance, it's only done on ISBN only. Will be improved soon.
When a duplicate is detected, the biblio is not saved, but the user is asked for a confirmations.
* in subscription Add, the issue number & date is used to calculate the 1st issue (previously, a "next issue number & date" was applied, meaning you had to enter the number & date of a previous issue)
* the "inner loop" used for serials number ({XYZ}) is now shown & can be modified. The innerloop is used for numbering formulas saying "change the number once every 4 times".
This field is useful when the callnumber contains no information on the room where the item is stored.
With this field, we now have 3 levels of informations to find a book :
* the branch.
* the location.
* the callnumber.
This should be versatile enough to solve any storing method.
This hack is quite simple, due to the nice Biblio.pm API. The MARC => koha db link is automatically managed. Just add the link in the parameters section.
moving the getalltemplates and getalllanguages subs out from Search.pm (that will be deprecated soon) to Koha.pm
moving changelanguage.pl to OPAC scope
* acquisition rewritte : create a aqbasket table to deal with "bookseller order header".
* add "close basket" feature : a closed basket can't be modified
* suggestion feature : manage suggestions in acquisition (after suggestion filled in OPAC)