Based on David Schuster improvement patch.
For David:
- To send the output into an HTML file, there is no need to add a
paramater to this script, just redirect to a file:
check-url --html --host-prot=http://koha-pro.mylib.org \\
> /usr/local/koha/koha-tmpl/badurls.html
- If you want as a result a table with alternate rows, use CSS and
JavaScript. For example, with jQuery (found with google):
<style type="text/css">
table {width:400px; border:1px solid blue;}
.oddrow {background-color:#E5E5E5;}
</style>
<script type="text/javascript"
src="http://code.jquery.com/jquery-latest.min.js"></script>
<script type="text/javascript">
$(function(){
$("table.tiger-stripe tr:even").addClass("oddrow");
});
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This adds a shell script which is able to turn html files
into pdf and print them on a printer
It takes a directory where pdf files is laid, a css filename,
a host for printer and a Printername
printoverdues : generates PDF files from html files in directorys and prints them
usage :
printoverdues.sh directory [css [printer_host [printername]]]
- directory directory to use to apply html2pdf transform
- css css file to apply to html
- printer_host Network Name or IP of the printer (port possibly included)
- printer_name printername
Note that css printerhost and printername are optional parameters
Note that this script uses xhtml2pdf command
xhtml2pdf command comes with pisa (a python library)
To install you need setuptools library for python
then type easy_install pisa
A new Koha Offline Circulation client has been written by Kyle M Hall in C++/Qt4.
Unfortunately, it requires an SQLite3 databse, where the PHP/Gtk client needs an SQLite2 database.
This update adds the switches --sqlite2 and --sqlite3 to the script to output either format.
Those bugs must have been introduced by merge?
- Overdue to all libraries with overdue rules doesn't work
- Overdue to a specific library doesn't work also
Enabled the -n (nomail) option, which was previously doing nothing. In
addition, I have added an -itemscontent option to allow for <<items.content>>
to be used in the notices for DUE and PREDUE.
This add two new options to overdue_notices.pl to select only overdues for few categorycodes, or to exclude few categorycodes.
Conflicts solved misc/cronjobs/overdue_notices.pl
This update the way Member is added and editing so that import and Edition
could be best automatized
GetMember evolves and allow ppl to serach on a hash of data
Adding SQLHelper A new package to deal with INSERT UPDATE and SELECT
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
The script already has a param -b for batch mode, which should silence informational
messages, but it missed a couple. This patch fixes that.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This patch, permit to take care of similar entries, if there is some diacritics or not.
(cherry picked from commit 776c177e3debedaf08fec65fbf8111675ccc93e7)
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Allows temporary locations corresponding to 'in processing' and 'shelving'
so that newly-created items, and newly-returned items do not show
immediately as a available. Three new system preferences govern the usage
of these features.
NewItemsDefaultLocation. If system pref NewItemsDefaultLocation is set to a location code,
all newly catalogued items will be set to the location set in this preference.
Location code must be a valid LOC authorized value type.
InProcessingToShelvingCart. if the system pref InProcessingToShelvingCart is turned on,
any items run through returns.pl with a location code for 'PROC', will be modified to
have a new location code of 'CART'.
ReturnToShelvingCart. If the syspref ReturnToShelvingCart is turned on,
all items returned other than confirmed holds will have a new location code of 'CART'.
Any item issued is automatically taken of the shelving cart.
Adds a cron script shelf_to_cart.pl which should be run hourly.
Updates all items with a location of CART to the item's permanent location.
The original location code is stored in the new items column 'permanent_location'.
Original Author: PTFS Contractor <dbavousett@ptfs.com>
This work co-sponsored by
Middletown Township Public Library, Middletown, NJ USA and
East Brunswick Public Library, East Brunswick, NJ USA
Signed-off-by: Colin Campbell <colin.campbell@ptfs-europe.com>
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
Fixed the holds queue job so that it correctly
ignores hold requests that are not yet scheduled
to be filled.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
This little script establishes a framework for database cleanup on some regular
schedule. Initial implementation provides for brute truncation of the sessions
table, and selective-by-age cleanup of the zebraqueue.
Signed-off-by: Galen Charlton <gmcharlt@gmail.com>
It appears, in Ryan's patch, that he wants to throw a warning to the log if
the directory specified in --out is not present. (Further messages will
be given when the open-or-die occurs a few lines later.) However, it was
throwing the warning if --out was not specified at all, which is
undesirable. This patch modifies that bit to check for the presence of
whatever directory is going to be used, either --out, ENV{TMPDIR}, or /tmp.
As before, if the write to the directory fails for any reason--including
its' non-existence--that is handled later, but this message will help
inform the troubleshooter.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
The Offline Circ tool by Kyle Hall uses PHP's SQLite, which is SQLite
v2. Gnope, which Kyle links, ships with libSQLite v2.
Let's not count on libsqlite3 not being installed for perl. If it is
installed DBD::SQLite will use it, where DBD::SQLite2 will not.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
When marcxml cannot be parsed into a MARC::Record object, the biblio is
undisplayable and it obviously breaks many features in Koha. This script
can test to parse every marcxml, and alert on failures. Optionally, the
marcxml can be replaced from the marc field.
See extensive perldoc for details.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
The problem is that we do not ensure that the issues table has valid
borrowernumber in each line. This is exacerbated by Getoverdues()
returning data sorted BY borrowernumber. So one NULL borrowernumber
in issues prevented ALL fines from being assessed. The actual error
from fines.pl cron log is:
No branchcode argument to new. Should be C4::Calendar->new(branchcode => $branchcode)
at /home/user/kohaclone/misc/cronjobs/fines.pl line 98
This patch deals only with getting fines to avoid crashing. It does
not fix the underlying data integrity problem.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
execute_query now refactored, returns reliable results, does
zero presentation-layer crap. Arguments reduced, client scripts
adapted to new API and performance improved. Text::CSV now used
to generate CSV output, ensuring portability, encoding and accuracy.
Replaced tools/runreport.pl with misc/cronjobs/runreport.pl:
~ security fixed
~ documentation improved
~ TODO: finish sendmail option.
Bug 3077 also fixed.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
The old location of runreport.pl was under tools, leaving it exposed
to web requests. This is a security flaw since it does NOT check any
Auth and allows the user to request any Saved Report be run. This is
not a problem under misc/crontab/ and it suggests the more appropriate use.
Guided.pm is not fixed here (see bug 3066), but it is prepared to be fixed
and made compatible with runreport as detailed in the perldoc.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
Removed cronjob, which was written to work around
a bug in 2.2 that no longer applies and is specific
to a single library in any event.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
This patch just fix the script which export to csv the overdue, and field the missing fields
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
This does not fix all problems recorded in 2883 (see all the FIXME's), but
it does improve the script's basic feedback to an intelligible level.
It also adjusts the documentation and examples to correct bogus usage
instructions.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
* removed ersataz YAML::XS dependency
* use 'return' instead of 'return undef'
* minor language changes
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
Improve URLs checker script in the way (half way) pointed out by Galen:
- A C4::URL::Checker class handle URL checking. This class is not yet
in a separate file in C4 directory. This class would be easily
extended to accomodate authorities URLs checking.
- Script output can now be formatted in CSV or HTML. HTML version
link directly to MARC biblio record editor.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
Item type was not retrieved in a query, leading to a case
where an item could be selected by build_holds_queue.pl
to fill a hold request even where forbidden by the
library and item type-level policy.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
* use item branch instead of patron's branch to
look up the applicable hold policies - this makes
requesting in the OPAC consistent with the intranet.
* when generating pick list using build_holds_queue.pl, only match items
to patrons if request is allowed.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
This reverts commit def09f5a21.
As I emailed to the patches list Oct 06, 2008:
I suggest we need to revert Josh' commit def09f5a21.
The effect on the crontab example is to invalidate the lines being executed. The lines were apparently copied in from a cron source, not crontab, despite the header describing it NOT being an example for cron. It also runs longoverdue twice, instead of fines.pl.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>
As rss.pl is not a CGI script, moved it to join the
other cronjobs. Full documentation of the script
is in misc/cronjobs/rss/rss.pl, but to summarize:
[1] rss.pl is run on the command line to produce
an RSS XML document. The output should be
placed in a directory accessible to the OPAC
(or staff) web interface so that users can download
the RSS feed. An example of usage:
misc/cronjobs/rss.pl lastAcquired.conf
Normally rss.pl should be run periodically (e.g., daily)
to keep the feed up-to-date.
[2] The configuration file (e.g., lastAcquired.conf) lists
* name of the template file to use
* path of output file
* SQL query
rss.pl runs the SQL query, then feeds the output of the
query through the template to produce the output file.
[3] The template file (e.g., lastAcquired.tmpl) uses
HTML::Template syntax like any of the HTML
templates for the web interface.
Signed-off-by: Galen Charlton <galen.charlton@liblime.com>