This distribution includes a number of non-cgi scripts (i.e. scripts which are not run in conjunction with a web browser). There are a few that are critical to the operation of the system, and a few more that are good examples of extending the usefulness of the system by generating reports and exporting to other data formats. This page describes what a few of these scripts do.
Digital Library and Archives
generate_title_pagesThis script generates the actual title pages for each author. It's used by the review interface whenever an ETD is approved or updated, so that a nice static html title page appears for our web users. This script should probably also be run every evening, so that any changes to the database that aren't reflected in the static html appear.
generate_browse_pagesThis script generates a hierarchy of static html pages that can be used to browse the collection by author or title. ETDs that have been approved or relocated will not appear on the static browse pages until you run this script, so it's probably a good idea to run it at least once a day.
backup_databasesThis script is a simple wrapper around the mysqldump utility included with mysql. It cycles through the list of databases that are relevant, and creates a backup of each in a time-coded directory. Before this script will work, you'll need to change the root password. If you want to use this scrip to backup the data from a remote host, you'll want to set up a separate mysql user that has read-only permission from the remote host, instead of allowing the root user to connect remotely (not a good idea).
This script was developed by Nuno Freire of the national library of Portugal (this is why some of the comments are in Portugese.... :) ). It has been modified by myself and Robert France of our Digital Library Research Lab here at Tech.
The script basically spits out a file containing a USMARC record for each ETD. At our site, we use these records with zgate (a free Z39.50 server) to provide Z39.50 searchability of our records. You could also use this script to dump records for use with an existing card catalog system. Whatever you use this for, chances are you'll want to customize this script a little, as the interpretation of MARC varies slightly from place to place.
I've included a few other simple scripts that are not quite as critical, but are decent examples. The "register_files" script I use pretty often, it searches an ETD's files and registers anything that's on the system but not in the database.