VPIEJ-L 11/93

VPIEJ-L Discussion Archives

November 1993

=========================================================================
Date:         Thu, 4 Nov 1993 15:01:18 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         John Franks <john@hopf.math.nwu.edu>
Organization: Northwestern University, Dept. of Mathematics
Subject:      The Impact of Electronic Publication on Scholarly Journals
 
 
Reprinted from "The Impact of Electronic Publication on Scholarly Journals"
by John Franks, Notices of the American Mathematical Society, Volume 40,
Number 9, November 1993, pp. 1200-1202, by permission of the American
Mathematical Society.
 
 
    THE IMPACT OF ELECTRONIC PUBLICATION ON SCHOLARLY JOURNALS
         by John Franks, Dept. of Math. Northwestern Univ.
 
 
 
What will electronic journals be like?  When most mathematicians
consider this question they naturally interpret it to mean what will
electronic journals be like for them -- how will their daily use of
research journals be different when these journals are routinely
accessed on their desktop computers.  But another interpretation of
the question may be more important -- it is certainly more
controversial.  What will the organization and economics of electronic
journals be like?  What does electronic publication mean for the
publisher and the librarian?  Will the editorial or peer review
process be affected?  Will the way journals are funded and marketed
change?  And how will all this affect the mathematician?
 
DECLINING AND SHIFTING COSTS
 
Electronic publishing will come to scholarly research journals before
it comes to other kinds of publications.  There are several reasons
that we academics will be the first.  The content of our journals is
relatively homogeneous and generally amenable to being put in an
electronic form.  Also the readership is much more highly connected to
the internet than the general population and we are generally more
favorably inclined to dealing with documents in electronic format.
Finally, at the present time most libraries are under extreme
financial pressure.  What librarians refer to as the ``serials
crisis'' has been brought about by dramatic increases in the cost of
scholarly journals (an average of 13.5\% annually for more than a
decade) combined with financial constraints facing most academic
institutions today.  The result is a strong economic motive to find
less expensive ways for scholars to communicate their work.
 
We don't know what the economics of electronic publishing will turn
out to be, but several of its aspects are becoming clear.  The
relationship of electronic publishing to traditional publishing is
analogous to the relationship of personal computing to mainframe
computing.  Electronic publishing is likely to be much more
decentralized.  It is much easier to do and the costs can be much less
than traditional publishing.  This is especially true for an
organization or department which must have a computing infrastructure
for other purposes.  In such a setting the marginal cost of an
electronic publication can be quite low.
 
Moreover the costs which remain are being redistributed.  For example,
the emergence of \TeX\ as a near standard for mathematics moves much
of the cost of composition and typesetting to the author or the
author's institution.  This trend will accelerate -- there will be
increasing pressure on authors to provide manuscripts in a standard
form which needs little massaging by the publisher.  Since transferring
articles from an electronic source to paper will normally be done by
the reader, the printing costs will be shifted to the user or to the
user's library.  Distribution {\it via} the internet is not free,
contrary to popular opinion, but the costs are largely borne by the
reader's institution and perhaps to a smaller extent by the taxpayer.
Since the reader's institution usually must have internet access for
other reasons, the marginal cost is low.
 
The effect on scholarly journals of all of these shifts is to decrease
the value added by the publisher and to increase the value added by
the institutions of the author and reader.  It is worth examining who
contributes to the value of a journal and which ``value added'' items
contribute to the cost. The contribution  of the author of a scholarly
article (presumably the primary value) is almost never a direct
contributing factor to the cost of the journal.
 
The next most important value added is the certification achieved
by the editorial and peer review process.  While the publisher plays
an important organizational role in this process, the work of editor
and referee is usually done by unpaid volunteers, at least for primary
journals.  What the publisher of a traditional journal does provide is
printing, distribution, and ``production editing.''  Production
editing is a relatively costly item which improves and standardizes
the style of articles.  It is what makes the difference between a
``camera ready copy'' journal and a typeset one.
 
The cost shifts which will come with electronic publishing, together
with the dramatic increases in the cost of commercial scholarly
journals have lead some in the library community to suggest that it
may be time for a dramatic restructuring of the process of research
publication.  Patricia Battin, then University Librarian and Vice
President for Information Systems at Columbia, urged that universities
take a much greater role in the publishing enterprise [1].
 
     ``The advent of electronic capabilities provides the
     university with the potential for becoming the primary
     publisher in the scholarly communication process.  At the
     present time, we are in the untenable position of generating
     knowledge, giving it away to the commercial publisher, and
     then buying it back for our scholars at increasingly
     prohibitive prices.  The electronic revolution provides the
     potential for developing university controlled publishing
     enterprises through scholarly networks supported either by
     individual institutions or consortia.''
 
Battin might have added that we are also giving away our efforts
as editors and peer reviewers and then buying them back, at
rapidly increasing prices.
 
 
 
ACCESS VERSUS OWNERSHIP
 
Not surprisingly others, especially publishers, have a very different
vision of how electronic delivery might change the nature of scholarly
publication.  To quote Ann Okerson of the Association of Research
Libraries [2],
 
 
     ``We have lived for many generations with a world in which
     the technology of publication meant that access {\it
     required} ownership, \dots New electronic technologies allow
     the possibility of uncoupling ownership from access, the
     material object from its intellectual content. This
     possibility is revolutionary, perhaps dramatically so.''
 
I don't know what Okerson has in mind as a revolutionary possibility,
but I think I know what commercial publishers mean by uncoupling
ownership from access.  It is a code phrase for a possibility I find
disturbing.  If I subscribe to a mathematics journal and receive an
issue, what I have acquired, aside from a certain quantity of paper,
is a limited right to use the content of the articles printed on the
paper.  I have the right to read the issue and to make limited
photocopies for my personal use.  I do not have to right to make large
scale reproduction or to sell reproductions.  But my limited rights to
the contents last as long as the paper on which they are printed.
 
Unfortunately, what publishers seem to have in mind when they speak of
access without ownership is a model in which a one year subscription
entitles the subscriber to one year of access.  When the subscription
ends, so does the access.  This has little to do with ownership -- no
matter what the form of the journal, the subscriber always owns the
{\it medium} and never owns the {\it message}.  It is a question of
limited rights in perpetuity versus the same rights but with a fixed
time limit which makes it necessary to repeatedly repurchase access.
 
Such an arrangement is very advantageous to the publisher. It becomes
much harder for a library to cancel a long running subscription, if,
for example, the quality of a journal declines or the price increases
dramatically.  The big loser in such a model is the librarian who is
supplanted as archiver and probably cataloger by the publisher.  In
fact, with this model the librarian becomes little more than a
purchasing agent for scholarly journals.  Even this role could be
threatened by publishers who have hopes of marketing their journals
directly to individuals rather than to libraries.
 
A NEW FORM OF PUBLISHING
 
In light of these very different visions of electronic publishing
it is interesting to see what new developments are actually taking
place.
 
Since the time when photocopying machines became widely available the
informal distribution of ``preprints'' of scholarly articles has been
an important component of scholarly communication.  Some would argue
that it is now the most important component because formal publication
is such a time consuming process.  A major criticism of this
``scholarship by preprint'' system is that scholars who have not
managed to break in to the right distribution circles may not have
timely access to scholarly work in their field.
 
It is natural that with the arrival of electronic mail, this process
is tending to move from paper to an electronic format.  The ease and
economy with which articles can be widely distributed electronically
have led a number of volunteers to set up article data bases for their
subdiscipline or their organization.  For example, one of the best of
these is run by the Institute for Mathematical Sciences at Stony
Brook.  It provides access to the articles in their preprint series in
both \TeX\ and Postscript format [3].
 
As scholars have gained experience with this kind of publishing they
have learned that e-mail is not a very good way to do electronic
document distribution.  It's sole advantage is widespread availability
but it is by far the most cumbersome method for the user.  Anonymous
ftp is a substantial improvement, but still better are distributed
electronic document browsers like Gopher and Mosaic, which were
explicitly designed for this purpose.
 
These article collections certainly constitute a form of electronic
publication, but are they journals?  The main thing missing, of
course, is the peer review process.  Also, at present, it is normally
assumed that articles in such a collection are preprints and will be
formally published elsewhere in a traditional journal.
 
This raises a several questions, as yet unanswered.  Commercial
publishers, would like to have preprints removed from electronic data
bases when an article appears in a traditional journal.  This may seem
reasonable at first, but, in fact, tends to conflict with the interests
of the author.  When I publish an article, I receive a certain number
of reprints which I am free to distribute to interested scholars.
If that supply is exhausted I routinely photocopy more and continue to
distribute them.  This practice is surely widespread and, I believe,
perfectly consistent with copyright transfer agreements of most journals.
 
It would be both easier and cheaper for me to respond to these reprint
requests electronically.  I would be disinclined to publish in a
journal which did not permit me to do so.  And if requests for
electronic reprints become routine, I will certainly want to automate
the process of sending them to such an extent that no intervention on
my part is required.  At that point I am running my own electronic
data base in conflict with the publishers wishes.  Indeed, with modern
electronic document delivery systems using client programs like Gopher
and Mosaic it is quite possible for my electronic reprints to reside
physically on my desktop computer but appear to others to be in a
discipline oriented data base.  This is an issue whose resolution is
not clear, but current practice seems to be to leave articles
available electronically after publication and I know of no instances
of publishers attempting to enforce their removal.
 
Not surprisingly it has occurred to some that the main thing required
to turn a preprint data base into a true journal is a volunteer
editor.  In fact, there now exist electronic journals, edited and fully
refereed, and available to scholars electronically without cost.  The
costs of running such a journal are comparable to those of an article
data base and are typically borne by the editor's institution or
research grant.
 
WHAT HAPPENS NEXT?
 
In some form or other, free electronic access to scholarly articles
seems to be here to stay.  There are simply too many advantages for
the author, the librarian and the university as a whole.  The author
has a strong incentive to have his or her work easily accessible to
other scholars and he or she has the final say on where it is
published. How great the competition of free journals will provide for
commercial journals remains to be seen.  Certainly a younger scholar
trying to establish research credentials for promotion or tenure may
prefer a traditional publication even if it offers less accessibility
to his or her work.
 
A great deal depends on the reaction of librarians to these innovative
publications.  A serious journal needs to be archived by libraries or
consortia of libraries.  Librarians have two strong incentives to
support this new model of publication.  First, it preserves their
traditional role of archiver and cataloger -- a role which at least
some publishers seem to covet.  Second, library budgets can surely
better support the cost of archiving than the ``prohibitive prices''
that Patricia Battin complains about.
 
Faced with increases in the prices of traditional journals, librarians
have begun to take action by campaigning to persuade faculty not to
submit papers to the most expensive journals or to server on their
editorial boards.  Whether they will equally vigorously support
inexpensive or free electronic publications by archiving them or
forming archive consortia remains to be seen.  The rapid change in the
nature of publication is just beginning to impact the librarian, but
that impact will be profound.  For the most part librarians are not
well prepared for the changes being forced upon them by the pace of
technological developments.  It will be a difficult time for them.
But on the plus side, being a librarian is rapidly becoming a
glamorous high-tech position.
 
How will it all turn out?  No one knows for sure, but likely we will
see a mixture of models.  I doubt that primary research journals can
be successfully marketed in a way that requires an annual repurchase
of access rights, but specialized data bases which have greater value
added by the publisher perhaps can.
 
We will see more free subscription journals, but I doubt they will
supplant traditional journals.  However, as traditional journals
gradually move to an electronic format they will be under great
pressure to cut costs and pass on the savings to subscribers.
Printing and distribution accounts for about 30\% of the cost of a
traditional research journal and hence represent substantial potential
savings.  Journals can also save by making greater efforts to shift
composition costs to the author.  This will take the form of stricter
requirements for the format of submissions (or real page charges if
those requirements are not met).  This is a desirable thing.  The
scholar too must contribute to ameliorating the serials crisis.
 
Most likely traditional journals will sell electronic subscriptions
much as they currently do subscriptions to paper versions.  Their
customers will continue for the most part to be libraries.  With a
journal subscription a library will receive the right to redistribute
the contents to its institutional patrons and the responsibility of
assuring that it distributes only to those patrons.  The library also
will acquire the right and responsibility to archive those journal
contents, perhaps as part of a consortium.
 
 
REFERENCES
 
[1] by Battin, Patricia.  \paper The Library: Center of the Restructured
University \jour College and Research Libraries \vol 45 \yr 1984
\pages 170--176 \endref
 
[2] by Okerson, Ann.  \paper Synopsis \inbook in
University Libraries and Scholarly Communication {\rm A Study Prepared
for The Andrew W. Mellon Foundation} \publ Association of Research Libraries
\yr 1992 \finalinfo available via gopher to orion.lib.virginia.edu port 70
\endref
 
[3] Institute for Mathematical Sciences \jour preprint series
Anonymous ftp to math.sunysb.edu
 
 
 
John Franks     Dept of Math. Northwestern University
                john@math.nwu.edu
=========================================================================
Date:         Fri, 5 Nov 1993 10:09:03 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         "A. Ralph Papakhian" <papakhi@iubvm.bitnet>
Organization: Indiana University Music Library
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
In-Reply-To:  Message of Thu,
              4 Nov 1993 15:01:18 EST from <john@hopf.math.nwu.edu>
 
On Thu, 4 Nov 1993 15:01:18 EST John Franks said:
>
>But on the plus side, being a librarian is rapidly becoming a
>glamorous high-tech position.
 
Is that a plus? Or should we really believe, by implication, that
librarians have been in an unglamourous low-tech position until
now? Notice that librarians haven't even really made it yet to glamor,
we are only "rapidly becoming" glamorous.
I, for one, am offended by such a comment.
Perhaps the mathematicians, oh so glamorous for so long, are
now envious of the librarians wallowing in the land of low tech.
 
No. It is not a plus. Glamor, glitz, and high-tech are not in and
of themselves plusses. Something else drives libraries and, I hope,
academe.
 
*GO*   Cordially, forever more,
*!!*   A. Ralph Papakhian, Music Library (Co-Listowner for MLA-L@IUBVM)
*IU*   Indiana University, Bloomington, IN 47405
       (812) 855-2970  papakhi@iubvm.bitnet papakhi@iubvm.ucs.indiana.edu
=========================================================================
Date:         Mon, 8 Nov 1993 06:04:05 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Ron Zweig <ron@ccsg.tau.ac.il>
Subject:      newspaper archives as textual corpora
 
Can anyone point me to Internet sources for discussing the creation of
machine-readable copies of library holdings of newspapers?
 
Have any/many significant newspapers undertaken retroactive scanning projects?
Would there be a demand for the conversion of microfilmed library holdings
of newspapers?
 
We are contemplating such a project, and need all the encouragement, advice
or words of warning that anyone cares to offer.
 
Apologies to those who are plagued with this message more than once - I am
posting it on a number of lists.
 
Dr. Ron Zweig
Dept. of Jewish History,
Tel Aviv University
Ramat Aviv, Israel 69-978
ron@ccsg.tau.ac.il
phone: 972-3-6409383 (work)
       972-2-332173 (home)
       972-2-345924 (fax)
=========================================================================
Date:         Mon, 8 Nov 1993 06:05:09 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Peter Scott <scottp@herald.usask.ca>
Organization: University of Saskatchewan
Subject:      Mother Jones available via Gopher
 
Mother Jones can now be read online via gopher. Here's a blurb, followed
by the bookmark:
 
Mother Jones is a magazine of investigation and ideas for independent
thinkers.  Provocative and unexpected articles inform readers and
inspire action toward positive social change. Colorful and personal,
Mother Jones challenges conventional wisdom, exposes abuses of power,
helps redefine stubborn problems and offers fresh solutions.
 
Name=Mother Jones Magazine
Type=1
Port=70
Path=
Host=mojones.com
=========================================================================
Date:         Mon, 8 Nov 1993 06:06:07 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         phil-preprints-admin@cogsci.l.chiba-u.ac.jp
Subject:      Announcements from the IPPE
 
We are pleased to report that the International Philosophical Preprint
Exchange continues to serve upwards of 150 users per day, and that
submissions continue to arrive steadily.
 
We are also glad to report that we shall soon be able to announce
several exciting new services on the IPPE, which devolve from
cooperative ventures between the IPPE and journal editors and conference
organizers.  (We encourage the editors of scholarly journals, and the
organizers of philosophical conferences, to contact us regarding the
possibility such ventures.)
 
We are also involved in establishing a North American mirror archive,
which will maintain a copy of the IPPE's holdings at a location with
better and faster connectivity to our largest user population.
 
Despite the high level of activity, we have considerable resources which
remain unused.  Not only do we have large amounts of online storage
still available, but we have been able to hold the time required for
review of submissions to only a few days.  Thus we remain able to offer
philosophers a means of bringing a working paper to the attention of
colleagues almost immediately.
 
We encourage the submission of working papers in all areas of
philosophy.  Papers are subjected to a process of "minimal refereeing",
designed to assure only that they meet the standard of being "of
interest to contemporary academic philosophers".  (Thus far, we have
been fortunate in receiving fewer than a dozen unacceptable papers.)
 
If you wish to submit a working paper to the IPPE, please contact
Carolyn Burke, who will be pleased to assist you.  Carolyn can be
reached by email at the address cburke@nexus.yorku.ca, and by telephone
at (416) 736-5113, or from outside North America +1 (416) 736-5113.
 
If you are in a hurry and are familiar with Internet file transfer
techniques, just read the instructions available in the README file in
the submissions directory on the IPPE; or mail a diskette (MSDOS or
Macintosh format) containing your paper (in any well-known wordprocessor
format) to:
 
     IPPE                                   IPPE, c/o Syun Tutiya
     Dept. of Philosophy                    Dept. of Philosophy
     York University            or          Faculty of Letters
     4700 Keele St.                         Chiba University
     Toronto, Ontario                       1-33 Yayoi-cho, Inage-ku, Chiba
     M3J 1P3  Canada                        263  JAPAN
 
In either case, please provide a separate file containing a 150 word
abstract of your paper, in the format used for abstracts on the IPPE.
 
 
 
Accessing the International Philosophical Preprint Exchange:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
By ftp:    "ftp Phil-Preprints.L.Chiba-U.ac.jp".
By gopher: "gopher apa.oxy.edu" or "gopher kasey.umkc.edu".
By email:  "mail phil-preprints-service@Phil-Preprints.L.Chiba-U.ac.jp".
Questions: "mail phil-preprints-admin@Phil-Preprints.L.Chiba-U.ac.jp".
To upload a paper or comment: see pub/submissions/README.
=========================================================================
Date:         Mon, 8 Nov 1993 11:31:00 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         ghermanp@kenyon.edu
Subject:      Re: newspaper archives as textual corpora
 
Most newspapers are already available in electronic form through commercial
providers. Americast, Viewtext, and the Lexis/Nexis system all have major
newspapers in e-format. They do not go back much beyond the early 1980's.
Paul
Paul M. Gherman
Director of Libraries
Olin and Chalmers Library
Kenyon College
Gamibier, OH 43022
614-427-5186 voice
614-427-2272 fax
ghermanp@kenyon.edu
=========================================================================
Date:         Mon, 8 Nov 1993 16:05:28 EST
Reply-To:     "Tansin A. Darcos & Company" <0005066432@MCIMAIL.COM>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         "Tansin A. Darcos & Company" <0005066432@MCIMAIL.COM>
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
 
From: Paul Robinson <tdarcos@mcimail.com>
Organization: Tansin A. Darcos & Company, Silver Spring, MD USA
-----
John Franks <john@hopf.math.nwu.edu>, writes:
 
> Reprinted from "The Impact of Electronic Publication on Scholarly Journals"
> by John Franks, Notices of the American Mathematical Society, Volume 40,
> Number 9, November 1993, pp. 1200-1202, by permission of the American
> Mathematical Society.
 
> Patricia Battin, then University Librarian and Vice President
> for Information Systems at Columbia, urged that universities
> take a much greater role in the publishing enterprise [1].
>
> ``The advent of electronic capabilities provides the university
>   with the potential for becoming the primary publisher in the
>   scholarly communication process.  At the present time, we are
>   in the untenable position of generating knowledge, giving it
>   away to the commercial publisher, and then buying it back for
>   our scholars at increasingly prohibitive prices.  The electronic
>   revolution provides the potential for developing university
>   controlled publishing enterprises through scholarly networks
>   supported either by individual institutions or consortia.''
 
This, I think, hits it right on the head, and solves a thing that stuck in
my craw.  The Journal "Software, Practice and Experience", which might be
dubbed "The New England Journal of Medicine" for Computer Software issues,
has the interesting idea of (1) charging a nice chunk of change for its
publication (2) requiring those who place articles in the publication to
assign copyright to the publisher and (3) requiring the author to get
permission to use his own work elsewhere!
 
> To quote Ann Okerson of the Association of Research Libraries [2],
>   ``We have lived for many generations with a world in which the
>     technology of publication meant that access {required}
>     ownership... New electronic technologies allow the possibility
>     of uncoupling ownership from access, the material object from
>     its intellectual content. This possibility is revolutionary,
>     perhaps dramatically so.''
>
> I don't know what Okerson has in mind as a revolutionary possibility,
> but I think I know what commercial publishers mean by uncoupling
> ownership from access...If I subscribe to a... journal...I have
> acquired...a limited right to use...my limited rights to
> the contents last as long as the paper on which they are printed.
> Unfortunately, what publishers seem to have in mind when they speak
> of access without ownership is a model in which a one year
> subscription entitles the subscriber to one year of access.  When
> the subscription ends, so does the access.
 
I think you've got another winner here.  Compuserve offers the Associated
Press Newswire.  One of the conditions for access is that you may not
reprint, reuse, or capture off screen what you obtain from the AP
Newswire.  So what am I paying $6.50 an hour for?  To read off the screen?
I can get almost all of this for free from broadcast and nearly free from
a 35c local paper.
 
It seems that commercial organizations which distribute electronically
have the nasty habit of thinking that an electronic release is somehow
intrinsically more valueable than the same information on paper.  Part of
the cost may be because of the lack of commercials.  Part of the cost is
the overhead of the distributor.  And some is profit.  But the fact is
that electronic reproductions are commercially more expensive than paper
ones in any case I know of.
 
> It is natural that with the arrival of electronic mail, this process
> is tending to move from paper to an electronic format.  The ease and
> economy with which articles can be widely distributed electronically
> have led a number of volunteers to set up article data bases for their
> subdiscipline or their organization.  For example, one of the best of
> these is run by the Institute for Mathematical Sciences at Stony
> Brook.  It provides access to the articles in their preprint series in
> both \TeX\ and Postscript format [3].
>
> As scholars have gained experience with this kind of publishing they
> have learned that e-mail is not a very good way to do electronic
> document distribution.  It's sole advantage is widespread availability
> but it is by far the most cumbersome method for the user.  Anonymous
> ftp is a substantial improvement,
 
I would like to introduce you to the method by which the people who design
the Internet and all the parts of it decide things.  First, someone gets
an idea on doing something.  They will write up an Internet Draft and
submit it to the internet draft mail box so that it may be seen by the
members of the Internet Engineering Task Force (IETF) and other interested
parties. The draft is then posted to an ANON FTP site, and a message is
E-Mailed to the IETF Mailing list mentioning the name of the document and
the abstract of it if one was provided.  Those interested can then
retrieve the document - which is in Ascii or can also be in Postscript as
well.
 
If the document covers a regular ongoing working group dealing with an
issue of the Internet, it is referred to them.  If there is interest in
discussing it, an E-Mail list will be set up to allow people to post
comments about it.  I see where people take someone else's document and
traverse through it paragraph by paragraph and ask questions and make
comments, these sometimes get very techincal and complicated.  Yet this
cooperative review tends to create not only a better product, but nobody
need feel disappointed since anyone who is on the list can toss in their
two cents, just as I am doing now.
 
Eventually, if the item is of importance to the Internet, it will be
published as an RFC and become one of the Internet Standards.
 
> but still better are distributed
> electronic document browsers like Gopher and Mosaic, which were
> explicitly designed for this purpose.
 
I'll toss the rodent out by saying that Gopher is not a good choice for
browsing (as neither is Mosaic) if you are using, for example, a dial-up
terminal to a service; to use these effectively on complicated documents
(subscript inline, font changes, graphics and tables) requires an
X-Windows terminal as a direct connection.  Some people might have one,
most probably don't.  Many schools probably still use standalone PCs
hooked up to the campus network, and running X is probably an expensive
proposition.
 
> These article collections certainly constitute a form of electronic
> publication, but are they journals?  The main thing missing, of
> course, is the peer review process.  Also, at present, it is normally
> assumed that articles in such a collection are preprints and will be
> formally published elsewhere in a traditional journal.
 
I think that the standard IETF list, and "publication" by announcing the
existence of a document on that list, does provide this type of feature.
Perhaps what is needed is a publications FTP site set up only for this
purpose; preprints can be in one section, published documents in another;
divided up by disciplines.
 
Add to this a set of lists which then discuss any discipline in the area
in question.  When someone releases a preprint (what IETF calls an
Internet Draft) they would subscribe to the list dealing with their
speciality.  As people read the document, they can post comments to the
list; one advantage to this is that everyone who subscribes gets to see
the comments.  Also, if the list only sends out under its own mailing
address, then comments can be posted anonymously because unless the poster
puts their name in the message, you can't tell who posted what remarks.
 
Once the document becomes finalized it can be printed and copies sent to
the Library of Congress for permanent record, microfilmed, whatever.  The
nice thing is that it is possible to leave the original document, and the
mail sent on the subject, on line to allow future readers to see how the
final document was arrived from the original draft preprint.
 
Inexpensive hard drives are about $1 a meg; a small computer can be set up
with 3GB of storage for perhaps $6,000.  This would probably be enough to
last a couple of years.
 
Also, at the end of each year or period, the entire year's output could be
reproduced on CD-ROM, both the preprints and published articles.  In the
5,000 quantity rate, CD-ROMs can be mastered for about $3,000 and
duplicated for $1 each, making the cost per CD about $3 each.  If the CDs
were then sold for, say $25 apiece, libraries (even individuals) could
afford to purchase enough of them to make the whole thing self supporting,
even to include the cost of running an Internet connection and a computer
to store the data.  With a page running about 2.5K, someone want to tell
me how many documents it would take to create 200,000 pages of text, the
amount needed to fill one CD-ROM?
 
The nice thing would be, since it's on CD-ROM, you can view it all you
want; if you want to print a copy, you just copy to printer.  It is
essentially as permanent as printed copies.  If documents are indexed as
they come in, the indexes are available immediately and require no extra
effort, especially since most authors supply keyword indexes with their
documents.
 
> When I publish an article, I receive a certain number of reprints
> which I am free to distribute to interested scholars.  If that
> supply  is exhausted I routinely photocopy more and continue to
> distribute  them.  This practice is surely widespread and, I
> believe, perfectly  consistent with copyright transfer agreements
> of most journals.
 
Did you ever check the agreements?  The "copies as payment for the
article" is standard, because the idea is that since you have assigned the
copyright to them, once your supply is exhausted, you are supposed to
_purchase_ more copies from the copyright holder and _pay_ them for the
copies that you _give_ away. You no longer own the work anymore; you are
supposed to pay the copyright holder for use of his work, e.g. by
purchasing reproductions from the owner, which is *not* you, but the
publisher!
 
> It would be both easier and cheaper for me to respond to these
> reprint requests electronically.  I would be disinclined to
> publish in a journal which did not permit me to do so.  And if
> requests for electronic reprints become routine, I will
> certainly want to automate the process of sending them to such
> an extent that no intervention on my part is required.
 
This is another feature that IETF's Internet Drafts and the RFC
Publication service really shine.  They set up both an FTP site for
Anonymous FTP requests, but they also have a daemon (a computer
program that reads a mailbox) running that when you send it a command
it returns a file in response to that command.  So you can send E-Mail or
you can FTP the files.  If they are large, the daemon breaks them into
small pieces to fit through various gateways and numbers the pieces.
 
> A great deal depends on the reaction of librarians to these
> innovative publications.  A serious journal needs to be
> archived by libraries or consortia of libraries.  Librarians
> have two strong incentives to support this new model of
> publication.  First, it preserves their traditional role of
> archiver and cataloger -- a role which at least some
> publishers seem to covet.  Second, library budgets can surely
> better support the cost of archiving than the ``prohibitive
> prices'' that Patricia Battin complains about.
 
Okay, here's an idea.  Set up a monthly distribution service.  Let's
assume that with comments and other materials, an individual journal might
use 2,000 pages a month.  That means we can put 100 serials on a single
CD-ROM each month.  Charge the libaries $15 a month for the disk, which
then gives them access to 100 or more serials on a monthly basis for less
than the cost of the electric bill of the computer that accesses the CD-ROM.
 
Or go one better.  A place will pay $25 a month per terminal for access to
the preprints/publications archive site, and for this, they get on-line
access to the FTP archives, immediate access to the publications, notice
of the publications and at the end of the month, they get everything
entered or published that month on a CD ROM, one copy for each subscription.
By making it cheap enough, many places can afford to telnet or FTP in and
pick up the papers.  By charging a fee, they get the advantage of having
an inexpensive permanent copy of their own so that old documents can be
removed from the archives in say, a year after publication or three months
after abandomnent.
 
It might be possible to also offer "anonymous" access for those who just
want to pick up one or two items, by allowing access to documents over one
month old or so.  This gives those who do not want, or can't afford the
price of paying for immediate access, to obntain the files after a period
of time, or the author can E-mail them a copy.
 
In fact, except for the CD-ROM capability, the whole thing could be done
by any institution having a free computer on the Internet with a large
hard drive and not cost anything.
 
The idea here is to set prices very low - $25 a month is probably the cost
of one printed journal - yet make the whole thing self-funding so that it
can continue to operate without having to depend on scrounged supplies or
borrowed equipment.  Yet, if there are just a few thousand subscribers, it
will pay for all costs involved.
 
The massive UUNET communications service started because Rick Adams wanted
to develop an inexpensive means to provide news and mail to people.  UUNET
does nothing else on its computers except provide Internet connectivity.
 
I think this would work.  Maybe I should look at financing this...
 
---
Note: All mail is read/responded every day.  If a message is sent to this
account, and you expect a reply, if one is not received within 24 hours,
resend your message; some systems do not send mail to MCI Mail correctly.
 
Paul Robinson - TDARCOS@MCIMAIL.COM
Voted "Largest Polluter of the (IETF) list" by Randy Bush <randy@psg.com>
-----
The following Automatic Fortune Cookie was selected only for this message:
 
                Another Glitch in the Call
                ------- ------ -- --- ----
        (Sung to the tune of a recent Pink Floyd song.)
 
We don't need no indirection
We don't need no flow control
No data typing or declarations
Did you leave the lists alone?
 
        Hey!  Hacker!  Leave those lists alone!
 
Chorus:
        All in all, it's just a pure-LISP function call.
        All in all, it's just a pure-LISP function call.
=========================================================================
Date:         Tue, 9 Nov 1993 09:41:50 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Stevan Harnad <harnad@princeton.edu>
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
 
                IMPLEMENTING PEER REVIEW ON THE NET:
        SCIENTIFIC QUALITY CONTROL IN SCHOLARLY ELECTRONIC JOURNALS
 
                         Stevan Harnad
   Cognitive Science Laboratory      Laboratoire Cognition et Mouvement
   Princeton University              URA CNRS 1166 I.B.H.O.P.
   221 Nassau Street                 Universite d'Aix Marseille II
   Princeton NJ 08544-2093           13388 Marseille cedex 13, France
   harnad@princeton.edu              harnad@riluminy.univ-mrs.fr
 
In Press: International Conference on Refereed Electronic Journals:
Towards a Consortium for Networked Publications. Implementing Peer
Review on the Net: Scientific Quality Control in Scholarly Electronic
Journals. University of Manitoba, Winnipeg 1-2 October 1993 (in press)
Retrievable by anonymous ftp: ftp.cc.umanitoba.ca  Directory: e-journals
 
    ABSTRACT: Electronic networks have made it possible for scholarly
    periodical publishing to shift from a trade model, in which the
    author sells his words through the mediation of the expensive and
    inefficient technology of paper, to a collaborative model, in which
    the much lower real costs and much broader reach of purely
    electronic publication are subsidized in advance, by universities,
    libraries, and the scholarly societies in each specialty. To take
    advantage of this, paper publishing's traditional quality control
    mechanism, peer review, will have to be implemented on the Net,
    thereby recreating the hierarchies of journals that allow authors,
    readers, and promotion committees to calibrate their judgments
    rationally -- or as rationally as traditional peer review ever
    allowed them to do it. The Net also offers the possibility of
    implementing peer review more efficiently and equitably, and of
    supplementing it with what is the Net's real revolutionary
    dimension: interactive publication in the form of open peer
    commentary on published work. Most of this "scholarly skywriting"
    likewise needs to be constrained by peer review, but there is room
    on the Net for unrefereed discussion too, both in high-level peer
    discussion forums to which only qualified specialists in a given
    field have read/write access and in the general electronic vanity
    press.
 
    KEYWORDS: citation, economics of publication, editing, peer
    commentary, peer review, publication lag, quality control, referee
    selection, scholarly publication, skywriting, trade publication.
 
There is no special problem of scientific quality control that is
peculiar to the electronic medium. Scholars criticize and evaluate the
work of their peers before it appears formally in print. The system is
called "peer review." Like democracy, it has imperfections, but it has
no viable alternative, whether on paper or on the electronic airwaves
(Harnad 1982, 1986).
 
        TRADE PUBLISHING VS. SCHOLARLY PUBLISHING
 
Later in this paper I will describe how peer review has been
implemented in the refereed electronic journal of which I am the
Editor, PSYCOLOQUY, but first I would like to induce some perspectival
perestroika in my readers: I was once at a reception with a
vice-president of NBC, so I took the opportunity you would all no doubt
have liked to take in my place, to chastise him roundly for the low
quality of his network's programs. He smiled and asked why I thought he
was to blame for that. After all, what did I think the "product" of NBC
TV was? I replied that it was TV programs, of course. He shook his head
and informed me that it was nothing of the sort: "NBC's product is
eyeballs, yours and mine, and we sell them to our advertisers. We're
perfectly content to put on the screen whatever it is that will make
your eyeballs adhere to it. So you get exactly what you pay for -- with
your eye balls."
 
Well, I don't know if this revelation turned things as much on end for
you as it did for me, but I'd like to induce an equally radical shake
up in your conception of the real "product" and reward structure of
scholarly and scientific publication. First, a disclaimer. What I am
about to say does NOT apply to trade publication. Trade publication
really works the way I had thought TV did: The product is the author's
words. These are sold as print on paper. In order to reach your
eyeballs (and pocketbook), the author relies on the mediation of the
publisher and his technology for setting word to page and then selling
page to reader. The alliance between author and publisher is necessary
and mutually beneficial; even in today's era of desk-top publishing,
most authors prefer not to be masters of all trades in their finite
lifetimes; they leave editing, copy-editing, type-setting,
proof-reading, printing, marketing and distribution to the experts.
 
In this symbiotic relationship between trade author and trade
publisher, it is quite natural that the author should transfer the
copyright for his words to the publisher, for COPYING is precisely what
the author wishes the publisher to do, as many times as possible, to
reach all those eyeballs with pocketbooks. And author and publisher
share exactly the same interest in protecting that copyright from
theft. No one else should be able to sell that author's words as his
own, and no one should have access to them without paying for it. These
shared interests are clear. If the author were not prepared to transfer
copyright to the publisher, the publisher would be investing his time,
technology and expertise in a product that was available by alternative
means (courtesy of someone else's time, technology and expertise,
presumably), and it would be quite reasonable for a publisher to
decline to invest in a product without the protection provided by
exclusivity (there are exceptions to this in the case of different
editions, countries or languages, but such details are not really
relevant here). For paper publication DOES Require a large investment,
even in today's era of desktop publishing. The technology of print is
not a trivial one, nor is it cheap.
 
So although I am sure that trade authors would be happy if there were a
way to sell their words to readers without the mediation of an
expensive technology (one that, among other things, raises the price of
their words relative to what they would have cost if they could be
marketed directly, like paintings, perhaps, or performances), the
sales-deterrent effect of the add-on cost of the print technology is a
small price to pay for the advantages of mass production.
 
It would accordingly NOT be music to a struggling author's ears to hear
that his limited print-run work, which was not selling all that well,
was widely available and read in a contraband xerox edition for free
(or rather, for the cost of polycopying). Hearing this, the author
would be as indignant as his publisher, and both would try to take
whatever steps were possible to protect their product from theft, and
themselves from loss of rightful revenue.
 
Now comes the perestroika. Suppose this author were not a writer by
trade, but a scholar or scientist, someone whose work is likely to be
read by at most a few hundred peers (and usually much fewer) in an
entire lifetime. Suppose such a one heard about a similar contraband
xerox trade being done in his words and work: What would HIS reaction
be? The first response would surely be one of delight (as long as the
work was not being attributed to someone else): Why are we scholars if
it is not to make a contribution to knowledge and inquiry? And surely
that contribution is made only if we are read by and influence the work
of our fellow scholars, present and future. The scholar/scientist, in
other words, wants to reach his peers' eyeballs so as to influence the
contents of their minds; his interest is not in the contents of their
pocketbooks.
 
Upon reflection, however, our scholar/scientist would have to wince and
duly report the infraction to his publisher and allow him to take steps
to put an end to the illicit trade in his words, because if such
infractions were allowed to proliferate unchecked, the very vehicle
that carries his words to his peers would be at risk. If contraband
trade undermines publishers' investment and revenues, then there will
be no primary publication to base the contraband on in the first
place.
 
So this potential conflict of interest between scholar and publisher
(the former wanting to maximize eyeballs and minimize any barriers --
such as price-tags -- between them and his work, the latter wanting to
recover real costs and make a fair return on his investment, and for
expert services rendered) is resolved along the same lines as in the
trade publishing model: A financial barrier is erected between word and
reader, with the reluctant acquiescence of the author and the justified
insistence of the publisher.
 
This was true in the Gutenberg age, in which the only means of reaching
eyeballs was through the mediation of the expensive (and slow and
inefficient and  unecological) technology of paper: That was the only
vehicle in town.
 
Today, this is no longer true, and although the scholarly community is
slow to come to a realization of it, the implications of the
PostGutenberg technology of electronic networked communication are
truly revolutionary (Harnad 1991). I will illustrate with one seemingly
innocent new feature, "anonymous ftp," and with it I will recreate the
contraband scenario above, but with a rather different outcome:
 
        THE SUBVERSIVE POTENTIAL OF ANONYMOUS FTP
 
We again have our fabled scholar/scientist, whose motivation in
publishing is to reach the eyes and minds of the highest possible
proportion of his relatively small community of peers, as quickly and
with as few impediments as possible during his fleeting lifetime, while
he is still compos mentis and productive and his work still able to
benefit from the interaction. He is accustomed to the fact that his
article appears in a journal that is subscribed to by about 1000
libraries and individuals world-wide, that he receives (usually) a
dozen to (rarely) a few hundred reprint requests after his article
appears (generally the week or two after it appears in Current Contents
or some other bibliographic alerting service). Before publication, an
unrefereed draft of his manuscript might also have been circulated to a
variable number of individuals as a preprint, and an updated final
draft may continue to be disseminated in that form after publication
(as long as his publisher is willing to turn a blind eye to this
limited form of auto-contraband). Then there are the citations (ranging
from none, to the median half dozen, to the occasional hundreds -- or
even higher in those rarities qualifying as Current Contents "Citation
Classics"). This is the small, esoteric world of scholarly/scientific
communication among the peers of the realm. (There sometimes appears a
second and larger incarnation of the rare paper that is reprinted for
educational purposes.)
 
Enter anonymous ftp ("file transfer protocol" -- a means of retrieving
electronic files interactively): The paper chase proceeds at its usual
tempo while an alternative means of distributing first preprints and
then reprints is implemented electronically: An electronic draft is
stored in a "public" electronic archive at the author's institution
from which anyone in the world can retrieve it at any time. No more
tedious scanning of Current Contents and mailing of reprint request
cards by would-be readers, and then costly and time-consuming (but
willing) mailing of reprints by authors who would be read: The reader
can now retrieve the paper for himself, instantly, and without ever
needing to bother the author, from anywhere in the world where the
Internet stretches -- which is to say, in principle, from any
institution of research or higher learning where a fellow-scholar is
likely to be.
 
Splendid, n'est-ce pas? The author-scholar's yearning is fulfilled:
open access to his work for the world peer community. The
reader-scholar's needs and hopes are well served: free access to the
world scholarly literature (or as free as a login on the Internet is to
an institutionally affiliated academic or researcher). And the
publisher? Well there's the rub. For, unlike the xerox contraband
economy, which has not had its predicted disastrous effects on
scholarly publication (apart from whatever role it might have played in
raising the prices of scholarly journals), the ftp-contraband economy
is a tail that can quickly outgrow the dog by orders of magnitude. Will
there be any buyers left to pay the real costs of publication?
 
That all depends on what we mean by publication, and what the real
costs of THAT will turn out to be. Paper publishers currently estimate
that electronic publication would cost only 20-30% less than paper
publication (e.g., Garson, this volume).
If that is true, then a loss of the revenues needed to
cover the remaining 70-80% because of ftp contraband could well do
in the entire scholarly publishing enterprise even if it went totally
electronic. In which case there would be nothing in those anonymous ftp
archives to retrieve. Is this possible?
 
I think not. Not only do I think that the true cost of purely
electronic publishing would be more like the reciprocal of the paper
publishers' estimates (which are based largely on how much electronic
processing saves in PAPER publication), i.e., SAVINGS of 70-80%, but I
also think this will put us over the threshold for an entirely
different model of how to recover those costs and create a viable
purely electronic scholarly publication system. That would be a
scholarly subsidy model, whereby universities (especially their presses
and libraries) and scholars' own learned societies support electronic
publications, in place of a trade revenue model. Such a system would
reflect more accurately the true motivational structure of scholarly
publishing, in which, unlike in trade publishing, authors are willing
to PAY to reach their colleagues' eye-balls, rather than the reverse:
In physics and mathematics, page charges to the author's institution to
offset part of the cost of publication are already a common practice in
PAPER publication today. In electronic publication, where these charges
would already be so much lower, they seem to be the most natural way to
offset ALL of the true expenses of publication that remain. That,
however, is not the subject of my paper, so I mention it only in
passing. One thing of which I feel confident, however, is that, in line
with the real motivation of scholarly publishing, scholars and
scientists will NOT accept to have anonymous ftp access blocked by
paper publishers invoking copyright. Either a collaborative solution
will be reached, with paper publishers retooling themselves to perform
those of their services that will still be required in purely
electronic publishing, or scholars will simply bolt, and create their
own purely electronic publishing systems.
 
What would they need to do to accomplish this? There will always be a
need for expertise in editing, copy-editing, page-composition, graphics
and lay-out, and proof-reading. But, most important of all, there has
to be a mechanism of quality control -- quality of content, rather than
just quality of form, which is what the former are all concerned with.
Publishers formerly furnished this quality control -- or did they? In
the case of book publishing, house editors are often scholars who make
judgments and recommendations about the content of their authors'
manuscripts, though usually in conjunction with other scholars,
unaffiliated with the publisher, whom they ask to serve as reviewers,
to provide criticism and make recommendations about acceptability and
revision, if necessary. In scholarly periodical publishing, the journal
editor is usually not directly affiliated with the publisher, and the
referees he consults certainly are not. They are us: the author's
community of peers. This is why this quality control system is called
peer review.
 
So the ones who monitored and guided the quality of the content of
scholarly publishing were always the members of the scholarly community
itself. Nor were they PAID for their efforts (in periodical
publication, which accounts for the lion's share of the scientific
literature and a good portion of the rest of scholarly publishing too;
in book publishing they were paid a pittance, but hardly enough to make
it worth their while if they were not already resigned to rendering
this scholarly community service in any case). So the COST of scholarly
quality control certainly cannot be written off as part of the cost of
paper publishing: Scholars have been subsidizing this with their free
efforts all along. For purely electronic publication, they would simply
have to be investing these efforts in another medium.
 
        THE ANARCHIC INITIAL CONDITIONS ON THE NET
 
And it is indeed another medium, one of which most serious scholars
today are still quite wary. Why? I think it is because of the peculiar
initial conditions of the making of the new medium, its initial
demography, and the style that has become associated with it. Erroneous
conclusions about the medium itself have been drawn from these its
first messages.
 
The Net was created, and is continuing to evolve, as the result of a
collective, anarchic process among computer programmers ("hackers") and
professional, student, and amateur users -- a networked effort, so to
speak. Hence it was perfectly natural to imagine that this creative and
enterprising anarchic spirit, which has proven so effective in forging
these remarkable new tools, should also be the means of deploying them.
Indeed, the rapid proliferation of bulletin boards, discussion groups,
alerting services and preprint archives, complemented now by simple and
powerful search and retrieval tools, all pointed in the direction of a
new "ultrademocratic" approach to information production and
distribution in this new medium.
 
Problems immediately manifested themselves, however, in this
informational Utopia: Discussions would wax verbose and sometimes
abusive; misinformation was difficult to distinguish from information;
an ethos of egalitarian dilettantism prevailed; and, worst of all,
serious scholars and scientists distanced themselves or kept their
distance from the Net, concluding, understandably, that it was much too
chaotic and undiscriminating a medium to be entrusted with the
communication and preservation of their substantive ideas and
findings.
 
And so things stand today. There are a few brave new electronic
journals, but the medium is still widely perceived as unfit for serious
scholarship, more like a global graffiti board for trivial pursuit.
Yet the remedy is obvious and simple; and, as I have suggested, it is
not, nor has it ever been, medium-dependent: The filtering of scholarly
and scientific work by some form of quality control has been implicit
in paper publication from the outset, yet it is not, and never has been,
in any way peculiar to paper.
 
The scholarly communicative potential of electronic networks is
revolutionary. There is only one sector in which the Net will have to
be traditional, and that is in the validation of scholarly ideas and
findings by peer review. Refereeing can be implemented much more
rapidly, equitably and efficiently on the Net, but it cannot be
dispensed with, as many naive enthusiasts (who equate it with
"censorship") seem to think.
 
IMPOSING ORDER THROUGH PEER REVIEW
 
I will now describe how peer review is implemented by PSYCOLOQUY, an
international, interdisciplinary electronic journal of open peer
commentary in the biobehavioral and cognitive sciences, supported on an
experimental basis by the American Psychological Association.
PSYCOLOQUY is attempting to provide a model for electronic scholarly
periodicals. All contributions are refereed; the journal has an
editorial board and draws upon experts in the pertinent subspecialties
(psychology, neuroscience, behavioral biology, cognitive science,
philosophy, linguistics, and computer science) the world over (Harnad
1990; Garfield 1991; Katz 1991).
 
In addition to refereed "target articles," PSYCOLOQUY publishes
refereed peer commentary on those articles, as well as authors'
responses to those commentaries. This form of interactive publication
("scholarly skywriting") represents the revolutionary dimension of the
Net in scholarly communication (Harnad 1992), but it too must be
implemented under the constraint of peer review.
 
The objective of those of us who have glimpsed this medium's true
potential is to establish on the Net an electronic counterpart of the
"prestige" hierarchy among learned paper journals in each discipline.
Only then will serious scholars and scientists be ready to entrust
their work to them, academic institutions ready to accord that work due
credit, and readers able to find their way to it amidst the anarchic
background noise.
 
How is peer review normally implemented, in conventional paper
journals? The journal has an Editor and an Editorial Board. With some
journals it is the Editor in Chief, with others it is the Editor in
consultation with the Board, or with Action Editors, who selects the
referees, usually one or two per manuscript, a third or more consulted
if a deadlock needs to be broken. The referees advise the Editor(s) by
submitting reports (sometimes anonymous, sometimes not) evaluating the
manuscript and making recommendations about acceptance/rejection and
revision. The reports are advisory rather than binding on the Editor,
who makes the actual decision, but a good Editor chooses his referees
well and then for the most part trusts them; besides, it is only the
very narrow specialty journal whose Editor has the expertise to judge
all submissions on his own. The idea of peer review is also to free
publication from the domination of any particular individual's
preferences, making it answerable to the peer community as a whole --
within the discipline or specialty. (Interdisciplinary journals always
have added problems in achieving peer consensus, and indeed, even with
specialty journals, referee disagreement rates suggest that consensus
is more than one can expect from peer review; nor is it clear that it
would be desirable; Harnad 1985).
 
In the social sciences and humanities, journals pride themselves (and
rank their quality) on the magnitude of their rejection rates. Eighty
to ninety percent of submissions rejected is not unusual for the most
prestigious journals in these fields. Prestige in the physical sciences
and mathematics is not associated with such high rejection rates;
indeed, they tend to be the reciprocal of social science rates, and
biological, medical and engineering periodicals' rates fall somewhere
in between (Hargens 1990). In all these fields, however, irrespective
of the prevailing rejection rates, there is a prestige hierarchy among
journals, with some known to accept only the best work in the field,
and some not much more selective than the unrefereed vanity press that
exists at the bottom of each field's hierarchy. It is thought that the
lower rejection rates in physics may occur because in this field
authors exercise more self-selection in choosing which journal to
submit their work too, saving only the best for the elite journals. It
is also true that in all fields virtually everything that is written
gets published somewhere; in the social sciences a manuscript may be
submitted to a succession of lower and lower standard journals until it
finds its niche; in physics authors may head more directly for
something within their reach the first time round.
 
Another pertinent feature of this hierarchical system of quality
control is that most published work is rarely if ever cited. Only a
small percentage of what is published is ever heard of again in the
literature. This may be because too much is being published, but it may
also reflect the inevitable wheat-to-chaff ratio in all human
endeavor. As a consequence, a scholar is protected on both sides: There
is not much risk that a truly valuable piece of work will fail to be
published, though it may not make it to its rightful level in the
hierarchy, at least not right away. (Peer review is far from
infallible.) On the other hand, it is also safe for a scholar, in this
monumental information glut, to let the quality control mechanism
calibrate his reading, saving it for only the best journals. Again,
there is some risk of missing a gem that has inadvertently been set too
low, but, given the prevailing odds, that risk is itself low.
 
I have not described a perfect or ideal system here; only the reality
of peer review and the reasonably reliable rank-ordering it imposes on
scholarly output. It should be apparent that there is nothing about
this system that could not be implemented electronically, indeed, there
are several ways in which electronic peer review can be made more
efficient, fairer, and perhaps even more valid in the electronic
medium. The "point faible" of the peer review system is not so much the
referee and his human judgment (though that certainly is one of its
weaknesses); it is the SELECTION of the referee, a function performed
by the Editor. Hence it is really the Editor who is the weak link if he
is selecting referees unwisely (or, worse, not heeding their council
when it is wise). Editors usually have "stables" of referees (an apt if
unflattering term describing the workhorse duties this population
performs gratis for the sake of the system as a whole) for each
specialty; in active areas, however, these populations may be saturated
-- a given workhorse may be in the service of numerous stables. So one
must turn to less expert or less experienced referees. In practice, the
problem is less the saturation of the true population of potentially
qualified referees but the saturation of that portion of it that an
Editor KNOWS of and is in the repeated habit of consulting.
 
One of the results of this overuse of the workhorses is that the entire
refereeing process is a very sluggish one. One does one's duty, but one
does it reluctantly, other duties take priority, manuscripts sit unread
for unconscionably long times, referees are delinquent in meeting the
deadlines they have agreed to, and sometimes, out of guilt, hasty
last-minute reports are composed that do not reflect a careful,
conscientious evaluation of the manuscript. There is much muttering
about publication delay, a real enough problem, especially in paper
publication, but peer review itself is often responsible for as much of
the delay as the paper publication and distribution process itself.
 
Now, as I said, there are no ESSENTIAL differences between paper and
electronic media with respect to peer review. And the Net is populated
by frail human beings, just as the paper world is. But the Net does
offer the possibility of distributing the burdens of peer review more
equitably, selecting referees on a broader and more systematic basis
(electronic surveys of the literature, citation analysis, even posting
Calls for Reviewers to pertinent professional experts' bulletin boards
and allowing those who happen to have the time to volunteer
themselves). The speed with which a manuscript can be sent
electronically is also an advantage, as is the convenience that many
are discovering in reading and commenting on manuscripts exclusively
on-screen. All in all, implementing the traditional peer review system
purely electronically is not only eminently possible, but it is likely
to turn out to be optimal, with even paper journal editors preferring
to conduct refereeing in the electronic medium (I am certainly doing
this more and more with the paper journal I edit).
 
Once peer review is in place on the Net, once the quality hierarchy has
been established, serious scholars will no longer have reason to
hesitate to confer their best work to the electronic-only medium. Yet
my prediction is that this state of affairs will NOT prove to be the
critical factor in drawing the scholarly community onto the Net with
their serious work. Much has been said about what the critical "value
added" feature of the Net will be that succeeds in winning everyone
over. We have spoken of decreased costs, but I think that even my
estimate that the true expenses of electronic publication will be only
20-30% of paper publication will not be what does the trick. Decreased
publication lags and more equitable refereeing on the Net will also be
welcome but still not, I think, the decisive factors. Not even the
global access to eyeballs unrestrained by the barriers of subscription
cost, xeroxing, mailing or postage, nor the possibility of a
(virtually) free world electronic periodical library sitting on every
scholar's desk thanks to network links, nor the powerful electronic
search and retrieval tools (built on anonymous ftp, archie, wais,
gopher, veronica, and their progeny) that will be within everyone's
reach -- none of these, remarkable as they are, will be the critical
value-added feature that tilts the papyrocentric status quo
irreversibly toward the electronic airways.
 
        INTERACTIVE PUBLICATION: "SCHOLARLY SKYWRITING"
 
The critical factor will be a spin-off of that very anarchy that I said
had given the new medium such a bad image in the eyes of serious
scholars, what had made it look as if it were just a global graffiti
board for trivial pursuit: For once it is safely constrained by peer
review, this anarchy will turn into a radically new form of INTERACTIVE
PUBLICATION that I have dubbed "Scholarly Skywriting," and this is what
I predict will prove to be the invaluable new communicative possibility
the Net offers to scholars, the one that paper could never hope to
implement.
 
I think I may be peculiarly well placed to make this prognostication.
For over fifteen years I have edited a paper journal specializing in
"Open Peer Commentary": BEHAVIORAL AND BRAIN SCIENCES (BBS, published
by Cambridge University Press) accepts only articles that report
especially significant and controversial work. Once refereed and
accepted, these "target" articles are circulated (formerly only as
paper preprints, but these days in electronic form as well) to as many
as 100 potential commentators across specialties and around the world,
who are invited to submit critical commentary, to which the author will
respond Harnad 1979, 1984b). Among the criteria referees are asked to
use in reviewing manuscripts submitted to BBS is whether open peer
discussion and response on that paper would be useful to the scholars
in the fields involved (and it must impinge on at least three
specialties). Each target article is then copublished with the 20 - 30
(accepted) peer commentaries it elicits, plus the author's Response to
the commentaries. These BBS "treatments" have apparently been found
useful by the biobehavioral and cognitive science community, because
already in its 6th year BBS had the 3rd highest "impact" factor
(citation ratio; adjusted:  see Drake 1986; Harnad 1984a) among the
1200 journals indexed in the Social Science Citation Index.
BBS's pages are in such demand by readers and authors alike that it has
(based on an informal survey of authors) one of the highest reprint
request rates among scholarly periodicals and, of course, the
characteristically high rejection rate for submissions -- attesting as
much to the fact that there is more demand for Open Peer Commentary
than BBS can fill as to the fact that BBS's quality control standards
are high.
 
Yet BBS has some inescapable limitations, because its tempo is far too
slow. Peer review (using 5-8 referees, from 3 or more specialties), is,
as usual, a retardant, but even if one starts the clock at the moment a
target article is accepted, and even if one allows for the fact that
preprints are in the hands of one hundred peers within two weeks from
that moment, their commentaries received six weeks after that, the
author's response four weeks after that, and then the entire treatment
appears in print 4-6 months later, these turnaround times, though
perhaps respectable compared to conventional forms of paper
publication, are in fact hopelessly slow when compared to the potential
SPEED OF THOUGHT.
 
I have discussed the chronobiology of human communication in more
detail elsewhere (Harnad et al. 1976; Harnad 1991). Suffice it to say
here that the tempo of a spoken conversation is in the same
neighborhood as the speed of thought; weeks, months, or years of lag
between messages are not. Whatever ideas could have been generated by
minds interacting at biological tempos are forever lost at
paper-production tempos. Scholarly Skywriting promises life for more of
those potential brainchildren, those ideas born out of scholarly
intercourse at skyborne speeds, progeny that would be doomed to
still-birth at the earthbound speeds of the paper communication.
 
I hasten to add -- so as to dispel misunderstandings that have already
been voiced in the literature (e.g., Garfield 1991) -- that I am not
advocating oral speeds for all scientific "publication." First of all,
the time to pass through the filter of peer review already puts some
brakes on the speed of interaction. Second, even unmoderated electronic
mail correspondence is not as fast as a conversation (nor would it be
comfortable if it were -- as anyone who has engaged in real-time
e-writing "conversations" can attest). Nor is the goal the
undisciplined babbling that we all recognize from "live" symposium
transcripts. The goal is something in between: Much faster than
paper-mediated interaction, but not as fast or unconstrained as oral
dialogue. Moreover, the virtue of "Scholarly Skywriting" is as an
available OPTION. Just as not every article is suitable for BBS, not
every idea or finding is a candidate for interactive publication. But
at last the option is there.
 
And once you have tasted it (as I have -- e.g., see Hayes et al. 1992),
I think you too will be convinced that it adds a revolutionary new
dimension to scholarly publication and, even more important, will, I
predict, increase individual scholars' productivity by an order of
magnitude (all those stillborn ideas that now have a lease on life!).
 
        CREATIVE CHAOS
 
Let me close by returning to the question of quality control. I have
argued that peer review can be and should be implemented on the Net,
and hierarchically, much as it was in paper, generating a pyramid of
periodicals, with the highest quality ones at the top and the
unrefereed vanity press at the bottom. This, I have suggested, should
allay the apprehensions of scholars who had wrongly inferred that the
Net was intrinsically anarchic. But now let me say a few words in
praise of the chaotic regions of such a partially constrained system:
Sometimes the brakes applied by referees are "unbiological" too: If all
of our ideas and findings had to pass through narrow peer scrutiny
before they could elicit wider peer feedback, perhaps certain ones of
them would still remain stillborn. Within the many possible structures
and nonstructures one can implement on a Net, unrefereed discussion,
perhaps among a closed group of specialists with read/write privileges
(while others have read-only privileges) would be a useful complement
to conventional peer review or even to electronic adaptations of
BBS-style editor-filtered peer commentary in the form of
editor-filtered "skywriting" of the kind BBS's electronic counterpart,
PSYCOLOQUY specializes in.
 
Peer commentary, after all, whether refereed or not, is itself a form
of peer review, and hence of quality control (Mahoney 1985). Let us be
imaginative in exploring the remarkable possibilities of this brave new
medium. My argument here has been on behalf of conventional peer review
as the principal means of controlling quality, whether on paper or on
the Net, and whether for target articles or commentaries. But once such
rigorous, conventional constraints are in place, there is still plenty
of room on the net for exploring freer possibilities, and the
collective, interactive ones, are especially exciting.
 
        REFERENCES
 
Drake, R.A. (1986) Citations to articles and commentaries: A
reassessment.  American Psychologist 41: 324 - 325.
 
Garfield, E. (1991) Electronic journals and skywriting: A complementary
medium for scientific communication? Current Contents 45: 9-11,
November 11 1991
 
Hargens, L.L. (1990) Variation in journal peer review systems: Possible
causes and consequences. Journal of the American Medical Association
263: 1348-1352.
 
Harnad, S. (1979) Creative disagreement. The Sciences 19: 18 - 20.
 
Harnad, S. (ed.) (1982) Peer commentary on peer review: A case study in
scientific quality control, New York: Cambridge University Press.
 
Harnad, S. (1984a) Commentary on Garfield:  Anthropology journals:  What
they cite and what cites them. Current Anthropology 25: 521 - 522.
 
Harnad, S. (1984b) Commentaries, opinions and the growth of scientific
knowledge. American Psychologist 39: 1497 - 1498.
 
Harnad, S. (1985) Rational disagreement in peer review. Science,
Technology and Human Values 10: 55 - 62.
 
Harnad, S. (1986) Policing the Paper Chase. (Review of S. Lock, A
difficult balance: Peer review in biomedical publication.)
Nature 322: 24 - 5.
 
Harnad, S. (1990) Scholarly Skywriting and the Prepublication Continuum
of Scientific Inquiry. Psychological Science 1: 342 - 343 (reprinted in
Current Contents 45: 9-13, November 11 1991).
 
Harnad, S. (1991) Post-Gutenberg Galaxy: The Fourth Revolution in the
Means of Production of Knowledge. Public-Access Computer Systems Review
2 (1): 39 - 53 (also reprinted in PACS Annual Review Volume 2
1992; and in R. D. Mason (ed.) Computer Conferencing: The Last Word. Beach
Holme Publishers, 1992; and in: M. Strangelove & D. Kovacs: Directory of
Electronic Journals, Newsletters, and Academic Discussion Lists (A.
Okerson, ed), 2nd edition. Washington, DC, Association of Research
Libraries, Office of Scientific & Academic Publishing, 1992).
 
Harnad, S. (1992) Interactive Publication: Extending the
American Physical Society's Discipline-Specific Model for Electronic
Publishing. Serials Review, Special Issue on Economics Models for
Electronic Publishing, pp. 58 - 61.
 
Harnad, S., Steklis, H. D. & Lancaster, J. B. (eds.) (1976) Origins and
Evolution of Language and Speech. Annals of the New York Academy of
Sciences 280.
 
Hayes, P., Harnad, S., Perlis, D. & Block, N. (1992) Virtual Symposium
on Virtual Mind. Minds and Machines 2: 217-238.
 
Katz, W. (1991) The ten best magazines of 1990.
Library Journal 116: 48 - 51.
 
Mahoney, M.J. (1985) Open Exchange and Epistemic Progress.
American Psychologist 40: 29 - 39.
=========================================================================
Date:         Tue, 9 Nov 1993 09:43:08 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         bob jansen <jansen@syd.dit.csiro.au>
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
 
>From: Paul Robinson <tdarcos@mcimail.com>
>> Patricia Battin, then University Librarian and Vice President
>> for Information Systems at Columbia, urged that universities
>> take a much greater role in the publishing enterprise [1].
>>
>> ``The advent of electronic capabilities provides the university
>>   with the potential for becoming the primary publisher in the
>>   scholarly communication process.  At the present time, we are
>>   in the untenable position of generating knowledge, giving it
>>   away to the commercial publisher, and then buying it back for
>>   our scholars at increasingly prohibitive prices.  The electronic
>>   revolution provides the potential for developing university
>>   controlled publishing enterprises through scholarly networks
>>   supported either by individual institutions or consortia.''
>
>This, I think, hits it right on the head, and solves a thing that stuck in
>my craw.  The Journal "Software, Practice and Experience", which might be
>dubbed "The New England Journal of Medicine" for Computer Software issues,
>has the interesting idea of (1) charging a nice chunk of change for its
>publication (2) requiring those who place articles in the publication to
>assign copyright to the publisher and (3) requiring the author to get
>permission to use his own work elsewhere!
 
I thought this was much the same with all journals. Electronic publishing,
or at least electronic access to infrmation in the public domain, has the
potential to change all of this, but which way??
>> I don't know what Okerson has in mind as a revolutionary possibility,
>> but I think I know what commercial publishers mean by uncoupling
>> ownership from access...If I subscribe to a... journal...I have
>> acquired...a limited right to use...my limited rights to
>> the contents last as long as the paper on which they are printed.
>> Unfortunately, what publishers seem to have in mind when they speak
>> of access without ownership is a model in which a one year
>> subscription entitles the subscriber to one year of access.  When
>> the subscription ends, so does the access.
>
>I think you've got another winner here.  Compuserve offers the Associated
>Press Newswire.  One of the conditions for access is that you may not
>reprint, reuse, or capture off screen what you obtain from the AP
>Newswire.  So what am I paying $6.50 an hour for?  To read off the screen?
>I can get almost all of this for free from broadcast and nearly free from
>a 35c local paper.
>
>It seems that commercial organizations which distribute electronically
>have the nasty habit of thinking that an electronic release is somehow
>intrinsically more valueable than the same information on paper.  Part of
>the cost may be because of the lack of commercials.  Part of the cost is
>the overhead of the distributor.  And some is profit.  But the fact is
>that electronic reproductions are commercially more expensive than paper
>ones in any case I know of.
 
This surely is the main problem with electronic access to information. The
very nature of electronic access is not intrinsically more valuable, in
fact if you talk to end-users you find that the technology just isn't up to
scratch yet. They can do more with a piece of paper than they are able to
do with an electronic piece of paper. Remember, we are all educated in
using this paper stuff, and we require a mind change to really use the
electronic medium. Try and create an electronic book and you will quickly
see that you require to think differently, if you really want the user to
buy your book and not the equivalent paper version. Electronic publishing
has the potential to add-value to the writen word, by use of associative
links, multi-media etc, but this does not eclipse the paper version. Unless
an end-user can see advantage in using the electronic version, we're all
wasting our time getting electronic books into the public arena.
 
bobj
 
-------------------------------------------------------
Dr. Bob Jansen
email: jansen@syd.dit.csiro.au
 
Address until March 1994
CNRS Institut des Textes et Manuscrits Modernes,
61, rue de Richelieu,
75084 Paris Cedex 02,
France
Phone (+33 1) 42 96 30 94  Fax (+33 1) 47 03 89 40
 
Normal address
CSIRO Division of Information Technology
Physical: Building E6B, Macquarie University Campus, North Ryde NSW 2113
Postal: Locked Bag 17, North Ryde NSW 2113
Phone: +612 325 3100  Fax: +612 325 3101
-------------------------------------------------------
=========================================================================
Date:         Tue, 9 Nov 1993 09:43:49 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Howard Pasternack <blips15@brownvm.bitnet>
Subject:      Re: newspaper archives as textual corpora
 
 
I'd like to raise the question of the degree to which historians would
be able to cite scanned newspaper texts in lieu of the originals or
microforms.  A microform of a newspaper is basically a photographic
reproduction of the original and retains all of the photographs,
illustrations, advertisements, type idiosyncrasies etc. of the orginal.
In contrast, scanned text is really a secondary derivative work.  While
there are many advantages to having the text in machine-readable form
and being able to search the text, citing the scanned text in a scholarly
work in lieu of the microform poses a number of problems.
 
Howard Pasternack
Brown University
=========================================================================
Date:         Tue, 9 Nov 1993 11:01:21 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         mark@csc.albany.edu
Subject:      Re: newspaper archives as textual corpora
 
Howard Pasternack writes:
 
>A microform of a newspaper is basically a photographic reproduction of
>the original and retains all of the photographs, illustrations,
>advertisements, type idiosyncrasies etc. of the orginal.  In contrast,
>scanned text is really a secondary derivative work.  While there are
>many advantages to having the text in machine-readable form and being
>able to search the text, citing the scanned text in a scholarly work
>in lieu of the microform poses a number of problems.
 
On the other hand, it should be easy to provide access to the ascii
text from which the newspaper was produced in the first place,
eliminating the potential errors of a scanning process.
 
And photographs and illustrations could be provided in some sort of
graphics medium such as postscript or gif.
 
Alternatively, the entire newspaper could be presented via a
graphics-oriented format, using the code that generated the print
version in the first place.
 
Under either alternative, the quality of the graphics would
undoubtedly be better than that produced for the print version.
 
--Mark
--------------------------------------------------------------------------
Mark Steinberger       |            mark@sarah.albany.edu
Dept. of Math. & Stat  |
SUNY at Albany         | Nonlinear similarity begins in dimension six.
Albany, NY 12222       |
--------------------------------------------------------------------------
=========================================================================
Date:         Tue, 9 Nov 1993 11:03:05 EST
Reply-To:     john@math.nwu.edu
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         John Franks <john@math.nwu.edu>
Organization: Dept of Math, Northwestern Univ
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
 
In article <0119931108124000/0005066432NA1EM-e100000@MCIMAIL.COM>, "Tansin A.
Darcos & Company" <0005066432@MCIMAIL.COM> writes:
> From: Paul Robinson <tdarcos@mcimail.com>
> Organization: Tansin A. Darcos & Company, Silver Spring, MD USA
> -----
> John Franks <john@hopf.math.nwu.edu>, writes:
>
>
> >
> > As scholars have gained experience with this kind of publishing they
> > have learned that e-mail is not a very good way to do electronic
> > document distribution.  It's sole advantage is widespread availability
> > but it is by far the most cumbersome method for the user.  Anonymous
> > ftp is a substantial improvement,
>
> I would like to introduce you to the method by which the people who design
> the Internet and all the parts of it decide things.  First, someone gets
> an idea on doing something.  They will write up an Internet Draft and
> submit it to the internet draft mail box so that it may be seen by the
> members of the Internet Engineering Task Force (IETF) and other interested
> parties. The draft is then posted to an ANON FTP site, and a message is
> E-Mailed to the IETF Mailing list mentioning the name of the document and
> the abstract of it if one was provided.  Those interested can then
> retrieve the document - which is in Ascii or can also be in Postscript as
> well.
>
> If the document covers a regular ongoing working group dealing with an
> issue of the Internet, it is referred to them.  If there is interest in
> discussing it, an E-Mail list will be set up to allow people to post
> comments about it. ...
>
> Eventually, if the item is of importance to the Internet, it will be
> published as an RFC and become one of the Internet Standards.
>
 
What is appropriate for an IETF working group is quite different from
what is needed for scholarly publishing.  In general LISTSERVs and
similar e-mail based systems have an important role, but distributing
scholarly journal articles is not part of it.
 
A LISTSERV is at its best in a private discussion among a number of
people.  It is fundamentally designed to function as a many-to-many
vehicle in which participation is controlled by a list maintainer.
For many-to-many discussions which are open to the public USENET is
both much more efficient and much more user friendly with the right
software.  Neither of these are appropriate for a scholarly journal.
 
In my opinion the best methods available at the moment are gopher
and WWW.  These are systems designed as browsers.  WWW has greater
functionality.  It currently trails gopher is usage, but seems to
be rapidly gaining.  WWW seems to be emerging as the standard in the
sciences, while gopher holds a lead in the humanities.
 
The difference between a true browser like gopher or Mosaic (a WWW client)
and an e-mail based system like the one used by PACS Review, for example,
is the difference between inter-library loan and having a book in
the stacks.  To access a PACS Review article there is "paperwork"
to do in submitting a request.  Then you have to wait for the article
to arrive.  A bigger problem is you have to know what you want in
advance.  With Mosaic or gopher you "browse the stacks" looking at
what is there, reading a few paragraphs from an article to see if you
are interested etc.  You have a good chance of finding something
related to the article you want "right next to it" on the menu.
 
>
> I'll toss the rodent out by saying that Gopher is not a good choice for
> browsing (as neither is Mosaic) if you are using, for example, a dial-up
> terminal to a service;
 
I disagree.  Both gopher an WWW are used very widely with dialup systems.
They are commonly used as Campus Wide Information Systems.  They both
have good clients for dialup terminals and for Macs and PCs.  They both
allow some fancy features like images and sound that won't be available
to dialup terminal users, but you couldn't have these features by e-mail
anyway.
 
--
 
John Franks     Dept of Math. Northwestern University
                john@math.nwu.edu
=========================================================================
Date:         Wed, 10 Nov 1993 08:34:58 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         "Filip M. J. A. Evenepoel" <filip.evenepoel@esat.kuleuven.ac.be>
Subject:      Re: newspaper archives as textual corpora
 
Dear,
 
In reply of the request by Dr. Ron Zweig, I would like to point out that there
is a machine-independent way of holding newspapers. The magic word here is SGML
(Standard Generalised Markup Language), and more specific for newspapers, the
CAPSNEWS DTD, which has been developed by the CAPS Consortium.
 
The CAPS projects are two E.C. funded projects (TIDE Office of DG XIII) who are
making "documents" available to the blind and print-disabled. The first of those
projects was specially dedicated to newspapers, and the CAPSNEWS DTD has been
developed in this stage.
 
Of course, the newspapers still have to be converted from the form they are
stored in, to the DTD. However, research in this direction has and still is
being done.
 
Converting a newspaper to this format not only gives the advantage of
portability, but also can it be read by the blind by means of the workstation
that has been developed in the first CAPS project.
 
The CAPSNEWS DTD is currently available via anonymous FTP from
gate.esat.kuleuven.ac.be in the directory /pub/CAPS/CAPSNEWS . We are working on
an update for this DTD that will be released very soon.
 
There will be a paper presented at the Open Information Interchange Workshop in
Luxembourg on the 3rd of December 1993. For further information, one can always
contact me at the address below.
 
Greetings,
 
Filip Evenepoel
 
Kath. Universiteit Leuven            | Phone : +32 16 22 09 31 (ext. 1123)
Dept. Electrotechniek, Afd. T.E.O.   |         +32 16 29 04 20
Kard. Mercierlaan 94                 | Fax   : +32 16 22 18 55
B-3001 LEUVEN - HEVERLEE             |
 
E-mail: Filip.Evenepoel@esat.kuleuven.ac.be
=========================================================================
Date:         Wed, 10 Nov 1993 08:39:57 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Rich Wiggins <wiggins@msu.bitnet>
Subject:      Re: The Impact of Electronic Publication on Scholarly Journals
In-Reply-To:  Your message of Tue, 9 Nov 1993 11:03:05 EST
 
>I disagree. Both gopher an WWW are used very widely with dialup systems.
>They are commonly used as Campus Wide Information Systems. They both
>have good clients for dialup terminals and for Macs and PCs. They both
>allow some fancy features like images and sound that won't be available
>to dialup terminal users, but you couldn't have these features by e-mail
>anyway.
 
Dr. Franks makes his points extremely well, but I'd like to reiterate a
couple of them. The analogy of LISTSERV to interlibrary loan for
e-journals is extremely apt. I am an experienced LISTSERV user (in fact
my group maintains it here at MSU) but when I first started browsing
back issues of PACS Review it was a great relief to discover the CICNet
archive of back issues. It's heaps easier to browse a list of titles
with a scroll bar, then click on the titles of interest.
 
And yes, dial-up browsing of Gopher/WWW is not only possible; it's
common. More than once I've seen folks claim that one can browse Gopher
but not WWW, or vice versa, or that the dial-up access to one of them
isn't viable. Browsing under dial is viable with either, using the
public curses client for Gopher, or the Lynx curses client for WWW.
 
(Dial-up Mosaic *is* possible, using PPP or using Term to export the
X session, by the way; until vendors make PPP support a no-brainer,
this is not for the masses.)
 
Of course, browsing isn't the whole story. Once our collections grow
beyond a certain size, browsing to hunt for items of interest will
become increasingly impractical. We'll do an index search of some sort
first, just like with an OPAC, then browse the resulting list of
candidate items.
 
/Rich Wiggins, CWIS Coordinator, Michigan State U
=========================================================================
Date:         Wed, 10 Nov 1993 08:42:22 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Ron Zweig <ron@ccsg.tau.ac.il>
Subject:      Re: newspaper archives as textual corpora
In-Reply-To:  <9311091446.AA22761@ccsg.tau.ac.il>
 
 
I share Howard Pasternack's concern about the failings of reconstructed (by
OCR) text files to represent faithfully a newspaper as a historical
source. That is why the project we are contemplating will preserve *both*
an image of the full page, and a linked text file containing an ascii
version of the articles on that page. The basic idea if that the user will
be able to search/retrieve the text file, and see the original article in
image format.
 
On the one hand, this gives the best of both worlds. On the other, it
*might* make proofreader of the text files unnecessary. Because the text
file is not the authoritative archival copy, a low level of errors caused
by the OCR would be acceptable.
 
That's the theory. Any comments?
 
 
Ron Zweig
=========================================================================
Date:         Wed, 10 Nov 1993 12:24:48 EST
Reply-To:     mzltov@nwu.edu
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Herbert Jacob <mzltov@nwu.edu>
Subject:      Publishing scholarly journals on listserv
 
John Frank writes:
>
>A LISTSERV is at its best in a private discussion among a number of
>people.  It is fundamentally designed to function as a many-to-many
>vehicle in which participation is controlled by a list maintainer.
>For many-to-many discussions which are open to the public USENET is
>both much more efficient and much more user friendly with the right
>software.  Neither of these are appropriate for a scholarly journal.
>
That is not my experience in publishing the Law & Politics Book Review.  The
subscribers to the list on which this journal is published, PSRT-L,
automatically receive each review in the e-mail box.  They can scan it, read
it, and/or trash it immediately.  In addition, the journal is archived in a
gopher which permits rapid retrieval.  Receiving a journal in e-mail boxes
is much more convenient, in my opinion, than having to go into the system
and search for the newest issue.  I will be happy, however, to migrate to
something better when it becomes widely available.
Herbert Jacob, Northwestern University
Voice Mail 708 491-2648
e-mail mzltov@nwu.edu
=========================================================================
Date:         Wed, 10 Nov 1993 12:26:06 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         bob jansen <jansen@syd.dit.csiro.au>
Subject:      Re: newspaper archives as textual corpora
 
>Howard Pasternack wrote
 
>I'd like to raise the question of the degree to which historians would
>be able to cite scanned newspaper texts in lieu of the originals or
>microforms.  A microform of a newspaper is basically a photographic
>reproduction of the original and retains all of the photographs,
>illustrations, advertisements, type idiosyncrasies etc. of the orginal.
>In contrast, scanned text is really a secondary derivative work.  While
>there are many advantages to having the text in machine-readable form
>and being able to search the text, citing the scanned text in a scholarly
>work in lieu of the microform poses a number of problems.
 
The problem here is surely not as to the status of citing scanned (I
presume the concept here is one of OCR'd not scanned) newspapers, but the
inability of holding the various data types within one physical document
(ie. like a newspaper). This will be resolved with improvements in the
technology and an adoption of conventions regarding file formats to enable
the data to be 'machine/viewer independent' (if this is at all possible).
 
If the issue is 'the scanned form can be changed and thus we are unsure
whether this is a true representation' (which is a real issue in citing
previous work), there was an article on the net recently about a facility
for calculating a checksum for a file and applying this to scanned text to
ensure that if the file is changed, then the checksum enables this fact to
be known. Thus we now have an ability to ensure that computer data files
are safe from uninformed tampering.
 
As to 'scanned text is really a secondary derivative work' I dont
understand this. Why is scanned a derivative form but a microfiche copy
not, unless the problem is again the lack of data-type support.
 
bobj
 
-------------------------------------------------------
Dr. Bob Jansen
email: jansen@syd.dit.csiro.au
 
Address until March 1994
CNRS Institut des Textes et Manuscrits Modernes,
61, rue de Richelieu,
75084 Paris Cedex 02,
France
Phone (+33 1) 42 96 30 94  Fax (+33 1) 47 03 89 40
 
Normal address
CSIRO Division of Information Technology
Physical: Building E6B, Macquarie University Campus, North Ryde NSW 2113
Postal: Locked Bag 17, North Ryde NSW 2113
Phone: +612 325 3100  Fax: +612 325 3101
-------------------------------------------------------
=========================================================================
Date:         Fri, 12 Nov 1993 15:36:49 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Morris Simon <msimon7@ua1ix.ua.edu>
Subject:      Re: newspaper archives as textual corpora
In-Reply-To:  <9311101351.AA198525@ua1ix.ua.edu>
 
 
 
On Wed, 10 Nov 1993, Ron Zweig wrote:
 
>
> I share Howard Pasternack's concern about the failings of reconstructed (by
> OCR) text files to represent faithfully a newspaper as a historical
> source. That is why the project we are contemplating will preserve *both*
> an image of the full page, and a linked text file containing an ascii
> version of the articles on that page. The basic idea if that the user will
> be able to search/retrieve the text file, and see the original article in
> image format.
.........
> That's the theory. Any comments?
 
How would you propose handling the very large size of the image files?
Even with GIF compression, a relatively small set of pages would run to
gigabyte length fairly quickly. With lossy compression techniques like
JPEG, you might save a lot of space, but OCR routines would probably have
even more difficulty with the loss of grey-scale pixels. I agree that the
ideal solution would include both, but perhaps it's feasible only with
advanced WORM or even permanent CD-ROM media.
 
Morris Simon <msimon7@ua1ix.ua.edu>
Stillman College
=========================================================================
Date:         Fri, 12 Nov 1993 15:38:56 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Ellen Sleeter <esleeter@ani.ieee.org>
Subject:      Re: newspaper archives as textual corpora
In-Reply-To:  <9311101354.AB03225@omega.ani.ieee.org>
 
re:  Ron Zweig's intent to combine "fuzzy" OCR and page images
 
Such an effort is the M.O. of AT&T's RightPages product, being applied at
U.C. San Francisco under the aegis of the "Red Sage" project.  One of the
several happy benefits of the OCR is to provide a current awareness and
"alerting" service based on user-defined interest profiles.
 
see:  Journal of ASIS, September 1991 44:8, Melia M. Hoffman, et al. "The
RightPages Service:  An Image-Based Electronic Library." pp. 446-452.
 
Probably most of you know about this project --- and since it is an
after-the-fact electronic publishing process rather than an electronic-only
process, it may not be uppermost in your thinking on this list.
 
Nonetheless, I commend you to this issue of JASIS, devoted to the topic of
digital libraries.
 
|======================================================================|
| Ellen L. Sleeter  < e.sleeter@ieee.org >     | v-mail: 212-705-7146  |
| Project Manager                              |    fax: 212-705-7122  |
| Publications Information Services            |                       |
| IEEE                                         |                       |
| 345 East 47th Street, New York NY  10017     |                       |
|======================================================================|
=========================================================================
Date:         Fri, 12 Nov 1993 15:41:59 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         JAYMACHADO@delphi.com
Subject:      Re: newspaper archives as textual co
 
>> This will be resolved with improvements in the
>> technology and an adoption of conventions regarding file formats to enable
>> the data to be 'machine/viewer independent' (if this is at all possible).
 
In the PC world, several products along these lines recently hit the
marketplace. Adobe Sytems' Acrobat and No Hands Software's Common ground
both offer different technologies for creating cross platform, application
independant documents.
The Adobe product uses Postscipt and a proprietary .PDF (Portable
Document) format to enable document exchange between Macintoshes and PCs,
complete w/ graphics, layouts, and hypertext links. Common Ground is
similar, but faster and uses a different proprietary technology to acheive
its effects. There is a 3 page writeup about both in the October 93
issue of BYTE. (p. 133). It's a start, but BYTE says both products need work.
======================================================================
Jay Machado              =  Internet:
1529 Dogwood Drive       =  JAYMACHADO@delphi.com
Cherry Hill, NJ 08003    =  jmachado@pacs.pha.pa.us
phone (day) 215/209-2396 =  slakmaster@aol.com
      (eve) 609/795-0998 =  Editor, Bits and Bytes Online Edition
======================================================================
=========================================================================
Date:         Mon, 15 Nov 1993 12:10:48 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         "JOHN DUKE - VCU (804) 367-1100" <jduke@vcuvax.bitnet>
Subject:      Re: newspaper archives as textual corpora
 
I believe that Ellen Sleeter meant to say the September 1993 (not 1991)
issue of JASIS has the RightPages article.
 
*********************************************************************
John Duke                                BITNET:   jduke@vcuvax
Assistant Director                     INTERNET:   jduke@ruby.vcu.edu
Network and Technical Services            VOICE:   804/367-1100
Virginia Commonwealth University            FAX:   804/367-0151
Richmond, VA  23284-2033
=========================================================================
Date:         Mon, 15 Nov 1993 12:11:58 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Rich Wiggins <wiggins@msu.bitnet>
Subject:      Newspaper archives -> Acrobat v Common Ground
In-Reply-To:  Message of Fri,
              12 Nov 1993 15:41:59 EST from <jaymachado@delphi.com>
 
Acrobat is not just in the PC world; it's out for Windows (not DOS) and
Mac. Unix versions of the Distiller, Exchange, and Reader are on the
way.
 
Common Ground started on the Mac and is now supposed to be out for
Windows.
 
Common Ground is faster because it is simpler. From what I can tell it's
basically a print driver plus a tool to wrap the captured print in a
portable application. One problem with that is that as you rescale a
document you can end up with huge files. A more serious issue -- in fact
a killer as far as I'm concerned -- is the basic design of having people
mail around executable programs. This, it seems to me, is asking for all
kinds of virus problems. With Acrobat you have the one-time problem of
getting a Reader installed on the users' desktops, after which you do
not move an executable program with each document. Unlike PostScript,
the PDF language does not include primitives for opening files, so
networked PDFs plus a static copy of the Reader are network-safe.
 
You say "Common Ground is similar" -- Does Common Ground have or promise
to have any of the features of Acrobat Exchange, such as embedding
hypertext links? This is useful for tables of contents. There is also a
feature for annotation that could be very useful for editing cycles --
like a PostIt note that expands with a mouse click. What about
compression -- it seems to me that Acrobat does this "right", with
scalable font information preserved one way, and images preserved as
embedded JPEGs. Adobe also promises other tools, such as OCR and
indexing tools; does Common Ground? From what I can tell Common Ground
is a much simpler tool, not in the same ballpark as the Acrobat family
at all.
 
I agree Acrobat viewing can be slow; screen paints are very visible on
my 25 MHz 486.
 
/Rich Wiggins, CWIS Coordinator, Michigan State U
 
 
>In the PC world, several products along these lines recently hit the
>marketplace. Adobe Sytems' Acrobat and No Hands Software's Common ground
>both offer different technologies for creating cross platform, application
>independant documents. The Adobe product uses Postscipt and a
>proprietary .PDF (Portable Document) format to enable document exchange
>between Macintoshes and PC complete w/ graphics, layouts, and hypertexts,
>links. Common Ground is similar, but faster and uses a different
>proprietary technology to acheive its effects. There is a 3 page writeup
>about both in the October 93 issue of BYTE. (p. 133). It's a start, but
>BYTE says both products need work.
>======================================================================
>Jay Machado = Internet: 1529 Dogwood Drive = JAYMACHADO@delphi.com
=========================================================================
Date:         Mon, 15 Nov 1993 15:36:39 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Ellen Sleeter <esleeter@ani.ieee.org>
Subject:      JASIS issue --- yep, wrong year
 
John's absolutely right about the incorrect pub year --- Rich Wiggins had
sent me a note straightaway, questioning the cite, and I sent errata same
day to the list, but --- alas, misspelled the LIST name, so the message
bounced back and I found it in my mailbox this a.m.
 
Thanks to Rich and John --- and check out that *1993* issue of JASIS.
 
;)
 
|======================================================================|
| Ellen L. Sleeter  < e.sleeter@ieee.org >     | v-mail: 212-705-7146  |
| Project Manager                              |    fax: 212-705-7122  |
| Publications Information Services            |                       |
| IEEE                                         |                       |
| 345 East 47th Street, New York NY  10017     |                       |
|======================================================================|
=========================================================================
Date:         Wed, 17 Nov 1993 08:42:31 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Mike Lanza <lanza@dnc.com>
Subject:      Re: JASIS issue --- yep, wrong year
 
>John's absolutely right about the incorrect pub year --- Rich Wiggins had
>sent me a note straightaway, questioning the cite, and I sent errata same
>day to the list, but --- alas, misspelled the LIST name, so the message
>bounced back and I found it in my mailbox this a.m.
>
 
One other question about that cite - is the actual name of the journal the
"Journal of ASIS"  That seems like a strange name for a journal (except for
the handful of people who know what ASIS is. . .).
 
I hope to find this somewhere at Stanford University.
 
Thanks.
 
- Mike Lanza
=========================================================================
Date:         Fri, 19 Nov 1993 11:39:01 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Gene Wiemers <wiemers%nuacvm.acns.nwu.edu@uicvm.uic.edu>
Subject:      CIC Libraries announcement
 
The following is going out on multiple listservs.  Please excuse any
duplication.  Gene Wiemers
_________________________________________________________________________
 
The Task Force on the CIC Electronic Collection was formed early
in 1993 by the directors of the libraries of the Committee
on Institutional Cooperation (which includes the libraries of
the University of Chicago, the University of Illinois at
Chicago, the University of Illinois at Urbana-Champaign,
Indiana University, the University of Iowa, the University of
Michigan, Michigan State University, the University of
Minnesota - Twin Cities,  Northwestern University,  the Ohio
State University,  Pennsylvania State University, Purdue
University and the University of Wisconsin - Madison)  to
explore ways these libraries can share and jointly manage
electronic collections.  The first Task Force report was
received and approved in October by the CIC Library
Directors.
 
Task Force recommendations propose policy and management
structures for collecting and cataloging titles to form a CIC
Libraries Electronic Journal Collection for joint use by  CIC
Libraries. The Task Force report suggests ways CIC libraries can
cooperate in the development of full text electronic collections.  The
Task Force will serve as an ongoing forum to explore and propose
solutions that advance cooperation in the management of CIC
electronic collections.
 
The following is the executive summary of the Task Force report.
Full text is available via FTP at CICNet (ftp.cic.net) in the
directory pub/reports.  An ASCII version is found under the file
name electronic.task.force.txt; and a PostScript version under
electronic.task.force.ps.   The report is also found on the CICNet
gopher  as ==> 7. Other CICNet Information and Resources ==> 2.
CICNet FTPable Resources ==> nic.cic.net FTP archives ==> 16.
Reports.
 
Inquiries about the work of the task force can be directed
to the chair, Gene Wiemers at Northwestern University Library,
(e-wiemers@nwu.edu).
------------------------------------------------------------
Committee on Institutional Cooperation
Task Force on the CIC Electronic Collection
Report to the CIC Library Directors
 
EXECUTIVE SUMMARY
 
The Task Force on the CIC Electronic Collection was charged by
CIC Library Directors to explore the management and use of
shared electronic resources and to consider relevant issues
for the CIC libraries.  The Task Force was specifically asked
for recommendations on collection policy, organization,
bibliographic control, and access policies for electronic
journals and full-text electronic collections.
 
The Task Force has defined the CIC Electronic Collection as
all electronic resources within any CIC library that are
accessible over the Internet by students and faculty at CIC
universities. The Task Force has limited its attention to
collections where centralized storage and shared access are
desirable.
 
A.   Recommendations for immediate implementation:  Electronic
     journals
 
     CICNet staff have accumulated on a server more than 600
freely distributed electronic journals across a wide range of
subjects and treatment levels.  To manage this collection, the
Task Force recommends:
 
     1.   that a complete, authoritative, and permanent
          collection of electronic journals shared by CIC
          libraries be selected from those on the server for
          active management, cataloging and ongoing
          maintenance.  The collection should be identified as
          the CIC Libraries Electronic Journal Collection
          (CICL EJC).
 
     2.   that the CICL EJC, and the CICNet server be governed
          by a broad collecting policy endorsed by the CIC
          Library Directors.
 
     3.   that the CICL EJC be managed on an ongoing basis by
          a small CIC committee reporting to the CIC
          Collection Development Officers.
 
     4.   that bibliographic records be created for each title
          selected for the CICL EJC and entered into national
          databases and each CIC library's OPAC.
 
     5.   that the collection be maintained in text format by
          CICNet with Gopher as the access tool.
 
B.   Recommendations for intermediate development: Full-text
     collections
 
     Management of shared full text resources will require
agreements among CIC libraries about sophisticated access and
text manipulation software, as well as shared standards for
tagging texts.  The complexity of these projects calls for
further work to determine their feasibility.  Several CIC
institutions are developing such expertise, and will soon be
in a position to explore use of shared collections and build a
common base of expertise through a pilot project.  The Task
Force recommends:
 
     6.   that the CIC Library Directors authorize the Task
          Force to convene librarians and computing staff
          members from CIC and CICNet institutions to review
          available tools, explore possible pilot projects,
          and build a common vision of a shared collection.
 
     The Task Force discussed several avenues of cooperation
with CIC university presses.  Several ideas were discussed,
and offer possibilities for future cooperative activity.  The
Task Force recommends:
 
     7.   that the Task Force and CICNet, Inc. facilitate
          cooperative projects initiated by the CIC university
          presses.
 
C.   Recommendations for long term exploration and
     development.
 
     The Task Force has made considerable progress in fulfilling
its charge with respect to shared collections stored centrally
on CICNet.  Its deliberations have just begun to touch on
alternative modes of sharing electronic collections, and have
revealed a number of issues which will require thought and
deliberation.  The Task Force recommends:
 
     8.   that its life be extended, either on a continuing or
          periodic basis, to explore, evaluate and make
          recommendations on ongoing issues, including
 
               * monitoring the development of archival
                 standards
               * licensing fee based journals
               * managing numeric data
               * shared bibliographic databases in multiple
                 formats
               * management of government data
               * sharing CD-ROM products
               * providing common gateway services.
 
These issues will need periodic sifting to discover which are
amenable to shared solutions.  In particular, as imaging
technology and experience with mixed image and text documents
increases, options for shared use of collections promise large
potential cost savings that should be explored.
 
Task Force Members:
 
Gay N. Dannelly, Collection Development, The Ohio State
     University
William A. Gosling, Technical Services, University of Michigan
John L. Hankins, Deputy Director, CICNet, Inc.
Sharon A. Hogan, University Librarian, University of Illinois
     at Chicago
Charlene K. Mason, Automated Systems, University of Minnesota
Salvatore M. Meringolo, Collection & Reference Services,
     Pennsylvania State University
Eugene L. Wiemers, Collection Management, Northwestern
     University (Chair)
 
Gene Wiemers
Assistant University Librarian for Collection Management
Northwestern University Library
Evanston, IL  60208-2300
 
e-wiemers@nwu.edu
(708) 491-5622
=========================================================================
Date:         Mon, 22 Nov 1993 08:07:01 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Allen Renear <allen@brownvm.bitnet>
Subject:      Re: newspaper archives as textual corpora
 
 
Pasternack and Zweig raise an important issue I think, but given the
varied responses I think they should say more explicitly just what they
think the issue is -- a little to much was left to the reader's
inference.
 
My reading of Pasternack's original posting was this: given the
particular evidentiary role of newspapers as historical documents,
transcripts (and I think that includes, e.g., SGML encoded transcripts)
will not serve certain research projects in the appropriate way --
those projects require images of an original document.
 
He is certainly right about this.  Most literary textbase projects have
discovered (if they were indeed under any illusions about this) that
they cannot entirely serve the needs of, e.g., the physical
bibliographer or the manuscript scholar.  Similarly the historian of
printing history or of style and decoration can hardly be expected to,
at least in every case, work effectively with a transcript.
 
But I think that common sense suggests two observations -- and also
suggests that they be treated as commonplaces, and not as insights.
 
1) No method of data capture or representation scheme will satisfy every
scholarly need.
 
2) Where historical and literary texts are concerned the vast
proponderance of research needs are best served, and usually fully
served, by some form of "editorial transcript".
 
By an editorial transcript I mean a linguistic transcription (i.e.
letters and words) combined with a structural identification of features
such as paragraphs, lists, extracts, chapters, etc.  Such transcripts
may be very usefully enhanced with other sorts of sense-based features,
such as the distinction of proper names and place names.  These features
may not be strictly editorial, but they share with editorial features a
close connection with the meaning of the linguistic text, rather than
the conventions of layout and typography -- or the vicissitudes of
physical existence (worm holes, stains, etc.). One connection with the
physical document is often appropriately included: a canonical reference
apparatus, perhaps to pages in a key edition or the copytext -- but
other features of the physical text -- signatures, line breaks,
ligatures, stains, broken type, turned letters, worm holes, blots, etc.
-- may be acceptably ignored.
 
Large corpora of transcriptions of this sort will bring enormous value
to historians and other scholars -- and to any inquirer.  That is not to
say that the physical bibliographer, codicologist, historian of printing
practices, or historian of page design need not consult a facsimile (or
an original).  Nor is it to say that even an intellectual historian will
not need to examine a facsimile, on some occasions, in order to gain
useful information.
 
Facsimile images are great.  But they are very much a sideshow.  For one
thing they just aren't as important as transcripts, sociologically
speaking.  Not nearly as important.  And technologically they are also a
red herring with little significance for future developments in
information technology and scholarship.
 
Tell me I'm wrong.  But not by just pointing to some specialty that
regularly requires access to facsimiles or to some crux that was
resolved by reference to facsimiles.  I live those cases every day.
What I am arguing is that:  (i) however important facsimiles are to some
scholars corpora of encoded editorial transcripts are *far* more
important, in general, to research and scholarship, than are corpora of
facsimiles and (ii) encoded editorial transcripts, unlike facsimiles,
are methodologically consistent with key aspects of the evolution of
information technology.
 
(On another, but related topic, there is no reason for such miserable
and incoherent half measures as we see in Adobe Acrobat.  Products like
AC are either appalling failures of intelligence or cynical short term
business moves.  Or both.  Anyway, they have nothing to do with the
future.  I hope.)
 
-- allen
 
Allen Renear
Brown University
401 863-7312
Allen_Renear@Brown.Edu
=========================================================================
Date:         Mon, 22 Nov 1993 08:07:26 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Allen Renear <allen@brownvm.bitnet>
Subject:      Re: newspaper archives as textual corpora
 
 
After thinking about Pasternak and Zwieg's notes I now suspect that
I may be more offending than anyone in not addressing their
immediate concerns, which they put clearly enough.  Well, my
note was on a closely related topic I think, if not the current under
discussion.  sorry. -- allen
 
(this short note may pass the longer one and reach the list first)
=========================================================================
Date:         Mon, 22 Nov 1993 08:07:55 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Stevan Harnad <harnad@princeton.edu>
Subject:      Threat to the Mathematical Literature
 
This article is posted, with commentary and response, with the
permission of the author, Frank Quinn. (Commentator: Stevan Harnad)
 
> Date: 15 Nov 1993 13:20:17 -0500
> From: Frank Quinn <quinn@math.vt.edu>
> Subject: no subject (file transmission)
> To: cpub@MATH.AMS.ORG
>
>                 Roadkill on the Electronic Highway?
>             The Threat to the Mathematical Literature
>
>                 Frank Quinn
>                 Virginia Tech
>                 Blacksburg VA 24061-0123
>                 quinn@math.vt.edu
>
>    Submitted to the Notices of the American Mathematical Society,
>    and written for a mathematical audience. This version not designed
>    for distribution outside this community.
>
> There has been much discussion of the promise and perils of electronic
> publication, including an excellent Forum article by John Franks
> In most of this discussion mathematics is considered as one of
> many essentially similar branches of science or scholarship. The
> problems threaten scholarly journals generally, not just math. And
> experiences in theoretical physics, molecular biology, or psychology,
> are assumed to have direct relevance for mathematics. In this article
> the focus is on issues specific to mathematics. The conclusion is that
> in fact mathematics is different: we have more to lose than do the
> other sciences. We should therefore be more careful about the
> precedents we set, and more prepared to take action to protect our
> advantages.
>
> A key issue is: who should be served by publication, authors or
> readers? Is the primary purpose to establish priority and record the
> achievements of authors, or is it to be a useful resource for readers?
>
> Many of the ideas presented here are not new. I would particularly like
> to acknowledge the influence of John Franks, Arthur Jaffe, Andrew
> Olzyko, and Dick Palais. The synthesis and alarmist conclusions are my
> own.
>
> Reliability
>
> The published mathematical literature is, by and large, reliable; much
> more so than in other sciences. Few disciplinary mathematicians
> understand how unusual this reliability is, but it is highly visible
> to interdisciplinary workers. Mathematical papers may be boring or
> useless, but they are usually correct.
>
> Reliability is not an accident, and indeed is quite a lot of trouble to
> maintain. There is first of all (at least in the West, and with a few
> exceptional areas) a long tradition of careful work and critical
> self-examination. Next, published papers are refereed, often very
> carefully. This catches most errors, and reinforces the tradition of
> careful work. Papers usually circulate as preprints for long periods,
> sometimes years, before publication. This exposes them to the scrutiny
> of a larger community, and offers further opportunities for detection
> of errors. Finally mathematics is unique among the sciences in that its
> methods, when correctly applied, do yield conclusions which are in
> practice completely reliable.
 
Refereeing, at any level of rigor one may wish, can be as readily
implemented in the electronic medium as in the paper medium. Peer review
is medium-independent.
 
As to unrefereed preprints, the Net offers the possibility of much
wider scrutiny for these than paper, because of its parallelism,
immediacy, interactivism and global reach.
 
> This reliability influences mathematical practice in many ways. For
> instance mathematicians are more likely to be familiar with the older
> literature, and build upon it. When a literature is unreliable there is
> less benefit to knowing it: it is often easier to rediscover material
> than sort through and check publications. There is a greater tendency
> to work only from preprints, and depend on word of mouth to identify
> the good ones.
 
The preference for unrefereed preprints over refereed publications
seems a bit inconsistent with what was said earlier, but even setting
that aside, the unreliability of preprints (electronic or otherwise)
has three remedies: (1) ignore unrefereed literature altogether, (2)
restrict reading to preprints by reliable authors (appearing in
high-level peer groups), (3) trust and/or contribute to the much
broader and more powerful prepublication evaluation that prepublication
peer scrutiny makes possible on the Net.
 
Science is self-corrective, partly because it is not possible to build
substantively on faulty work and partly because peers are vigilant for
flaws in one another's work. This is true in both paper and on the Net,
and it is true both of formal peer review and informal peer commentary
(concerning both refereed and unrefereed results). The Net merely makes
it possible to increase the scale, scope, speed and interactiveness of
what already exists -- and to allow some things to happen at the
biological tempo of real-time thought, rather than the sluggish
turnaround time that the constraints of paper arbitrarily imposed on
learned inquiry (Harnad 1990, 1991).
 
NB: This newfound speed, scope, scale and interactiveness are an OPTION
not an OBLIGATION. They merely increase the available possibilities.
Prior tempos (and media) are still available; they are now subsumed as
special cases.
 
To put it more directly: Even with the tachyonic speeds that the Net
puts at the author's disposal, he is still free to reflect as long and as
selectively as he likes.
 
> There are important exceptions to the familiarity of mathematicians
> with the literature. The best mathematicians often internalize their
> subjects (develop intuition) and expand it to such an extent that they
> do not use the literature. They are also able to rapidly assess the
> plausibility of new work, so often do not see reliability as a key
> issue. It is the rank-and-file mathematician who tends to know and
> depend on the literature. Reliability of the literature enables such
> people to contribute significantly to the mathematical enterprise
> rather than simply being camp followers of the great. So it is
> mathematicians of average ability who stand to lose if reliability is
> compromised.
 
But since there is no reason whatsoever why reliability should be
compromised (once peer review and peer discussion hierarchies are
carved out of the current anarchy on the net), it is perhaps also those
of average ability who stand to benefit most -- in terms of
productivity.
 
This is not to imply that the great don't stand to benefit from the
speed, scope and interactiveness of the Net. If their work draws AT ALL
on that of others (preprints, publications, seminars, conferences) then
then this can only be strengthened by the Net. And if they have
benefitted at all from feedback from others on their own work, here too
the great stand to profit from the Net -- even if the communication is
restricted to the handful (or fewer) of peers that they have in the world!
 
> Ignorance or distrust of the literature also leads to duplication of
> effort and repetitive publication. In some disciplines with unreliable
> literatures, like theoretical high-energy physics, there is a relaxed
> attitude toward publication of very similar papers. It is sometimes
> even comforting: like duplication of an experiment it seems to increase
> the probability that the conclusion is correct. By contrast the
reliability of mathematics has led to a definite avoidance of
> overlapping publication. When something is done there is seldom any
> benefit to doing it again.
 
True, but a readily searchable world electronic mathematical literature
will diminish rather than increase the likelihood of duplication,
particularly with the remarkable new literature searching tools being
developed on the Net.
 
> A final difference between mathematics and other fields is that we have
> less tradition of review articles: secondary literature which sifts and
> consolidates the primary literature. It is less necessary because the
> primary literature is more directly usable. There is also less benefit
> since with fewer errors and less duplication to discard there is less
> compression.  In conclusion, mathematics has developed customs which
> result in a reliable and useful (reader-oriented) primary literature.
> Our practices are adapted to this. Further, we lack customs used in
> other fields to correct and refine an unreliable literature. It is
> therefore particularly important for mathematicians to be aware of, and
> cautions about, changes that might threaten reliability and the reader
> orientation.
 
In other disciplines it is primarily peer review, rather than published
reviews, that controls the quality of the literature.
 
> Speed Kills
>
> The first threat to the mathematical literature comes from the lure of
> sheer speed. In many areas there are already electronic bulletin
> boards through which papers can be immediately circulated world-wide.
> They sometimes hit the wires a few seconds after the final keystrokes,
> and needless to say not in final form. If the work is reviewed and
> corrected before it is frozen into the literature, then the additional
> exposure is a good thing.  But there are pressures to regard this
> instantaneous circulation as publication. Information can be
> transmitted instantly, so authors want credit instantly. They want to
> stay in the flow of ideas rather than take the time to nail the last
> one down firmly.
 
Others have voiced this same worry about undue haste (e.g. Garfield
1991), and my reply here would be the same:  Don't throw out the baby
with the bathwater; quick dissemination of unrefereed, unreliable
results is not to be encouraged; but the way to discourage it is to
STRUCTURE the Net, so it is clear where the reliable results are: in
the refereed electronic journals or the high-level peer groups, where
posting access is restricted by levels of expertise, as in a scientific
academy -- all the way from the rarest elite (the 2-4 greatest minds in
a given subarea, with others allowed read-only access) to intermediate
levels, all the way down to the unrefereed, vanity press and chat
groups among dilettantes or know-nothings.
 
Currently the Net is unstructured, so a know-nothing's latest results
are on a par with those of a Bourbaki. But surely this is just a
demographic and entropic initial condition, readily remedied by
providing structure. Readers will be only too happy to calibrate their
reading of this monstrously large literature with some reliable cues as
to the level of quality control and expertise that lies behind a given
piece of text.
 
But don't under-rate the self-corrective dimension of even the
near-vanity press: It is not possible to build on an unsound foundation.
Overhasty, invalid results will cave in under their own weight because
of the interactive dimension of the Net, as others try (much more
quickly than on paper) to build on them -- and fail, and then come
back and SAY SO.
 
> There is nothing new about most of this, of course. Fast-moving fields
> have always engendered a sense of urgency. And there have been fields
> and times when giving a lecture at Princeton was considered tantamount
> to instant publication. But in the past the people who moved on too
> fast, or only lectured at Princeton, did not seriously damage the
> literature. Instead they reduced their own long-term impact on
> mathematics. Now it is technically feasible to damage the literature.
 
The electronic "literature" (most of it unrefereed) is right now a vast
anarchy. What is needed is structures for providing QUALITY CONTROL
(peer review and peer commentary) and QUALIFICATION CONTROL (restricted
peer groups). The Net has the means to pollute the landscape with
unregulated noise, but it also has the means to regulate that noise so
as to produce a vast net improvement in intellectual productivity.
 
Let us not confuse the medium (whose potential is revolutionary) with
its current message (which, because of demographic initial conditions is
more like the contents of a global graffiti board for trivial pursuit).
 
> Part of this issue is the question of who publication should serve;
> authors or readers. In an author-oriented literature, like much of
> theoretical physics, fast publication and loose standards are accepted,
> if not appropriate. In a reader-oriented literature, like most of
> mathematics, high standards come first. Sometimes authors will phrase
> their interests in reader-oriented language: fast publication is
> important because there are unknown persons out there who will be
> vitally interested in the latest results. But this misrepresents the
> interests of the unknown persons: there is little benefit in getting
> something fast if it is wrong or unreliable. And even correct
> statements with sketchy or incomplete proofs can be damaging: if
> someone else was working on the same problem this can render their work
> obsolete and at the same time deny them access to the tools necessary
> to reproduce or extend the result.
 
It seems to me that this is getting into some of the psychology of
mathematical productivity that has little to do with the Net itself.
Mathematics is admirable for knowing whom and what to trust. There is no
reason this cannot be carried over to the Net. The differences between
the old and new medium, after all, are just differences of scale. And
the Net, along with its qualitatively greater speed, scope and
interactiveness also has the means to FILTER information more minutely,
partly by establishing the structures I have described, and partly by
using and developing the new tools for navigating the informational
landscape in a rational, selective way.
 
In short, if it is a psychological danger to a mathematician to have a
colleague prematurely announcing a solution without providing that
solution, then there should be refereed electronic journals where
announcements must pass peer review before appearing, and there should
also be peer groups where mathematicians themselves determine the level
of reliability of their unrefereed announcements:  One could ignore all
but the most elite of these groups, and wait instead till their
announcements pass the filter of peer review.
 
I also think this is a bit of a red herring, because if mathematicians
are really this fragile psychologically -- so that a mere unsupported
announcement can render their ideas stillborn -- then this was a crisis
waiting to happen, and was just as possible in paper as on the Net.
 
> The conclusion is that a careful distinction should be made between
> DISTRIBUTION and PUBLICATION of material. Distribution
> of preprints should be as fast and wide as possible. But something
> should not be considered published until it has been through the
> traditional editing and refereeing process. Further, speed per se
> should not be a goal of the publication process. It might even be a
> good thing for journals to have the policy that at least six months
> MUST elapse between submission and publication.
 
There seems to be an element of incoherence here: Earlier it was said that
unrefereed preprints are the main source of new information. Fine.
And the distinction between unrefereed preprints/announcements (tagged
and weighted by the level of rigor of the peer group in which they are
posted) and refereed publications (likewise tagged and weighted by the
level of the journal in the peer-review hierarchy) should certainly be
prominently made and maintained. But what else? What is the FURTHER
psychological desideratum that is being realized by the arbitrary
paper-derived legacy of the "publication lag"?
 
It takes real time to do work. It takes real time for peers to referee
that work carefully, or to provide peer discussion on preprints of that
work. But once the work is refereed and "authenticated," what earthly
advantage is there in delaying its dissemination by a second?
 
I think incidental features of the old medium are here being mistaken
for essential features of scientific communication and collaboration:
They are not. They are retardants to it.
 
> Alternate Paths to Knowledge
>
> Another hazard to the mathematical literature is a growing uncertainty
> about what should be counted as ``mathematical knowledge'' and so,
> among other things, eligible for publication in a mathematical journal.
> This is not specifically an electronic publication problem, but
> electronic publication is likely to weaken the barriers against
> defective information.
>
> Here again reliability seems to be the key. Our experience is that
> things proved using traditional methods (correctly) are completely
> reliable. Mathematical practice has adapted to this: we accept very
> long and elaborate proofs by contradiction which would be ridiculous if
> the ingredients were not completely reliable. In the other sciences
> information may be very good, but can never be absolutely reliable. And
> accordingly elaborate arguments are viewed with suspicion: the output
> is at best a hypothesis which should be tested.
>
> For example one can determine that a number is very, very likely to be
> prime {3}, or that a complicated identity is virtually certain to
> be true {5}. These are useful and acceptable conclusions. But no
> matter how high the probability, it would be dangerous to accept
> primeness or an identity as ``mathematical knowledge'' without the
> caveat that they are not completely reliable. Our experience is that
> one often encounters low-probability events in long and elaborate
> arguments. Indeed many important mathematical developments are based on
> low-probability events: the representation theory of Lie groups can be
> regarded as giving solutions to some almost certainly insoluble systems
> of equations. Therefore an argument in which some steps are only
> probably true should be handled like arguments in other sciences: the
> conclusion is a hypothesis which may need further testing even to
> conclude that it is probably true.
 
This seems reasonable, but, as the author indicates, has NOTHING to do
with electronic publication/communication, so why is it mentioned?
 
> >From this point of view some earlier concerns about computer proofs
> (eg. in the 4-color problem) were misguided. A proof in which a
> computer checks 20,000 cases is not essentially different from a proof
> in which a person checks 20 cases: the argument is still designed to
> give completely reliable results. Indeed if the algorithms are
> carefully explained, and documented source code is available, then a
> computer proof is preferable to a bald assertion that the author has
> identified and checked all cases.
>
> There are also non-electronic alternate paths to knowledge being
> explored, see {2}, and the responses to it (to appear soon in the
> Bulletin; see particularly {4}). They are ``organic'' in the sense
> that they do not necessarily involve computers. But to the extent that
> the knowledge they produce is not absolutely reliable, or not
> reproducible, they also threaten the integrity of the literature.
>
> The conclusion is that absolute reliability should not be abandoned as
> a goal in mathematics. Knowledge obtained by methods which should
> produce complete certainty should be distinguished from understanding
> which is not certain, no matter how likely.
 
My humble conclusion here, as a nonmathematician, is that (1) the
ELECTRONIC ISSUE has nothing to do with the problem of RELIABILITY
(except inasmuch as peer review and peer hierarchies are needed in the
electronic medium, just as they were needed in the paper medium), and,
symmetrically, (2) the problem of RELIABILITY (probable mathematical
truth, etc.), such as it is, has nothing to do with the ELECTRONIC
ISSUE.
 
> Blurring Journals
>
> The final threat discussed is the growing uncertainty over what
> constitutes publication. This is considered in the context of journals
> because at present they are the primary gatekeepers for the
> literature.
>
> It used to be that publication offered a suite of benefits: editing,
> typesetting, marketing, distribution, archiving, refereeing and
> certification. It was not necessary to identify the relative importance
> of these functions because they always went together. It was also easy
> to tell the difference between reprints and preprints: reprints of
> published material were typeset. TeX{} and laser printers changed
> this. Fortunately typesetting turned out to be a relatively inessential
> part of the publication package. Cutbacks in library budgets have
> compromised the distribution and marketing of some journals.
> ``Camera-ready'' journals have already abandoned the copyediting
> function, and electronic bulletin boards are rapidly challenging the
> distribution function of journals. In short the traditional
> publication package is disintegrating.
 
True, but again a completely medium-independent matter. Editing,
copy-editing, etc., like peer review, can be implemented just as
rigorously electronically as on paper. One must not confuse the medium
with what happens to be the current (primitive) state of its messages!
 
Archiving and retrieval (setting aside the silly, superstitious worries
some people have about the relative durability of the "objects" in the
two media) are, without contest, much more powerfully and optimally
implementable in the electronic medium. Distribution too.
 
As to "marketing" -- well that's a real can of worms. I have written
elsewhere (Harnad 1994) that until now scientists' writing had to be
"marketed" on the trade publication model because of the economics of
paper production and distribution. As absurd as it was, scientists were
SELLING their words, for all the world as if they were trade authors.
With the new economics of electronic publication, this may no longer be
necessary; universities, libraries and learned societies can instead
subsidize the much-reduced real costs of electronic-only publication
[paper publishers will tell you these will be 70-80% of what they were,
I suspect it'll be more like the complement of that, i.e., 20-30% or
even lower]; hence price-tags, copyrights, and copyright protection
measures need no longer form a barrier between the author and his (on
average) lifetime readership of ~20 for an average article (I'm
guessing, but I think I have the order of magnitude about right, which
should point out the absurdity of the trade model in the case of
scientific and esoteric scholarly publication).
 
> The functions still essentially unique to traditional publication are
> editing, refereeing, certification, association with a publisher, and
> archiving by libraries. To this list we might add DELAYS in
> publication. This is significant here because the period between
> submission and publication is important for detection of errors by
> referees, readers of preprints, and authors themselves. We claim that
> most of these remaining functions are vital--we are near the
> irreducible core beyond which the process will begin to unravel.
 
I had difficulty deriving a clear or coherent point from this. We have
(1) unrefereed preprints (potentially appearing in peer groups of
varying levels of qualification) and we have (2) refereed publications
(appearing in refereed journals of varying degrees of peer-review
rigor). The appearance in (2) is of necessity AFTER whatever real-time
delays peer review takes, and mostly what appears in (2) appeared
previously in some form in (1). But what is the point here? Surely one
can tell the difference between something in a BEFORE-forum and an
AFTER-forum, and one can also appreciate (and ignore at one's peril)
the difference between a high-quality forum and a low quality forum in
both cases. Now what is this FURTHER salutary delay that is being
recommended on us?
 
Nor is (2) the end of the road, because the literature (both at various
levels of (2), formally, and of (1), informally) can continue
scrutinizing, elaborating and attempting to build upon what has already
appeared in (2). This too involves time. But where are we being too
quick?
 
> By certification we mean official acceptance into the literature. At
> present what is being certified is that the work has passed the
> scrutiny of editors and referees, and should be useful and reliable. So
> to maintain a useful literature it is certainly vital that editing and
> refereeing remain closely tied to certification. The usefulness of
> certification depends on a generally agreed distinction between
> published and unpublished material. The clarity and significance of
> this distinction is what keeps referees and editors willing to do their
> jobs, and motivates authors to write to high standards and submit their
> work to the rigors of the process.
 
But look: This essay began by noting that in paper we have two
categories, unrefereed preprints and refereed publications. The very
same dichotomy can be duplicated on the Net. So what is the distinction,
in danger of being blurred, that is at issue here?
 
> The real significance of the publisher-library connection is less
> obvious. Historically a publisher was necessary because publishers
> applied ink to paper. Libraries were necessary because they collected
> the paper produced by publishers, and made it accessible to their
> communities. Elimination of paper cuts out these historical functions,
> but important secondary functions remain. The most important of these
> is that publishers and libraries control the distinction between
> published and unpublished material. We can no longer tell at a glance
> that something is published by seeing that it is typeset. Now we
> determine if it appears in a journal or monograph issued by a publisher
> and archived by libraries. Soon we are going to need more sophisticated
> criteria. Whatever they are must be unambiguous, and sufficiently
> controlled to maintain the standards of the literature.
 
This is of course true, but why is it put forward as a problem? It is
obvious that there will have to be quality control on the Net, as on
paper. Paper publishers traditionally provided this. Perhaps they
will become electronic publishers, if the electronic publishing model
proves to make it worth their while. Otherwise their expertise will have
to be provided by individuals who call themselves something else
(editors, copy-editors, designers, proof-readers). But let us not forget
that the REAL foot-soldiers of peer review are US, the peer community
itself! The patina or imprimatur of a prestigious publication was not
established by the prettiness of its typefaces but by the rigor of the
refereeing performed by US. There is no reason whatsoever that the same
marks of quality cannot be duplicated on the Net, once peer review is
implemented there.
 
> A second benefit of publishers and libraries is that they promote
> stability. It is difficult to start a journal, get it printed, and get
> libraries to buy it. But once one develops a good track record and is
> accepted by libraries there is a lot of momentum. The mechanisms of
> electronic communications do not promote this sort of stability, so
> substitutes must be found.
 
This will still be true on the Net. A vanity press will flourish, of
course, because of the minimal real costs involved, but what IMPACT will
it have? Who will read and cite it? As elsewhere, it will matter to an
author, and to his intellectual impact, and to his career, WHERE on the
Net his work appears, how high in the peer group hierarchy as a
preprint, and how high in the journal hierarchy as a refereed
publication. The Net will not remain the flat anarchy it is currently.
(Like many others, the author is again being constrained in the
exercise of his imagination by the incidental initial conditions on the
Net, mistaking the current state of the medium for the ultimate state
of the message.)
 
> A final important aspect of publishers and libraries is the cash flow.
> Individual users usually receive nothing directly when they put
> information into the system, and usually pay nothing directly when they
> take information out. Libraries, for the most part, support the
> financial costs of publication. In the electronic era it seems that the
> major expenses can stripped away: electronic distribution instead of
> paper and postage; requiring submission of TeX{} files instead of
> typesetting; no copyediting; volunteer editors and referees. There are
> still expenses, though they are small enough that an individual,
> department, or university could underwrite them. So we may expect to
> see electronic ``journals'' supported in this way and distributed free
> directly to users. But at least in the short term most journals will
> continue to be associated with publishers, for historical and legal
> reasons, and for the legitimacy and stability it confers. Publishers
> must make a profit. This is a worthwhile expense: legitim
 
I cannot follow the reasoning here either. There are two issues.
The economics of electronic peer groups and their unrefereed, unedited
preprint archives are one thing, the economics of electronic journals
and their refereed, edited (etc.) publications are another. I have
conjectured that the LATTER will cost 20-30% of what paper publication
costs FOR THE SAME QUALITY, both in form and content. It is not clear
that the trade publication model applies to that -- and we certainly
aren't going to back away from its vast potential benefits in order to
keep paper publishers in the black! If they do not wish to take it on,
purely electronic counterparts for them will evolve, probably more
directly administered by universities (and their libraries) and learned
societies.
 
The second issue is that of keeping the paper fleet (on which 99.999% of
our intellectual output currently relies) afloat during the perestroika
of converting to electronic-only publication. There are some HUGE
potential problems there, the main one having already been mentioned
here: electronic preprint (and reprint) archiving and distribution of
PAPER publications. None of us wants to sink the paper fleet (and that's
not to keep publishers in profit but to keep our work in print until the
Network structures are available to take over). I strongly believe that
a collaborative solution can be found that will be fair to all, but this
CANNOT be based on paper publishers' trying to stop or retard the
inevitable, for authors (whose lifetimes are finite) will not tolerate
arbitrary obstacles being placed between their work and the eyes and
minds of their potential peer readership when the means of reaching them
on such an unprecedented scale are available (Harnad 1994).
 
> One mode of support for commercial publication is to bypass libraries
> and pass expenses directly to users as access fees. This is the
> commercial database model. It is unlikely to be successful for
> journals: some drawbacks are discussed in {1}. In the near term
> probably the only reliable way to support publication expenses is to
> maintain the publisher-library connection. Publishers should continue
> to regard libraries as their principal customers, and users should
> continue to get access through a library. A corollary of this is that
> users must be tolerant of some limits to access. In order to spread the
> costs around, many libraries must subscribe (or buy site licenses). For
> this to work the use of the material must be limited to the community
> served by the library. Typical forms of these limits are social
> security numbers as passwords, or access only through machines with
> certain internet addresses.
 
Scientists and scholars have (reluctantly) tolerated restrictions on
access to their work because the economics of paper publications were
such that if they wanted to make their work accessible AT ALL, they had
to be prepared to cooperate in the recovery of its real costs (plus a
fair profit). Whether this trade model will apply to electronic-only
journals will depend on what those real costs turn out to be. If I am
right, that they are closer to 20-30%, then an up-front subsidy by the
information PROVIDERS (the scientist's institutions, libraries and
learned societies) may prove to be a more sensible model than the trade
model -- particularly as these same individuals and institutions are
also the consumers of one another's information, so they gain on both
ends.
 
We must not forget that one of the biggest costs of scholarly journal
publication has ALWAYS been made available on a subsidized basis and
that is the subsidy we all contribute when we perform refereeing!
 
> The conclusions are that editing, refereeing and certification are the
> key functions of publication. In order for these functions to produce a
> high-quality (reader-oriented) literature there must be a clear
> distinction between published and unpublished material, and there must
> be mechanisms to maintain high standards for certification. Currently
> this distinction and many of these mechanisms are byproducts of the
> rigors of paper publication. Electronic publication is free of many of
> these rigors. So unless new quality-control mechanisms are developed,
> the literature is likely to suffer.
 
Quality control mechanisms will certainly have to be implemented, but
they will not be new (though I think they will be implementable more
efficiently and equitably on the Net). The preprint/publication
distinction will be maintained.
 
> Some Ideas
>
> Here we discuss some ways to maintain quality and stability in
> electronic mathematical publication. When these issues are properly
> posed they can be seen to be far from unique to mathematics or
> publication, and there are many models for addressing them.
>
> Stability and Accountability
>
> Corporations, universities, and organizations like the AMS need
> predictable stability over long periods. Most address this need
> through something like a board of trustees. Trustees oversee the major
> aspects of operation, and have final authority. CEOs, presidents and
> directors are accountable to them, so cannot act with impunity, and do
> not simply pass the baton on to their choice of successor. Neither
> should editors. So perhaps every journal should have a ``board of
> trustees'' of eminent mathematicians who ``own'' the journal, to whom
> the editor is accountable, and who control transitions.
>
> This may seem like an intrusion on the autonomy of an editor, and an
> insult to his or her integrity. But currently this function is
> implicitly performed by the publisher (and often not too well). The
> question is not whether to impose such control, but rather whether to
> allow it to disappear. And if it is done consciously with mathematical
> goals it may work better than when driven by the profit motive.
 
Editors will continue to be Editors, whether their journals are
published electronically or in paper, and whether the publishers are
for-profit, nonprofit, university or learned-society. Often Editors are
wholly or partially accountable to or constrained by their Editorial
Boards. This too should and will continue in the electronic
implementation of peer review. There seems to be no especial need for
active outside overseeing on the Net (though there is no reason it
should not be experimented with); the only thing that is really
essential is credible institutional (i.e., peer) support. The prestige
hierarchy will then form of its own accord, as it did in paper, based
on the quality and impact of particular electronic journals.
 
> Global standards
>
> To maintain the sharpness of the published/unpublished distinction
> there must be some way to identify ``real'' journals. Something better
> and more to the point than association with a publisher, or archiving
> by a library. Problems with identifying reliable processes have been
> faced by users of universities, medical, engineering and law schools,
> and generally by users of professional services. The mechanisms
> employed include accreditation, board certification, and licensing. So
> perhaps there should be an accrediting agency for mathematical
> journals. Publication would mean appearance in an ACCREDITED
> journal, independently of the format.
>
> This may seem like an reactionary plot to strengthen the control of the
> establishment over journals, and extend this control to new media. It
> would seem to infringe on the right of anyone to start their own
> journal. But again there is such control now (through publishers), and
> the question is whether or not to give it up. There is a general
> feeling that the end users are well served by the accreditation of
> schools, certification of lawyers and physicians, etc. If readers are
> to be the end users of the mathematical literature, then they would
> probably be well served by accreditation of journals.
>
> Note that this would also solve a problem facing libraries: how can
> they tell what is published and should be archived, if it is not
> associated with a publisher? Accredited journals should be archived,
> everything else should be considered informal communications.
 
These recommendations for the administration of peer review are
reasonable, but hardly revolutionary. It is OBVIOUS that quality control
structures will have to be established on the Net, as they are in paper.
Universities (already publishers) and Learned Societies (ditto) are
natural prima facie candidates, but nothing stops peers from evolving
new forms of overseeing structures on their own. Relative status
in the prestige hierarchy will sort things out (through reading patterns,
citing patterns, demography and practice), just as it did on paper.
This is how new journals find their proper place in the pantheon even in
paper.
 
> Local Standards
>
> So far the focus has been on the PROCESS of publication, on the
> principle that when authors, editors, and referees all go through the
> right motions then the outcome will usually be good. It might be useful
> to have some feedback about the product itself. Purchasers of
> defective merchandise have some recourse with vendors, manufacturers,
> and the Better Business Bureau. Perhaps there should be some place for
> individual mathematicians to direct complaints about unreliable
> published papers. For instance an accredited journal might carry the
> statement ``This journal has been accredited by the Accreditation Board
> of the AMS. Complaints about unreliable material appearing in this
> journal can be addressed to the Accreditation Board.'' Some such
> mechanism might also encourage authors and editors to be more
> conscientious about errata and retractions.
>
> This may seem draconian, and certain to chill the free exchange of
> ideas. But the suggestion applies to official publication, not bulletin
> boards and other prepublication exchanges. These informal media are
> getting freer all the time. The real issue is whether we want a
> reader-oriented or author-oriented published literature. A complaint
> mechanism is certainly reader-oriented.
>
> Such a mechanism might also help with ethics and fraud problems. There
> is in fact a body to which complaints can be addressed: the Committee
> on Professional Ethics of the AMS. Complaints must be formulated as
> charges of ethical misconduct, but some abuses of publication can be
> seen this way (cf. 6). These complaints can make sense to the
> popular press and cause a great deal of discomfort, even if in the end
> there is little technical merit. It would be nice to recognize a less
> serious class of transgressions, and provide a less destructive outlet
> for objections.
 
The suggestions are all reasonable, not at all draconian, but also
obvious, once one recognizes that the objective is to set up quality
control structures on the Net that are homologues of the ones that
already exist for paper. Let's first implement those electronically --
they are virtually nonexistent in the virtual world at the moment;
there is, for example, only one purely electronic refereed journal in
mathematics, as far as I know, and very few in any of the other
disciplines either -- and THEN let's see whether there's any need for
"accreditation" boards over and above the usual structures of peer
review.
 
> A Dialogue
>
> The theses developed in this paper have many ramifications and
> consequences for the AMS. We explore a few of these.
>
> The Bulletin is now available free online over e-math. Should the
> Proceedings and Transactions be next. Probably not. Receiving the
> Bulletin is a ``privilege of membership'' in the AMS, and therefore
> supported by membership fees. Going online has made it a privilege
> conferred by the EXISTENCE of the AMS rather than membership in
> it. This is acceptable to the AMS since the additional expenses are
> insignificant, and experience with an electronic journal has been
> valuable. The Proceedings and Transactions in contrast produce income
> which helps support other AMS activities. There is little interest in
> raising membership fees to replace this income, so distribution should
> remain tied to libraries. In this instance the AMS should act as a
> commercial publisher rather than a professional society.
>
> The research announcements in the Bulletin are scheduled to be
> phased out after November 1993. Should they be published in some other
> format, for instance electronically? Published no, distributed
> maybe. Research announcements are by nature incomplete, so are on the
> wrong side of the line in a careful distinction between published and
> unpublished material. They should be ineligible for formal
> incorporation into the literature. They could be made available on an
> electronic bulletin board. Since this would cost money and not raise
> revenue the main question is a practical one: is this important enough
> that the AMS should subsidize it?
 
These are all valid considerations, but the AMS will soon be a hybrid
entity: both a paper and an electronic publisher. Considerations for the
safe-guarding of the paper flotilla will invariably influence decisions
about what to do electronically, what it costs, and what/whether to
charge. That's fine; but the conclusions might be different for a
non-hybrid, electronic-only project.
 
> What should an electronic journal look like? In the near term,
> like a traditional journal. We should do experiments one at a time:
> first electronic publication of traditional journals, then wait for the
> dust to settle before tinkering with the journal format itself. In the
> long term there are exciting possibilities, like interactive texts with
> video and adaptable graphics. There are also important issues to be
> resolved, for instance how computer programs can be effectively
> published and documented. But the basic rules of electronic publication
> must be settled first.
 
I disagree. I think electronic journals should not emulate paper
journals in form any more than necessary. AAAS/OCLC's electronic journal
"On-Line Journal of Clinical Trials" made a huge investment in making the
journal emulate as many of the features of paper as possible. As a
consequence, they had to adopt the trade model and sell it for over
$100 (lower than the usual paper journal, to be sure, but still a lot,
and a lot more than nothing, which is what some electronic-only journals
-- mostly the text-only ones, admittedly -- are charging). I don't know
which properties of paper are important to conserve; that will have to
be determined empirically, on the basis of what readers in the end want
and use; but I think it's a wrong starting point to try to explicitly
set out to emulate all or most of them a priori.
 
> In this scenario electronic journals would be distributed by
> libraries to a restricted community. What about people not associated
> with a library, and individual subscriptions? The old methods of
> access would still be available, including reprints from the author or
> physically going to a library. Many journals would no doubt be
> accessible through commercial databases (for a fee). It is possible,
> but unattractive for technical reasons, for publishers to offer direct
> individual electronic subscriptions for a fee. It is more likely that
> this would be handled through a commercial database. There is an
> attractive alternate suggestion for the AMS: once a year members could
> be offered a CD ROM containing the previous years issues of all AMS
> journals. Current issues would still have to be accessed through a
> library. Such personal archives would make extensive searches more
> feasible, among other things.
 
My vote would be for trying to offer them for free, accessible to
everyone on the Net through gopher, archie, WAIS, WWW, CICNet and new
structures, and for trying to subsidize the real costs up-front.
Embeddedness in a free, fully accessible and searchable world scientific
literature is much better than getting your annual CD ROM (and perhaps
even some of the bells and whistles of paper journals).
 
> What about the copyright issue? There has been an active debate
> in the AMS (and the publication community) about the meaning and
> function of copyrights in the electronic age. The main focus has been
> on the conflicting interests and rights of publishers and authors. Here
> we have been more concerned about the interests of readers. Copyright
> issues do not seem to have much direct impact on these concerns.
 
I again disagree. If without unnecessary frills the electronic journal
DOES turn out to cost 20-30% of a paper journal (or even less) and is
hence subsidized up front, then there is no need for the publisher to
request exclusive copyright. If copyright simply continues to be
demanded and assigned, on the assumption that the matter is somehow
irrelevant to the concerns under discussion here, then this will make
it all the more likely that a trade model -- and the associated
unnecessary and counterproductive barriers between work and reader --
will be retained and imposed on the Net.
 
> Surely mathematics is resilient enough to adapt to these new
> circumstances. Can't we just relax and let the new age find its own
> equilibrium? We could. Even the worst cases envisioned here would
> not fatally cripple the mathematical enterprise. Road KILL may
> be putting it too strongly. For instance an unreliable literature would
> mostly disadvantage mathematicians of average ability. It is widely
> believed that most advances happen at the top, so from this point of
> view average mathematicians are more-or-less expendable. ``In groups''
> would develop private folklore about the hazards of their local
> literatures. This would place outsiders at a disadvantage, but probably
> most advances are made by insiders. And theoretical high-energy physics
> has an unreliable author-oriented literature, but is far from dead. So
> it is not a matter of life-or-death, but quality-of-life. Still, the
> potential discomforts are serious enough to fully justify concern and
> care.
 
Most of the literature IS by definition the average literature. So
decisions about publication concern this. And I don't agree that the
elite themselves would not benefit from refereed electronic journals
and peer group hierarchies.
 
> Summary
>
> The mathematical literature is very reliable. Indeed most of the
> material is completely reliable in a sense impossible to achieve in the
> other sciences. This deeply affects both technical and social
> practices in mathematics, and has the effect of making the literature
> unusually useful to readers. This reliability, and its benefits, are
> threatened in several ways by electronic publication. Speeding
> publication and blurring the distinction between published and
> unpublished material will erode the standards and effectiveness of the
> editing and refereeing process. A proliferation of journals will lower
> the defences against individuals who do not understand, or do not
> respect, the significance of reliability. However there are many steps
> the American Mathematical Society could take to preserve this heritage,
> including accreditation of journals and rapid development of its own
> electronic publication program to set a wholesome precedent.
>
> Frank Quinn
> Virginia Tech
> quinn@math.vt.edu
> Blacksburg VA 24061-0123
 
The AMS as a Learned Society should certainly take a large role in these
questions, but in some ways a nonpublishing Learned Society is better
placed, at least with respect to problems of conflict of hybrid
interests. It's a trade-off, though, because a nonpublishing Learned
Society also lacks publishing experience. But how much is paper
publishing experience really going to be RELEVANT to electronic
publishing (as opposed to RETARDANT)? The real expertise and experience
is in the peer community itself -- the population from which all those
editors and referees are drawn. The rest is just copy-editing and
collecting the tickets at the gate...
 
Stevan Harnad
Editor, Behavioral & Brain Sciences, PSYCOLOQUY
 
Cognitive Science Laboratory |    Laboratoire Cognition et Mouvement
Princeton University         |    URA CNRS 1166 I.B.H.O.P.
221 Nassau Street            |    Universite d'Aix Marseille II
Princeton NJ 08544-2093      |    13388 Marseille cedex 13, France
harnad@princeton.edu         |    harnad@riluminy.univ-mrs.fr
609-921-7771                 |    33-91-66-00-69
 
------------------------------------------------------------------------
                RESPONSE BY FRANK QUINN:
 
> From: Frank Quinn <quinn@math.vt.edu>
> Date:         Fri, 19 Nov 1993 15:59:37 -0500
>
> I believe that some of the issues raised in the commentary can be
> resolved by a closer reading of the text. Mathematicians tend to give
> precisely logical readings, and writers for this audience sometimes
> abuse this tendency by being overly terse. There are significant real
> differences, and I will try to focus our positions on two of these. But
> first I would like to repeat that different fields have different
> needs. Mathematics, with its emphasis on logic and proof, and with its
> 5,000 subspecialties, may very well have different needs than
> psychology. My analysis is explicitly specific to mathematics, and I do
> not assert that my conclusions are portable to any other field. Maybe
> more to the point I also deny that conclusions valid in other fields
> are necessarily portable to mathematics.
>
> FILTERING AND GRADING DISCOURSE: currently scientific discourse is
> filtered (by editors, referees, etc) and graded (by prestige of
> journal, reputation of author, etc). We agree that the Net is still
> nearly flat at least in terms of formal structure, and that this must
> change.
>
> *  Harnad feels that the new navigation and search tools mean that to
> some degree filtering can be shifted to readers. I am not happy with
> this. Until the software is so good that I can specify "quality" as a
> search attribute, I want the burden to remain with editors and
> referees.
>
> *  Harnad expects structure to develop primarily along "peer group"
> lines, in other words, filtered by author. The mathematical tradition is
> to filter by content of individual papers. I believe this practice has
> served us well and should be continued, though I would not presume to
> suggest imposing it on psychology. Filtering by papers rather than
> author requires a much more elaborate structure, which may have more
> trouble evolving without deliberate intervention.
>
> GETTING TO THE FUTURE: Our primary differences are not over the
> far future but over how to get there. Harnad seems to be content to
> let things evolve, confident that the "self-correcting nature of
> science" will guide it to a proper conclusion. I favor a proactive
> approach, shortcutting or preempting this evolution. This difference
> is probably motivated by different estimates of discomfort in the
> transition.  Most fields have active antibodies against errors in their
> literatures. Some seem to have literatures so unreliable they may as
> well be bulletin boards anyway. Such fields should not suffer greatly
> from dislocations in the filtering process. But my argument is that
> mathematics has far fewer such antibodies, so is more likely to really
> suffer from the disease. Our dependence on an unusually reliable
> literature is so deep and pervasive (and has been so beneficial) that I
> feel it justifies vigorous protection. Our customs have also evolved
> over a much longer period than other sciences, and I am unwilling to
> risk having to wait a proportionally longer time to regain our
> equilibrium.
>
> *  So my suggestions are intended to ease the near-term transition,
> not shape the far future. They may also be uniquely appropriate to
> mathematics.
 
STEVAN HARNAD: By way of clarification, I would add:
 
(i) I agree that there are likely to be some differences in the way
different disciplines implement electronic publication and communication.
 
(ii) In the structure I advocate, the Net WILL provide the kind of
quality-tagging Quinn desires, both at the unrefereed preprint peer
group level and at the refereed archival journal level.
 
(iii) I do not expect the structure I advocate to develop primarily at
the unrefereed peer group level. On the contrary, I think it should be
systematically imposed at the refereed (electronic) journal level.
 
(iv) Self-corrective constraints operate at both (a) the unrefereed peer
group level (for preprints and informal discussion) and (b) the refereed
journal level (for articles and formal discussion). I advocate active
intervention in both: Structuring a peer group hierarchy based on
expertise for (a) and establishing refereed journals and peer review for (b).
 
SUMMARY (Harnad):
 
Quinn has indicated that most mathematicians get their information from
unrefereed (paper) preprints (PRE), rather than the reliable,
authenticated, later, refereed (paper) publications (PUB), but he
stresses the importance of the reliable, authenticated sources of
information (PUB). Quinn is NOT advocating that mathematicians stop
heeding preprints. On the Net, however (and to a certain extent in
paper preprints in typeset in Tex), Quinn suggests that the distinction
between PRE and PUB is becoming blurred, and that this has to do with
the speed and scope of the Net, and that this is somehow compromising
the reliability of the mathematical literature (both PRE and PUB,
presumably). Quinn's recommended remedy is, basically, to slow down,
and perhaps to set up "accreditation" boards for refereed electronic
journals (PUB).
 
The reason I disagree with this proposal is that, to date, two
prominent features are virtually absent from the Net, features that
have nothing whatsoever to do with speed or with accreditation:
(a) There are almost no peer-reviewed journals on the Net (PUB);
everything is just unrefereed forums (PRE); and (b) the forums
themselves are all at a par -- there are no distinctions among them
based on the quality of the material appearing and the qualifications
of the contributors. So I suggested that what was needed on the Net was
(1) peer review (in fact, a duplication of the usual prestige hierarchy
of refereed journals, just as it now exists in print) plus (2) explicit
demographic constraints on the qualifications and level of expertise of
the unrefereed peer groups (both preprint and informal discussion).
Until these two features are implemented, there is no basis whatsoever
for determining whether speed in itself kills, and whether some special
accreditation mechanisms, over and above the normal ones for peer
review, would have to be implemented on the Net.
 
The problem, in other words, is not the blurring of the PRE/PUB
distinction, it is that the Net ONLY has PRE on it at the present time
(and the PRE is flat).
 
I disagreed also with Quinn's suggestion that paper should be emulated and
that reader-charges should be instituted (I think both of those are
empirical questions that can only be answered when we find out the real
costs of producing high-quality, refereed, electronic-only journals -- and
hardly anyone has even begun doing that yet).
 
Stevan Harnad
 
> Refs
>
> 1 by John Franks paper The impact of electronic publication
> on scholarly journals Notices of the AMS vol 40 yr November
> 1993 pages 1200--1202
>
> 2 by Arthur Jaffe and Frank Quinn paper ``Theoretical
> mathematics'': toward a cultural synthesis of mathematics and
> theoretical physics Bull. AMS vol 29 yr 1993 pages
> 1--13
>
> 3 by R. G. E. Pinch paper Some primality testing
> algorithms Notices of the AMS vol 40 yr November 1993 pages
> 1203--1210
>
> 4 by William Thurston paper The nature of progress in
> mathematics paperinfo Preprint June 1993
>
> 5 by Doron Zeilberger paper Theorems for a price: Tomorrow's
> semi-rigorous mathematical culture Notices of the AMS vol 40
> yr October 1993 pages 976--981
>
> 6 paper Report of a special committee on professional ethics Notices
> of the AMS vol 40 yr November 1993 pages 1217--1219
 
Drake, R.A. (1986) Citations to articles and commentaries: A
reassessment.  American Psychologist 41: 324 - 325.
 
Garfield, E. (1991) Electronic journals and skywriting: A complementary
medium for scientific communication? Current Contents 45: 9-11,
November 11 1991
 
Hargens, L.L. (1990) Variation in journal peer review systems: Possible
causes and consequences. Journal of the American Medical Association
263: 1348-1352.
 
Harnad, S. (1979) Creative disagreement. The Sciences 19: 18 - 20.
 
Harnad, S. (ed.) (1982) Peer commentary on peer review: A case study in
scientific quality control, New York: Cambridge University Press.
 
Harnad, S. (1984a) Commentary on Garfield:  Anthropology journals:  What
they cite and what cites them. Current Anthropology 25: 521 - 522.
 
Harnad, S. (1984b) Commentaries, opinions and the growth of scientific
knowledge. American Psychologist 39: 1497 - 1498.
 
Harnad, S. (1985) Rational disagreement in peer review. Science,
Technology and Human Values 10: 55 - 62.
 
Harnad, S. (1986) Policing the Paper Chase. (Review of S. Lock, A
difficult balance: Peer review in biomedical publication.)
Nature 322: 24 - 5.
 
Mahoney, M.J. (1985) Open Exchange and Epistemic Progress.
American Psychologist 40: 29 - 39.
 
-----------------------------------------------------------------------
The following files are retrievable from directory pub/harnad/Harnad on
host princeton.edu (citation is followed by FILENAME and, where
available, ABSTRACT):
 
Harnad, S. (1990) Scholarly Skywriting and the Prepublication Continuum
of Scientific Inquiry. Psychological Science 1: 342 - 343 (reprinted in
Current Contents 45: 9-13, November 11 1991).
FILENAME: harnad90.skywriting
 
Harnad, S. (1991) Post-Gutenberg Galaxy: The Fourth Revolution in the
Means of Production of Knowledge. Public-Access Computer Systems Review
2 (1): 39 - 53 (also reprinted in PACS Annual Review Volume 2 1992; and
in R. D. Mason (ed.) Computer Conferencing: The Last Word. Beach Holme
Publishers, 1992; and in A. L. Okerson (ed.) Directory of Electronic
Journals, Newsletters, and Academic Discussion Lists, 2nd edition.
Washington, DC, Association of Research Libraries, Office of Scientific
& Academic Publishing, 1992).
FILENAME: harnad91.postgutenberg
 
Harnad, S. (1992) Interactive Publication: Extending the American
Physical Society's Discipline-Specific Model for Electronic Publishing.
Serials Review, Special Issue on Economics Models for Electronic
Publishing, pp. 58 - 61.
FILENAME: harnad92.interactivpub
 
Harnad, S. (1994) Implementing Peer Review on the Net:
Scientific Quality Control in Scholarly Electronic Journals. Proceedings
of International Conference on Refereed Electronic Journals: Towards
a Consortium for Networked Publications. University of Manitoba,
Winnipeg 1-2 October 1993 (in press)
FILENAME: harnad94.peer.review
ABSTRACT: Electronic networks have made it possible for scholarly
periodical publishing to shift from a trade model, in which the author
sells his words through the mediation of the expensive and inefficient
technology of paper, to a collaborative model, in which the much lower
real costs and much broader reach of purely electronic publication are
subsidized in advance, by universities, libraries, and the scholarly
societies in each specialty. To take advantage of this, paper
publishing's traditional quality control mechanism, peer review, will
have to be implemented on the Net, thereby recreating the hierarchies
of journals that allow authors, readers, and promotion committees to
calibrate their judgments rationally -- or as rationally as traditional
peer review ever allowed them to do it. The Net also offers the
possibility of implementing peer review more efficiently and equitably,
and of supplementing it with what is the Net's real revolutionary
dimension:  interactive publication in the form of open peer commentary
on published work. Most of this "scholarly skywriting" likewise needs
to be constrained by peer review, but there is room on the Net for
unrefereed discussion too, both in high-level peer discussion forums to
which only qualified specialists in a given field have read/write
access and in the general electronic vanity press.
=========================================================================
Date:         Mon, 22 Nov 1993 08:08:56 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         HIREMK@VM2.YorkU.CA
Subject:      material on education
 
Can anyone direct me to the right sources on the Internet?  I am compiling
a list of electronic journals and newletters on the education of adults
and children.  Any language is fine.  Thank you very much for your help.
 
Hirem Kurtarici
York University
hiremk@vm2.yorku.ca
=========================================================================
Date:         Mon, 22 Nov 1993 12:07:41 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Morris Simon <msimon7@ua1ix.ua.edu>
Subject:      _Books-In-Print_ online?
 
 
Does anyone on this list know if _Book-In-Print_ is online anywhere?
Perhaps by Gopher?
 
Thanks
Morris Simon <msimon7@ua1ix.ua.edu>
Stillman College
=========================================================================
Date:         Mon, 22 Nov 1993 15:43:11 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Marcos Athanasios Athanasoulis <mathanas@uclink.berkeley.edu>
Subject:      Re: material on electronic newspapers
In-Reply-To:  <9311221309.AA16709@uclink.berkeley.edu>
 
 
Can anyone refer me to concise and up-to-date summaries of the current
state of electronic newspapers?  I'm interested particularly in sources
that describe either what's available or what the cuurent issues and tools
(hardware/software/intellectual property rights etc.) are.
 
Thanks
 
Marcos Athanasoulis
Department of Biomedical and Environmental Health Services
UC Berkeley
(mathanas@uclink.berkeley.edu)
=========================================================================
Date:         Tue, 23 Nov 1993 08:34:30 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Sam Demas <sgd1@cornell.edu>
Subject:      Re: _Books-In-Print_ online?
 
In respose to the query:
>Does anyone on this list know if _Book-In-Print_ is online anywhere?
>Perhaps by Gopher?
 
I am unaware of any internet source which is accessible free of charge.
However, "Books in Print" is available online over the net through DIALOG
Information Services as File 470.
 
+-+-+-+-+-+-+-+-+-*-+-*-+-*-+-*-+-*-+-*-+-*-+-*-+-*-+-+-+-+-+-+-+-+
Sam Demas                                              sgd1@cornell.edu
Head, Collection Development & Preservation       voice: (607)255-6919
Albert R. Mann Library                                 fax: (607)255-0850
Cornell University
Ithaca, N.Y. 14853
=========================================================================
Date:         Tue, 23 Nov 1993 08:36:00 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Ulrich Riehm <afs778@ucla.hdi.kfk.d400.de>
Subject:      Re: material on electronic newspapers
 
One source for the availability of electronic versions of newspapers
is "Fulltext Soruces Online", edited by R. M. Orenstein, published
by BiblioData in Needham Heights, Ma., I think every year.
 
Ulrich Riehm
=========================================================================
Date:         Tue, 23 Nov 1993 08:36:37 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         "Filip M. J. A. Evenepoel" <filip.evenepoel@esat.kuleuven.ac.be>
Subject:      Re: material on electronic newspapers
 
>Can anyone refer me to concise and up-to-date summaries of the current
>state of electronic newspapers?  I'm interested particularly in sources
>that describe either what's available or what the cuurent issues and tools
>(hardware/software/intellectual property rights etc.) are.
 
In late 1992, The CAPS Consortium published a report on electronic newspapers
describing the current State of the Art: "Digital Newspapers for the Print
Disabled: State of the Art Report".
 
The CAPS Consortium is an inetnational group of researchers, users and companies
who are active in the field of the Print Disabled (blind, partially sighted,
...). During the first phase (Pilot Phase) of the project (sponsered by the
European Community under the TIDE Programme) the main objectives of the group
were to make newspapers available to the print disabled in a structured way
(electronically).
 
The report mentioned above can be obtained through:
InfoVisie VZW
Kapucijnenvoer 33
B-3000 LEUVEN
BELGIUM
Fax: +32 16 22 18 55 (to: Filip Evenepoel, Afd. T.E.O.)
The report will be sent to you, however a small fee will be charged for for the
report (1000 Belgian Francs = approx. 33 US $) and for administrative and
banking costs (500 Belgian Francs = approx. 17 US $).
 
More information on the CAPS projects and the published reports can be obtained
through myself. (You can also send a request for the report to me and I shall
send it further to InfoVisie.)
 
Greetings,
 
Filip Evenepoel
 
Kath. Universiteit Leuven            | Phone : +32 16 22 09 31 (ext. 1123)
Dept. Electrotechniek, Afd. T.E.O.   |         +32 16 29 04 20
Kard. Mercierlaan 94                 | Fax   : +32 16 22 18 55
B-3001 LEUVEN - HEVERLEE             |
 
E-mail: Filip.Evenepoel@esat.kuleuven.ac.be
=========================================================================
Date:         Tue, 30 Nov 1993 08:24:40 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         Misha Glouberman <misha@cam.org>
Organization: Communications Accessibles Montreal, Quebec Canada
Subject:      Bark Magazine?
 
 
Hi. I'm trying to find out about an on-line magazine called Bark.
A friend heard about it recently on CNN, but couldn't remember the
details. It's apparently a nicely-formatted deal, aimed at young readers
(teens?) and sponsored by Polygram. I tried the Usual Net Tools, but
couldn't find it. Does anyone know where the magazine, or more
information about it can be found? Any info. would be much appreciated.
 
                                - Misha
                                  misha@cam.org
=========================================================================
Date:         Tue, 30 Nov 1993 08:25:25 EST
Reply-To:     "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
Sender:       "Publishing E-Journals : Publishing, Archiving,
              and Access" <vpiej-l@vtvm1.bitnet>
From:         IAN.WORTHINGTON@classics.utas.edu.au
Subject:      *ELECTRONIC ANTIQUITY* 1,6
 
As a subscriber to *Electronic Antiquity* you are now being contacted to
let you know that this month's issue (Volume 1 Issue 6) is now available.
A list of contents and access instructions follow.
 
*ELECTRONIC ANTIQUITY:
COMMUNICATING THE CLASSICS*
 
ISSN 1320-3606
 
Peter Toohey (Founding Editor)
Ian Worthington (Editor)
 
VOL. 1 ISSUE 6 - NOVEMBER 1993
 
(01) LIST OF CONTENTS
 
(02) FEATURES
 
Dale, Peter, 'The Voice of Cicadas: Linguistic Uniqueness,
        Tsunoda Tananobu's Theory of the Japanese Brain, and
         Some Classical Perspectives'
Farrell, Joseph, 'Allusions, Delusions and Confusions: A Reply'
Keen, Antony G.,'Grain for Athens.  Notes on the Importance
        of the Hellespontine Route in Athenian Foreign Policy
        before the Peloponnesian War'
 
(03) OPINIONS
 
Goetsch, Sallie R., 'Training the Tragic Actor'
Slater, Niall, 'The Greek Project: Review of Aeschylus,
        *Agamemnon* and Sophocles, *Electra*'
 
(04) *DIDASKALIA: ANCIENT THEATER TODAY*
        Announcing a new electronic venture for ancient theatre
 
(05) EMPLOYMENT
 
Australia:
Ancient Historian (2 posts): Macquarie University
Research Fellow in Ancient History: Macquarie University
 
New Zealand:
Classicist: Victoria University of Wellington
 
U.S.A.:
Classicist: Rutgers University
 
(06) KEEPING IN TOUCH
 
Classics in Canada
 
Conference:
Ancient Ceremony and Spectacle,
        UNC, Chapel Hill (call for papers)
 
Conference:
Gender and Material Culture,
        University of Exeter (call for papers)
 
Conference:
Israel Society for the Promotion of Classical Studies
        Ben Gurion University (call for papers)
 
Conference:
Kingship and the Origins of Power in Greek Society,
        University of Texas, Austin (programme)
 
Conference:
Pilgrims and Travellers to the Holy Land,
        Creighton University, Nebraska (call for papers)
 
Conference:
Women and Slaves in Classical Culture, A.P.A./A.I.A
        Panel Discussion 1994 (call for papers)
 
Electronic Forums & Repositories for the Classics
        by Ian Worthington
 
(07) GUIDELINES FOR CONTRIBUTORS
 
*Electronic Antiquity* Vol. 1 Issue 6 - November 1993
edited by Peter Toohey and Ian Worthington
antiquity-editor@classics.utas.edu.au
ISSN 1320-3606
 
------------------------
 
A general announcement (aimed at non-subscribers) that
the journal is available will be made in approximately 12
hours time over the lists - as a subscriber you will be
automatically contacted in advance when future issues
are available.
 
Access is via gopher or ftp (instructions below).
 
Volume 1 Issue 7 will be published in December 1993.
 
The editors welcome contributions.
 
HOW TO ACCESS
 
Access is via gopher or ftp.
The journal file name of this issue is 1,6-November1993;
Volume 1 Issues 1-5 may also be accessed in the same way.
 
GOPHER:
 
-- info.utas.edu.au and through gopher:
-- open top level document called Publications
-- open Electronic Antiquity.
-- open 1,6-November1993
-- open (01)contents first for list of contents, then other files as appropriate
 
FTP:
 
-- FTP.utas.edu.au (or ftp.info.utas.edu.au)
        --> departments --> classics --> antiquity.
-- In Antiquity you will see the files as described above.
 
Since a few people had problems accessing the journal via ftp,
here are the stages in more detail:
 
at your system prompt: FTP
at the subsequent prompt: open FTP.utas.edu.au
at login prompt: anonymous
at password: your username (which won't show)
then: cd departments
then: cd classics
then: cd antiquity
then: ls -l
then: cd 1,6-November1993
then: ls -l
   You will now have a list of the various directories (the 'd'
   beginning each line 'drwx....' indicates you're dealing with
   a directory)
then: cd (into whichever directory you want)
then: ls -l
   If the first character in the line is not 'd', you've got a file.
   Use the 'get' command plus the file name to download.  If you're
   still in a directory, use the 'ls-l' command to list its contents.
        Use 'get' to transfer files.
 
To move back up the directory tree:
 
type: cdup
then: ls -l
 
And repeat the process.
 
If still having trouble, try, once you have the directory list for
the journal:
 
Type (for example)       cd (01)Contents
Your response should be 'CWD command successful', but no list.
Type                     ls-l
Your response should be in a form such as:
-rw-rw-r--1  1689  77030  Nov 30  16:40 contents
Type  get contents
and you should have a copy.
 
A final alternative if a space is magically inserted in the parenthesis
of the file number is to specify:
 
CD ./(01)Contents
 
Please also be very careful when ftping *not* to leave *any* spaces
in file names or make typos.
 
Do NOT use Telnet.
 
The best way to access the journal (in terms of both ease and
time) is by gopher, and we would urge you to do so.  The
structure of the journal is also more easily recognisable on gopher.
 
Please try to access *here* in Tasmania  either during the night,
very early morning or at weekends, since during the business
 day the lines are crammed.  This means you'll need to check
with (e.g.) the international operator for the right time difference,
but at the moment (the following is not an exhaustive list)
Britain is 11 hours behind Tasmania; Europe, west
to east, 10-8 hours; East Coast U.S.A. 16 hours; West
Coast U.S.A. 19 hours; South America, coastal to eastern,
15-17 hours, South Africa 10 hours; Singapore 3 hours;
and Japan 3 hours.
 
Queries and contributions may be directed to the editors at
:antiquity-editor@classics.utas.edu.au.
 
Peter Toohey (ptoohey@metz.une.edu.au)
Ian Worthington (ian.worthington@classics.utas.edu.au)
 
(end)
---------
Ian Worthington,
Department of Classics,
University of Tasmania,
Hobart, Tasmania 7001,
Australia.
Tel. (002) 202-294 (direct)
Fax (002) 202-288
e-mail:  Ian.Worthington@classics.utas.edu.au
</vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></misha@cam.org></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></filip.evenepoel@esat.kuleuven.ac.be></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></afs778@ucla.hdi.kfk.d400.de></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></sgd1@cornell.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></mathanas@uclink.berkeley.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></msimon7@ua1ix.ua.edu></msimon7@ua1ix.ua.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></quinn@math.vt.edu></quinn@math.vt.edu></harnad@princeton.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></allen@brownvm.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></allen@brownvm.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></wiemers%nuacvm.acns.nwu.edu@uicvm.uic.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></lanza@dnc.com></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></esleeter@ani.ieee.org></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></jaymachado@delphi.com></wiggins@msu.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></jduke@vcuvax.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></esleeter@ani.ieee.org></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></msimon7@ua1ix.ua.edu></msimon7@ua1ix.ua.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></jansen@syd.dit.csiro.au></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></mzltov@nwu.edu></vpiej-l@vtvm1.bitnet></ron@ccsg.tau.ac.il></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></wiggins@msu.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></filip.evenepoel@esat.kuleuven.ac.be></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></john@hopf.math.nwu.edu></tdarcos@mcimail.com></john@math.nwu.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></blips15@brownvm.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></tdarcos@mcimail.com></jansen@syd.dit.csiro.au></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></harnad@princeton.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></randy@psg.com></john@hopf.math.nwu.edu></tdarcos@mcimail.com></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></scottp@herald.usask.ca></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></ron@ccsg.tau.ac.il></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></john@hopf.math.nwu.edu></papakhi@iubvm.bitnet></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet></john@hopf.math.nwu.edu></vpiej-l@vtvm1.bitnet></vpiej-l@vtvm1.bitnet>

__________________________________________________________________

James Powell