The University of Virginia Library's Experiment with Benchmarking
by Lynda S. White
Benchmarking is an ongoing, systematic process for measuring and comparing the work processes of one organization to those of others that exhibit functional "best practices." The goal is to provide an external standard for measuring the quality and cost of internal processes, and to help identify where there may be opportunities for improvement. Benchmarking should be an ongoing process that analyzes data collected over time. It is a learning process that helps institutions discover how they can best improve the services, direct or indirect, they offer to their customers.
For the 1998/2000 biennium, the University of Virginia Library chose as one of its goals to institute benchmarking as a tool for the analysis of internal processes and to establish "benchmarks" against which we could measure those processes.
Choosing the Project
The benchmarking pilot project had been chosen collaboratively by our managers after review and discussion of the results of several user surveys conducted in the spring of 1998. The library had just completed its triennial comprehensive student survey of library services, as well as two SERVQUAL surveys, one in the main library and one in the fine arts library. Both the service ratings and the comments from respondents pointed to our reshelving process as an area needing improvement.
For various reasons, patrons sometimes had difficulty finding books in the stacks. There was some indication that they failed to notice the online catalog note explaining that the book they wanted was checked out. In addition, our online catalog indicates that the book is on the shelf the moment it is checked in. Because of this, books listed as available in the catalog might not yet be on the shelf. Often patrons simply do not understand how the call number system works.1 It was also true that books left unshelved by other patrons were not picked up and reshelved in a timely manner. Others languished in sorting areas or on parked trucks. Some books, of course, were lost or misshelved. In order to eliminate some of these causes, shelving was chosen as a pilot for learning the benchmarking process. For our project, we looked specifically at how long it took for a book to reach its shelf after it had been discharged at any library at the University and at how accurately it was shelved.
Team members were chosen by Management Information Services2 staff partially by considering staff members who had similar experience on a previous -process--improvement team. We also looked at having representation from several departments and service units. Two team members were from Management Information Services to provide statistical skills and to provide continuity for future benchmarking projects. Additional team members were drawn from the Cataloging Department, the Science/Engineering Library, Social Sciences Services, and the stacks staff of the main and music libraries for a total of seven team members. It is crucial to have on the team staff who work in the area to be studied.
This first benchmarking team was created in January 1999. The team had two challenges: create a benchmarking process for the library and carry out a short-term benchmarking project as a pilot.
The charge read in part:3
"The project this Team is charged to undertake is benchmarking the shelving/reshelving process in all University Library service units. The project should include these processes: map and measure the current process in each library
- determine benchmarking partners with best practices for the shelving process
- communicate with those partners about their process
- compare their practices with the University Library's process
- recommend improvements in current practices based on the best practices of our benchmarking partners."
The Learning Curve
Benchmarking is a process improvement tool that has been used by the business community for over a decade, but it has only recently migrated to the non-business academic community. This made it difficult to find both information about the process as it relates to libraries and information about other benchmarking projects on the shelving process specifically.
We began our task by identifying books, articles, and Internet resources on benchmarking in business, libraries, and the military. There was some literature on benchmarking specifically relating to libraries, but details on how to carry out the process were generally lacking. In addition, we could not determine that there was any training available locally through the University or any of its schools and departments, or through the Association of Research Libraries. There were, in essence, no resources other than our own research and reading for learning the process.
A query regarding benchmarking made to the LARGE_PSD (Public Services Directors) listserv brought a response from Sally Kalin, Pennsylvania State University Associate Dean of University Libraries. She graciously consented to spend some time on the phone explaining the process and also to send a packet of information on the benchmarking projects she had participated in at Penn State. Her enthusiasm for the process was contagious. Our team spent several weeks reading books and articles on benchmarking. Fortunately, after this short time, our readings became repetitive.
The basic benchmarking process is straightforward:
- Determine what to benchmark.
- Form a benchmarking team.
- Identify benchmarking partners.
- Collect, analyze, and compare benchmarking information from your own institution and from your partners.
- Take action; implement the new plan.4
We had already accomplished steps 1 and 2 by deciding the topic, having the team in place, and learning the process. But we had some difficulty getting to steps 3 and 4. Just what are the critical factors for successful reshelving? What points need to be measured and compared? How does one find other libraries with best practices that would be willing to be benchmarking partners?
As we had only a semester to complete the project, we undertook parts 3 and 4 of the process simultaneously. We needed to learn more about our own shelving process in each library in the system. Since there were minimal data available on our shelving process, we began to flowchart the process in the eleven libraries at UVa and in the Government Documents department. We also began to work on a survey instrument that would help us gather data about the process as practiced at the University of Virginia Library. We tested this questionnaire by interviewing a few stacks supervisors. The outcome of the test was messy at best. It was necessary to revise the questionnaire several times in order to garner more usable answers (see Appendix 2). We learned how each location shelved its books and journals as well as some of the factors that contributed to the shelving process, such as training, number and level of employees, pay rates, LEO (on campus) delivery, new book routines, pick-up routines, sorting areas, etc. We discovered that many of our libraries already shelved excellently-by the end of each day.
Another component of the project was to identify those institutions that exhibited best practices for shelving. The literature on the shelving process was as sparse as the literature on benchmarking in libraries. Unlike the business world, there was no place to go to determine which library had best practices in shelving; there were no Malcolm Baldridge Award winners among libraries. It should be noted that we also considered comparing our process to similar processes in the business world, such as stocking grocery shelves or shoe stores, or refiling videos at a video rental store. We learned quickly that businesses are not as willing to share information about how they operate as libraries are. On the contrary, the library culture assumes the sharing of information.
Rather than rely on a sparse literature, we began querying two electronic listservs (LARGE_PSD and CollDev) as an alternative way to establish which institutions had best practices for shelving. We initially asked whether those on the listserv would be willing to participate in a brief survey. The 19 institutions that responded were sent a short survey (Appendix 3) devised to ferret out best practices at institutions similar to the University of Virginia Library. Thirteen institutions responded over the next two months, revealing much interesting data about shelving standards, staff sizes, and other resources.
The data from three institutions suggested that their operations constituted best practices for library shelving. Two of these institutions were chosen because of their reports of a remarkable 4- or 5-hour turnaround time, 94%+ accuracy rates, and previously completed shelving studies. Cost was also a factor in the choices. Although the business literature insists that cost should not be a factor in choosing a site to visit, we were fortunate to have a best practices site within a three-hour drive of Charlottesville.
Communicating the process to staff and stakeholders is a key element of benchmarking. Our contact at Penn State assisted us in finding a benchmarking consultant, Gloriana St. Clair, director of the libraries at Carnegie Mellon University, who could help us with this. She graciously consented, on very short notice, to present basic benchmarking information to the entire library staff. She also assisted the team in revising the -local--practices questionnaire and in deciding which institutions exhibiting best practices it would be best to visit. She suggested that the team was moving toward its objective at a good pace in spite of our reservations about lack of training in the benchmarking process-and that we needed to "just get on with it."
We began planning for site visits to the University of Arizona in Tuscon and to Virginia Polytechnic Institute and State University in Blacksburg. The site visits were essential for understanding how the best practices really worked. There is no substitute for walking through a process and having an opportunity to ask questions along the way. In addition, the host libraries were asked to fill out the same survey that had been completed by our own stacks staff. This allowed us to identify procedures that were alike and different, and thus point to how our process could be improved. We were also fortunate to be very graciously received by staff of both institutions. Often institutions that have been identified as having best practices are inundated with requests for benchmarking data and site visits. Over time, these institutions become less cooperative since the benchmarking process requires work on their part as well. We felt privileged that our partners were so willing to help us.
While two team members and our AUL for User Services conducted the out-of-state site visit, the other five team members measured several things for which we had no data: how much we shelved (number of books and journals), what our turnaround time was (from return desk to shelf), how accurately we shelved, and what the turnaround time was for pick-ups.5 Our MIS programmer developed the protocol and ran reports against our Sirsi database, with which the studies were done. For example, for one measurement we produced a list of call numbers of books checked in on a particular day for each of our larger libraries. These were libraries where we were not sure how long it took to shelve the books: the main library, science/engineering, the undergraduate library, fine arts, and government documents. The lists were checked each day until all (or nearly all) the books were found in the right place. Notations were made about when the book was found and whether it was in the correct place. We found that turnaround time ranged from 1.3 to over 5 days. Some items were not found shelved during the study. For this study, team members carried out the measurement in libraries other than their home libraries in order to avoid influencing the results.
By the time we finished these projects and the internal questionnaire, we had enough information about our own process to compare ourselves to other institutions that shelved more quickly and accurately than we did. From the surveys of other libraries and from the site visits, we had information on exactly how libraries with best practices performed reshelving and with what resources.
Communication with Staff
Communication is a critical part of the benchmarking process if the rest of the staff is going to accept the concept of benchmarking and the idea that a process they have been performing for years can be done better. It is otherwise difficult to counter the "we've always done it that way" objection to changing processes. At various points during the project, the team apprised staff and stakeholders of progress by:
- making direct contact with other library stacks supervisors both to gather information and to let them know about the process and how the project was progressing.
- inviting our consultant to present information on benchmarking to the entire staff.
- sending an interim report mid-way through the project to our staff e-mail list.
- posting the final report on the staff Web site.
- creating a working group of all shelving supervisors to exchange ideas, solve problems, and train new shelvers and shelving supervisors.
It is important not to underestimate this part of benchmarking. It is a part that we could have done much better.
Using and comparing data from the questionnaire, the best practices email survey, the site visit reports, and our own local measurements, the team was able to develop recommendations for changes in the shelving process at the University of Virginia. A report on the project, with recommendations for action, was submitted to the Library's Administrative Council for approval by early June 1999. It is currently available at http://www.lib.virginia.edu/mis/benchmarking.
The last critical step of benchmarking entails using the data collected to make changes in the chosen process. Lack of follow-through will lead to skepticism on the part of the staff about the benchmarking process. It will also have wasted a considerable amount of time and effort.
In the business literature, the reader is often admonished to adopt, not adapt, the best practices process of another business. Once we decided which model we wanted to emulate, however, we found that we could not exactly duplicate the process in all of our libraries simply because of the configuration of physical space that we could not alter. We also decided that it was not necessary to implement the full process in the smaller libraries. These libraries had neither the need nor the resources (staff and funding) to change their processes; they simply did not shelve very many books and typically did so as the books were turned in.
The cornerstone of the new process was to reduce the number of times the book is touched after it is returned. All but one of our libraries could institute two-touch shelving. The book is "touched" when it is checked in and placed, in order, on a book truck at the desk; it is "touched" again when it is placed on the shelf. There is no batching or waiting until a book truck is filled. Once we learned to do two-touch shelving, most of our sorting areas were turned into much needed regular shelving. We also venture that simply paying attention to shelving helped change the perception of its importance and thus how well it is done. It is now recognized as an integral part of delivering materials to users, of customer service, and as such it needs to be done quickly and accurately. The critical point is to ensure that users are accurately directed to materials by the online catalog.
The new process was implemented in five shelving units over two years. While sample measurements were taken periodically in the smaller libraries, these five largest libraries were provided funding to hire additional shelvers and to measure both the turnaround time and accuracy of each book truck. The average turnaround time for each truck was reduced to about four to eight hours-except, of course, at the end of each semester. This regular assessment is part of the new process, so much so that shelving speed and accuracy became one of the metrics for the Library's Balanced Scorecard-a management and assessment technique designed to provide a view of an organization from four perspectives: user, internal processes, financial, and future/learning potential.6
During these two years it became apparent that the original assessment method takes a considerable amount of time and effort. With the state-wide budget cuts, it became imperative to reduce the cost of this program while still providing the improved service to our users. One way to do this is to sample turnaround time and accuracy weekly rather than try to assess, and record data for, each book truck. In addition, we lowered our turnaround time goal from 4 hours to 24 hours-still an improvement over the time we measured during our benchmarking study.
Benchmarking can be a powerful tool for assessing and changing methods and procedures. Learning how our colleagues elsewhere manage similar processes is in itself a fascinating process.
Appendix 1: Charge to the Benchmarking Team
January 22, 1999
Thank you all for agreeing to serve on the University Library's first Benchmarking Team. The purpose of this Team is to fulfill goal 6f of the library's priorities for 1998-2000: Develop and implement performance standards and benchmarks for selected library services.
Benchmarking is an ongoing, systematic process for measuring and comparing the work processes of one organization to those of others that exhibit functional "best practices." The goal is to provide an external standard for measuring the quality and cost of internal processes, and to help identify where there may be opportunities for improvement. Benchmarking should be integrated into operations throughout the organization and should be an ongoing process that analyzes data collected over time. The Library's priorities indicate that it is time to embark on this learning process in order to discover how we can best improve the services, direct or indirect, that we offer to our patrons.
To accomplish this the Team is charged with learning the benchmarking process and applying it to a specific project. The intent is that the members of this Team become the Library's core staff with knowledge of benchmarking. After learning the process, the Team members should be able to:
- assist other groups with their benchmarking projects
- assist in developing benchmarking expertise among other staff members, for example, by participating in a training program.
Each May the membership of the Team will be reviewed. Those who want to remain on the Team will be joined by new members so that the Benchmarking Team can be a constantly renewed central group of experts in the process. New projects will be determined at the same time that membership is reviewed.
The project this Team is charged to undertake is benchmarking the shelving/reshelving process in all University Library service units. The project should include these processes:
- map and measure the current process in each library
- determine benchmarking partners with best practices for the shelving process
- communicate with those partners about their process
- compare their practices with the University Library's process
- recommend improvements in current practices based on the best practices of our benchmarking partners.
Additional staff from stacks operations should be invited to join the Team during this initial benchmarking process. Recommendations should be ready by May 15, 1999.
Appendix 2: Internal Shelving Questionnaire
- If you have a flowchart of your shelving process, please attach a copy.
- How many times is a book handled from the time it is returned to your library until it is shelved? How many additional times is it handled if it is returned to a different library?
- How much time does it take for an item to get from the return desk in your library to the shelf?
- Do you have standards for shelving quantity and quality? If so, what are they? How were they measured? For example: number of books shelved per hour percentage of errors/accuracy rate
- How often are book return drops cleared?
- at desk?
- How often are tables, photocopiers, carrels, shelves, etc. cleared of books left by patrons? What is the process for reshelving the items collected?
- What additional steps, if any, are involved in processing new books for shelving?
- Describe each step of your book sorting process, from check-in to final sorting onto trucks before shelving, and who performs it.
- If you have set due dates, how do you handle the massive returns at those dates?
- Are there areas in which it is difficult to shelve? How do you manage shelving in those areas?
- Are the stacks shelf-read?
- Who normally does the shelving? [See chart below.*]
- Do shelvers have a fixed work schedule?
- How is shelving assigned and supervised?
- What other tasks do shelvers perform in the stacks besides shelving?
- Do shelvers have work assignments outside of the stacks?
- Describe the process of training shelvers and who does the training.
Appendix 3: Questionnaire for Respondents to Listserv Query
- Do you have standards for reshelving already in place? In particular, standards for books shelved per hour, percent shelved without error, turnaround time from check-in to shelf.
- Have you done a shelving study of any kind? If so, what was the focus?
- How many branch libraries are on your campus?
- How many of these branches fall into the 75,000-300,000 volume range?
- How many volumes are in your main library?
- How many items were returned from circulation to your main library last year (1997/98)?
- How many new books were added to your main library collection last year?
- How many items were picked up around the library (from the floor, photocopiers, tables, etc.) and re-shelved last year?
- What is the frequency of these pick-ups (daily, each shift, etc)?
- How many searches for missing materials were requested by patrons at your main library?
- How many FTE stacks employees work in your main library?
- How many FTE student shelvers work in your main library?
- How do you manage massive returns at the end of a term or academic year?
- Would you be willing to host a site visit by a small team from the University of Virginia? (I can provide some details of what we have in mind if you would like, but this would basically be an opportunity to share information face-to-face.)
A few newer items are added to our original bibliography below. There are now many more publications on both benchmarking and shelving, and there are many more indexed that date from the time UVa was conducting its project. A search of Library Literature for both "benchmarking" and "shelving" will yield a more abundant harvest. The Spendolini volume was the one we relied upon most for the process.
Allan, Ferne C., "Benchmarking: practical aspects for information professionals," Special Libraries, v. 84, no. 3 (summer 1993), p. 123-129.
Alstete, Jeffrey W., Benchmarking in higher education; adapting best practices to improve quality, Washington: George Washington University Graduate School of Education and Human Development, 1996.
Anderson, Dawn R., "Method without madness: shelf-reading methods and project management at the University of Illinois," College & Undergraduate Libraries v. 5, no. 1 (1998), p. 1-13.
Buchanan, Holly Shipp, and Joanne Marshall, "Benchmarking reference services: an introduction, its use in TQM programs in health sciences libraries in US and Canada," Medical Reference Services Quarterly, v. 14 (fall 1995), p. 59-73.
Buchanan, Holly S. and Joanne Marshall, "Benchmarking reference services: step-by-step," Medical Reference Services Quarterly, v. 15, no. 1 (spring 1996), p. 1-13.
Camp, Robert C., Benchmarking; the search for industry best practices that lead to superior performance, Milwaukee: ASQC Press, 1989.
Coult, Graham, "Measuring up to the competition," The Library Association Record, v. 98, no. 9 (September 1996), p. 471.
Davis, Robert I., and Roxy A. Davis, How to Prepare for And Conduct a Benchmark Project. The Electronic College of Process Innovation from the Office of the Assistant Secretary of Defense (Command, Control, Communications & Intelligence). Department of Defense, 7/15/94, http://www.c3i.osd.mil/bpr/bprcd/0135.htm.
Edwardy, Jeffrey M., and Jeffrey S. Pontius, "Monitoring book reshelving in libraries using statistical sampling and control charts," Library Resources & Technical Services, v. 45, no. 2 (April 2001), p. 90-4.
Egghe, L., "The amount of actions needed for shelving and reshelving," Library Management, v. 17, no. 1 (1996), p. 18-24.
Finnigan, Jerome P., The manager's guide to benchmarking, San Francisco: Jossey-Bass Publishers, 1996.
Garrod, Penny. "Benchmarking development needs in the LIS sector," Journal of Information Science, v. 23, no. 2 (1997), p. 111-18.
Gohlke, Annette, "Benchmarking Basics for Librarians," Military Librarians Workshop, Dayton, Ohio, Nov., 1997.
Gohlke, Annette, "Benchmarking for strategic performance improvement," Information Outlook (August 1997), p. 22-24.
Hanson, Emil O., "In search of the benchmark institution," College & University, v. 70, no. 3 (spring 1995), p. 14-19.
Henczel, Susan, "Benchmarking-measuring and comparing for continuous improvement," Information Outlook, v. 6, no. 7 (July 2002), p. 12-20.
Johnson, Mary. "Benchmarking an automated circulation control system," in ASIS '85, Knowledge Industry Publs., 1985.
Jurow, Susan, et al, Benchmarking Interlibrary Loan: a pilot project. OMS Occasional Paper #18. Washington: Association of Research Libraries, 1995.
_____ et al, "Tools for measuring and improving performance," Journal of Library Administration, v. 18, nos. 1-2 (1993), p. 113-126.
Kaplan, Nancy, "Performance Measures as Incentives for Redesigning Access and Delivery Services" in Association of Research Libraries Proceedings of the 125th Membership Meeting, October 1994.
Kaufman, Roger and William Stuart, "Beyond conventional benchmarking: integrating ideal visions, strategic planning, reengineering, and quality management," Educational Technology, v. 35, no. 3 (May-June 1995), p. 11-14.
Kelly, Anthony, Benchmarking for school improvement: a practical guide for comparing and improving effectiveness. London: RoutledgeFalmer, 2001.
Kendrick, Curtis L., "Performance measures of shelving accuracy at SUNY, Stony Brook," The Journal of Academic Librarianship, v. 17 (March 1991), p. 16-18.
Lawes, Ann, "The benefits of quality management to the library and information services profession," Special Libraries, v. 84, no. 3 (summer 1993), p. 142-146.
Library Benchmarking International, http://www.librarybenchmarking. com/index.php3.
Meyerson, Joel W., editor, New thinking on higher education: creating a context for change. Bolton, MA: Anker Publishing Co., 1998.
NACUBO Benchmarking Project, http://www.nd.edu/~spbusop/benchmrk/nacubo1.htm.
Northern Territory University Library, "Publications and Projects- Benchmarking," http://www.ntu. edu.au/library/pubbench1. html.
O'Neil, Rosanna M., Total quality management in libraries: a sourcebook, Englewood, CO: Libraries Unlimited, 1994.
Pedersen, Wayne A., "Statistical measures for shelf reading in an academic health sciences center library," Bulletin of the Medical Library Association, v. 77 (April 1989) p. 219-22.
Poling, Nikki, "Ahead or behind the curve-library benchmarking; re-sponses from three special librarians," Information Outlook, v. 6, no. 7 (July 2002), p. 22-5.
Pritchard, Sarah M., "Library benchmarking: old wine in new bottles?" The Journal of Academic Librarianship, v. 21, no. 6 (November 1995), p. 491-495.
Robertson, Margaret, and Isabella Trahn, "Benchmarking academic libraries: an Australian case study," Australian Academic & Research Libraries, v. 28, no. 2 (June 1997), p. 126-41.
Schabo, Pat, and Baculis, Diana Breuer, "Speed and accuracy for shelving. Cedar Rapids Public Library has developed a shelving method and a motivation plan for part-time employees." Library Journal, v. 114 (Oct. 1, 1989), p. 67-8.
Schwartz, Charles A., ed., Restructuring academic libraries: organizational development in the wake of technological change. Chicago: Association of College and Research Libraries, 1997.
Sharp, S. Celine. "A library shelver's performance evaluation as it relates to reshelving accuracy. Brigham Young University study, 1988-89." Collection Management v. 17, no. 1-2 (1992), p. 177-92.
Shaughnessy, Thomas W., "Benchmarking, total quality management, and libraries." Library Management & Administration, v. 7, no. 1 (winter, 1993), p. 7-12.
Sheather, Graeme, and Nolan, Tony, "Solving reshelving backlogs in a university library; a case study using an interactive problem-solving technique with a TQM application at the University of Technology, Sydney," Australian Library Journal, v. 44 (February 1995), p. 27-46.
Smith, Margo, and Melissa A. Laning, "Zen and the art of stacks maintenance: rethinking an ancient practice at the University of Louisville," The Southeastern Librarian, v. 49, no. 3/4 (fall/winter 2001), p. 15-18.
Spendolini, Michael J., The Benchmarking Book. NY: Amacom, 1992.
St. Clair, Gloriana, "Benchmarking and restructuring at Penn State Libraries," in Restructuring Academic Libraries, edited by Charles A. Schwartz, Chicago: Association of College and Research Libraries, 1997.
St. Clair, Guy, "Benchmarking, total quality management, and the learning organization: new management paradigms for the information environment, introduction," Special Libraries, v. 84, no. 3 (summer 1993), p. 120-122.
_____, "Benchmarking, total quality management, and the learning organization: new management paradigms for the information environment, a selected bibliography," Special Libraries, v. 84, no. 3 (summer 1993), p. 155-157.
_____, "The future challenge: management and measurement," Special Libraries, v. 84, no. 3 (summer 1993), p. 151-154.
Stuart, Crit, and Miriam Drake, "TQM in research libraries," Special Libraries, v. 84, no. 3 (summer 1993), p. 131-136.
Tucker, Sue, Benchmarking, a guide for educators. Thousand Oaks, CA: Corwin Press, 1996.
White, Lynda S., "Report on Shelving Process," University of Virginia, May 1999, http://www.lib.virginia.edu/mis/benchmarking/bench-shortrpt.html.
_____, "Report on Benchmarking Process," University of Virginia, May 1999, http://www.lib.virginia.edu/mis/benchmarking/bench-ProcessRept. html.
1 We have since discovered that this is true about 50% of the time. Books for which we receive search requests are found 50.4% of the time in the correct place on the shelf within 24 hours of the request.
2 Management Information Services conducts assessments (surveys, focus groups, etc.) and gathers data for Library management to assist them in making decisions on programs and resources.
3 See Appendix 1 for entire charge.
4 The number of steps varies considerably in the literature, depending on how finely they are broken down.
5 Pick-ups are books that are "picked up" and reshelved by staff after patrons have used them in the library and left them lying on tables, chairs, shelves, photocopiers, sorting areas, reshelving areas, the floor, etc.
6 For more information on our Balanced Scorecard project see http://www.lib.virginia.edu/bsc/metrics.html.