by Cris Ferguson (Director of Technical Services, 222 Waterfield Library, Murray State University, Murray, KY 42071; Phone: 270-809-5607) <[email protected]>
A library considers a myriad of factors when undertaking a monographic deselection project. The need for space, institutional priorities, and the obsolescence of materials all play a role in determining what and how much to remove from the collection. Whether items are being withdrawn or simply stored in an off-site facility, the criteria factoring into the decision as to whether to keep a particular item could include circulation and in-library use data; reviews and authoritative title lists; availability of the title in eBook archives like the HathiTrust; how widely (or scarcely) the title is held at other libraries; and the availability of the item through interlibrary loan or possibly a shared print archive.
Given that much of this information is freely available, it is not surprising many libraries opt to gather the data for deselection projects on their own, pulling circulation data from their OPACs, searching WorldCat for holdings in other libraries, examining reviews, and investigating online availability for titles under consideration for weeding. However, compiling data from these disparate sources into a single interface and generating functional reports requires a significant investment of time and manpower. I would argue that this manual investigation is often inadequate and the cost in terms of the staff time required is simply too high.
A rules-based approach to weeding utilizing a collection analysis tool offers a practical alternative to this time consuming investigation and title-by-title analysis. Collection analysis tools bring together several data points under one umbrella, streamlining the data gathering and simplifying the analysis process, providing tangible benefits for a library. Establishing rules-based weeding criteria alleviates the subjectivity of the collection analysis and speeds up the deselection process. Overall, this approach is more time efficient, expedites overlap and gap analysis within the collection, and facilitates batch processing both of records and materials.
Some examples of collection analysis tools available, both commercial and open source, include OCLC’s WorldShare Collection Evaluation (formerly known as WorldCat Collection Analysis), Sustainable Collection Services, Bowker’s Book Analysis System, Intota Assessment, GIST Gift and Deselection Manager, INN-Reach Union Catalog, and ProQuest Title Matching Fast. While these systems vary in their services and functionality, each is designed to help librarians assess their holdings for both collection development and deselection purposes.1
It is important to note that, as a profession, we are collectively responsible for archiving materials and preserving access to information, but, with improved print resource sharing and online availability, it is not necessary that every library retain a copy of every book. To ensure that libraries are not all weeding the same titles and that content is archived both in print and electronically, it is necessary to compare our holdings with other libraries and repositories and to analyze where there are overlaps or gaps in our collections. We must have some knowledge of what is held by peer institutions, what may be available in shared print archives, and what is available electronically, before we can decide what we can remove from our own collections.
While we can certainly export our holdings and the accompanying circulation statistics from our catalogs, the work involved in aggregating our own data with data from other libraries, WorldCat, or the HathiTrust is not insignificant, requiring batch processes for both retrieval and matching.2 In a 2014 article in the Journal of Library Administration, George Machovec points out that, while manual comparison of title lists is certainly possible, it requires the expense of substantial effort and time.3 Machovec goes on to say, “Except for projects that are small in scope, it is worthwhile investigating commercial and open source tools for monographic and serial overlap and gap analysis.”4 Collection analysis tools are specifically designed to help libraries navigate this type of large-scale analysis by normalizing data, matching data points, and producing institution-level reports.
One of the challenges libraries face in the deselection process is establishing objective measures for making reasonable and unbiased decisions. A rules-based approach to deselection used in conjunction with a collection analysis tool can streamline the decision-making process. When using a rules-based approach to deselection, libraries “define categories of books that could be withdrawn without title-by-title review, enabling a batch approach to some weeding decisions.”5 Establishing and adhering to clearly defined rules for what should be weeded and what should be kept, it reduces, if not eliminates, time consuming title-by-title analysis.
In anticipation of a new library as well as the implementation of an automated storage and retrieval system, Grand Valley State University (GVSU) undertook a weeding project in 2009. Working with Sustainable Collection Services (SCS), GVSU established a set of criteria and used those to generate lists of potential weeding candidates. Julie Garrison, Associate Dean of Research and Instructional Services at GVSU, cited the ability to look at their collections through many lenses and quickly identifying things that were widely held, but hadn’t been circulated, as two of the benefits of using SCS. “With this project, the assumption was that if a book was a withdrawal candidate then it should be withdrawn unless there was a reason to keep the book. The library had used this method in the past on smaller weeding projects and found it increased the yield and seemed to reduce librarian anxiety.”6
Removing more than 30,000 books over the course of a few summer months, GVSU made several important decisions that helped streamline their project: librarians were required to provide a rationale for every book that was retained, and physical review was not performed for every item that was withdrawn. By basing their weeding decisions around data and pre-defined rules, GVSU was able to save time and improve consistency in their deselection.
Libraries strive to make the most effective use of their spaces, and, as part of that effort, it is of the utmost importance to make educated, unbiased, and timely decisions about our collections. Use of a collection analysis tool in conjunction with a rules-based approach to weeding offers libraries an alternative to manual data gathering and title-by-title analysis. Collection analysis tools can expedite overlap and gap analysis, facilitate batch processing of both records and materials, and ultimately speed up the deselection process. Libraries embarking on a large-scale deselection project would be well-served by a collection analysis tool and the implementation of a rules-based decision-making process.
- For more information on individual products and the services they provide, I would suggest George Machovec’s 2014 article in the Journal of Library Administration. Cited below, Machovec’s article provides information regarding the services provided by several of the products mentioned in this article, as well as some comments on the strengths and weaknesses of each.
- Rick Lugg. “Data-driven deselection for monographs: a rules-based approach to weeding, storage, and shared print decisions.” Insights 25, no. 2 (July 2012): 203.
- George Machovec. “Shared Print Archiving – Analysis Tools.” Journal of Library Administration 54, no. 1 (2014): 67.
- Rick Lugg and Ruth Fischer. “Future Tense — The Disapproval Plan: Rules-Based Weeding & Storage Decisions,” Against the Grain v.20#6 (December 2008 – January 2009): 74.
- Doug Way and Julie Garrison. “Developing and implementing a disapproval plan.” College and Research Libraries News 74, no. 6 (June 2013): 285.