v28 #5 Being Earnest with Collections — Let’s Get In Formation: Standardized Data Review for eResource Management

by Kelli Getz  (Assistant Head of Acquisitions, University of Houston)

and Lindsay Cronk  (Coordinator of Online Resources and Collections, University of Houston)

Column Editor:  Michael A. Arthur  (Associate Professor, Head, Resource Acquisition and Discovery, University of Alabama Libraries, Box 870266, Tuscaloosa, AL  35487;  Phone:  205-348-1493;  Fax:  205-348-6358)

Column Editor’s Note:  I met Kelli Getz (University of Houston) at the North American Serials Interest Group (NASIG) Annual Conference in 2008.  As a fellow graduate of Indiana University we have collaborated on several projects since that first meeting.  It has been rewarding to watch her success in Houston, and her various professional contributions to the NASIG, and the Association for Library Collections & Technical Services.  Knowing of her knowledge and career success, I was not surprised to learn that Kelli played a key role in developing the criteria and procedures for a thorough and ongoing review of e-resources at the University of HoustonATG readers will benefit from her experience and the hard work she put into this project.  Efforts that resulted in development of an effective framework for the review of existing e-resources.

The current process at the University of Houston is in place to ensure that e-resources are effectively meeting the needs of library users in a cost efficient manner.  I am pleased that Lindsay Cronk (University of Houston) has joined Kelli on this article because working together they have developed a process that established new procedures, proved successful with selectors, and is designed to be a recurring aspect of collection analysis.  While I have not had the opportunity to work with Lindsay, a cursory evaluation of her professional experiences and contributions makes it clear why Kelli collaborated with her on this article.

Lindsay’s holistic understanding of collections is in no small part due to her background working in library services consortia, where she considered herself a conduit for communication, negotiation, and mutual understanding between small academic libraries and library vendors.  Lindsay is an innovator and boundary-pusher, applying technologies and analysis to the practice of collections in her work at the University of Houston.  She has an active interest in developing a community of practice in collection data visualization, and blogs on that topic and others in her role as Editor of the LITA Blog.

I hope that ATG readers will find in this article a few best practices for establishing criteria and managing an ongoing review of e-resources.  This project came to my attention during similar discussions at The University of Alabama.  We are in the early stages of establishing a systematic way to measure the impact of our existing e-resources and having best practices that are tested to serve as a guide will help us as we move forward with our own review. — MA

Faculty and researcher needs are transforming and collections processes must keep pace.  Consider the increasingly interdisciplinary research currently being produced between the sciences and humanities for example, from nutrigenomics (the study of the complex interplay between food and genetic expression) to cliodynamics (study that combines economic history with macropsychology and mathematical modeling).  In the face of shifting researcher needs and perhaps more importantly, research methods, legacy eresource renewals must be critically examined, questioned, and justified or rejected.

As always, as librarians, we need to reconfirm our commitment to supporting research and delivering collections as a service.  An ongoing database review process synchronizes assessment and the eresource lifecycle, informing collection strategy while enhancing service efficiency.

Larger university libraries may find, as we did at University of Houston, that the task of reviewing the full slate of licensed eresources can appear to be a challenging prospect from the standpoint of scalability.  With hundreds of databases renewing in different months of the year, we sought to match the database review process with the acquisitions calendar and to further incorporate the process into preset monthly collections meetings.  By implementing a new process in an established framework, the database review process was easily applied and adopted by collection stakeholders.

This integration further helped to inform what has become a manageable but meaningful review system which lines up yearly usage information with renewal costs and presents long-term usage trends.  In developing a clear framework for evaluation, the database review process provides an ongoing mechanism for the assessment of the online database collection, in keeping with best practices for eresource management.  The database review process has improved collections agility by providing flexibility for cancellations and more excitingly, substitutions.

To clearly detail the database review process, context is critical.  Each month, University of Houston library collections stakeholders meet.  During meetings, participants review product trials, report on assessment projects, discuss collection needs in terms of both development and management, and participate in the monthly database review process.  Members of this Collection Management Committee (CMC) include an assortment of representatives of multiple departments in both public services and technical services roles.  CMC provides a standing monthly appointment for these representatives to convene around collections holistically, maintaining a necessary platform for collection-centric interdepartmental collaboration and dialogue.

For the health and well-being of a research collection, attention and cultivation are critical.  While more robust and nuanced assessment is pursued in coordinated projects by groups of CMC members, the database review process is meant to provide a set of simple metrics for at-a-glance review.  To this end, as previously described, the current process has focused on providing both the annual cost and usage information, with defined source reporting and notations for non-COUNTER compliant resources.  In addition to these numbers, usage trends are noted when statistically significant.  The finished database review document is a single page spreadsheet — front and back in heavier renewal months — with each renewal listed as a row, accompanied by columns of current and historic cost information, usage information as is available, and usage trend information when it is statistically significant.  It is distributed as a print-out and also circulated online through a collaborative web-based project management site.  The standardized format of the review provides CMC participants with the opportunity to compare resources within disciplines and subject area groups.

The database review document is provided to CMC members for initial review four months prior to renewal, insuring time for both reviewer investigation and acquisitions decision-making.  The database review document is circulated once more at the following CMC meeting, three months prior to renewal.  This aspect of the process can serve to relieve the stress of deselection, providing time to communicate with faculty and researchers well in advance of proposed changes, an opportunity for outreach and engagement.

While in-person discussion occurs in the context of CMC meetings, ongoing review questions and thoughts are also shared and documented on the group website.  Through the group website and the review process, collection decision-making activities have been enhanced in terms of efficiency, transparency, and accountability.  In place of a previous system of automatic renewal is one of automatic review, and the database review process itself is subject to ongoing tweaks, improvements, and discussion.

Ultimately, the database review’s success is rooted in its simplicity and incorporation into preexisting processes.  It represents a logical and straightforward extension of eresource collection management.  As libraries continue to grow collections and cultivate collection-centric outreach, a simple system for on-the-spot evaluation can be a powerful tool to enhance efficiency.  This is particularly true in larger libraries with sizable database collections.  In seizing the opportunity to develop an integrated database review system, librarians can create a collection culture that embraces rigorous evaluation without overcomplicating existing processes.

In conjunction with the ongoing database review, collection stakeholders at the University of Houston Libraries participate in an annual serials review project.  The purpose of the project is to review thousands of individual journal titles for continued inclusion in the collection.  Demand continues to outpace funding, so data-driven decision making has become integral to the review process.  While the review process has strategically evolved over the past decade, the mission of delivering collections as service remains constant.

Reviewing journals titles is more essential than ever as lean budgets force librarians to justify each dollar spent.  At the University of Houston Libraries, over half of the collections budget is dedicated to serials, and it is imperative as stewards of the collection that each title is critically examined for continued relevance.  As university priorities and interests change each year, so must the serials collection.

Introducing the new annual serials review project nearly a decade ago was not easy.  It took several years for the project to become embedded in the collections culture and part of the annual collections calendar.  Initial reviewer feedback highlighted the sheer enormity of titles to be reviewed, in our case, over 8,000 titles.  In response, new collection development policies were enacted to reduce the number of reviewed titles to approximately 3,500.  Removing titles that could not be deselected, such as those titles wrapped up in big packages, made a noticeable reduction of the title list.  Additionally, removing inactive gift and exchange titles also further reduced the title list to a more manageable number.

While strictly adding or cancelling titles is familiar, the focus of most review projects has been substitutions of low-use titles for titles in new or evolving areas of campus research.  During the serials review project, statistics are critical.  Statistics provide a starting-point for public services librarians to begin working with their faculty on reviewing their subject area title list.  Usage statistics are the most popular tool when making data-driven decisions for deselection, followed by cost-per-use and price increases.  Besides talking with faculty, other ways of identifying titles for inclusion in the collection include assessing interlibrary loan borrowing statistics, publisher journal denial or turnaway reports, and reviewing the top journals in each Impact Factor area.  Each selector has his/her own method of reviewing the current journals title list, and the most common method is to set review thresholds when reviewing titles for substitution.  In many cases, a selector may limit titles to review to those that fall above a set cost-per-use figure or below a set usage figure.  A top twenty list of recommended titles for potential inclusion in the collection is created for each subject area, thus focusing attention on the titles with the greatest usage potential.

A benefit of conducting an annual serials review project is that both librarians and faculty are habituated to reviewing titles at the same time of the year, whether the goal is to cancel, add, or substitute titles.  Technical services librarians gather and assess statistics early in the calendar year, with the assumption that the review will begin in March.  Public services librarians and faculty expect to have the latter half of the spring semester to review titles and discuss changes to their title lists for the following year.  During the summer, technical services works on ordering, cancelling, and licensing the titles with the serials subscription service provider.  Title lists are finalized by the end of August, and all changes take effect with January of the following year.

As with all interdepartmental projects, effective and frequent communication is vital.  Monthly CMC meetings serve as a venue for in-person stakeholder discussions and training.

Beyond CMC, the primary platform for serials review documentation and statistics is located on a page on the library’s intranet site dedicated to the review.  Statistics, title lists, submission forms, and links to past review projects are all featured on the site.  A blog dedicated to the serials review is the primary method for technical services to inform public services of newly uploaded statistics on the intranet site, changes in deadlines, or other pertinent serials related information.  Internal technical services discussions regarding licensing and technical requirements is documented on a group site dedicated to technical services communications.

The annual serials review project experiences constant improvements based on participant feedback.  One major concern last year was that faculty who begin their tenure at the university usually arrive on-campus in August, which traditionally has been too late to participate in the review project for upcoming changes for the following year.  A special review was held in October where new faculty could select titles to add to the collection from a preselected list of publishers.  With a shortened timeline, a preselected list of publishers with which we had existing licenses was essential for licensing to be completed before the subscriptions began the following January.  This project garnered praise from new faculty, and the intention is to continue this special review for new faculty.

It is challenging to commit to review as a default, but through the implementation of both the database review process and the serials review process, it is assured that decisions are defensible and data-informed.  Documentation of these processes creates opportunities for internal assessment of collection efficiency, delivery time, and ROI (return on investment) as was never possible previously.  Both the database review and the annual serials review project offer openings to actively cultivate and assess the collection holistically.  The success of these processes is entirely dependent on interdepartmental collaborations, a culture of assessment, and a commitment to delivering collections as a service.  If this course of action, ongoing review, is occasionally challenging or demanding, the rewards are worth the work.

 

Pin It

Comments are closed.