ATG v29 #2 Data Informed and Community Driven: Using Data and Feedback Loops to Manage a Journal Review and Cancellation Project

by Hilary M. Davis  (Head, Collections & Research Strategy, North Carolina State University Libraries) 

and Gregory K. Raschke  (Associate Director for Collections and Scholarly Communication, North Carolina State University Libraries) 

Introduction

Many libraries are familiar with this scenario:  the extent of journal and database price inflation combined with budget cuts necessitates frequent reviews and subsequent cancellations of journals and databases.  The challenge is how to sustain as much quality content as possible while getting through the process without damaging the credibility of the library and maintaining relationships that keep the library front and center as a research and teaching partner.  We have conducted four public reviews since 2005.  In each case, we have learned best practices and strategies for engaging directly with our campus community — students, faculty, and staff — to involve them in difficult decisions that could impact their research and teaching capacity.  In this article, we describe strategies to create data-informed, community-driven feedback loops and communication that fosters deeper engagement with our campus community at all levels.

Context is Everything

Librarians are all too familiar with the need to manage the impact of inflation for scholarly content on collections budgets.  The combination of increasing annual unit costs for journal articles, a steady increase in the volume of articles published each year, and a relative decline in the funding rates for libraries by their home organizations has created a toxic mix of limited funds and increasing costs.  This unsustainable mix periodically necessitates a comprehensive review of journal subscriptions, packages, and licenses.  

The NCSU Libraries has conducted four public reviews since 2005, the most recent being in 2014 (http://www.lib.ncsu.edu/collections/collectionsreview2014).  Creating a serials review plan that resonated with our stakeholders was key to collaborative decision-making and community buy-in.  Maintaining awareness of the priorities of our stakeholders via consistent outreach, liaisonship, and communication conduits (such as institution-wide membership on library committees) provided us with the critical foundation for a plan that resonated with our stakeholders.  Not only did we transparently provide the facts (the collections budget, the cost to sustain existing subscriptions, the dollar amount and the number of journals or databases to be cut, etc.), but we also provided context.  This came in the form of documenting recent serials cuts, and, more importantly, in describing the impact the serials cuts would have on teaching, learning, and research success.  Finally, interactive and transparent displays of data — such as usage, Eigenfactor, impact factor, and local citation behavior — built engagement and credibility with the university community.

During the most recent serials review, the NCSU Libraries illustrated the potential impact of the cuts by focusing on:

(1) a high-profile faculty cluster program that hinges its success on supporting recently hired world-class researchers and on successful interdisciplinary research;  (2) the journals and databases within the research interests of campus leaders that would be cut;  (3) the most important journals (measured by requests to retain and usage) that would not be cut unless further budget reductions were enacted.

We knew that these cancellations decisions were going to have broad impact, so getting as much campus participation and buy-in as possible was critical.  We knew that cutting journals would be the most effective method to get us the furthest in terms of meeting our budget cuts (we cut about $440,000 worth of journals).  Databases were more contentious because it would be harder if not impossible to provide alternate access to database content (i.e., cannot interlibrary loan a database), but we were still able to cut about $130,000 worth of databases.  We aimed to minimize coverage duplication and weigh the value of the indexing of specialized publications against the broad but non-exhaustive coverage of tools such as the Web of Science.  For databases, use data and cost per search are data elements that inform, but have to be taken in context with the scope of the database.  Finally, we cut standing orders and continuation resources which enabled us to save an additional $48,000.

To show the impact on the faculty cluster program and interdisciplinary research at NC State, we created infographics that featured the journals and databases most relevant to those key research areas and, in some cases, included journals in which those faculty most recently published (see Figure 1).  Likewise, we created a memo for campus leaders that listed the journals considered for cancellation that were most relevant to their areas of research interest and in which they had published the majority of their scholarship (see Figure 2).

Select Interdisciplinary Journals Impacting Clusters on Cancellation List

Figure 1. Selection of journals with highest impact for interdisciplinary research at NC State that were considered for cancellation due to the budget cut.

Research Journals Considered for Cancellation with Most Top Priority Votes

Figure 2. Selection of journals noted as most relevant to campus leadership that were considered for cancellation due to the budget cut.

After multiple rounds of collecting campus-wide feedback which included votes on the top priority serials, we presented to our university library committees and oversight group a view of the journals with the highest usage (measured as full-text journal downloads) to illustrate the scale of use that would be limited if and when these journals were cancelled due to the budget cut (see Figure 3).

Most Downloaded Journals Considered for Cancellation

Figure 3. Most downloaded journals that were considered for cancellation due to the budget cut.

Getting our Ducks in a Row

Prior to inviting our campus to provide feedback on what should be cut, we spent approximately four months gathering and reviewing data about our entire portfolio of serials commitments in order to decide which journals and databases would be presented to campus for their feedback.  In addition to basic bibliographic data about the serials, we leveraged a mix of data to make decisions about which resources to propose for cancellation:  usage statistics, NCSU’s publication and citation patterns in our journal subscriptions, feedback from previous serials cuts, holdings across aggregators as well as publishers, and package/bundle dependencies.

All of this took place across a one-year timeline that took into account the academic calendar, the need to provide opportunities for campus to provide feedback at multiple points, and the need to provide our serials vendors timely cancellation decisions (see Table 1).

Table 1.  Summarized timeline of serials review and cancellation.

 

Communication Strategies

As we prepared our communication with campus, we started with our advocates — the University Library Committee and the Library Representatives (for each academic department), both of which are composed of faculty, students, staff, and campus administrators.

We also leveraged campus communication channels including the student newspaper, The Technician, and email distribution lists of the Faculty Senate, campus department heads, deans, and directors.  Our aim was to provide as many venues to make our campus aware of the need to make cuts and to give them opportunities to provide us with their feedback.  In addition to campus communication venues, we launched a website dedicated to all aspects of the review process (http://www.lib.ncsu.edu/collections/collectionsreview2014), accompanied by an email campaign that included internal communication with library staff to ensure consistency in messaging, and one-pagers that showcased paraphrases and quotes from stakeholders.  Finally, we added record-level notes that would appear throughout our discovery systems when any of the journals proposed for cancellation were accessed via our website.

We were cognizant that campus stakeholders needed to have the opportunity to provide feedback across multiple feedback windows.  We structured our process around two main windows of feedback and offered venues for discussion with our Library Representatives group and our University Library Committee.  The feedback we received was diverse — it consisted of “votes” to keep or cancel serials, personal notes reflecting the impact that a cancellation would have on research and teaching, and questions and concerns about the need to cancel serials.  The value of crafting personal responses to these concerns was immeasurable.  Our email replies and face to face conversations had to be empathetic and respectful of differing values and perspectives.  

Most of the concerns about the cuts were directed at library staff.  One lesson we learned is that we needed to find a way to enable a direct channel of feedback from campus stakeholders to campus administrators who oversee allocations to the library collections.  Because we led the campus communication about the budget cuts and the necessary journal and database cancellations, our campus channeled their concerns about increased funding for library collections to us instead of to the Provost and/or Chancellor’s offices.  For that feedback to be more impactful, we attempted to redirect it to the University Library Committee which reports to the Provost.

Data-Informed (not Data-Driven)

Through our dedicated serials review website, we provided multiple data points to campus to consider during the feedback windows including an overview of how to interpret usage statistics, publication and citation patterns, impact factor, formats, and costs (http://www.lib.ncsu.edu/collections/collectionsreview2014/factors).

The publication and citation trends in combination with the usage statistics gave our campus a sense of the impact of serials on our campus community.  It was illustrative for faculty to get a sense of the wide range of uptake of the serials.  The impact factor data was less useful, but we included it since there is a general desire to see it from a segment of the faculty.

Our final decisions to cut or keep serials that were appealed by campus stakeholders were based on a number of factors including but not limited to:  anticipated use by new campus research foci, cost-per-use (CPU), faculty intentions to rely heavily on journals for teaching, multiple appeals from stakeholders, if aggregator access would suffice for the intended use, and if interlibrary loan (ILL) would support demand.  For the CPU metric, we did not institute a strict cut-off across the board, but instead, considered a CPU of $10 or more as unsustainable, as a general guideline that could be weighed against other qualitative and anecdotal feedback.

For the large journal packages (e.g., Wiley, Springer, Elsevier), we took advantage of cancellation allowances where possible as doing so enabled a small amount of flexibility and control over costs.  Likewise, we assessed the value of the smaller packages and bundle dependencies noting that, in some cases, breaking a bundle helped us realize cost savings, while in other cases, it would have cost more to subscribe piecemeal to serials we wanted to keep if we unbundled them.  For one package, we negotiated cutting our spend in half by eliminating journals that were low use, saving money and creating a package of journals more relevant to our stakeholders.

Collecting Stakeholder Feedback

We notified our campus Library Representatives and department heads when the first list of potential cancellations was made available and we invited them to disseminate the information to their colleagues to provide feedback to the Libraries about which serials should be kept.

The list of potential cancellations was presented for review and ranking as an online webform or as a downloadable .csv (comma-separated) file (source code for the webform is available at https://www.lib.ncsu.edu/collections/projects/collectionsreview/source).  This initial list contained approximately 900 serials from all subject areas.  Campus was asked to only review and rank the serials of relevance to their disciplines and areas of interest.  We provided sorting features in the webform (and in the downloadable .csv file) to enable stakeholders to focus their review on their areas of interest.  We used a three-tiered ranking system:  1 – Top Priority, 2 – Medium Priority, 3 – Low Priority.  The webform collected and stored the feedback for our review and analysis (see Figure 4).  

Collection and Journals Review (screenshot)

Figure 4. Screenshot of webform used to collect campus feedback on serials proposed for cancellation.

 

After two windows of feedback from campus stakeholders, we heard from 1,183 people (610 faculty and 471 students, a 26% response rate from faculty and a 5% response rate from graduate students).  The feedback resulted in 10,177 rankings of journals with 644 journals having two or more “Top priority to keep” votes.  Only 52 journals were ranked as “Low priority” or “Medium priority” by campus.

Short- and Long-term Impacts

We shared these results with campus through memos to the University Library Committee and the Library Representatives for each academic department.  In the memo, we explained that the 52 journals ranked as “Low priority” or “Medium priority” would only meet $57,700 of the overall cut.  The 644 journals with multiple “Top priority” votes cost $654,800 in total.  The bottom line was that we would need to cut 62% of the “Top priority” journals in order to meet the budget target.  A further impact that we shared was that, as a result of these deep cuts, we anticipated that we would need to support increased interlibrary loans (ILLs) for the cancelled journals at a cost of $10-$30 per article.

We ended up cutting 626 journals, 30 databases, and over 130 standing orders and continuations in order to meet the needed cut to the collections budget.  Since the cuts were made, we have monitored ILL requests for the cancelled serials on a quarterly basis to determine if we need to reinstate highly-requested serials (as long as we have funding to do so).  To date, we have reinstated approximately 30 journals and one database.  Of the journals that were cancelled, 10-12% were requested via ILL.  We joined the RAPID ILL network, and nearly all of the requests for cancelled journals were fulfilled via RapidILL service at no additional charge, vastly mitigating the impact of the cuts.

After the cancellations were fully enacted, our University Library Committee encouraged us to document and share data on where our library is positioned amongst our peers along with the impact of inflationary increases.  The NCSU Libraries remains in the bottom 1/3 of our peer group for collections expenditures, and an additional $2.7 million below the average of our peers (see Figure 5).  We noted that future support is vital to managing annual inflationary cost increases (5-7% for journals and databases, and 3-5% for books) while sustaining resources to support growing research programs and faculty hiring.

NCSU Libraries and NC State Peer Group's Collections Expenditures

Figure 5.  Comparison of NCSU Libraries collection expenditures compared to peers and estimates of funding needed to bring our collections allocation to the average and median of these peers.
Pointing to the Future

The unsustainable mix of per article cost increases, increasing number of articles published, and a relative decline in library funding against their home organization budgets shows no sign of abating in the near-term.  Add to that the broadening conception of the collections budget as a potential source to support non-traditional items such as digital scholarly communication tools, funding open scholarship, software applications, digital preservation, and organizational memberships — and the reality of ongoing pressure on collections budgets is clear.  As a result, the need for ongoing collections analytics to maximize the efficiency of collections expenditures and periodic large-scale reviews along the lines of that described in this article will continue.  Creative budgeting and advanced collections analytics only serve to mitigate the problem while multiple communities in the scholarly communication ecosystem search for systemic solutions.  Solutions that support creating a more elastic market, where price per unit and publication volume are both contained, offer long-term relief from the need for periodic reviews.  While hopeful that such long-term solutions can develop, we support medium-term efforts, such as evidence-based pricing and the Pay It Forward Project,1 to create more responsive pricing models.  We also intend to sustain investments in leading-edge collections analytics to position the NCSU Libraries to leverage emerging pricing models and prepare for future reviews.  

Endnotes

  1.  “Findings and Other News from the Pay-It-Forward Project,” http://icis.ucdavis.edu/?page_id=713.  Last viewed January 18, 2017.

 

Pin It

Leave a Reply