As publishers shift their business strategies to meet higher education’s open access priorities, universities must continually re-assess the extent to which their readers still require access to content behind paywalls, and by extension, whether the bundled subscription packages that provided a discount to that content still constitute a “big deal.” Understanding the costs of these subscriptions to institutions relative to the benefits to its readers is made complicated by the uneven pace of open access uptake across disciplines as well as variations in how long older publications remain essential to research in different disciplines. And, while much attention has focused on the growing business of open access agreements, as Rick Anderson recently reminded us in the Scholarly Kitchen, new “big deal”-style subscription arrangements are still being made between university systems and publishers.

How can libraries balance the needs of their university’s researchers, both as readers and authors, when analyzing the costs and benefits of various agreements with publishers?

In 2020-2021 Ithaka S+R conducted a novel study in partnership with 11 academic libraries to understand the effects of bundled journal package cancellations on researchers. Since then there continue to be important efforts to collect new and different forms of evidence to create a detailed picture of how researcher perceptions and behaviors can be factored into library budgetary decision making around big deals. The Charleston Library Conference recently featured two such efforts at Georgia Southern University and Cornell University. Researchers Jesse Koennecke (Director of Acquisitions and E-Resource Licensing, Cornell), Kizer Walker (Director of Collections, Cornell), and Jessica Rigg (Acquisitions Librarian, Georgia Southern) recently shared more information about these projects with us.

What motivated your research?

Georgia Southern: In the spring of 2021, the dean of the university libraries received a charge from the president of the university to cut $300,000 from the libraries’ budget. Because we wanted to make decisions that would have the smallest impact on faculty and students, we decided to focus on adjusting our electronic resource expenditures instead of making cuts to personnel and services. After carefully reviewing all of the libraries’ collections, we decided that by canceling our “big deal” packages and subscribing to the most highly used individual journal titles from those publishers, we could meet our money-saving goal without greatly disrupting scholarship and teaching.

Cornell: We moved away from a “big deal”-style arrangement with one of the large commercial publishers four years ago and shifted to individual subscriptions for a subset of the titles in the big package. We wanted to understand the impact of these changes on Cornell researchers and to improve our ability to make future changes in ways that meet researchers’ needs. Our study, which is still in progress by the way, involves analysis of turnaway, interlibrary loan, and historical usage data, as well as interviews with researchers who we know have recently published in the canceled titles, cited them, or requested them via interlibrary loan, so the interviews were quite targeted.

It is important for institutions to both understand broader trends in research behavior and perceptions alongside the specific conditions faced by researchers at its own institution. What is especially illuminating when you put the findings from the earlier Ithaka S+R study in dialogue with what you are finding through your own research?

Georgia Southern: Our experience supports the Ithaka S+R study’s finding of faculty “making do” with the resources available to them after we broke up our big deals. We also have not received any reports of faculty research being significantly impacted by these subscription changes.

Cornell: Cornell’s findings, too, hew pretty closely to the Ithaka S+R results. By and large, the researchers we spoke with indicated they accept that the library can’t afford to subscribe to everything and described ways of “making do,” as the Ithaka S+R report puts it. About half of the Cornell researchers we interviewed mentioned using interlibrary loan (ILL) for articles in journals for which the library does not provide immediate access (a rate of ILL use that is quite a bit higher than the Ithaka S+R study documented) and about a quarter volunteered that they are comfortable waiting for delayed access. But more than anything else, interviewees said they look for a copy of the article they need online. For many, Google Scholar was a starting place for discovery for content subscribed by the library—and also where they turn, along with ResearchGate, Academia.edu, and others to try to find publicly available PDFs to non-subscribed content. A fair number of researchers take advantage of personal networks: about a third request a copy from a colleague at another institution that does subscribe, and even more reach out to the author directly. Of the 24 researchers we spoke with, only three said they drop articles of interest when they are not immediately available via a library subscription.

Understanding researcher perceptions and behaviors towards making concrete budgetary decisions is incredibly complicated. Tell us more about what methodological approaches you selected, and why.

Georgia Southern: We tried to balance the quantitative data we could collect about the real usage of our resources with the qualitative feedback we received from departmental faculty about the importance of those resources. Time and time again, we discovered that journals that faculty felt they had to have to participate in their field were not actually being used at all, were being used rarely, or were being used via aggregator databases such as those offered by EBSCO or ProQuest and not via the publishers’ sites. The Use Adjusted Score metric developed during this project gave us a way to rank the individual journal titles under consideration so that while titles that faculty determined to be “essential” or  “important” were given more weight than titles that were merely “desirable,” titles considered “essential” that showed little or no usage on the COUNTER usage reports would be moved down the ranked list.

Cornell: We started by comparing a couple years of Cornell’s historical usage of the journals we dropped when we left the big deal with attempts to access those same titles (i.e. turnaways) in the two years following the change. And we compared that with some preliminary data about interlibrary loan requests for articles from those journals. The initial data showed significantly fewer turnaways in two years than historical uses two years prior and many fewer interlibrary loan requests than turnaways. In other words, the quantitative data suggested that the shift away from the big deal package might be having a relatively small impact on researchers. We’re cognizant, though, that the changes to our journal package with the publisher in question are still fairly recent, and we realize, as the Ithaka S+R study also points out, that less and less current content from these titles will be immediately accessible over time, and this may result in a growing proportion of turnaway experiences. In any case, we wanted to understand our patrons’ decision making behind the numbers we were seeing. We wanted to conduct structured interviews and record the responses, and for this we needed to work through Cornell’s Institutional Review Board for human subject research. We developed a protocol that was approved by the IRB. The twelve library colleagues working with the interview material were required to get IRB training and certification.

We wanted to talk with Cornell researchers who had a significant relationship with the titles we had dropped from the publisher package. We developed a pool of potential interviewees who had either recently published in one of the journals, cited one of the journals in a recent publication elsewhere, or requested one of the non-subscribed titles via ILL. We excluded undergraduate authors, but all other authors affiliated with Cornell’s Ithaca campus—faculty, graduate students, postdocs, and academic research staff—were included in the pool. We sent 309 email invitations and this resulted in scheduling 24 Zoom interviews. Two members of the study team met with each researcher, with one team member asking most of the questions and the other taking notes. With the participants’ permission, we recorded the Zoom interviews and also retained a Zoom-generated transcript for each. The materials are kept in a secure folder and will be anonymized or destroyed at the end of the study period.

The interviews were completed last spring and, over the summer, each member of the research team reviewed each interview, identified themes in the conversations, and counted instances of the themes. Then we worked together to consolidate and describe the themes.

These kinds of assessment projects can take considerable labor and expertise. How did you structure your research teams, and what did you learn about what works best for doing this work along the way?

Georgia Southern: When we began the project, a few library faculty were called together to the dean and given the charge to evaluate the libraries’ collections and determine what cuts to our subscriptions would have the least impact on faculty research and teaching. Throughout the project this small group remained the main team, but we pulled in our colleagues on an ad hoc basis. Most of the initial work for our assessment project was performed by the Collection Services Department (CSD). Jeff Mortimore, the Discovery Services Librarian, who was also serving as the Interim Department Head of CSD at the time, was the lead on the project, and he in consultation with the dean developed the timeline and methodology for the project as well as our communication strategy. As the Acquisitions librarian, I played a major role in assembling all of the information, and the Acquisitions staff assistant helped me collect usage data. At this point, all of the parties involved were performing tasks that were familiar to them. However, we had a short period of time to compile a large amount of data, and other members of the CSD staff had to be pulled in, including staff who do not usually work with electronic resources or Acquisitions. We brought in cataloging and institutional repository staff to help with the Overlap Analysis phase of the project, as well as CSD staff who work on our other campus.

Once the overlap analyses were completed, our colleagues were free to return to their usual tasks, and the Liaison librarians were pulled in to complete Liaison rubrics evaluating each resource to determine which resources would continue to the next step in the assessment project. After the feedback request forms were sent to departmental faculty, completed, and returned, the original team that received the charge from the dean compiled the feedback, ranked the title lists, and reported back about the findings. While a small team was involved in every step of the process, we would not have been able to complete this project without the aid of the two larger teams who completed specific tasks—CSD staff who performed overlap analyses and Liaison librarians who completed the Liaison rubrics.

Cornell: The study team is comprised of 12 Cornell Library colleagues, including collection development and public service librarians covering various disciplines, the head of Interlibrary Services, the lead on our Collections Data Analysis group, our director of Assessment and Planning, Jesse as director of E-Resource Licensing, and me. We expanded the group from a smaller core when we moved into the interview phase. There has been some division of labor—some members of the group are particularly adept at working with quantitative data, others with qualitative data analysis, etc.—but almost everyone on the study team was involved in the interview process and, as we said, everyone working on the study was required to complete human subject research training with Cornell’s Institutional Review Board.

Now that the interviews are complete and we have presented some preliminary findings (at the Charleston Conference and elsewhere), we are setting our sights on more detailed and systematic work with the data toward a formal paper that we hope to submit for publication in the summer. So far, almost every member of the study team has expressed interest in continuing with this new phase of work on the project.

How has your research impacted decision making at your libraries?

Georgia Southern: We have learned that communicating with faculty is the most important part of the process. Assuring them that they are not completely losing access to the resources they need but are instead freeing up funds to safeguard the resources they need the most has been the most crucial part of this process. They might be losing immediate access, but in most cases we can help them find the same resource after a brief wait or find a replacement resource that will meet their needs.

Cornell: Well, our primary goal in conducting the study has been to inform Cornell Library’s approach to future licensing decisions. As we have been reviewing other big deal packages recently, we have generally felt more confident that some reduction to immediate access may not have as significant an impact on researchers as we had previously thought. This has opened up potential paths that we had not seriously considered before. And actually, this goes beyond the question of mediated and unmediated access to journals—we have learned a lot about how local researchers discover content, how they interact with journals in their fields, how they think about journal reputation, as well as their concerns about the unsustainability of the prevailing academic publishing model. We’ll be exploring possible service improvements to reduce barriers to access for content Cornell researchers need.

How can those interested in your research learn more?

Georgia Southern: We have a public-facing website for our Assessment Project that everyone is welcome to explore. My colleague Jeff Mortimore (jmortimore@georgiasouthern.edu) and I (jminihan@georgiasouthern.edu) are also available to respond to any questions via email.

Cornell: As we said, our intention is to write up the study for publication and, if all goes well, we expect an article about Cornell’s experience will be available in late 2023. In the meantime, we are both happy to discuss the work further. You can reach Kizer at kw33@cornell.edu and Jesse at jtk1@cornell.edu.