v27 #5 STL: A Publisher’s Perspective

by Rebecca Seger  (Senior Director, Institutional Sales, Oxford University Press)

and Lenny Allen  (Director, Institutional Accounts, Oxford University Press)

 

“I saw the crescent, but you saw the whole of the moon.” — The Water Boys

 

The introduction of Short-Term Loan (STL) into the current range of models available for the access of electronic content has been the cause of much discussion during the past couple of years and, dare we say it, some contention in what is otherwise a generally cordial area of mutual mission-based endeavor.  A full year following the introduction of changes to the rate structure of STL, there remain questions about the model and, based on direct conversation with numerous librarians across the country, a lack of complete clarity as to how and why this model has impacted the publishing of the scholarly monograph.

This presents us with an excellent opportunity, as publishers, to peel back the curtain, as it were, and look at the current monograph publishing model, how it developed, and how new and evolving models not limited to STL are impacting it.  Certainly much has been written about the current state of academic publishing, and the pages of ATG alone carry many such articles.  And while there are many areas of academic publishing, as the scope of both for- and not-for-profit presses is vast, we are here limiting ourselves to discussion of the scholarly monograph alone.

The scholarly monograph, in its current format, has existed for hundreds of years.  As the peer-reviewed output of an academic press, many of the costs associated with producing it, whether in print or as an eBook, have been in place for nearly as long.

What do those costs look like and what do they cover?  Well, for an average monograph of 336 pages, with a trim size of 6-1/8 x 9-1/4, the average up-front cost to a publisher is approximately $10,000.  There are relatively wide variations, of course, depending on the profile of the author, permissions, number of images, etc., but for the purposes of illustrating the business model, let’s continue with this particular specification.

These costs are made up primarily of expenses for copyediting, page composition, proofreading, and the author’s advance against royalties.  This may surprise many readers, but these costs do not go away in the digital world.  The actual “PPB” — paper, printing, and binding — only make up one-third of the costs.  And you can imagine the buying power that OUP has compared to many smaller presses — even a 10% increase on those costs can be significant.

On top of that are indirect costs that you may not even think about, warehousing for one, in both physical and digital formats.  There is a cost associated with warehousing a digital object for eternity, and in fact, with the variety of different platforms, publishers actually have to produce multiple forms of digital objects.  At Oxford we are producing XML for our own platform, UPDF for the institutional eBook aggregators, and epub3 for the consumer eBooks sellers.  All of this requires resources in order to have the processes, the people, and the third-party vendors needed to create all these formats.

We also need metadata warehousing and distribution.  If we wish to remain viable, we are all now required to send our metadata to discoverability services, and to the eBook aggregators and suppliers.  On the sales side we have to manage the relationships with the resellers and work closely with the library community to ensure our business models and our content and our services are meeting the needs of libraries and their patrons.

Our marketing team has to work closely with authors and ensure that those in the discipline are made aware of new titles of relevance to them.  And this is far more important than ever before;  if we are to be reliant on demand from users to drive purchasing, we have to make sure they know it’s available, and what it’s about.  We work very closely with the author of every single book, which at OUP must be approved by the Delegates to the press, senior scholars around the world who are tasked with the simple mission of ensuring OUP is publishing scholarship of the highest quality.  And royalties management — while royalties on 500 copy sales may not make a significant difference in the life of an academic, it does have to be managed and we have a responsibility to the author to continue to manage that payment as long as a work continues to sell, and there is a cost in stewarding that.  And when you produce a few thousand titles a year, or even a few hundred, that number rapidly increases.

And let’s be very clear:  not a single one of these publishing functions has or is likely to disappear in the digital era.

So what does the profile of a typical academic monograph look like?  Though the number has been slowly declining for years — again, see any one of numerous recent articles in these very pages — the lifetime sales of a monograph range from roughly 350 to 700 units on the very high and, we might add, rare end.  A full 75-80% of those sales occur in the first year.  These are not considered profitable titles.  For that, university presses rely on the course adopted titles or the ones that end up getting a healthy “trade/consumer” profile.  We need those to support the otherwise low margin monographs.

In the past, we’ve had the predictability of approval plans to help guide our decisions.  We knew we had a high percentage of those few hundred sales that would go through approval plans, and we could predict it by discipline.  It helped to remove the risk of book publishing, which is very different than journal publishing, in that we are laying our investment out on the book with no guarantee of sales.  Approval plans, while in no way guaranteeing the sale of any given title, certainly helped to make the sale of monographs more consistent and predictable.

In the old world, our profile for an individual title would include, on the profit and loss statement, many of the costs noted above.  Hopefully, if we’ve done everything right, we earn back the majority of those upfront costs.  If not, we take a hit on the bottom line for money we’ve invested that hasn’t been returned — because anything  invested that hasn’t been earned back immediately is a loss on our profitability and our bottom line, until it earns itself back.  “Unearned royalties,” where we haven’t yet sold enough books to cover the advance that we have paid our authors, are a very significant negative against our annual performance.  In the DDA world, there is zero predictability, and we have no sense of how long it’s going to take us to recoup our costs.  And while we are completely understanding of the reasons for this business model, it does introduce risk beyond anything we’ve seen before.  In short, with the old business model we had a high degree of predictability in costs and sales, and now, with newly developed and evolving models, the only predictable thing remaining is our costs.

We have, by and large as a community, understood  the need for libraries to focus spending on what gets used in libraries, versus the old world of speculative buying that was designed to ease the burden of acquisition for libraries in selecting books from thousands of publishers.   We all know that a print monograph is not the heaviest used item in the library.  Why is that?  Why do 30-40% of newly acquired monographs not circulate in libraries, according to numerous studies we have all seen?  One key factor is that this content has been locked away in a container on the shelves of libraries, the vast information contained within discoverable only by a few mechanisms — the limited amount of metadata in your online catalogs (MARC record, author, publication year, LC classes, and so on), the recommendation of faculty or librarians, book reviews, citations, reading lists, or serendipity while browsing where it sits on the shelves of the library.  How could potential readers ever know, for example, that that monograph on farming in the 20th century had important information about the Irish famine in a particular chapter?  They simply couldn’t — and thus its utility was limited.  But that doesn’t mean that it couldn’t have been more useful.  It was just tucked away where the reader couldn’t be expected to find it.

Enter the digital era, and we’re presented with vast new opportunities to discover precisely relevant content.  We are bullish about the opportunities presented to actually help long-form scholarship survive and indeed thrive in a world where we can far better expose users to what’s in the pages of books they never would have found before.  That is the primary reason OUP invested in the XML platform for University Press Scholarship Online and why we have taken that investment and shared it with the wider university press community, taking our role as the largest university press seriously, and understanding that smaller university presses don’t have the resources to do this on their own.  One of the key components of the platform is having authors write abstracts for every single chapter, and creating keywords for each chapter that are connected across the whole database so users can see what these authors feel are the core ideas of their writing.  The fact that these are done by the authors themselves, and not simply generated by a software program, is so incredibly valuable to long-form scholarship.  This issue isn’t getting the book you want electronically, it’s about finding the book you want in the first place and then having the ability to access it in whatever format you choose.

Now that we’ve walked through the economics of monograph publishing and the issues surrounding usage, let’s look more closely at the impact of DDA and STL.

So what does DDA look like at OUP right now?  DDA and STL currently represent 24% of our eBook sales, which are about 20% of our total book sales, with a full 80% remaining in print.  eBook sales, of course, continue to grow as print sales continue to decline — this is expected — and OUP has held up fairly well with print, managing only a decline in the single digits.  As a direct result of DDA implementation as a model, OUP has provided to libraries via the aggregators over $15M worth of content in the form of DDA records into library catalogs, allowing this content to appear as owned without requiring any purchase or investment from the library.

And yet we, and numerous other publishers, supported this model.  We heard the rallying cry and the sensible approach of not paying for content until it demonstrates use, use now being the ultimate indicator of value.  We have continued to experiment with DDA, and it is now fully embedded in our approach to the market.

Short-Term Loan first presented itself as a further evolution of DDA, an option for the other content — for example, high-priced items beyond the scope of an approval plan, or, even more usefully, as a replacement for ILL — a faster, cheaper, more effective replacement.  And again, publishers were willing to experiment.  We were willing to pilot this to see how it would work, to analyze the sales data as it became available, and then make a determination if this was feasible, if the rates were right and sustainable.  That worked fine in the very early days of the model.

What happened within a relatively short period of time, however, is that STL became much more the initial means of acquisition and, in some cases, served as the replacement for both the approval and the regular DDA programs.  Publishers did not anticipate this nor did our aggregator partners.

And this form of adoption of this particular model is really what has led to the changes we’ve seen in rates and the more cautious approach in general on the part of publishers.  With DDA, we understood and accepted the idea that libraries do not want to pay for content until it is used.  With the integration of STL in many DDA programs, that message evolved into something closer to “We only want to pay a small fraction of the DLP for each use.”  So if use is truly the ultimate indicator of value to the library, the content itself has been devalued.

As a publisher who now sees real and strong demonstrated usage of electronic content — and has made great investments in driving that usage via discoverability and other means — we are not seeing that usage translate into full title purchasing for the vast majority of STLs.  This represents a seismic shift from existing business models and is not only unpredictable but results in further shrinking the revenue for real use of scholarly content.  Given the fixed nature of costs for this type of publishing, this is not sustainable and has potential to impact scholarly publishing more broadly.

To reiterate a very basic fact, albeit one that is often overlooked in discussion of this issue, many publishers are still not charging anything for what is not used, which is the core premise of this model.  But actual purchasing has been impacted, and the resulting drop in sales is precipitous for many presses.

So why were STL rates adjusted?  There was enough clear data about purchasing to see that real demonstrated usage selling at 10% of list price was not sustainable and was, in fact, having a significant impact on many presses.   At a now-annual Charleston gathering of publishers and libraries to talk about this issue, some university presses noted sales declines of several hundred thousand dollars.  That represents real jobs, real losses, and poses a very real threat to sustainability — and that’s for content that is actually getting used.  As major stakeholders in the scholarly ecosystem, we would be irresponsible not to work on adjusting the model in a way that improves the long-term sustainability of scholarly publishing.  This is not a question of whether or not these books should be published — these are books that are getting used.

Industry media were swift to pick up on accusations of “price increases and price gouging.”  The reality is that most publishers, including OUP, simply shifted their discounts.  For Oxford specifically, from 90% to a 75% discount and, again, this is for books that are actually getting used.  Our eBook prices are exactly the same right now. We have stayed with the model.  Yes, some publishers pulled out entirely rather than change their discount percentages, but that is also a completely understandable response when your sustainability as a business is at risk.

Publishers, by and large, have made a commitment to supporting libraries in their desire not to pay for content until there is use.  Receiving 10% of the cost of a book when it is used , and then having to share that 10% with both the aggregator and the author, simply cannot make financial sense to anyone.

We need to find a way forward that works for all the parties involve: library, publisher, author, and end user.  The primary concept of not paying for content until it is used remains an agreed principle and may eventually mean less is published in certain disciplines, but that makes sense.  Allow that this has been an experiment on all sides and that sometimes adjustments are necessary, as libraries have made adjustments to their programs, including what actually triggers purchasing.

We, as publishers, are still in this with libraries and librarians, and we want to continue to work with you.  We’ll continue to engage in and to encourage experimentation and collaboration, and we’ll find something that works for everyone.  There might be even better solutions for all of us, and we’re committed to working together to discover just what those might be going forward.

 

Pin It

Comments are closed.