by Donald T. Hawkins, Freelance Conference Blogger and Editor <email@example.com>
Note: A summary of this article appeared in the September issue of Against The Grain v. 30 #4 on Page 74.
The Society for Scholarly Publishing (SSP) met at the Sheraton Grand Hotel in Chicago on May 30 –June 1, 2018 for its 40th annual meeting.
The meeting featured the usual array of plenary and concurrent sessions as well as a day of pre-conference and sponsored sessions on topics of special interest and a review of new and noteworthy products on the last day. An exhibit hall drew about 60 companies. The meeting theme was “Scholarly Publishing at the Crossroads: What’s Working, What’s Holding Us Back, Where Do We Go from Here?” The meeting was well attended, with over 930 attendees.
A special 40th anniversary celebration with music, food, and concluding fireworks was held on the first evening at the Navy Pier. Early arrivals were treated to a beautiful double rainbow, which was followed by a spectacular sunset.
Humans, AI, and Decision Making
This pre-conference seminar (one of four presented concurrently) was moderated by Phill Jones, Chief Technology Officer, Emerald Publishing, who noted that journal impact factors are well used to compare the reputation and quality of journals, but they are misused in comparing the quality of the research reported in those journals. Impact factors continue to be useful, but once the internet arrived, alternative metrics became widespread. One problem is that just counting things does not give any indication of the underlying quality; for that, we must turn impacts into narratives, so it is necessary to look beyond the numbers. Tools are now appearing which can be used for this purpose: SciVal from Elsevier, Dimensions from Digital Science, Unsilo, and Meta.
Isobel Thompson, Senior Strategy Analyst, Holtzbrinck Publishing Group, began with Amazon’s definition of AI: “The field of computer science dedicated to solving cognitive problems commonly associated with human intelligence…” and noted that AI-based systems are intuition machines, in contrast to traditional computers which are logic machines. She said that scholarly communication does not have an AI problem; it has a strategy problem. Businesses are struggling over what to do about AI, which is composed of foundational, cognitive, behavioral, and sensory layers and is capable of operating in a wide range of technologies, including natural language, audio and speech, machine vision, navigation, and visualization.
Thompson stressed that AI is a tool; we should not invest in AI, but rather in business problems. In the publishing world, using AI can result in increased efficiencies in journal creation. No single organization can have all the information it needs, so partnerships are important. Most AI talent resides with vendors because only large corporations have the talent necessary to create large and complex systems. The McKinsey management consultants have listed these five aspects of a successful AI implementation:
- Have a business case,
- Build a data ecosystem,
- Get the right tools and techniques,
- Integrate throughout your workflows, and
- Adopt an open culture and reskill the workforce.
Thompson concluded that AI can do many things, but business cases are necessary, and AI is not always the answer. We need to think differently about competitive advantage, partnerships, and vendors and change our organizations. Preparing now will allow us to take advantage of future advances.
John Sack, Founding Director, HighWire Press, reiterated some of the points that Thompson made and said that we must start with the people, not the technology. We must ask the right people for help; AI experts are not editorial experts. Authors, editors, and publishers must all be involved in the journal production process: setting editorial goals, attracting the right authors, handling submitted papers, and monitoring the results. Here are four challenges necessary for producing actionable results:
- Visualize the performance of a journal, its context, and its competitors.
- Engage the community to attract and retain the best authors and papers before submission. Leverage preprints to find the best articles for the journal.
- Use tools to see ahead; many new things go beyond known vocabularies. Identify emerging concepts and researchers.
- Strive for workflow efficiency and productivity. Meta’s “bibliometric intelligence” tool can be used to help editors suggest appropriate journals for submitted articles.
Jon Stroll, Research Director, Clarivate Analytics said that we must determine the challenges we are trying to solve. The “holy grail” of AI is a general system that can learn and solve any problem, but we are still a long way from this. AI tries to do what the human brain does intuitively: think, learn, reason, and apply data science that affects behavior. Extensions include robotics to perform a task and smart machines that can decide and act within a set of rules.
Machine learning is showing the most promise in terms of advancing and developing AI for today’s needs because it can take data, turn it into intelligence, and deliver predictions. It is dependent upon data for training the algorithm, getting it into the system, and providing feedback data to improve accuracy. One problem with natural language systems is that you may not be able to see what the algorithm is doing. Important trends contributing to the recent explosion in AI include cloud computing, big data, APIs, open source systems, and the Internet of Things.
Pauline Crépel, Business Development Manager, My Science Work (MSW) provided a description of the system, which is a suite of solutions for research institutions, scientific publishers, and R&D companies. It has indexed and created a database of 70 million publications from over 500 sources which is used to analyze content, foster innovation, and drive strategic research decisions. According to Crépel, the following trends will become products and services in the future:
- Data management: big data vs. actionable data and standardization on a uniform format,
- Relevant peer review methods, especially using blockchain technologies, and
- More open data and repositories leading to visibility and discovery of the data as well as powerful semantics research tools.
Chandi Perera, CEO, presented case studies of six organizations from Typefi’s impressive list of customers that have used Typefi to develop different types of content products. He likened the process to the “Magic Roundabout” in Swindon, England (5 mini-roundabouts around one large central one, creating one of the most dangerous intersections in England), like this:
Typefi is a single-source publishing platform that automates the production of content from source to output using the user’s data in its original format; there is no “Typefi format”. Here are the six case studies:
- American Academy of Orthopedic Surgeons (AAOS)
Publishes books with many diagrams and photos, with corrections formerly entered in a manual process that was very error-prone. Typefi allowed AAOS to increase its output within 6 months to 3 textbooks of up to 800 pages and repurpose the content for online publishing.
- IGI Global
Books, journals, and cases were produced in separate processes using a variety of platforms and manual typesetting. In 2009, IGI made a strategic decision to develop an in-house system and tripled its output with only 12 staff plus a specialist designer to create templates.
- The Institutes
Publications were produced as standalone products one at a time. Books needed to be customized for different markets. With Typefi, the speed of production was greatly increased and flexibility was increased using XML files.
- FASS (Federation of Animal Science Societies)
A proprietary system was being used for creating publications. When a staff member retired, a system using Typefi increased automation and cut production time from 60 to 30 days.
- ISSA (International Social Security Organization)
Only two staff members managed production of publications in up to 7 languages. Typefi allowed editors and authors work in MS Word using templates and process them through the Typefi Cloud system to produce HTML and PDF for online and print production.
- CFA (Chartered Financial Analysts)
About 200,000 people in 166 registered for one of their exams every year. Production of exams used to take 12 months; but the production was brought in-house, providing control of the process and the ability to produce more content faster and at lower cost.
As an example of the use of Typefi to produce a book with a complex layout of photos, maps, and typefaces, attendees were given a complimentary copy of the Moon Guide to Chicago, published by Avalon Travel.
Sadia Shahid, Head, Business Development said that there is more research produced today than ever, and more data than ever before. We are in the 4th industrial revolution, which is powered by intelligence; AI is the new electricity, and data is the new oil. Wizdom.ai scans the research ecosystem and has compiled a data set of 93 million publications from 63,000 journals which are interconnected using AI and natural language technologies. Output from the system is presented in a series of dashboards which can be tailored to the user’s needs.
Wizdom.ai has the following implications for publishers:
- Ability to monitor and analyze journals,
- See how a journal’s scope has changed over time and analyze it in the context of the publisher’s portfolio,
- Find emerging areas and gaps in which to launch new journals,
- See the direction of a field based on funding, citations, and patent activity,
- Find star authors that are entering or leaving a field, and see where else they are publishing,
- Find peer reviewers, editors, and authors related to the scope of a journal.
Paige Shaklee, Executive Publisher, Reviews and Partner Journals, CellPress, described how her organization met needs of readers who are challenged by complicated figures in research articles. Short, concise videos (Figure360) of two minutes of less explaining the figures were embedded into articles after they have been accepted for publication. Editage’s medical writers were hired to produce the videos, and they were able to complete 72 of them in 2017 in only 10 days. Figure360 enables authors to describe a figure with narration and animation and discuss key points of their article, just as they would in a conference presentation. Feedback from authors and editors on the quality and service provided by Editage has been positive. Because of the success of this effort, the Figure360 program is being expanded to CellPress’s entire line of journals.
Patricia Baskin, Executive Editor, Neurology Journals, American Academy of Neurology (AAN), described how Neurology is highly read and cited, so AAN launched specialty subsets of it. The online version of the journal is the version of record and contains videos, errata, and links to references in PubMed. But many readers prefer their journals in print. In addition, print ads are more lucrative for publishers than online ads. So AAN has begun producing the journal with full articles in the online version and short versions in print (exactly like “Don’s Conference Notes” in Against The Grain). Editage writers produce the short articles, so there is no burden on authors. Surveys and focus groups found that more people were reading the issues cover-to-cover, and there was greater interaction with the online version, which has cut production costs.
Satyajit Rout, Head, Research Services, Editage, concluded the session by discussing new approaches to content beyond the journal article. New formats bring better experiences to readers; for example, visual abstracts increase article dissemination, infographics provide summaries at a glance, and audio summaries replicate the conference experience. Publishers see video as a future-facing medium, but the challenge is in producing a high-quality product.
Other formats beyond video may not be appropriate for all disciplines. It is important to give readers something that does not disrupt their habits. They want immediate engagement and mobile-optimized digital content. Future directions include driving product innovations, solving new problems, and maximizing digital content possibilities.
Opening Keynote: Toward an Ethic of Social Justice in Information
SSP President Jennifer Pesanelli Opens the Keynote Session
Opening Keynote Speaker Dr. Safiya Umoja Noble
Dr. Safiya Umoja Noble is an Assistant Professor at the University of Southern California (USC) Annenberg School of Communication and co-author of Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press, February 2018). Her research focuses on the design of digital media platforms on the internet and their impact on society, particularly in the area of social justice. She began her keynote address by noting that publishers are the critical gatekeepers of knowledge and are important in clarifying issues of race and gender as they intersect with information. In her opinion, the field of social justice should broaden to include library and information science. We like to think of ourselves as apolitical or neutral, but our work is deeply political even if we do not think it is. Over the last 10 years, many values have been highly contested, and race remains as one of several blind spots in our field.
Racism, oppression and similar concepts were formerly markers of exclusion. They have now been transformed by “diversity” or “cognitive diversity” as new catchall phrases. A Google search brings up this definition:
“The power of having cognitive diversity in the workplace is the same power that companies try to attain through strong leadership and great communication. It’s a more inclusive, collaborative, and open space where people feel empowered to create and implement ideas.”
One person thinking differently from another is not necessarily diversity. No two people have the same thoughts (not even twins). We need engaged academics and practitioners who can enact social justice on campuses by including a greater diversity of courses. Can the library catalog be truly a neutral space? One study found that LC catalog systems are explicitly racist. If your knowledge is not published it cannot be classified, cataloged and disseminated.
Knowledge classification and hierarchies are not new. Classification themes are part of the process of understanding people and societies. We are at the epicenter of knowledge diffusion in our societies and need to think about how information is disseminated in our society. Power relations are embedded in text. Racial classification emerged in the 18th and 19th centuries in North America. It became codified to deny citizenship to classes of people, especially indigenous people.
We have implemented the organization of people. Knowledge organization is also part of the landscape. Many people do not understand the implications of the history of racial classification of people. Publishers and librarians have an opportunity to provide more knowledge. In the confusion of the “post-racial” moment, we must look more closely at knowledge dissemination. Noble said that diversity, in the so-called “post-racial” era, has been deployed in calls for the “inclusion of everyone”, in ways that undermine and erase the specific naming of social injustices that are intersectionally and inextricably radicalized, gendered, or based in class.
We are missing the mark and are underperforming in creating remedies for social justice. We can publish more work that helps us. What role will we play in sharing knowledge and research about what is happening? For example, the replacement of the LC subject heading “illegal aliens” was the result of a two-year project by Dartmouth students, who gathered documentation that “illegal alien” was not a preferred term. It was replaced by “non-citizens”. Control over identity is political: information provided to a user is deeply contextualized. These concepts are fundamental to our field of information science.
Noble concluded with this list of things we can do:
- Make scholarly research visible to the public, faster, and more broadly.
- Build repositories and platforms that belong to the public and don’t put everything behind a paywall.
- Re-train ourselves on how to resist color blind/racist/sexist policies. There is much work to do.
- Never give up hope that education can serve as a powerful intervention in society.
Cialdini’s Six Principles of Persuasion: Thursday Keynote
Steve Mirsky, Sr. Editor, Scientific American, opened the second day of the conference with a fascinating look at the six principles of persuasion that were promulgated by Robert Cialdini, Professor Emeritus of Psychology and Marketing, Arizona State University in his book Influence: The Psychology of Persuasion. Cialdini developed his principles to manage a team. Publishers want to persuade people so they can sell more books and magazines. Mirsky writes a monthly humor column in Scientific American (the only purposely funny thing in the magazine) entitled “Antigravity”, and commissioned an article by Cialdini in the February 2001 issue.
Here are Cialdini’s six principles:
- Reciprocity is one of the glues that keep the social contract going. (Free samples indebt the consumer to reciprocate.) A sense of fairness stems from reciprocity. It can be manipulated.
- Consistency: The public commitment involved in agreeing to a petition influences the signer to behave consistently with that position in the future. For example, marriage is still popular because you get together with friends and make a commitment in front of them.
- Liking. If you like people, you are more likely to comply with their wishes. Liking is important; we want to be around people we like.
- Authority. Do you want to buy from an authority who does not know what he speaks about?
- Scarcity is context-dependent, which is why supermarket shelves are always kept fully stocked. For example, before a hurricane, milk becomes a highly desirable commodity and supplies become extremely limited. If there is no hurricane coming and the shelf is almost bare, people tend to think “what’s wrong with this”. Scarcity can also be manufactured with limited offers to make a product more desirable.
- Social proof. People tend to do whatever others around them are doing. For example if a few people on the street are looking up, others will soon join them and also look up, even if there is nothing specific to see.
Rules of persuasion can help us in whatever we are doing, especially in a world of distracted attention. These rules are based on our history and are more or less constant.
Using Data and Customer Insights to Evaluate Tools and Services
The three speakers in this session represented publishers who have evaluated third-party tools for their organizations. Jill Treby, Director of Marketing, International Association for the Study of Pain (IASP), said that IASP saw the need to provide digital resources for its members and increase the value of the association to them. Its strategy was to develop use cases and content-specific objectives across operational units and get feedback from members. Next steps will be to establish a framework for user needs and select tools to acquire.
According to Daniel Griffin, Web Presence Manager, Duke University Press, the Digital Strategy Group works across all units of the organization and develops roadmaps for each system. The roadmaps consist primarily of enhancements to existing programs and new additions. Drivers for innovation include:
- Strategic directions that are reexamined every few years and form the primary basis for innovation. They focus on technology and benefits, not vendors, and fit with business drivers.
- Process improvements make work of staff easier through integrations between systems. They often overlap with strategic directions. The major decision is whether the investment makes the time saved worth the effort.
- Editorial office needs arise out of necessity. They are often developed far in advance of anticipated date.
- Value-added propositions support business goals, user experience enhancements, reporting tools, and author services. Major metrics for evaluation are whether trials are available and how the proposition can be justified at budget time.
- Mission projects are not in the strategic plan but enhance the mission of the Press or the University. Cases for such projects must often be made with no prospect of return. Questions to be asked include: Does it serve users or fit into an existing workflow? Is there a university partner? Does it support general directions in scholarly publishing by adding value or filling a need?
Paul Gee, Digital Product Manager, JAMA Network, American Medical Association (AMA) noted that the JAMA network comprises the 13 peer-reviewed journals of the AMA publishing 5,500 articles a year. Several full-text journals were released online with HTML5 as a pilot in 2012. Managing such projects requires undivided attention, so an R&D segment of the network staff was created to do new product development. Lessons learned from this effort include:
- Put “Research” before “Development”.
- Physicians like innovations but not too fast.
- Releasing a product as a conceptual test is a bad idea. Don’t look for success; look for failure and learn from it.
- Marketing and sales are as important as developing good technology. Bring good ideas to life slowly and then hand them off to marketing, sales, and product management.
- It is hard to launch a new product, but it is even harder to kill one.
- Our business is content, not medium, format, or technology.
- Success depends on strategy; know why you are doing work or don’t do it.
- Think about things that will drive author interest in articles.
Bound Together in a Common Cause
Representatives from a university, university press, e-book vendor, and academic library discussed the common bond they share as intermediaries and enablers of quality research. Matt Ragas, Associate Professor, College of Communications, DePaul University, said that our common purpose is to publish great work that makes an impact on society. We must ensure that peer-reviewed e-books are discoverable by researchers which will help raise the level of business literacy for non-business people. Here are some common challenges:
- Discoverability is critical. Some publishers have packages to make e-books available through the library. Graduate students are not predominantly e-book purchasers; they want to have a “real” (i.e. printed) book so they are not distracted by the device, technology, etc.
- Standardization and normalization of data. We are focused on data-driven decision making. We need access to more data, but we are not getting access to it as much as we want.
- Content value: higher education is currently in an era of cost cutting. Authors must therefore think about their platform.
Leila Salisbury, Director, University Press of Kentucky, said that in marketing, you do not have the luxury of keeping a closed in-house view. A common role for university presses is acquiring manuscripts from scholars. We must start by defining the audience for a book and then see if the Press is the right publisher to get the book to the intended audience. We are dependent on authors for content development and also on aggregators for discovery.
Discoverability is a challenge because of the flood of content facing us. We are the gatekeepers; what do we bring to it? Most university presses do not have specialized employees or technology infrastructures, so they rely on authors or junior staff members to write key abstracts. We do not rely on word of mouth or reviews so much, so we have to solve the problem of how to compete.
Enhanced discovery depends on standards and metadata, which involves huge challenges. Most e-book use is at the chapter level, so the chapter metadata must get into distribution channels. We wish we could get more data and must figure out who owns what. Granular user data is most helpful.
Ken Breen, VP, Product Management, EBSCO E-books, said that our common goal is discoverability. Aggregators must ensure that e-books are discoverable in acquisitions portals and then make sure that the books are chosen by librarians once they are discovered. Metadata is the vehicle to make a book discoverable. ONIX for Books is an XML-based standard that provides a consistent way for publishers, retailers, and their supply chain partners to communicate rich information about their products, but it is very complicated.
Authors and publishers are assuming that abstracts are making their way to the library. There are vendors between aggregators and publishers, and between aggregators and libraries, so the first thing necessary is to talk to the next level downstream and ask what they can and cannot use, then ask the publisher to turn that data on. The result of adding value to content will be to bring it to the top of search results.
Gregory Heald, Associate Dean of Libraries, University of Northern Colorado, noted that discoverability from the library point of view has shifted with the advent of heterogeneous discovery systems. Questions have become more complex. E-books are increasingly being used at the chapter level, particularly in STEM fields. Other paths to discovery include exposure through Google, OCLC, Bibframe, and similar systems. Users frequently find book content through unexpected paths.
Libraries are seeking data for user experience optimization, promotion and tenure, and selection. They need interoperability to be able to provide such data to faculty members. For selection, we need data that exposes user needs, such as data on circulation, as well as user feedback. We are increasingly relying on the data received from Google Analytics. Changes in ebook content being experienced now are similar to those formerly experienced with journals. Books are being received as large packages, which tell us what parts of the LC classification system are popular.
Libraries recognize that scholarly presses play an essential role in the promotion and tenure processes. We must continue to support university presses and learned societies and make sure that their content shows up in discovery systems so that we do not wind up with non-representative collections.
The session concluded with these suggested actions for the future:
- Authors: Better educate authors about discoverability (metadata and keywords) and be prepared for ROI-driven decision making. There is a societal value to niche knowledge.
- Publishers: Encourage vendors to expand ONIX field ingestion and explore better access to data.
- Aggregators: Recognize that data feeds do not replace human communication. They must be delivered accurately and in timely fashion. Audit e-book metadata to ensure we received what the press sent, and then audit acquisition portals to ensure the same data is in bibliographic records.
- Librarians: Communicate with candor to all parties and articulate the shape of the library’s budget.
Integrating Scholarly and Non-Scholarly Content to Increase Visibility and Value
This session began with an appropriate quotation from Walter Isaacson’s biography of Leonardo Da Vinci:
“…the ability to make connections across disciplines—arts and humanities, science and technology—is a key to innovation, imagination, and genius.”
We can apply this concept to scholarly communication and note that one way to increase the visibility of content is to broaden its subject coverage and integrate scholarly content with other types of non-academic content, such as blogs, videos, or news feeds.
Jackie Prince, General Manager of the New England Journal of Medicine (NEJM) described NEJM’s revitalized website, NEJM.org. On the former website, innovative features were often trapped within articles, and the site design had become dated. It was difficult to find content beyond the current issue of NEJM. Objectives of the redesign were to make digital objects searchable, discoverable, and monetizable, which was achieved by giving each object its own DOI. The user experience was improved and modernized to enhance discoverability and provide an optimal reading experience. New multimedia and learning pages were added. A roadmap was developed to plan for future developments. The entire project took two years and received a very positive response from users.
Michael Roy, Executive Editor, American Journal of Psychiatry and Editorial Director, Journals, American Psychiatric Association (APA) Publishing, spoke about promotion at all stages of the submission cycle to raise awareness. Although APA’s journals are respected, authoritative, widely read, highly ranked, and indispensable, they were not connected. APA had no social media presence and no press office. Here are four stages of the submission process that were developed to raise journal awareness:
- Pre-submission: Attract authors and promote awareness. A list of APA’s resources was prepared for authors’ guidance.
- Peer review: Be welcoming and make it as friendly as possible. Provide a plain-language summary of the article. Authors feel most welcomed when their article is accepted and should be encouraged to sign up with Kudos.
- Production: Ask authors to help. The best time to ask is while author is highly engaged with the content during review of page proofs or when the author receives notice of publication.
- Post-publication: Remind authors to stay engaged. Don’t just publish and forget.
Paul Guinnessy, Online Director, Physics Today, American Physical Society (AIP) noted that AIP, a federation of 10 societies, has existed since 1931 and publishes 28 journals and 4 magazines. Its flagship publication, Physics Today, has 3.3 million Facebook followers, over 200,000 web visitors per month, and a print run of 100,000.
Journals must attract their audiences because time is a rare commodity. The amount of time spent reading articles has remained fairly constant over the years, but the number of articles “read” has tripled, primarily because reading now includes not only the article abstracts but the table of contents. It is therefore necessary to get additional coverage in a magazine, press release, or on social media to get articles noticed. Physics Today is a “browsing” journal and must use the capabilities of the web to attract readers. Based on its experience, AIP has found that;
- A direct link to the reader is worth more than 1,000 Google searches.
- Email or mobile notifications to users about new content works well.
- Facebook is dying as a source of traffic, but Twitter is effective with the right content.
- It is important not to put barriers in the way of subscribers reading content. Give them hints of what they will find behind the login and password.
- Press releases increase traffic to research articles and time spent on them.
Using these techniques, Physics Today as hosted on an Atypon platform has become a benchmark for other journals. It is well known and drives traffic to articles it references.
Bill Kasdorf, Principal, Kasdorf & Associates said that we must stop regarding accessibility as a burden because it makes content more discoverable. Most of what accessibility requires is already being done with a good workflow because it is based on known standards, HTML or EPUB markup, meaningful structure, and a logical reading order. Most STM publications have some math in them and therefore use MathML markup, which can be created by Word’s equation editor. Most scholarly publication is close to these requirements and is therefore simple to navigate. Areas that cause extra work are image descriptions and accessibility metadata. For images, the authors should supply image descriptions, not alt text statements. The metadata documents compliance to accessibility standards; today’s scholarly publications are close to being uniformly accessible.
Mike Groth, Senior Brand Manager, Emerald Publishing moderated this session and introduced Emerald’s Real Impact Manifesto in which Emerald commits to supporting meaningful real world impact by:
- Supporting the community of practice to overcome barriers to impact,
- Challenging simplistic and outdated approaches to impact, and
- Driving impact literacy in the research sector.
This session featured four prominent executives of scholarly publishing presenting their thoughts on impact. David Crotty, Editorial Director, Journals Policy, Oxford University Press and Editor of SSP’s Scholarly Kitchen blog, questioned whether we are still in the same traps as we have been for some time. We still rely on numerical metrics because of scale and lack of deep subject knowledge. Problems with metrics are that they are subtle, slow, vary from one field to another, and encourage short-term thinking instead of long-term gain. Altmetrics do have some advantages, however: they allow us to track connections between research papers and patents, and we can see how societies’ journals are affecting change. But do we really want to look up more research results behind paywalls (i.e. patents)? Focusing on patents as a goal goes against everything we are working towards in open science.
Exerting pressure on research to run like a business and create short-term results is bad for research. Everything we do today is based on previous decades of research. There is valuable information in the citation record, and we should not dismiss it simply because we do not like a journal’s impact factor.
Kent Anderson, CEO, Redlink, discussed how some of the problems with social media affect information. People often put quasi-scientific information out to validate good or bad theories (frequently bad ones). We often overlook the fact that our distributors are no longer trustworthy. Social media are not neutral distributors of information, and their biases have long-term effects. For example, the US distribution of Facebook is larger than the circulation of all major newspapers combined; 80% of altmetrics measurements come from Twitter activity; and Google is every journal’s dominant search engine. Social media reflect biases that most of us would agree are wrong. These biased sources are making their money off misinformation; therefore, any metric or search you run must be treated as if it is affected by some algorithm that you do not understand.
Ann Gabriel, VP, Academic and Research Relations, Elsevier, suggested that the RELX (Risk, Elsevier, Lexis/Nexis, and eXhibitions) group (the parent of Elsevier) is generating data that could be used to develop metrics to measure the impact of science on policy. Metrics have a wide impact on the research lifecycle, as shown here:
We still have a long way to go to convince taxpayers that the research they are supporting is of benefit to them. Some new methods of measuring impact are CiteScore, a simple metric for all journals indexed in Scopus, and those captured by Plum Analytics. Elsevier is developing some metrics for research data, such as SciVal, which can show not only a researcher’s collaborators but who they are not collaborating with (which could be an indication of potential new markets). And science in legislative data is being addressed by adding Federal Register references to hearing data.
Patti Davis, Publisher, Emerald Publishing, asserted that many professors have little interest in the courses they teach; their main purpose is to do research, which may not be relevant to their courses. As a result, for 25 years, scholars have been saying that they need to make their research relevant.
Academia and practitioners are on parallel roads, so there is a gap between research and the communication of it. Researchers tend to pursue academic impact over society impact; they choose journals in which to publish their work by the impact factor. It is difficult to get professionals to collaborate; many of them work alone. We need to fundamentally restructure how academics are incentivized. Organizations playing an important role include business schools, professional societies, and publishers. We add value to research by commissioning summaries and recruiting practitioners to serve on editorial advisory boards, so that they become an integral part of the communication between researchers and publishers. In a survey of 1,000 Emerald authors, 97% said they believe their research has relevance outside academia, and 65% agreed that collaboration with non-academics can produce highly cited research. But only 36% said they are incentivized to work with non-academics, and only 15% said that their organization’s incentives were a motivation to enter into an existing collaboration.
New and Noteworthy Product Presentations
The final day of the conference featured SSP’s first Product Previews session followed by an audience vote to choose a favorite innovation for this year. The session was moderated by David Myers, Principal, Dmedia Associates, who noted that the most successful organizations are those that offer innovation and do things that others do not. Innovation is not a luxury; it is essential in today’s environment. If you are not a disruptor, you will be disrupted.
Here are brief descriptions of the 13 products and services that were presented.
- AAMC Convey: Submitting financial disclosures is onerous. Convey provides a web-based repository to enter and maintain records of financial interests and allow them to be disclosed directly to any organization. Publishers can tailor the system to collect only information relevant to a journal.
- Atypon: Scitrus: the future of science alerts. Content is personalized, not as an email alert service but a magazine-like interface.
- Copyright Clearance Center (CCC): The costs of managing Author Publication Charges (APCs) are increasing as OA grows. CCC has built an “OA Agreement Manager” that notifies parties to join the payment process. It enables publishers, funders, and institutions to excel by cutting operational costs, increasing transparency, reducing errors, and creating a quick and easy experience for authors.
- Delta Think: An OA data network tool normalizes publicly available OA data and produces a living report to read, interact, and download. Examples of data: How do my APCs compare with others in the industry? What are the common price points? How do impact factors influence pricing? Delta Think also publishes a free newsletter, “OA News & Views”.
- Digital Science: Dimensions. Challenges in the research ecosystem lead to wasted time. Dimensions, a multi-tool for modern research, has a database of 128 million research documents with over 4 billion links between them. Publishers can get the full picture and support business intelligence, find experts (reviewers, rising stars, potential members), and track impact (patents, citations, policy, altmetrics).
- Edanz Author Path: Researchers, especially ESL authors, are challenged because of the long times they need to write articles. Author Path educates them on writing, submission, and peer review.
- eLife Sciences and Hypothesis, a mission-driven nonprofit, has 3.2 million annotations. A publisher group has been established to apply standards to their annotations. Authors and readers are able interact in new ways.
- JSTOR Text Analyzer by JSTOR Labs is a new way of searching by using the whole document to find documents in all databases by natural language processing even though the actual subject word is not preset. Documents can be processed in 15 different languages.
- Kudos: Help publishers benefit from and report on PDF-based author sharing. A new approach encourages shareable PDFs, maximizes the usage of full text, and increases subscription revenue. Kudos has become the most popular mechanism for sharing. Trackable links enable counting to enable making more informed decisions.
- PaperHive: lifting the curtain with in-document metrics. PDFs are a challenge for established metrics: are users reading papers they have downloaded? Readers click the PaperHive widget on a publisher’s article page, then go to the reading environment and see discussions. PDFs stay on the publisher’s website so discussion happens on version of record. Metrics generated: discussion activity to plan content and identify new authors and reviewers; completion rate and hot paragraphs (that are read beyond the abstract and which chapters are interesting); views and recurring behavior to justify ongoing library subscriptions and secure new ones.
- Quark Intelligent Systems: AI-powered value added services. Professionals and practitioners are frozen in time. Practitioners don’t read articles; the market segment is immune to OA proliferation. The biggest problem is keyword search; practitioners want answers, not lists of articles. Why search when customers can simply ask? Automated Q&A surpassed human performance for the first time this year. Older technology doesn’t work well. What do you need to make this happen?
- Ai: Publishers are sitting on a gold mine. Terminology is variable and ambiguous, so facts are hidden from readers. Whether a publication will be part of future research and funding depends on its machine-readiness. sci.AI converts unstructured plain text into machine-readable publications and adds machine-ready, detailed metadata to publications.
- MIT Press: PubPub—a collaborative community that publishes to socialize the process of knowledge creation. How can publishers promote the increasingly open and collaborative nature of research? How can they help make published works accessible?
Closing Plenary: The Scholarly Kitchen Chefs Look Back Over 50 Years
The closing plenary session featured the “chefs” of the Scholarly Kitchen (SSP’s blog, now in its 10th year of publication) looking back over the past ten years and making some predictions of what the future might bring for scholarly publishing. Alice Meadows, Chair of the Kitchen Cabinet Task Force on Diversity, reviewed the results of a readership survey. Most of the readers of the Kitchen are located in the U.S., Canada, and Europe and are publishers or university librarians of a wide variety of ages. They are loyal; many have been reading the Kitchen for five years or more, and they feel that it is meeting many of SSP’s goals. Some areas for improvement were expressed: a need for a wider range of opinions instead of just those of publishers, a concerted effort to reach students, a discussion of practical successes in the field instead of only discussing research results, and increased gender diversity of the chefs. In general, the Kitchen is viewed positively and was regarded as a valuable source of information. The SSP Board plans to revisit and possibly revise SSP’s goals for the Kitchen by increasing diversity in all its forms, ensuring that comments are more respectful in tone, and making the Kitchen more relevant to early career publishing professionals.
Following Meadows’ review, seven of the chefs discussed various aspects of the Kitchen:
- The Kitchen as originally proposed by Kent Anderson, CEO of Redlink, was envisioned as a newswire publication. Ideas and suggestions were received, and the Kitchen developed into the format we know it now. Anderson noted that being a chef is like a double-edged sword because one’s opinions are exposed to the world, and readers will form their opinions about you. The Kitchen has more than exceeded Anderson’s expectations, and it has been a very positive experience for him.
- David Crotty, Editorial Director, Oxford University Press and current editor of the Kitchen, commented that although the scholarly publishing landscape is facing major changes, many of the subjects discussed have endured for the entire ten years of the Kitchen’s existence, which is not only an evidence of the slow evolution of academia, but also because many of the problems are not easy ones to solve. For example, are we any closer to getting rid of the disadvantages of the impact factor or questioning the meaning of citations, especially in the humanities?
- Robert Harington, Associate Executive Director of Publishing at the American Mathematical Society, said that peer review is not just an evaluation of the quality of what an author writes, but it is part of an ecosystem that comments on the validity of an author’s experience, from which an author can learn and improve the article. Currently, peer review is evolving. The reason it should be examined now is an implicit bias in areas such as gender, or the region where the author resides and about which one might make an assumption. Should we be discussing different models of peer review (single-blind vs. double-blind, for example) in an effort to remove implicit biases? (Not to have this conversation is a form of bias in itself.) One interesting possibility is to continue the blindness of the reviewers but publish the content of the reviews as part of the public record, which might help remove some predatory practices that now exist.
- Tim Vines, Consultant, Origin Editorial, said that discussions of data sharing policies must shift towards the viewpoint that unless data is provided, an article is not very good. Publishing an article claiming to move science forward without furnishing the numbers on which the research was based is ridiculous. It is amazing that we have reached the point where we must provide the evidence that what an article says is correct. There must be a shift away from compliance to quality; articles that provide data are better than those that do not.
- Judy Luther, President, Informed Strategies, said that the biggest shift in searching that we currently see is that despite our focus on discovery (which means that we are finding something), we are headed towards an environment where what we want finds us so that we do not have to go and search for it. The volume of content is growing to the point where it is increasingly challenging to find what we want. When information was scarce, we had it all well-organized; now, we hardly get past the first page of Google search responses. In five or 15 years, we may decide that this is not an efficient way to stay informed. We are enhancing our content so that the tools we have will find it more readily by trying to personalize the data. Serendipity has been lost in our focus on search; we are relying on our networks to help us find what we need. If we can bring our informed networks together with smart content, we will have a basis for informed artificial intelligence. But we have a ways to go before we arrive at this.
- Much of our industry is embedded on the technology that we are using. Todd Carpenter, Executive Director of NISO, noted that despite beliefs that technology is changing every day, it takes a long time to be broadly adopted. The culture and our expectations must change, which requires significant energy. For example, ten years ago, there was a significant effort to get e-publication established, but today, most of us are still publishing using PDF. We have done much in ten years to expand our expectations, but we need to support that infrastructure in our communities. The challenge is agreeing on the culture, not standards. The technology community has serious problems with diversity; how do we bring the community together to address some of these problems?
- Lisa Hinchliffe at the library of the University of Illinois addressed the role of librarians and information scientists in the next five to 10 years. Librarianship is evolving in its relationship with publishers; there is a shift towards everyone in research libraries moving to the services side of the business, which is resulting in a focus on economics, purchase decisions, big deals, etc. Librarians are being asked, “Why do we need you?” because they have been disintermediated. They must provide value in a way that is related to information, not gatekeeping and purchasing. The scholarly publishing and research library communities are beginning to look like scholarly publishers. Many author services that publishers are striving to establish are like library services. It will therefore be interesting in 10 years to see if we are still two communities.
SSP’s 41st Annual Meeting will take place May 29-31, 2019, at the Marriott Marquis Hotel, San Diego Marina, San Diego, CA.
Donald T. Hawkins is an information industry freelance writer based in Pennsylvania. In addition to blogging and writing about conferences for Against the Grain, he blogs the Computers in Libraries and Internet Librarian conferences for Information Today, Inc. (ITI) and maintains the Conference Calendar on the ITI website (http://www.infotoday.com/calendar.asp). He is the Editor of Personal Archiving: Preserving Our Digital Heritage, (Information Today, 2013) and Co-Editor of Public Knowledge: Access and Benefits (Information Today, 2016). He holds a Ph.D. degree from the University of California, Berkeley and has worked in the online information industry for over 45 years.
 See https://en.wikipedia.org/wiki/Magic_Roundabout_(Swindon)
 See https://www.cell.com/figure360
 See https://en.wikipedia.org/wiki/Robert_Cialdini
 Features of the new website are described at NEJM.org/revitalized, where there is also a link to an editorial discussing its development.
 See author-path.com
Leah was appointed Executive Director of the Charleston Conference in 2017, and has served in various roles with the Charleston Information Group, LLC, since 2004. Prior to working for the conference, she was Assistant Director of Graduate Admissions for the College of Charleston for four years. She lives in a small town near Columbia, SC, with her husband and two kids where they raise a menagerie of farm animals.