Home 9 Blog Posts 9 v30 #1 Optimizing Library Services — How Information Scientists Can Help Fix the Broken Peer Review Process

v30 #1 Optimizing Library Services — How Information Scientists Can Help Fix the Broken Peer Review Process

by | Apr 12, 2018 | 0 comments

by Jeremy Horne  (International Institute of Informatics and Systemics)

Column Editors:  Caroline J. Campbell  (Promotions Coordinator, IGI Global) 

and Lindsay Wertman  (Managing Director, IGI Global) www.igi-global.com

The Search For Knowledge Quality Standards

Until the 1970s, a person would go to a building known as a “library,” walk in, and then make their way over to a cabinet with many drawers.  These drawers contained cards with information about the books and other printed material in that building, all of which were available to the public.  In larger libraries, there would be a main desk providing general information on how to use the library, and a smaller reference desk staffed with individuals able to assist researchers on locating information, in particular, “references” or generic sources, like atlases, dictionaries, and encyclopedias.  Very few people questioned the integrity of what was given to them or the advice given by the staff. For students, reliance on that information was critical as it was foundational to the integrity of the knowledge they were acquiring to proceed with their studies. It was all about truth, knowledge, and learning.

When computers started to become more sophisticated, there was a corresponding development in the type and mode of information dispensed.  Card catalogues started to become computer based and in the colleges and universities, the libraries started becoming multimedia centers. Consequently, librarians realized that microfilm, reel-to-reel moving pictures, and records should also be included as knowledge sources.  In the schools themselves, attention was paid to source integrity. It was common to include some critical thinking in high school and college English courses, as a part of the initial instruction on assembling a “theme” and then, a term paper. Still, there was no concern about the quality of the source, whether it be librarians, academic journals, etc.  Therefore, information repositories themselves have a long tradition of being reliable sources of knowledge.

Today, with the information and technology explosion, detailed discoveries are being vetted across increasing areas of study.  Greater skill is required in winnowing out what has relevance and quality thus, the ability of the average person is being challenged with technology and discovery forcing more specialization.  Therefore, everyone has to be versed in systematic methodologies of evaluation and selection of quality sources, especially the librarian. Libraries in essence no longer are mere repositories of information, but instead librarians must be the future scientist with all the attendant responsibilities to become educated in knowledge quality, hence the appellation “information scientist.”  

The Problem, Its Background, and Some Reasons

Information scientists must know how to recognize knowledge quality and how to handle it.  More specifically, what does an information scientist do about the emerging dubious quality appearing in ostensible journals?  Journals that have few, poor, or no peer review criteria and often charging authors to publish are deemed “fake,” “false,” or “predatory,” hence the use of the word “ostensible.”  The “predatory” designation refers to taking advantage of those needing to publish at the risk of compromising their job status. Academics as individuals are upset with predatory journals, but publishers are as well (Khosrow-Pour, 2017;  Sorokowski et al, 2017; Kolata, 2017; Shen, Bjork, and B-Christer, 2015; Bohannon, 2013; Cook, 2006; Csiszar, 2016; Kaplan, 2005; Preston, 2017; Retraction Watch, 2017; Callaos, 2011). Nature, in particular, has been very upset, saying that these journals take a severe toll on research efforts in terms of time, money, resources, and even the animals (Moher et al, 2017).  Information scientists (the former “librarian”) are directly affected by these predatory practices and have developed efforts to alert others to the problem. After all, it is they (the information scientists) who often are in the middle of conveying information as knowledge in their collections of printed and other media.  In fact, it is their responsibility to help the public ensure that it is of quality, both as object and process. The reference “librarian” in particular has, in the past, been relied upon as the one who can direct a person to quality sources, i.e., references. Her/his role is critical now more than ever.

Origins and Rise of Predatory Publishing

While it is not the major purpose here to discuss how and why predatory and fake publishing came about, it still is important to think about these factors.  

  • Initially, the more information produced and distributed among existing journals, more and more information is left out, thus generating a demand for more publications.  
  • More PhDs are being graduated, but there are not enough vacancies to hire them, and the positions that do exist are destructively competitive.  There is not enough support for education generally to create more faculty positions.
  • Journal subscriptions are costly, and libraries cannot afford them.  The response has been to have authors pay for articles — which is a questionable practice to begin with.  The ostensible intent is to have the articles “open source” (available to the public without cost).
  • How does one discern legitimate journals from fake ones just by looking at them?  There may be well-researched articles dispersed among poor quality ones.
  • School quality has been declining, such that not only is half the U.S. population not able to read past the eighth grade, but only half of high school graduates meet minimal academic standards (NAAL, 2003, 2005).  In addition, National Science Foundation (2015) data as well as the National Center of Educational Statistics (NCES, 2009) paint an equally dismal picture with 25 percent of U.S. adults thinking the Sun orbits the Earth.  
  • It may be argued that the “publish or perish” goad is an unsustainable quantity-over-quality ideology.  Since there would be no peer review, this creates a veritable minefield of misleading or false research and data to contaminate other legitimate research, especially if those researchers rely upon what is published in the fake journals.  

Each of these can be debated, and more reasons can be added.  Librarian scientists do not have direct control over some of these variables, such as the “publish or perish” model, but there are some factors, such as acting in an individual capacity to help improve school quality by counseling students in research methodologies and in knowledge quality, they are able to greatly influence.  

Information Scientists’ Solutions

In 2010, in response to predatory publishing, Jeffrey Beall, Master of Science in Library Science (MSLS) degree and librarian at the Auraria Library, University of Colorado Denver (Beall-CV, 2017) started issuing his “list of predatory publishers.”  An extensive amount of controversy and history has surrounded this event and its motivations, but there has been an underlying legitimacy.  Numerous publications were listed in “Beall’s List of potential, possible, or probable predatory scholarly open-access publishers,” but there were no specific reasons why each was included.  However, he did publish a general set of guidelines (Beall – criteria, 2017), such as these publications not only accepting payment for articles but — more serious — having little or no peer review.  It seems that it does not take much to have been on Beall’s “hit list”;  even a typographical error would do.  The Journal of Financial Education (JFE, 2017) appeared on the list.  In 2016, Timothy Michael, Associate Professor of Finance and Business at the University of Houston at Clear Lake, Texas and the executive director of JFE, erroneously referred to “Dr. Beall” (as Beall has no such degree) and requested that the JFE be removed from the “hit list.”  Beall refused to do so, as the typo was supposedly not corrected (JFE-complaint, 2017).  

More and more scholars, while sympathizing with Beall about the problems began questioning his authority and asking, “Who set you up as judge, jury, and executioner?”  “Who is peer reviewing Beall?”   While the specific reasons are not known, it is reasonable to think that the threat of lawsuits (Flaherty, 2013;  Silver, 2017) and heightened criticism from academics motivated Beall to stop publishing the list.  

The Associate Dean for Scholarly Resources and Collections at Marriott Library, University of Utah, Rick Anderson (Anderson- CV, 2017), ironically holding the same degree and sharing the same concerns about poor quality journals as Beall, set forth guidelines for assessing whether a journal is fake or predatory, such as:

  • Falsely claiming to provide peer review and meaningful editorial oversight of submissions
  • Lying about affiliations with prestigious scholarly/scientific organizations
  • Claiming affiliation with a non-existent organization
  • Naming reputable scholars to editorial boards without their permission (and refusing to remove them)
  • Falsely claiming to have a high Journal Impact Factor
  • Hiding information about APCs until after the author has completed submission
  • Falsely claiming to be included in prestigious indices.  (Anderson – Scholarly, 2017)

What Are the Problems with These Solutions?

Anderson has a link summarizing a critique of Beall (Crawford, 2017), as in “…there’s no stated reason for things to be included.”  He refers to Walt Crawford, another retired librarian (Crawford – background, 2017), who states that it is not only “grotesquely unfair to blacklist a journal or publisher without giving any reason why,” but the original entries in Beall’s list of 1,604 entries is pared down to “53 journals and 177 publishers” (Crawford, 2017).  

In lieu of Beall’s absence, Cabell’s list of presumably predatory journals has appeared, essentially reflecting Beall’s criteria (Cabell-criteria, 2017).  Besides the critique of who set Cabell’s organization up and specifically why it listed a journal, is that Cabell’s sells the list (Cabell – Charges, 2017).  

Others, like the Directory of Open Access Journals (DOAJ) have their rules (DOAJ Standards, 2017), but there is no detailed set of criteria for peer reviewing content, i.e., quality control.  For example, how thoroughly were sources validated? Do they in fact exist? What about basic standards, such as dated information, financial interests of the author, or overreliance on one source or even a few?  Still others like Anderson have attempted to set forth standards for rating of knowledge quality as Beall apparently tried to do (Anderson-Scholarly, 2017;  IIIS, 2017). Many others have their own standards and methods of peer review (Peer review methods and standards, 2017).

Unfortunately, because there was no non-profit organization with the criteria and metrics, academics flocked to Beall, and Cabell’s.  While there may be basic standards for quality, in the end “Different journals have different aims and individual titles can be seen as ‘brands.’  The editorial position of the journal influences the criteria used to make decisions on whether to publish a paper” (MacGill, 2017). In essence, a state of anarchy exists in the world of knowledge quality assessment.

What Can Information Scientists Do To Help Solve The Knowledge Quality Problems?

Fake and predatory journals most likely will not disappear as long as the profit motive and factors giving rise to them (such as publish or perish problems) continue.  There are two basic areas in which the information scientist has a responsibility: discerning and promoting knowledge quality and addressing the blacklists.

Aside from critical thinking (2017), information scientists would do well in being grounded in epistemology, or justified belief.  That is, how do we know that something is true? Familiarity with the course content taught in philosophy departments is a good start.  Applying epistemology to questionable information often will generate the needed red flags. Bringing philosophy department faculty into the process will help.  Teaching others how to spot poor quality is another action.

As for blacklists themselves, an information scientist needs not only to know what criteria is being used to judge a publication or article, but exactly how the criteria was applied and why.  Here, one should read the journal first, arrive at a conclusion, and then see what the blacklist says. If a journal is thought to be important or worthy, a colleague should participate, blindly reading the article independently.  Both should compare notes afterward. In all cases, one should be versant in the journal or article’s subject matter, aside from composition, organization, grammar, style, typography, etc., not unlike grading a term paper or thesis.  Then, the question arises, “who created the blacklist?” Was it a dispassionate non-profit effort to identify those who have no regard for the truth? How was the judgment criteria created and applied? Was there any interaction between those managing the blacklist and the targeted publications?  Was there any blind reviewing of the journal? Was there a well-defined procedure to allow the journal to come in line with the blacklist standards?

While it is critical that individual information scientists have the ability to identify quality information, it is perhaps more important that they work with others to promote knowledge quality through the creation of a knowledge quality institute (KQI).  This institute may have activities comparable to those of standards organizations like the Institute of Electrical and Electronics Engineers (IEEE), the American National Standards Institute (ANSI), or the International Organization for Standardization (IOS).

A common core of standards and recommended practices should be developed and applied appropriately to various disciplines that all peer reviewers in their respective areas would use.  These would undergo periodic review at the meetings, as they are with other standards organizations.

Analogous to other scientific and standards organizations, the KQI would have workshops in specialty areas and a “trade fair” component, where representatives of publishers, universities/colleges, research organizations, and other information vetting outlets present their findings on knowledge quality.  In addition, a journal of knowledge quality would tie together persons interested in this activity and would serve as a paradigm case of excellent peer review, the “gold standard.” Through this collaboration, librarians can not only assist in creating KQI, but they can identify ethical publishers in which to create a partnership with and invest in their collections.

Through investing in reputable publishers, it will diminish the presence of predatory publishers, as many academic publishing houses are in direct competition with these “vampire presses.”

Additionally, librarians can identify the publishers that have a rigorous and transparent peer review process.  For example, IGI Global’s peer review process:

“As a part of the commitment to maintain the best ethical practices and the highest levels of transparency, the peer review process is the driving force behind all IGI Global publications” (IGI-peer review, 2017).

Meaningful and successful peer review depends upon the competence of the reviewer, elimination of bias as much as possible, “cross-checking” through multiple reviewers, an “audit trail” of the review process for accountability, and thoroughness.  In addition, meaningful reviewing criteria should be known to all in the review process, as well as to the reader. IGI Global’s peer review process is being done transparently through the eEditorial Discovery (EED) online submission system. Their criteria for the journal process is below:

  • There must be at least 3-5 Editorial Review Board Members in a double-blind peer review process (note that two reviewers are the standard for many publishers);
  • The Editor-in-Chief will blind the manuscript and assign it to 3-5 members of the Editorial Review Board.  Once at least three Editorial Review Board Members have completed their reviews, the Editor(s)-in-Chief will then send the manuscript and its reviews to an Associate Editor for review;
  • Then, once the Associate Editor’s review commentary is received, the Editor(s)-in-Chief will take all commentary into consideration, and make their formal assessment of the manuscript.
  • After the overall assessment by the Editor(s)-in-Chief is complete, the author (if the manuscript is not rejected outright) will have the opportunity to make either major or minor revisions to their work per the reviewer’s commentary.  Once the revisions are received from the author, the Editor(s)-in-Chief will either accept or reject the manuscript outright, send the revised manuscript to the Associate Editor for their additional commentary, or the Editor-in-Chief may request additional revisions from the author before a final decision can be made.
  • Note that the revision process may repeat itself several times to ensure that the author’s work is of the highest quality.

Before the double-blind peer review, each reviewer must submit their curriculum vitae to help ensure that they are not reviewing outside their subject area.  Then, “each review board member is evaluated every six months.” This practice is not as common as one would typically assume in the publishing field.

Additionally, through researching publishers, librarians are able to see which publisher best fits their needs.  Through the collaborative efforts with librarians, IGI Global has created customized solutions through the award-winning InfoSci Database Platform that contain over 4,500 peer-reviewed books and more than 175 quality peer-reviewed journals, allowing unlimited simultaneous users to download full-text in XML and PDF with no digital rights management (DRM).

They, and many other publishers and librarians, are active in “Peer Review Week,” an international campaign to combat unethical publishing practices, and writing informative posts and guides highlighting the threat of predatory publishing.  It is no accident that information scientists (“librarians”) and publishers, in going beyond the outcry of academics, initiated efforts to track academic journals for their quality and started to establish some mechanisms to address peer review issues.  Now, a concerted effort by all those concerned about knowledge quality is needed. This article is such a call to action, following the outline above of suggested activities. Contact this author, who currently is coordinating an effort to establish the KQI.



(all links accessed 25 December 2017)

AAAS (2017).  Science: Editorial Policies.  Science.  http://www.sciencemag.org/authors/science-editorial-policies#conflict-of-interest  

Anderson – CV (2017).  Rick Anderson.  http://faculty.utah.edu/bytes/curriculumVitae.hml?id

Anderson – Scholarly (2017).  Cabell’s New Predatory Journal Blacklist: A review.  The Scholarly Kitchen.  25 July 2017.  https://scholarlykitchen.sspnet.org/2017/07/25/cabells-new-predatory-journal-blacklist-review/

APA – standards (2017).  Standards of Accreditation for Health Service Psychology.  American Psychological Association.  https://www.apa.org/ed/accreditation/about/policies/standards-of-accreditation.pdf

Beall (2017).  Beall List.  http://beallslist.weebly.com/  

Beall – criteria (2017)  http://beallslist.weebly.com/uploads/3/0/9/5/30958339/criteria-2015.pdf  https://en.wikipedia.org/wiki/Jeffrey_Beall

Beall – CV (2017).  Jeffrey Beall – curriculum vitae.  https://web.archive.org/web/20131102231551/http://people.auraria.edu/Jeffrey_Beall/sites/people.auraria.edu.Jeffrey_Beall/files/cv/Jeffrey-Beall-CV.pdf

Bohannon , J. (2013).  Who is afraid of peer review?  Science, 342(6154) , pp. 60-65.  DOI: 10.1126/science.342.6154.60.  http://science.sciencemag.org/content/342/6154/60

Cabell (2017).  Cabell’s Blacklist violations.  Cabell’s Scholarly Analytics. https://cabells.com/blacklist-criteria

Cabell – Charges (2017).  The Journal Blacklist.  https://cabells.com/about-blacklist  https://cabells.com/get-quote

Callaos, N. (2011).  Peer Reviewing: Weaknesses and proposed solutions.  Unpublished paper, available upon request from Jeremy Horne.

Cabell – criteria (2017).  Blacklist criteria. https://cabells.com/blacklist-criteria

Cook, A. (2006).  Is peer review broken?  The Scientist, 1 February 2006.  http://www.the-scientist.com/?articles.view/articleNo/23672/title/Is-Peer-Review-Broken-/

Crawford, W. (2017)  Walt at Random.  The library voice of the radical middle.  https://walt.lishost.org/2016/01/trust-me-the-other-problem-with-87-of-bealls-lists/

Crawford – background (2017).  Walt at Random – About.  https://walt.lishost.org/about/

Critical thinking (2017).  Two excellent sources for critical thinking material:  www.criticalthinking.org and www.criticalthinking.com

Csiszar, A. (2016).  Peer review: Troubled from the start.  Nature, 532(7599), 19 April 2016.  http://www.nature.com/news/peer-review-troubled-from-the-start-1.19763

DOAJ Standards (2017)  Directory of Open Access Journals.  https://doaj.org/application/new#review_process-container

Flaherty, C. (2013).  Librarians and lawyers.  Inside Higher Ed.  15 February 2013.  https://www.insidehighered.com/news/2013/02/15/another-publisher-accuses-librarian-libel

IGI-peer review (2017).  For more information on partnership opportunities please contact: <E-Resources@igi-global.com>.  https://www.igi-global.com/publish/peer-review-process/

IIIS (2017).  Information to Contributors.  Journal of Systemics, Cybernetics and Informatics.  http://www.iiisci.org/Journal/SCI/PeerReviewMeth.asp?var=  Integrating Reviewing Processes.  http://www.iiisci.org/Journal/SCI/IntegRevProcesses.asp?var=

JFE (2017).  Financial Education Association.  http://jfedweb.org/feaboard.html

JFE-complaint (2017).  Journal of Financial Education.  http://jfedweb.org/Beall2016.htm

Kaplan, D., 2005, How to Fix Peer Review, The Scientist, 19(1), p. 10, Jun. 6

Khosrow-Pour, M. (2017).  “Maintaining the Integrity of Scientific Knowledge Content.”  https://www.youtube.com/watch?v=z93mj0alKEg&feature=youtu.be

Kolata, G. (2017).  Many Academics Are Eager to Publish in Worthless Journals.  New York Times, Science.  30 October 2017.  https://www.nytimes.com/2017/10/30/science/predatory-journals-academics.html?module=WatchingPortal&region=c-column-middle-span-region&pgType=Homepage&action=click&mediaId=thumb_square&state=standard&contentPlacement=5&version=internal&contentCollection=www.nytimes.com&contentId=https%3A%2F%2Fwww.nytimes.com%2F2017%2F10%2F30%2Fscience%2Fpredatory-journals-academics.html&eventName=Watching-article-click

MacGill, M. (2017).  Peer review: What is it and why do we do it?  Medical News Today, 10 August 2017.  https://www.medicalnewstoday.com/articles/281528.php

Moher, D., Shamseer, L., and Cobey, K.  (2017).  Stop this waste of people, animals and money.  Nature 5(4 9): 23.

NAAL (2003).  National Assessment of Adult Literacy.  https://nces.ed.gov/naal/kf_demographics.asp  

NAAL (2005)  Key Concepts and Features of the 2003 National Assessment of Adult Literacy;  NCES 2006-471  U.S. Department of Education, Institute of Education Sciences, National Center for Educational Statistics (NCES)  https://nces.ed.gov/NAAL/PDF/2006471_1.PDF

National Science Foundation (2014).  “Science and Engineering Indicators 2014”  Chapter 7 Science and Technology: Public Attitudes and Understanding  P. 7-23

NCES, (2009).  Basic Reading Skills and the Literacy of America’s Least Literate Adults. National Center for Educational Statistics.  https://nces.ed.gov/pubs2009/2009481.pdf  

Peer review methods and standards (2017).  Welcome to the Peer Review Week resources page where you can find How-tos & Tutorials, Best Practices & Guidelines and Research related to peer review.  Peer Review Week, 2017. American Association for the Advancement of Science (AAAS). http://www.pre-val.org/prw/

Peer review week (2017).  https://peerreviewweek.files.wordpress.com/2016/06/prw-press-release-2017.pdf ;  and https://peerreviewweek.wordpress.com/

Preston, A. (2017).  The future of peer review.  Scientific American, 9 August 2017.  https://blogs.scientificamerican.com/observations/the-future-of-peer-review/;  and https://publons.com/home/

Publons (2017).  Publons. https://publons.com/home/

Retraction Watch (2017).  “Weekend reads: A flawed paper makes it into Nature; is science in big trouble?; a reproducibility crisis history.”  Retraction Watch. http://retractionwatch.com/2016/12/10/weekend-reads-flawed-paper-makes-nature-science-big-trouble-reproducibility-crisis-history/

Shen, Cenyu;  Björk, Bo-Christer (2015).  Predatory’ open access: a longitudinal study of article volumes and market characteristics.  BMC Medicine. 13 (1): 230. ISSN 1741-7015. doi:10.1186/s12916-015-0469-2.  https://bmcmedicine.biomedcentral.com/articles/10.1186/s12916-015-0469-2

Silver, A. (2017).  Controversial website that lists “predatory” publishers shuts down.  Nature, 18 January 2017.  ISSN: 0028-0836 EISSN: 1476-4687  https://www.nature.com/news/controversial-website-that-lists-predatory-publishers-shuts-down-1.21328

Sorokowski1, P., Kulczycki, E., Sorokowska, A., and Pisanski, K. (2017).  Predatory journals recruit fake editor.  Nature: 543 (481–483) (23 March 2017) doi:10.1038/543481a  http://www.nature.com/news/predatory-journals-recruit-fake-editor-1.21662#fake;  https://www.nature.com/polopoly_fs/1.21662!/menu/main/topColumns/topLeftColumn/pdf/543481a.pdf

Recommended Readings Published by IGI Global:

Al-Suqri, M. N., Al-Kindi, A. K., AlKindi, S. S., & Saleem, N. E. (2018). Promoting Interdisciplinarity in Knowledge Generation and Problem Solving (pp. 1-324). Hershey, PA: IGI Global.  doi:10.4018/978-1-5225-3878-3

Baran, M. L., & Jones, J. E. (2016). Mixed Methods Research for Improved Scientific Study (pp. 1-335). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0007-0

Esposito, A. (2017). Research 2.0 and the Impact of Digital Technologies on Scholarly Inquiry (pp. 1-343). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0830-4

Hsu, J. (2017). International Journal of Ethics in Digital Research and Scholarship (IJEDRS).  doi:10.4018/IJEDRS

Jeyasekar, J. J., & Saravanan, P. (2018). Innovations in Measuring and Evaluating Scientific Information (pp. 1-315). Hershey, PA: IGI Global.  doi:10.4018/978-1-5225-3457-0

Munigal, A. (2017). Scholarly Communication and the Publish or Perish Pressures of Academia (pp. 1-375). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-1697-2

Sibinga, C. T. (2018). Ensuring Research Integrity and the Ethical Management of Data (pp. 1-303). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-2730-5

Storey, V. A., & Hesbol, K. A. (2016). Contemporary Approaches to Dissertation Development and Research Methods (pp. 1-360). Hershey, PA: IGI Global. doi:10.4018/978-1-5225-0445-0

Wang, V. C. (2015). Handbook of Research on Scholarly Publishing and Research Methods (pp. 1-582). Hershey, PA: IGI Global. doi:10.4018/978-1-4666-7409-7

Wang, V. C., & Reio Jr., T. G. (2018). Handbook of Research on Innovative Techniques, Trends, and Analysis for Optimized Research Methods (pp. 1-445). Hershey, PA: IGI Global.  doi:10.4018/978-1-5225-5164-5  



Submit a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.



Share This