ATG Article of the Week: Could this be the start of a new era in scholarly communication?

by | Jul 12, 2020 | 0 comments

Could this be the start of a new era in scholarly communication? is by Rebecca Lawrence, Managing Director of F1000 Research Ltd, and it recently appeared in the F1000 Research Blog. (However it was first posted on the website Elephant in the Lab.) In this post Ms. Lawrence “discusses how we need to learn from, and harness, the new approaches employed during the COVID-19 pandemic to shape the ‘new normal’ of scholarly communication.”

“For all the devastating impact that the coronavirus pandemic has had on us all, it has also shown how it is possible to swiftly change previously entrenched cultures and mindsets. Take for example the ability of a large proportion of the working population to now work just as effectively from home, without the usual significant travel and environmental impact caused by commuting and long haul flights to meetings and conferences around the world.

Meanwhile in scholarly research, there has been increasing pressure over the past couple of decades to rethink how we review and disseminate new research: to move away from the traditional journal model that has been the central pillar for much scholarly communication over the past 300+ years, and maximise the opportunities and potential that new technologies and approaches can bring.  Some progress has been made but the pace of change has been much slower than many had hoped. Could this pandemic be the trigger that finally enables a wholesale change in the way we conduct, communicate and discuss new research?

Accelerating research and its communication

The research world has shown that it is possible to make progress on drug and vaccine discovery [1] during this pandemic at speeds many multiples faster than previously achieved through the use of infrastructures to support sharing, reuse and collaboration around data; rapid publication and discussion/review tools (as discussed below); and through much greater transparency and accessibility of new research.

We have seen a big upsurge in the use of preprint servers [2] such as bioRxiv [3], medRxiv [4] and others, to rapidly share new insights into coronavirus. Such preprint servers (e.g. arXiv [5]) were originally developed to enable researchers to receive early feedback on articles prior to journal submission and to claim priority on the findings. During the coronavirus outbreak, preprints have been increasingly used as a way to quickly share new research prior to going through peer review so that other researchers in the field can quickly assess the outputs and, where appropriate, start to build on them without the normal delay (often months) awaiting formal journal publication.

Another new approach has been the development of Outbreak Science [6] (funded by Wellcome), a platform to enable researchers (public or anonymous) to provide a structural review of a preprint with the aim of providing initial community triage. A number of publishers and related groups have also come together to look at how they can maximise efficiency in peer review [7] for coronavirus-related research, to minimize direct peer review requests from already overworked coronavirus experts through effective triage prior to peer review, and to reduce re-review between publication venues to bring cost and efficiency gains to the system.

Speed must not come at the expense of trust

Some experiences of rapid publication during the pandemic have however highlighted a number of inherent flaws in these processes that simply cannot be ignored. Although most preprint servers include clear warnings that the preprints they host have not been peer reviewed, there have been a few unfortunate cases where lower quality preprints have been used to fuel fake news and to derail public debate [8]. Given the potential impact of this research on health, many preprint servers have now introduced additional checks [9] to help increase trust in the content to try to minimize the potential for misuse and misinterpretation of the findings being reported.

Furthermore most research on preprint servers does not include sharing of the underlying data, code and materials, minimizing the ability for full scrutiny (including peer review) of findings to support their reuse. Indeed, the problems with this have been brought into sharp focus with the recent retractions of two COVID-19 papers in the Lancet [10] and NEJM [11], as well as of a further preprint, all of which had started to influence treatment regimes. In each case the lack of access to the underlying data [12] to enable independent verification of the results ultimately led to their significant claims being brought into question. This highlights not only how crucial it is that such underlying data are made available for review, but also that peer review of research needs to be open and transparent, so that it is clear to all what level of expert review has taken place, by whom, and their comments…”

Please click here in order to continue reading the entire article.

Sign-up Today!

Join our mailing list to receive free daily updates.

You have Successfully Subscribed!