The appeal of simple metrics isn’t going to go away any time soon. This piece will cover a number of metrics e.g. the Impact Factor, Altmetrics and Scopus data among others. It will look at the different ways these metrics benefit the west at the detriment of other regions, especially developing regions and the impact this can have on researchers and research more broadly. With pieces from:
- Xin Bi – DOAJ Ambassador for China
- Ivonne Lujano – DOAJ Ambassador Latin America
Dr. Xin Bi is currently the Director of Library and Museum, Director of University Marketing and Communications in Xi’an Jiaotong-Liverpool University. Since May 2016, he has been a DOAJ (Directory of Open Access Journals) editor and Ambassador for China helping to increase the quality of open access journals and the awareness of best practice in academic publishing in China and East Asia.
Publish or perish is a common theme in academic circles worldwide, and this is no different in China. With a booming economy, the Chinese government has significantly invested in scientific research, which has boosted research outputs.
Statistics compiled by the US National Science Foundation (NSF) last year, showed China has overtaken the United States in terms of the total number of science publications. This significant shift has attracted a lot of analysis from many different fields.
Quality of the academic publishing
The Chinese government is very serious about the quality of the academic journals published in the country. The so-called predatory journals are not often seen in China, and the 5,000 plus academic journals that are now published, are generally owned by government bodies or state-owned publishers, universities or institutes, and mostly published in Chinese. Though most of these academic journals have a rather small editing office, the quality of the journal is guaranteed.
With ever increasing research collaborations between China and other countries, the number of research articles published by Chinese scholars in English has also increased dramatically in recent years.
Metrics as evaluation criteria to researchers
Metrics are commonly used in Chinese universities, research institutes and funding bodies. Impact Factor (IF), a measure reflecting the yearly average number of citations to recent articles published in that journal, is heavily used in China (as well as other countries) to evaluate researchers. It is common practice that researchers are keen to publish in journals with a high impact factor. As the IF was developed in the west and focused on journals published in English, this of course incentivises many Chinese researchers to publish their best research in international journals.
Web of Science and also Scopus are both scientific citation databases which index thousands of journals and subject areas. In Web of Science there are sub collections as SSCI (Social Science Citation Index), SCI (Science Citation Index) and A&HCI (Art and Humanities Citation Index) which refers to different subjects. Again, to publish in a journal which is listed in either SSCI, SCI or A&HCI, or Scopus is something that has been adopted by universities and funding bodies as the evaluation of the research or the criteria for postgraduate students to be qualified for graduation.
In China, particularly in science and technology, researchers are increasingly keen to publish with international journals, and for social sciences and humanities, due to the limitation of English language capability, researchers are publishing with local journals in Chinese. So a CSSCI (Chinese Social Science Citation Index) has been established and also serves as a metric to judge researchers and Ph.D. students for their work.
Similar to the metrics mentioned above, even Almetrics are involving more than just citations, but the sources of their data mainly come from English language publications. So this again encourages local researchers to publish in international journals.
In conclusion, metrics make it easy to evaluate research in their promotion and funding application, among many other things, and they may help to provide seemingly fair criteria for everyone. When examined in depth, the practice has severely affected local research in China, as the best research outcomes have often been published in English. Frequently, these papers can’t be accessed and used by a wide range of readers in China, and this again, will prevent Chinese local journals from growing.
Ivonne Lujano-Vilchis holds a Master in Social Sciences with an emphasis in Education by FLACSO-Argentina. She teaches at the Universidad Autónoma del Estado de México (UAEMex) and is the DOAJ Ambassador (Directory of Open Access Journals) for Latin America.
Traditional metrics in scholarly communication (Impact Factor, SCImago Journal Rank indicator) are not neutral, they reflect the political, economical and cultural conditions that historically have built an unequal structure where many forms of discrimination still operate. For that reason, scholars from regions of the so-called Global South have reported this as a social justice problem to solve.
Both of these indicators were created by private companies to measure citations, which gradually was associated with the impact of research. The criteria these metrics are based on include some aspects that only some journals can meet: topics of publication established by top institutions, specific types of contributions (mainly research articles), language of publication (mainly English), among others. These metrics allow us to make inferences about how academic communities are hierarchically structured and how prestige is attributed. In that sense, many research outputs, disciplines or languages are excluded.
Thus, these metrics are embedded in a geopolitical context characterized by various forms of structural exclusion and inequalities in the research and education systems, that have prevailed worldwide for many decades. Furthermore, these metrics are polemic when they are used to determine a journal’s quality because to be cited is not necessary synonym of high quality or best editorial practices. The problem is that in many countries, governments and institutions use these metrics uncritically to design policies for research assessment and to take decisions on promotion, tenure, resources allocation, agendas in science and technology policy, etc.
This is the case of some countries in Latin America, a region that paradoxically has been underrepresented in those rankings. In recent years, there have been some changes in journals’ assessment policies in countries like Mexico and Colombia, two of the countries with the largest research systems in the region. Those changes include IF as the main measure of the quality of journals; in fact, in both countries journals are currently classified and ranked from the quartiles of the Web of Science. For instance, in Colombia, 583 were evaluated under this criterion. The result is dramatic: 339 journals were rejected and 280 downgraded (only two Colombian journals are in Q3 and 15 journals in Q4 in Science Citation Index and Social Science Citation Index). The new national index (Publindex) now only includes 244 journals.
Among researchers, this represents a setback for scholarly communication in the region, because for more than 20 years in Latin America very important initiatives have been developed to encourage local publications to disseminate research outputs meeting international publishing standards in order to visibilize science. These initiatives are Latindex, SciELO and Redalyc. First, Latindex is a referral service launched in 1997 that provides information of journals from countries of Latin American, the Caribbean, Spain and Portugal. It is hosted and supported by National Autonomous University of Mexico (UNAM). Second, the Scientific Electronic Library Online (SciELO) is a network that provides an indexing methodology and infrastructure to support journals from 15 countries in Latin America. SciELO have its own metrics, which are indicators on citations and co authorship. Finally, Redalyc is an information system created and supported by Autonomous University of the State of Mexico (UAEMEX). It is centered on Social Sciences and Humanities journals and it also have created its own metrics.
However, policy makers prefer ‘international’ metrics or databases to legitimate and evaluate journals. Official documents refer internationalization as a priority and generally define it as as synonym of being indexed in WoS or Scopus. Other databases, like the three cited from Latin America or the Directory of Open Access (DOAJ) which provides open access to high quality journals from more than 125 countries- are simply considered secondary or of less importance. Although the DOAJ does not provide metrics of journals, there are some data that this Directory collects which could be relevant for policy designing. For instance, DOAJ provides information on types of peer review, licensing, APCs and types of formats available of full text articles. These elements are indicators of quality and best editorial practices, which are ultimately fundamental for dissemination of knowledge.
VASEN, Federico; LUJANO VILCHIS, Ivonne. Sistemas nacionales de clasificación de revistas científicas en América Latina: tendencias recientes e implicaciones para la evaluación académica en ciencias sociales. Revista Mexicana de Ciencias Políticas y Sociales, [S.l.], v. 62, n. 231, ago. 2017. ISSN 2448-492X. Disponible en: <http://www.revistas.unam.mx/index.php/rmcpys/article/view/58652>. doi:http://dx.doi.org/10.1016/S0185-1918(17)30043-0.
HermesIS.com. La recomposición del sistema de evaluación de revistas en latinoamérica: más de 300 revistas excluye Publindex de la evaluación académica en Colombia. Revista Nativa Digital, abril 4, 2018. Disponible en: http://www.hermes-is.com/blog/articulo/la-recomposicion-del-sistema-de-evaluacion-de-revistas-en-latinoamerica-mas-de-300-revistas-excluye-publindex-de-la-evaluacion-academica-en-colombia/