Scientists in the US, the UK and other global research powers enjoy higher citations for their publications than scholars in less developed countries doing broadly similar research, a new study has argued.
Based on analysis of the text and citations given to nearly 20 million scientific papers across 150 fields, researchers conclude that those from countries with more established scientific communities – including the US, UK and China – are “overcited” compared with those from other nations, even if they cover similar subject matter.
The study,?published?in?, tracked publications over 35 years between 1980 and 2012, and found that the gap between “over- and under-cited countries grew over time”.
The study conducted?by researchers from CUNY Queens College, Stanford University and the University of California, Los Angeles used a measure that sought to capture the difference between the number of citations that would be expected for a paper based on the text contained within it, and the actual number of citations the paper received.
Based on their algorithm, they found that “citational bias” may be distorting the flow of information because “research papers from some countries are systematically overlooked in scientific research”.
The US is described as “highly overcited”, while other countries such as Germany, Switzerland and the Netherlands are also classed as “overcited” but not the same degree as America, according to the paper published on 30 May.
Brazil and Mexico are among those nations losing out from systematic bias that “impede[s] global scientific growth”, the paper claims.
Recognising this bias was important because “knowledge production is [already] overwhelmingly skewed towards resource-wealthy countries such as the United States and those in western Europe and east Asia (to name a few) that house the best universities, Nobel winners and journal editors”.
Without examining this kind of bias, a practice called “citational lensing”, it was difficult to “track the effectiveness of national science policies” or evaluate “the relative importance of various national factors in nurturing a highly cited scientific community”, given that research bibliometrics might not tell the whole story.
“Better identifying who is undercited not only promotes the inclusion of often excluded perspectives but also enhances knowledge production,” the authors conclude.