New! Sign up for our free email newsletter.
Science News
from research organizations

Are Journal Rankings Distorting Science?

Date:
March 16, 2007
Source:
BMJ-British Medical Journal
Summary:
This week's British Medical Journal raises concerns over whether journal rankings (known as impact factors) are distorting publishing and science. The impact factor is a measure of the citations to papers in scientific journals. It was developed as a simple measure of quality and has become a proxy for the importance of a journal to its field.
Share:
FULL STORY

This week's British Medical Journal raises concerns over whether journal rankings (known as impact factors) are distorting publishing and science.

The impact factor is a measure of the citations to papers in scientific journals. It was developed as a simple measure of quality and has become a proxy for the importance of a journal to its field.

But a report by the BMJ this week warns that the popularity of this ranking is distorting the fundamental character of journals, forcing them to focus more and more on citations and less on readers.

Concerns include the fact that a bad paper may be cited because of its infamous errors and that a journal's rank has no bearing on the quality of individual papers it publishes. But more worrying is the trend towards using impact factors to guide decisions on research funding. This has been particularly noticeable in the UK, where universities now prioritise scientific fields that produce research published in the highest impact factor journals, causing substantial damage to the clinical research base.

In an accompanying article, two researchers discuss whether impact factors should be ditched.

Gareth Williams of Bristol University believes that the academic community should consign the impact factor to the dustbin. He sees the measure as fatally flawed and highly damaging to the academic community.

"The impact factor is a pointless waste of time, energy, and money, and a powerful driver of perverse behaviours in people who should know better," he writes. "It should be killed off, and the sooner the better."

But Richard Hobbs of Birmingham University thinks that rather than just discarding impact factors we should consider solutions to the problems. For example, extending the citation surveillance period, applying weightings to adjust for the average number of references across journals, or scoring journals on only their most important papers.

It's easy to criticise bibliometrics, but we should attempt to refine them and debate in parallel how we can track academic careers and encourage fewer, but better studies that affect the wider community, he concludes.


Story Source:

Materials provided by BMJ-British Medical Journal. Note: Content may be edited for style and length.


Cite This Page:

BMJ-British Medical Journal. "Are Journal Rankings Distorting Science?." ScienceDaily. ScienceDaily, 16 March 2007. <www.sciencedaily.com/releases/2007/03/070315210038.htm>.
BMJ-British Medical Journal. (2007, March 16). Are Journal Rankings Distorting Science?. ScienceDaily. Retrieved November 21, 2024 from www.sciencedaily.com/releases/2007/03/070315210038.htm
BMJ-British Medical Journal. "Are Journal Rankings Distorting Science?." ScienceDaily. www.sciencedaily.com/releases/2007/03/070315210038.htm (accessed November 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES