Academic Publishing: Scientific Journals Need to Open Up- Part 4

Impact factor is a stupid concept. Blatantly stupid. It may have been useful in the context earlier (to base the purchase decisions) but its application to tenure of academicians seems like an overstretch.

First the historical context:

One key concept in Garfield’s 1955 article was what he called the “impact factor,” a measure of influence based on how frequently an article was cited by others. In 1972 he unveiled his first ranking of journals by impact factor, with the Journal of the American Chemical Society coming in first. Such a metric, he mused at the time, might help librarians in deciding which journals to subscribe to and journal editors in looking for “objective and timely measures of their success.”

Viewpoint: Covid-19 Shows That Scientific Journals Need to Open Up

Now consider this:

A journal’s impact factor is simply an average of the citations that papers published in the previous two years attracted that year. In the majority of journals indexed in 2016, some 70% of papers were cited less times than their average.

Éric Archambault, founder of the bibliometrics firms Science-Metrix and 1science in Montreal, agrees that a few papers can create a “false impression” that a journal is highly cited. “Proper safeguards” should be in place when using the metric to compare journals, so as not to conflate the average with the typical score of articles, he says.

You can read the entire paper here.

Citation and article metrics are flawed.

If you add the “social metrics” it is an unholy matrix. Twitter, for example, is an “advertising platform” and I have been repeating here that it is a useless medium for “scientific debates/discussion” or even sharing of the ideas.

The steady rise of impact factor

This is explained well by the following quote:

In 1975, the top-scoring journal was the Journal of Experimental Medicine, with a JIF of 11.874. By 2016, A Cancer Journal for Clinicians received the highest score of 187.040.Several factors have contributed to the progressive rise, says Archambault, including an increase in the number of scholarly journals, and references per paper. “But this a reflection of growth in quantity, not a growth in quality,” he adds.

I call this as “evergreening of the benefits”.