Skip to content

Impact factor or other “better metrics”?

August 8, 2014

In a recent post, I collected a series of 2013 impact factors (IF) from our field of research (heterogeneous catalysis) and around (chemistry, materials science, etc.) and I discussed the ranking of the journals and their evolution with respect to 2012.

Some colleagues of mine pointed to the flaws of these indicators and argued that today, “there are other better indicators, for example SJR”. So I sat back and asked myself if my use of the impact factors was reasonable or not.

Answer in 3 points.


1. Impact Factor (Thomson Reuters) vs. SJR (Scopus)

Both metrics are based – roughly speaking – on the number of citations received by a journal, normalized by the number of papers published by the journal. At this stage, let’s just consider that the point is to be able to compare journals in terms of “impact” or “prestige” or “audience”. The most famous journals attract more citations (per article) and are supposed to collect a high IF or SJR score (is it because they publish the best science or because they do excellent marketing? That’s another story…).

I collected the SJR 2013 (from here) for journals in my research field (this precision is important) and plotted them as a function of their 2013 IF.

comparison SJR - IF 2013

Both metrics rank journals focused on catalysis pretty much in the same way. Correlation between indicators is statistically excellent. Even if you expand the range of journals in the analysis, covering also the very high impact journals and review-only journals, correlation is still excellent.

comparison SJR - IF 2013 plus large

Therefore, it can not be claimed that SJR is a better metrics than IF, as long as the purpose is to assess the “audience” I can expect for my next submissions (it was my purpose actually).


2. The flaws of the metrics

“Citable items”

IF are criticized because Thomson Reuters is normalizing all the citations received by the journal by the number of “citable items” published by the journal. The problem is that the definition of this denominator is obscure and easy to rig, as exposed in this article. Journals can try to take some of their published content out of the denominator in this equation in order to get a higher IF. I don’t really understand what can be kicked off. Editorials? Job offers? Reviewers acknowledgements? As far as I am concerned, as long as they take into account all scientific content (full papers, communications, reviews, etc.) and they do it the same way with all journals, that’s fine. Nevertheless, it is fair to ask Thomson Reuter to be fully consistent and transparent on that point.

“Include reviews”

Reviews attract more citations. That is very clear. Review-only journals are scoring very high. But I mean, people in the field know that! And still, when I prepare a huge and comprehensive review, I am happy to know that Chem. Rev. is scoring higher than Chem. Soc. Rev. for example. So again, IF are useful to me. However, it is true that “regular” journals can be tempted to also publish more numerous reviews along with the original research papers. That’s indeed a way to trick the indicator. But they can do it only to a limited extent, if they still want to be recognized by the community as a regular journal publishing mainly original research articles. Plus this issue is common to JSR and IF.

“Importance or prestige of the journal”

While IF uses a relatively basic formula, SJR includes a correcting factor that they call the “importance or prestige of the journal”. Now that is something really obscure (to me)! More explanation can be found on their website: “SJR assigns relative scores to all of the sources in a citation network. Its methodology is inspired by the Google PageRank algorithm, in that not all citations are equal. A source transfers its own ‘prestige’, or status, to another source through the act of citing it. A citation from a source with a relatively high SJR is worth more than a citation from a source with a lower SJR”. I don’t really like maths and I didn’t take the time to try and understand this parameter further (maybe you can find some useful info in their FAQ). But, two points: (i) I prefer a very simple metric with well-known weaknesses over a complicated metric with obscure calculations and (ii) as showed above, even with this more advanced calculation, SJR give equivalent ranking (for journals in the same field).


Just a reminder: IF or SJR of journal in which you just published your latest article IS NOT an indication of the quality of your work. If papers have to be compared, look at the content or at Altmetrics! If scientists have to be put in competition and compared, use other metrics! H-factor, I-factor, etc. or better: a full CV taking into account all other aspects of the academic life of a scientist (teaching, mentoring, blogging, impact on society, scientific popularization, etc.).


3. The interest of SJR

Apparently, SJR, thanks to its more sophisticated algorithm, is able to rank journals form different fields. Clearly, a very good article in the field of climate science of neurobiology will attract more citations than a very good article in the field of organic chemistry, for example. It is thus impossible to compare articles, journals, scientists, groups or departments, on the basis of the IF of the journals in which they publish, if they belong to very different fields. SJR is thus interesting for those who are concerned with comparing different fields. It is not my case, so I leave the discussion there.


I want to conclude by saying that no indicator is perfect. And it is important to understand the limitations behind and not to over-interpret them. It should be said also that my view of the IF is very restrictive: one researcher in one relatively well-defined field using IF for a simple purpose. No doubt librarians and other bibliometric fans will want to go further. I also believe that my analysis is incomplete since I only looked at SJR and IF. Most probably, some improved indicators are already on the market somewhere. Please feel free to share.





Leave a Comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: