Skip to Main Content

Research Impact Metrics

Bibliometrics vs. Altmetrics: Competitors or Complementary?

Bibliometrics are metrics based on works citing other works, associated most with journal articles. Times cited, journal impact factors, and the H-index are all based on these metrics. These are the traditional data points that have been used for quite a long time, and can be obtained through Web of Science, Google Scholar, and more recently, Dimensions.  In particular, the times cited data provides feedback on the scholarly use of one's work. Bibliometrics can also be obtained for conference proceedings, books and book chapters, though there may be some challenges for tracking down book citation data.

Alternative Metrics (Altmetrics) are metrics based on engagement beyond citations, usually focused on online engagement. This includes news coverage, social media activity, and other data like views and downloads.  Altmetrics can provide data for outputs that are not covered well by bibliometrics, as soon as they are available online, and from a wider audience than a narrow field of scholars in that discipline. They can demonstrate engagement by a variety of people from the general public to news agencies to other researchers or scholars and more.  With altmetrics, you can often see who is talking about your work as well as what they are saying. Altmetrics can be useful for disciplines outside of the hard sciences and in situating impact beyond citation counts. The most prominent altmetrics product is Altmetric, but websites and publishers may also provide additional alternative metric data. Dimensions includes citation data and links to Altmetric.

Bibliometrics and altmetrics can be complementary; altmetrics may provide more immediate engagement data, while bibliometrics don't get tracked until works citing your work are published and indexed.


When Using Bibliometrics and Altmetrics, Consider the Following:

Bibliometrics Altmetrics
Immediacy - How soon can I get some metrics? When the works that cite your work are online and/or indexed. When the work is online, you can view downloads, shares, likes, and an Altmetric attention score if there is one.
Coverage - Which types of sources have these metrics? Primarily journal articles, conference papers, books and book chapters. Works that have cited references and are indexed themselves. Other sources like data and software/code are not cited as well. Any online source can have data like shares, likes, downloads. Resources tracked by Altmetric need a unique identifier like a DOI.
Disciplinary Differences - What can I compare across different fields? Or in the same field?

There are disciplinary (and even sub-disciplinary) differences and these should not be crossed in any comparisons. Biomedical journal articles are going to have different citation patterns than books in the humanities and social sciences.

Comparisons within the same field should also be approached cautiously. Are the researchers in the same sub-discipline, and at similar places in their careers?

Along with the standard disciplinary differences, works in disciplines that get a lot of news coverage may have metrics that reflect that. For example: health and medicine, climate change and other environmental topics, internet and technology, etc.
Knowledge - How well known are the metrics to others? Bibliometrics have been around for quite a while, so times cited and journal impact factors should be familiar (even if they don't know exactly how the JIF is calculated. The H-Index is a more recent metric. However, knowledge does mean these metrics do not get misused. Altmetrics are a more recent tool, and as such may not be as well known. Some may assume altmetrics are solely based on social media like tweets, rather than a broader scope of online engagement.

Other considerations for both bibliometrics and altmetrics

  • The numbers themselves don't tell you anything about how the source was used, cited, or engaged with. Was the source cited with as part of the prior literature with 10 other references like this1-10, or was there a deeper discussion or engagement. Was the online attention a dozen tweets, or was it covered in more depth in a source like the New York Times.
  • Attention can be positive or negative. Again, the numbers won't reveal that.
  • Metrics can be misused. 
  • Metrics can be biased. Works in some disciplines and types of sources (biomedical journal articles, for example), get more heavily cited over other sources and disciplines. Review articles get heavily cited, leading to some very high impact factors for these journals. There may also be other factors at play, including gender biases. 
  • Metrics can be gamed. Clarivate monitors citation patterns in Web of Science to look for journals where publishers or editors may be trying to boost their impact factors. Altmetric weighs attention sources differently, so news coverage counts more than tweets.
  • There may be a relationship between high altmetric data and subsequent citations in the literature, but like so many of these factors can depend largely on the discipline and types of sources.