A recent post introduced a survey examining mentions of Clarivate Analytics Web of Science in the scientific literature. The papers culled for the survey – numbering nearly 20,000 – primarily derived from the fields of scientometrics or bibliometrics. That is, the selected papers examined, in some aspect, the dimensions and dynamics of research itself.
Here, we present one particular example of a bibliometric study harnessing the Web of Science – an investigation into the evolving methods for assessing impact.
The headlong rush of digital technology, particularly in the last decade or so, has accelerated the transformation of science and scholarship, dramatically expanding the ways in which research is communicated, archived, and assessed. A key development in this progression has been the rise of new evaluative measures known collectively as altmetrics. Reflecting the ubiquity and reach of today’s social media, the altmetrics movement seeks new means of quantifying the impact of research – means that go beyond such traditional measures as citation analysis.
For research analysts, and the institutional decision makers who depend on their findings, these new tools present an important question: How closely, and how usefully, do altmetrics and the older, “classical” measures correspond?
These new tools present an important question: How closely, and how usefully, do altmetrics and the older, “classical” measures correspond?
Using data from Clarivate Analytics Web of Science, a trio of analysts set out to examine this question. The team recently reported the results in the journal Scientometrics.
The Web of Science is an apt test bed for this inquiry. In addition to the authoritative store of publication and citation data that extend back more than a century, Web of Science metrics now include measures that address the newer dynamics of research impact and influence.
Specifically, each record in the Web of Science Core Collection, in addition to featuring citation data for the given source item, also includes “Usage” counts. These measures, updated daily, record instances in which an article has been singled out for extra attention by a user. Such indications include a user clicking the available Web of Science links for the optimal, legitimate full-text version of the article, or taking steps to save the item in a bibliographic management tool.
For each article, the Web of Science Core Collection monitors and reports two usage totals: a moving window covering the previous 180 days, and a cumulative measure reflecting usage activity after 2013.
For their Scientometrics report, the trio of Moscow-based analysts used the Web of Science to examine a selection of Russian research (that is, papers whose author lists include at least one Russia-based affiliation). In all, upwards of 37,000 articles from the year 2015, representing more than 200 Web of Science subject categories, were extracted and analyzed.
In all, upwards of 37,000 articles from the year 2015, representing more than 200 Web of Science subject categories, were extracted and analyzed.
The authors amassed a range of data on these articles and their respective fields, tracking usage for both the “last 180 days” and “since 2013” measures. The overall share of these articles, out of Russia’s total research output during 2015, was calculated. Meanwhile, to capture and compare the “classical” metrics, the authors also tracked citation figures, including a mean-citation calculation, along with data incorporating journal impact and expected citation rates for different fields.
In their conclusion, the researchers note that a relatively high percentage of reports featuring Russia-based authors met the criteria for usage figures – more than 60% of total research output (RO) for the “last 180 days” measure, and more than 80% for the “since 2013” metric. The survey also determined that individual Web of Science subject categories displayed varying percentages, both of total RO and the two usage measures. Microbiology and condensed matter physics, for example, demonstrated higher shares. Meanwhile, the percentages for mathematics and medicine were the lowest.
In all, the authors observed a significant correlation between usage metrics and citation figures at the article level in their slice of Russia’s output, while the correlation between usage and journal-impact measures proved weaker.
The authors’ ultimate conclusions, and their plans for further comparative study, indicate that the intersection of altmetrics and citation analysis remains fertile ground for investigation.
To read the complete Scientometrics report, please click here.