In the complex matter of assessing research and researchers, simple metrics have their place. For example, when used advisedly in properly limited and matched comparisons, the h-index measurement – which specifies the convergence point of an author’s productivity and citation impact – can identify influential researchers in a precisely defined field. Similarly, the Journal Impact Factor, recorded annually in Journal Citation Reports from Clarivate Analytics, provides a marker of journal usage and influence that can be useful to publishers and librarians.
The problem with simple metrics, however, is that they are prone to being misused or mistakenly invested with excessive significance, to the exclusion of more complete, nuanced, and meaningful data. Compared to single-point metrics, the preferable alternative is a multiplicity of complementary data showing trends and other specifics – ideally, represented in a lucid visualization.
“The problem with simple metrics, however, is that they are prone to being misused or mistakenly invested with excessive significance, to the exclusion of more complete, nuanced, and meaningful data.”
This advocacy for the use of more complete data – “profiles, not metrics” – is the thesis and title of the inaugural Global Research Report just released by the Institute for Scientific Information (ISI). Recently reformed within Clarivate Analytics, and carrying on the name originally given to the company under its founder, information-science pioneer Eugene Garfield, ISI serves as the base for knowledge, education, and thought leadership within the Web of Science Group at Clarivate Analytics.
On this occasion, the Global Research Reports themselves also mark a relaunch. Beginning in 2009, the reports presented detailed examinations of research activity, collaboration, and impact in various nations and regions. Under the new auspices of ISI, subsequent reports in 2019 will profile research in the G20 nations and discuss methodological refinements for analyzing researchers, journals, and subject areas, among other topics.
For now, the new report, co-written by ISI director Jonathan Adams and ISI analysts Marie McVeigh, David Pendlebury and Martin Szomszor, makes the case for metrics that go deeper than a single data point. For example, in lieu of a single h-index score for an author, analysis is better served by a beam-plot. This visualization incorporates far more information and context, including normalized citation data for each of the author’s papers.
Specifically, each paper’s citation count is compared against an average for the specific journal and year of publication. These values are converted to a percentile score, with the author’s overall average percentile represented vertically as a benchmark across time. Thus, the author’s impact in relation to the benchmark can be visually tracked over multiple years.
Similarly, the report presents enhanced metrics for assessing not only individuals but journals, research institutes, and universities. Each method surpasses simple “league table” rankings and other products of single-point metrics, thereby better reflecting the complexity and variability of the research enterprise.
To learn more about these metrics, please download the new Global Research Report from ISI.