Research metrics in practice

The nature of analysis is changing, and there are more metrics available than ever before. Many librarians and research professionals are seeing their roles shift, requiring them to increasingly deliver recommendations and reports that inform strategic decision making at their organizations.  Metrics can provide powerful insight into research trends, but understanding which indicators to use and how to most appropriately apply and interpret them is critical to ensuring an accurate assessment.

Clarivate Analytics solutions promote the use of responsible metrics in research evaluation for researchers, administrators, and policy makers, while also supporting the Leiden Manifesto* initiative that calls for 10 principles to guide research evaluation.

USE QUANTITATIVE EVALUATION TO SUPPORT EXPERT EVALUATION

Institutions can no longer rely solely on peer assessment and past reputation; they must be able to quantifiably account for current performance. The objective nature of metrics serves as an effective complement to the variety of qualitative and quantitative measures already at hand for assessment of institutional performance. Yet it’s still important to remember: Don’t rely on numbers alone. Quantitative data should support qualitative expert assessment.

MEASURE PERFORMANCE IN ACCORDANCE WITH THE RESEARCH MISSION OF THE INSTITUTION

One key to obtaining useful insights into research management from an evaluation framework is to pose the right questions at the outset. For example, what are the precise objectives of conducting the evaluation? What information and specific data points need to be collected, and how? What are the quantifiable benchmarks or achievements, both in the short and long term, that will enable conclusions regarding whether the evaluation was successful in meeting the specified goals?

An evaluation framework might be applied to answer a range of situations and needs, such as advocating on behalf of increased support for research, addressing accountability in showing that resources have been used efficiently, deepening the understanding of ways in which research is effective, or determining where and how to allocate funds in the future. No single evaluation model applies to all contexts. That’s why the Web of Science and InCites Benchmark & Analytics allow users to apply data to a variety of contexts, e.g., institutional and regional.

PROTECT EXCELLENCE IN LOCALLY RELEVANT RESEARCH

Metrics based on high-quality, non-English literature serve to identify and reward excellence in locally relevant research.

The Web of Science (covering 42 languages) continuously expands its coverage with regional content in its regional citation indices.

KEEP DATA COLLECTION AND ANALYTICAL PROCESSES OPEN, TRANSPARENT & SIMPLE

The Web of Science is built on strict content-selection criteria. Our editorial team works full-time on evaluations and collection management and has done so for many decades, with no other professional commitments or conflicts of interest. InCites metrics are simple, and it is easy to drill down to the article level.

ALLOW THOSE EVALUATED TO VERIFY DATA AND ANALYSIS

Converis or tools like the Web of Science Profiles allow the use of citation metrics based on content, validated by both the assessors and the assessed.

ACCOUNT FOR VARIATION BY FIELD IN PUBLICATION AND CITATION PRACTICES

Keep in mind that different academic fields have different patterns of citation activity. Therefore, what can be considered a high score in one subject category might differ from what may be a high score in another category. We produce normalized metrics to account for varying citation patterns across subjects, time and document types.

ASSESS INDIVIDUAL RESEARCHERS USING QUALITATIVE JUDGEMENT OF THEIR PORTFOLIO

A CRIS system like Converis or a research management solution like Web of Science Profiles allows an individual’s expertise, experience, activities and influence to be considered holistically on the basis of a portfolio of research-related activities.

 AVOID MISPLACED CONCRETENESS AND FALSE PRECISION

Citation data are not meant to replace informed peer review, and Clarivate Analytics recommends paying careful attention to the many conditions that can influence citations. These include language, journal history, format, publication schedule and subject specialty. In the Journal Citation Reports, we offer a range of metrics to offer more comprehensive perspectives. However, we advise against extrapolating their artificial precision into the real world.

RECOGNIZE SYSTEMIC EFFECTS OF ASSESSMENT AND INDICATORS

We support a wide framework of contextual citation-based indicators that provide a wide variety of angles for assessment of research performance, and thus prevent the possibility of gaming.

SCRUTINIZE INDICATORS REGULARLY AND UPDATE THEM

We constantly strive to be responsive to new needs for metrics as the research ecosystem evolves. Clarivate Analytics products and services address the concerns and needs voiced by the research community by delivering high-quality data designed for accurate measurement, benchmarking, and customized analysis.

*Hicks, D., Qouters, P., Waltman, L., de Rijcke, S., & Rafols, I., “The Leiden Manifesto for research metrics,” Nature, 520 (7548): 429-31, 2015. Also, see video: https://vimeo.com/133683418