Product logins

Find logins to all Clarivate products below.


chevron_left

Evaluating the societal impact of research: Insights from the global community

Evaluating the societal impact of research: Insights from the global community

A global survey of research office staff and researchers reveals how research offices worldwide are prioritizing and measuring the societal impact of research.

Understanding and demonstrating the societal impact of research is now a strategic priority for universities and research institutions worldwide. Research Professional News* recently conducted a global survey – Research Offices of the Future 2025 – of more than 1,100 research office staff and 1,400 researchers. It explores how research offices are evaluating societal impact, which research outputs they prioritize, and the key challenges institutions face when measuring real‑world influence at scale.

Key findings from these responses include significant regional differences in the main drivers of societal impact evaluation, the continued dominance of traditional academic publications as evidence of impact globally, and the identification of policy citations and media mentions as the most valued proxy measures.

Why institutions are evaluating societal impact

Overall, research office staff identified public accountability as the primary reason for evaluating the societal impact of research (67%), followed by funding requirements (53%) and strategic differentiation (50%). Respondents from the U.K. and Australia/New Zealand broadly agreed with this overall pattern.

However, in Europe**, funding requirements were prioritized as the primary reason (61% of respondents). Interestingly, strategic differentiation emerged as the leading motivation in the Middle East (67%) and surpassed funding requirements to become the second most common reason in North America, Asia, and Africa.

These regional differences suggest that institutions use societal impact evaluation for varying purposes. A recent report by the Institute for Scientific Information further explores how data can be used to suit these and other evaluation goals.

Which research outputs matter in evaluation

Research can take the form of both traditional outputs – such as academic publications – and non-traditional outputs, including patents and patent applications, datasets, policy documents, professional publications (e.g. in magazines or online media). In some disciplines, particularly in Arts and Humanities, research is also delivered through activities, such as performances and exhibitions.

The table below provides examples of Clarivate data sources where relevant research-related outputs can be identified.

 

Output Example data source
Academic publications Web of Science Core Collection, Research Commons, Preprint Citation Index
Patent and patent applications Derwent Innovations Index on the Web of Science platform
Policy documents Policy Citation Index on the Web of Science platform
Data sets and software code Data Citation Index on the Web of Science platform
Medical guidelines MEDLINE on the Web of Science platform
Dissertations and theses ProQuest Dissertations & Theses Citation Index on the Web of Science platform
Clinical trials Cortellis Clinical Trials Intelligence
Professional publications ProQuest databases
Learning objects (if created as an outcome of research) ProQuest databases
Visual arts, performances, designs and architectural plans ProQuest databases

 

The Research Offices of the Future survey asked research offices about their most important institutional outputs and activities when evaluating the societal impact of research. Their responses are summarized in interactive Figure 1.


Figure 1. Most important outputs and activities in the evaluation of the societal impact of research: research office perspective, share of respondents by country/region. Source: Research Offices of the Future 2025

 

Many national research assessment exercises provide useful illustrations of the use of traditional vs non-traditional outputs. Impact case studies submitted by institutions must reference the research outputs “underpinning” real-world impact. For example, in the U.K.’s Research Excellence Framework 2021, only 2.4% of all submitted outputs were non-traditional, although an ambitious target of up to 5% has been set for REF 2029.

U.K. and Australia/New Zealand research offices clearly highlighted this need in their responses to Research Professional News, placing policy documents at the top of the list of institutional outputs most important for evaluating societal impact of research. Australia/New Zealand also valued patents as highly as academic publications.

In contrast, traditional scholarly publications remain the dominant signal of societal impact elsewhere. They are prioritized by the majority of research offices in North America (74%), Africa (70%), Asia (66%), and Europe (65%). In several of these regions — notably North America and Europe — patents are viewed as more important than policy documents, reinforcing the role of applied and market-facing outcomes.

A diverse mix of non-traditional outputs contribute to impact evaluation globally, including professional publications, clinical trials, datasets, software, and learning objects in some regions.

The fact that among respondents about which output types matter most means that institutions need flexible, context-aware solutions for impact assessment that reflect how research delivers value in different systems, sectors, and regions.

How research offices are measuring societal impact

The real-world impact of research is difficult to measure directly at scale – e.g. in terms of lives saved, wellbeing improved, or costs reduced. However, the research outputs and activities generated by institutions can provide signals (proxies) of societal impact.

In this survey, research office staff and researchers were asked about the importance and difficulty of measuring such proxies, covering both forward-looking measures – such as relevance of research to UN Sustainable Development Goals (SDGs) or page views/readership statistics – and retrospective measures including non-academic citations or media mentions. The responses from research offices are summarized in interactive Figure 2.


Fig 2. Most important – today (x-axis) and in future (y-axis) – and hardest to measure (size of the bubble) proxy measures in the evaluation of the societal impact of research: research office perspective, share of respondents by country/region. Source: Research Offices of the Future 2025

 

  • Policy citations and media mentions: Important but hard to measure

Research office staff identified two proxy measures as currently the most important: citations from policy documents and media mentions (approximately 41% each). Looking five years ahead, citations from policy documents were considered more important than media mentions (48% vs 37% respectively). This longer-term prioritization of policy citations as a key proxy of societal impact was largely consistent across all regions, except for North America and the Middle East, where media mentions were ranked higher.

At the same time, research office staff ranked citations from policy documents as the second hardest proxy to measure and media mentions as the fourth hardest.

Researchers broadly agreed with research office staff on the longer-term importance of policy citations, but showed weaker support for media mentions. Unsurprisingly, policy citations are not a one-size-fits-all measure: they received limited support from STEM researchers, whereas around half of social scientists prioritized them.

  • Non-academic co-authored research citations: A feasible way forward

Both research office staff and researchers placed citations from research co-authored by non-academics among the top four most important proxies of societal impact, both now and in the future, with broad agreement across regions. Notably, U.K. research offices assigned this proxy even greater long-term importance than media mentions, while for U.K. researchers it ranked as the single most important measure.

The importance of this proxy was confirmed by researchers across most fields, except for the Arts and Humanities where research citations are generally less frequent. Despite its purely bibliometric nature, non-academic co-authored research citations appeared in the middle of the list in terms of how hard research offices find it to measure. Research office staff in Asia even placed it among the two hardest proxies of societal impact to measure.

  • Relevance to SDGs: (Still) the hardest proxy to measure

A decade after the adoption of the UN’s 2030 Agenda for Sustainable Development, one in three research office staff ranked the relevance of research to SDGs among their top three hardest proxies of societal impact to measure. This was particularly common among respondents from the U.K., Europe, and North America – together accounting for approximately three quarters of all survey participants – as well as among senior leadership staff across all regions.

This difficulty may stem from the challenge of assessing SDG relevance for non-traditional research outputs or research projects (such as grant applications), or from the need for a more granular focus on specific SDG targets and indicators rather than high-level goals. Building on the well-established SDG classification schema in Web of Science Core Collection, the AI-native Web of Science Research Intelligence platform is designed to help address this gap.

Implications for research offices

These survey findings underscore that evaluating societal impact is no longer optional — it is a strategic necessity. Research offices increasingly need to:

  • balance traditional and non-traditional outputs in evaluation frameworks
  • invest in tools that track hard-to-measure proxies such as policy citations and SDG relevance
  • prepare for greater accountability to funders and the public
  • adopt more granular, data-driven, and responsible approaches to identifying signals of impact

Institutions that embrace these practices will be better positioned to demonstrate the societal impact of their research and differentiate their institutions in an increasingly competitive research landscape.

Ready to learn more? Read our Research Offices of the Future 2025 report, check out Web of Science Research Intelligence, and meet us at the EARMA Conference in Utrecht, the Netherlands, 5-7 May 2026 to explore our findings and engage in a broader discussion on the challenges and opportunities of evaluating the societal impact of research.

 

*Research Professional News is an editorially independent part of Clarivate

**In this blog post, as well as in the original survey, Europe does not include the U.K.

Related insights

The latest news, technologies, and resources from our team.

Identifying the top 50 universities powering global innovation Identifying the top 50 universities powering global innovation
Blog January 30, 2025
Identifying the top 50 universities powering global innovation
The continuing evolution of the Highly Cited Researchers list — to promote fairness, accuracy, and research integrity The continuing evolution of the Highly Cited Researchers list — to promote fairness, accuracy, and research integrity
Blog April 30, 2025
The continuing evolution of the Highly Cited Researchers list — to promote fairness, accuracy, and research integrity
Recognizing Nobel-Class Research: The Power and Purpose of Naming Citation Laureates Recognizing Nobel-Class Research: The Power and Purpose of Naming Citation Laureates
Blog September 25, 2025
Recognizing Nobel-Class Research: The Power and Purpose of Naming Citation Laureates
chevron_left
chevron_right