Contact and FAQ

Recognizing the true pioneers in their fields over the last decade, demonstrating significant and broad influence in their field(s) of research. Each researcher selected has authored multiple Highly Cited Papers™ which rank in the top 1% by citations for their field(s) and publication year in the Web of Science™ over the past decade. However, citation activity is not the sole selection indicator. A preliminary list based on citation activity is then refined using qualitative analysis and expert judgement. Of the world’s scientists and social scientists, Highly Cited Researchers™ truly are one in 1,000.

Resources

2023 Analysis

Experts from the Institute for Scientific Information™ provide their detailed insight into the list of Highly Cited Researchers 2023, including their geographical locations, primary tenured research institutes and a breakdown of their fields of research.

Read our analysis of the 2023 list.

Highly Cited Researchers 2023 Program

Our Highly Cited Researchers program identifies individuals who have demonstrated significant and broad influence in their field(s) of research.

Of the world’s population of scientists and social scientists, Highly Cited Researchers are 1 in 1,000.

Our 2023 list contains 7,125 Highly Cited Researchers in total: 3,793 Highly Cited Researchers in 20 fields of the sciences and social sciences and 3,332 individuals identified as having exceptional performance across several fields.  For deeper insight into the list – the researchers, their fields of research, institutions and home countries or regions, please read our Analysis.

Each year, the Institute for Scientific Information (ISI)™ creates a fresh list of preliminary candidates. Their names are drawn from the publications that rank in the top 1% by citations in one or more fields in Essential Science Indicators (ESI). This preliminary list is then refined with the information that lies beneath, using qualitative analysis and expert judgement to create the final annual list of Highly Cited Researchers.

Please take careful note of our Evaluation and selection guidance for 2023  which clearly outlines our process, to fully understand how the final list has been compiled. We also recommend you read our Disclaimer section to understand the limitations of any analytical approach.

This year, 7,125 Highly Cited Researcher 2023 designations were issued to 6,849 individuals. The number of awards exceeds the number of unique individuals because some researchers are recognized in more than one Essential Science Indicators™ (ESI) field of research. Our Analysis of countries/regions and institutions counts designated awards and is thus based on the total of 7,125.

Our global external communications team has created a toolkit of materials to assist with the promotion of the researchers at your institution. Please contact them directly with your request.
e: newsroom@clarivate.com

These lists are freely available as Excel files in the Past Lists section of our website.

Please contact our external communications team; they will be happy to assist you.

e: newsroom@clarivate.com

We greatly value the support of research institutions in maintaining the accuracy of our list. We encourage you to share your findings with both our organization and the designated researchers at your institution. Before implementing any changes to the public list, we kindly ask for a researcher’s personal authorization via email, in line with the information provided in Guidance for researchers, below.

A Highly Cited Researcher (HCR) badge and quick filter are now available to aid discovery of influential experts while researching on the Web of Science platform. Web of Science users can locate researcher profiles using the Researchers search tab, which supports searching based on author name, identifier, or affiliated organization. When browsing profiles, a new HCR quick filter will be available to help users quickly refine their results to awardees. Researchers who have claimed their Web of Science Researcher Profile and have an HCR award from 2014 onward associated with their profile will be included in the filter. An HCR badge will also be displayed on profiles to help users identify these influential experts while searching the platform. Note that it may take a few days for 2023 data to be reflected in the HCR quick filter.

Guidance for researchers

You can access your personal certificate via the ‘Claim profile’ link next to your name on the 2023 recipient list. It is free to access – it is not reliant on a Web of Science subscription. The certificates can take up to 48 hours to appear, post launch.

From November 15th 2023 – December 15th 2023, we will accept update requests from individual researchers. Please email ISI@clarivate.com from your primary institutional email address, highlighting your name, as you would like us to recognize you, along with the name of one primary affiliated research institution. You may also add any secondary affiliations, if relevant.

Please include this copy:

  • I agree to the processing of my personal information for publication in the Highly Cited Researchers list.

[Please refer to the Clarivate Privacy Notice for more information.]

Updates received during this one-month window will be completed by the end of December 2023. Both the public list and our public file will be updated.

We also kindly request that individuals who transition to a new primary institution keep us informed at their earliest convenience, using the above guidance. This ensures that our central records remain current and accessible for future reference.

You did not have a sufficient number of highly cited papers and total citations to publications assigned to the ESI field of Geosciences. Instead, a tally of all your highly cited papers, not only in Geosciences but also in Engineering and in Environment/Ecology, revealed a publication and citation record equivalent to those selected in any one or more ESI fields. In other words, you qualified for selection through the combination of highly cited papers in several fields, demonstrating superior multidisciplinary or interdisciplinary influence.

To list you in Geosciences would misrepresent the measure by which those selected for that field were chosen and the manner in which you qualified for selection in a different class.

Evaluation and selection

As the need for high-quality data from rigorously selected sources is becoming ever more important, the ISI has had to adapt and respond to technological advances and changes in the publishing landscape when selecting individuals for inclusion.

Just as we have applied stringent standards and selection criteria to identify trusted journals, we have evolved our evaluation and selection policies for our annual Highly Cited Researchers program to address the challenges of an increasingly complex and polluted scholarly record.

Please take careful note of our updated evaluation and selection guidance for 2023  which clearly outlines our process, to fully understand how the final list has been compiled.

This is the sixth year we have sought to identify researchers with cross field impact – those who might contribute multiple highly cited papers in several different fields – but would not register enough highly cited papers in any single ESI field to be considered for selection.

The recognition of these researchers keeps our list contemporary and relevant as it tends to capture younger researchers and those who work at the intersection of different scientific or scholarly domains.

This is by far the most frequent question we receive. Typically, we are contacted by individuals who seek to be included to request that we review their personal publication and citation data.

It is inevitable that there will always be individuals who are disappointed that their name does not appear.  We cannot respond to requests to review cases sent to us or share our records for individual queries regarding the thresholds for inclusion in each field of research, so please take careful note of our new evaluation and selection guidance for 2023  which clearly outlines our process, to fully understand how the final list has been compiled.

Each year, Clarivate creates a new, dynamic Highly Cited Researchers list, based on a rolling eleven-year window of citation evaluation. As our evaluation and selection policies evolve to respond to the changes in the publishing landscape, it is to be expected that new names enter the list each year. A return to the list for consecutive years is in no way guaranteed. There is no practice of continuation, so by not being selected this year, a researcher should not consider themselves ‘removed.’

It is not unusual for researchers of this calibre to have affiliations with several different research organizations. Due to this complexity and high levels of mobility for many researchers, Clarivate asks preliminary candidates of the Highly Cited Researchers program to verify their affiliations to us each year prior to launch.

Our published list then reflects the information available from the scholarly record (i.e., the contact details on their Highly Cited Papers in ESI), combined with any requested updates from the researchers themselves.

Clarivate welcomes and expects accuracy and clarity in researchers’ own claims of primary and secondary affiliations. Many of the individuals named to our list have genuine, complex research affiliations which is why we ask them to verify their primary affiliation to us. We remind researchers that we expect them to specify the correct home institution which reflects their permanent, tenured positions as their primary affiliation.

An asterisk accompanying a primary affiliation indicates that the Highly Cited Researcher is associated with this institution as part of a fellowship or associate program. This rarely occurs since most researchers follow the established tradition of using the secondary affiliation for such appointments and reserve the primary affiliation slot for their main employer. The affiliations of these individuals are not counted in our own analysis of nations or institutions.

To ensure correct attribution of Highly Cited Papers to authors, we use a combination of algorithmic disambiguation of author information and manual inspection. Manual review involves the examination of author identifiers, emails, the subject of the papers as well as the journals in which they were published, review of the institutional addresses, and inspection of co-authorships. Usually this is sufficient to resolve questions of authorship for a unique individual.

Occasionally we examine original papers to obtain a full name not present in the Web of Science™ bibliographic record (some journals do not publish full author names). Reference is made to websites of researchers themselves and their curricula vitae if questions remain, which sometimes arises when a researcher changes institutional affiliations several times during the period surveyed.

We work hard to resolve authorship questions but will make adjustments where required.

You should contact our customer care team with any concerns about your Web of Science profile / ResearcherID. However, this has no bearing on our selection criteria.  Please do read our evaluation and selection criteria carefully to understand how the list is compiled.  It outlines how we approach this task using both quantitative and qualitative indicators, but we begin with Essential Science Indicators to identify Highly Cited Papers and use the associated metadata to create author sets – not your Web of Science profile.

We do not count highly cited papers that have been retracted from the Web of Science when creating our list of preliminary candidates. In addition, in 2022, with the assistance of Retraction Watch and its unparalleled database of retractions, we expanded our qualitative analysis. We exclude when there is evidence of cases in which a potential preliminary candidate’s publications may have been retracted for reasons of misconduct (such as plagiarism, image manipulation, fake peer review).

As part of our final stage validation for this program, we alerted individuals that they had been identified as a potential candidate on our preliminary list in order to verify individuals and their affiliated research institutes. The communication clearly stated that Inclusion in this preliminary list does not guarantee inclusion in the final launch list. We also state that the list can and does change in this final validation stage.

During this final validation phase our broader analysis continued; other researchers were identified and added, others removed. Only when our final data validation and qualitative analysis had been finalized did we lock our final lists for publication.

Please take careful note of our new evaluation and selection guidance for 2023  which clearly outlines our process, to fully understand how the final list has been compiled.

The specific methodology used in generating the new list can turn up researchers – even more junior researchers – who have actively contributed to multiple highly cited papers during 2012-2022, whereas more senior and even more cited scientists may not have been identified because they did not publish as many Highly Cited Papers in a field (as we defined it) or across fields during the relevant eleven year period.

We do not gather gender information alongside author name and affiliation from our researchers during the HCR validation process, and we do not seek to assume ‘apparent’ gender of any researcher, based on their first name. We are careful not to presume how any individual researcher chooses to self-identify.

The list itself is diverse in terms of geography, nationality and even age since we have added the cross-field category. But we do recognize that the list itself is very male-heavy – as it reflects the systemic biases present in the scholarly communication system. We do continue to stress that any metrics gathered from our platform should be used in a responsible setting, which is especially relevant in researcher evaluation. The Web of Science can be a useful tool to highlight how prevalent these biases are, and to show how they vary across regions and disciplines and ultimately assist policy makers in tracking the effectiveness of policy changes designed to address systemic biases.

At the Institute for Scientific Information, we must make difficult choices in our commitment to respond to threats to research integrity across many fields. We have chosen to exclude the Mathematics category from our analysis for this year and we will afford this category additional analysis and judgement to identify individuals with significant and broad influence in the field in 2024.

The field of Mathematics differs from other categories in ESI – it is a highly fractionated research domain, with few individuals working on a number of specialty topics. The average rate of publication and citation in Mathematics is relatively low, so small increases in publication and citation tend to distort the representation and analysis of the overall field. Because of this, the field of Mathematics is more vulnerable to strategies to optimize status and rewards through publication and citation manipulation. This can misrepresent influential papers and people and obscure those that would have qualified for recognition.

Our response to this concern has been to take advice from external experts and consult with leading bibliometricians and mathematicians to discuss our future approach to the analysis of this field. We plan to provide an update on this work in 2024.