Evaluation and selection

The Highly Cited Researchers™ list from Clarivate™ seeks to identify individual researchers in the sciences and social scientists, who have demonstrated significant and broad influence in their field(s) of research.

This small fraction of the global researcher population contributes disproportionately to extending the frontiers of knowledge and gaining for society innovations that make the world healthier, more sustainable and more secure.

2023 Analysis

Experts from the Institute for Scientific Information™ provide their detailed insight into the list of Highly Cited Researchers 2023, including their geographical locations, primary tenured research institutes and a breakdown of their fields of research.

Read our analysis of the 2023 list.

Our evaluation and selection strategy is not one-dimensional, the process is complex and determined by combining the inter-related quantitative and qualitative information available to us.

Each year, our in-house team at the Institute for Scientific Information (ISI) creates a fresh list of preliminary candidates, selected for their exceptional performance in one or more fields in Essential Science Indicators (ESI)™, or across several fields. This list is then refined with the information that lies beneath, using qualitative analysis and expert judgement to create the annual list of Highly Cited Researchers.

Evaluation and selection

Please read the full evaluation and selection process with care to understand limitations of any analytical approach.

See our disclaimer

Highly Cited Researchers from Clarivate is an annual list recognizing influential researchers in the sciences and social sciences from around the world, who have demonstrated significant and broad influence in their field(s) of research.

Of the world’s population of scientists and social scientists, Highly Cited Researchers are 1 in 1,000.

The robust evaluation and curation of our data ensure that the Web of Science Core Collection™ remains the world’s most trusted publisher-independent global citation database. Our experts at the Institute for Scientific Information (ISI) rely on this robust publication data when selecting individuals who have published multiple Highly Cited Papers™ in the body of trusted journals in the Web of Science.

As the need for high-quality data from rigorously selected sources is becoming ever more important, the ISI has had to adapt and respond to technological advances and changes in the publishing landscape as we identify individuals for inclusion.

Just as we have applied stringent standards and transparent selection criteria to identify trusted journals, we have evolved our evaluation and selection policies for our annual Highly Cited Researchers program as we address the challenges of an increasingly complex and polluted scholarly record.

The 2023 list contains about 3,800 Highly Cited Researchers in 20 fields of the sciences and social sciences and about 3,300 Highly Cited Researchers identified as having exceptional performance across several fields.

The list focuses on contemporary research achievement as we survey Highly Cited Papers in trusted science and social sciences journals indexed in Science Citation Index Expanded™ and Social Sciences Citation Index™ during the 11-year period 2012 to 2022.

The data derives from Essential Science Indicators™ (ESI), a component of InCites™.

For our 2023 analysis we reviewed 188,500 Highly Cited Papers from 20 broad fields in ESI, defined by sets of journals and exceptionally, in the case of multidisciplinary journals such as Nature and Science, by a paper-by-paper assignment to a field based on an analysis of the cited references in the papers. Data are taken from article and review papers only – we do not count citations to letters, correction notices and other items. We have excluded the Mathematics category from our analysis for this year. See our Exceptions and exclusions section for further information.

Our first stage of analysis begins with a citation triage of records to identify a list of preliminary candidates – authors with the greatest number of Highly Cited Papers in an ESI field at the threshold for inclusion and above.

For the Highly Cited Researchers 2023 analysis, the papers surveyed were those published during 2012 to 2022 and which then ranked in the top 1% by citations for their ESI field and year (the definition of a highly cited paper).

Researchers who, within an ESI-defined field, publish Highly Cited Papers are judged to be influential, so the authorship of multiple top 1% papers is interpreted as a mark of exceptional impact. Relatively younger researchers are more likely to emerge in such an analysis than in one dependent on total citations over many years. To recognize more junior and mid-career researchers is one of our goals in generating the list of Highly Cited Researchers.

The determination of how many researchers to examine for each field is based on the population of each field, as represented by the number of disambiguated author names on all Highly Cited Papers in that field, 2012 to 2022.

The square root of the number of authors in each field determines the number of individuals selected – the number of researchers identified by ESI field varies, with Clinical Medicine being the largest and Space Science being the smallest in 2023. When ranked by paper count the number of papers associated with the author at the square root position becomes the field paper threshold.

Another criterion for selection is that the researcher must have enough citations to their Highly Cited Papers to meet the author field citation threshold found within ESI. All who published Highly Cited Papers and received citations at the field threshold level are considered at this stage – even if the final list then exceeds the number given by the square root calculation.

In addition, any researcher with one fewer highly cited paper than the field paper threshold number is also admitted to the list if total citations to their Highly Cited Papers ranks that individual in the top 50% by total citations of those at the threshold level or higher.


Example – fictional authors

ESI field First name Last name Highly Cited Papers Citation to Highly Cited Papers Field paper threshold Author field citation threshold Author field citation threshold if one fewer paper than threshold number Status
Field 9 Mary Pandit 17 2,838 11 1,112 2,920 Considered
Field 9 William Clever 10 3,677 11 1,112 2,920 Considered
Field 9 Judith Sage 10 1,338 11 1,112 2,920 Not considered
  • Mary Pandit meets both field paper threshold and author field citation threshold, so is considered for selection
  • William Clever has one less paper than the paper threshold but meets the one less paper author field citation threshold and so is considered for selection
  • Judith Sage does not meet the paper threshold or either citation threshold and so is not considered for selection

This is the sixth year we have sought to identify researchers with cross-field impact – those who might contribute multiple Highly Cited Papers in several different fields – but would not register enough Highly Cited Papers in any single ESI field to be considered for selection.

The recognition of these researchers keeps our list contemporary and relevant as it tends to capture younger researchers and those who work at the intersection of different scientific or scholarly domains.

To identify researchers with cross-field influence, highly cited paper and citation counts are normalized through fractionating according to the thresholds required for each field (thus, each Clinical Medicine paper has a smaller unit fraction than one in Space Science). Citation counts are fractionated in a similar manner. If the sum of the publication counts and the sum of the citation counts for a researcher equals 1.0 or more, the individual exhibits influence equivalent to a researcher selected in one or more ESI defined fields and is therefore selected as a candidate for exceptional cross-field performance.

Example

ESI field First name Last name Highly Cited Papers Citation to Highly Cited Papers Field citation threshold Field paper threshold Field paper score Field citation score Cross-field paper score Cross-field citation score
Field 3 Joseph Savant 1 98 1,857 22 0.045 0.053 1.670 5.666
Field 6 Joseph Savant 7 2937 946 8 0.875 3.105 1.670 5.666
Field 14 Joseph Savant 3 663 676 6 0.5 0.981 1.670 5.666
Field 16 Joseph Savant 4 3,397 2,223 16 0.25 1.528 1.670 5.666

The fictional researcher Joseph Savant published 15 Highly Cited Papers in four ESI fields from 2011 to 2021. Seven papers in Field 6, with a threshold number of eight for selection, earned Savant a credit of .875 (or 7/8ths). Three papers in Field 14, with a threshold number of six for selection, were worth. 5. The sum of the fractional paper counts in each field yielded a total Cross-Field paper score of 1.67. A score of 1 or more indicates that the individual achieved impact equivalent to a researcher chosen in a specific ESI field. The second criterion for consideration as a Highly Cited Researcher is enough citations to rank in the top 1% by citations for a field. Again, citations in different fields were fractionated in a similar manner to the treatment of papers. In the example above, Professor Savant earned more than five times the number of citations needed for selection as an influential cross-field researcher.

Clarivate is trusted by many organizations involved in research evaluation and assessment – including universities, governments, research assessment and ranking organizations globally to provide accurate, verifiable and trustworthy data.

As we identify individuals who show significant and broad influence in their chosen field or fields, we have added more filters and checks to our analysis. Our evaluation and selection strategy is not one-dimensional, the process is more complex than ever and determined by combining the inter-related information available to us.

Some decisions are straight-forward – to award credit to a single author among many tens or hundreds listed on a paper strains reason. Therefore, we eliminate any Highly Cited Paper with more than 30 authors or explicit group authorship as defined by publisher, from our analysis. Beyond this, researchers found to have committed scientific misconduct in formal proceedings conducted by a researcher’s institution, a government agency, a funding agency, or a publisher cannot be selected as a Highly Cited Researcher.

ESI field of Mathematics

We have chosen to exclude the Mathematics category from our analysis for this year.

The field of Mathematics differs from other categories in ESI. It is a highly fractionated research domain, with few individuals working on a number of specialty topics. The average rate of publication and citation in Mathematics is relatively low, so small increases in publication and citation tend to distort the representation and analysis of the overall field. Because of this, the field of Mathematics is more vulnerable to strategies to optimize status and rewards through publication and citation manipulation, especially through targeted citation of very recently published papers which can more easily become highly cited (top 1% by citation). This not only misrepresents influential papers and people; it also obscures the influential publications and researchers that would have qualified for recognition. The responsible approach now is to afford this category additional analysis and judgement to identify individuals with significant and broad influence in the field.

Because Clarivate is trusted by global organizations for research evaluation and assessment, we have a responsibility to provide accurate, verifiable, and trustworthy data. At the Institute for Scientific Information, we must make difficult choices in our commitment to respond to threats to research integrity across many fields. Our response to this concern has been to take advice from experts and consult with leading bibliometricians and mathematicians to discuss our future approach to the analysis of this field.

Upholding research integrity

Together with our community partners, we need to play our part to respond to a rise in threats to research integrity in many areas. So, we examine for any anomalies in the scholarly record which may seriously undermine the validity of the data analyzed for Highly Cited Researchers. These activities may represent efforts to game the system and create self-generated status.

In 2022, with the assistance of Retraction Watch and its unparalleled database of retractions, we extended our qualitative analysis to all retracted papers to detect for evidence of cases in which a potential preliminary candidate’s publications may have been retracted for reasons of misconduct (such as plagiarism, image manipulation, fake peer review). We searched for evidence of publication anomalies for those individuals on the preliminary list of Highly Cited Researchers. This extended analysis proved valuable in identifying researchers who do not demonstrate true, community-wide research influence.

We also receive expressions of concern from identified representatives from research institutes, national research managers and our institutional customers along with information shared with us by other collective community groups – e.g. For Better Science, Pub Peer. Some of these resources include anonymous or ‘whistleblower sources. We also consider these, where we can verify claims through direct observation.

Our response evolves each year, and we now look at a growing number of factors when evaluating papers including, but not limited to:

  • Extreme levels of hyper-authorship of papers. Our expectation is that an author has provided a meaningful contribution to any paper which bears their name and the publication of multiple papers per week over long periods strains our understanding of normative standards of authorship and credit.
  • Excessive self-citation – We exclude papers which reveal unusually high levels of self-citation. For each ESI field, a distribution of self-citation is obtained, and extreme outliers (a very small fraction) are identified and evaluated. We also look for evidence of prodigious, very recent publications that represent research of incremental value, accompanied by high levels of author self-citation. For a description of the methodology used to exclude authors with very high levels of self-citation, please see: Adams, J., Pendlebury, D. and Szomszor, M., “How much is too much?The Difference between Research Influence and Self-Citation Excess,” Scientometrics, 123 (2):1119–1147, May 2020.
  • Unusual patterns of collaborative group citation activity and anomalous levels of citations from co-authors. The identification of networks of co-authors raises the possibility that an individual’s high citation counts may be highly reliant on citations from this network; if more than half of a researcher’s citations derive from co-authors, we consider this to be narrow influence, rather than the broad community influence we seek to reflect.

ISI analysts use other filters to identify anomalous publishing activities. We can report, with the implementation of more filters this year, the number of potential preliminary candidates excluded from our final list increased from 500 in 2022 to more than 1,000 this year.

We explicitly call for the research community to police itself through thorough peer review and other internationally recognized procedures to ensure integrity in research and its publication.

Clarification of how we identify, request and publish primary researcher affiliations in the Highly Cited Researchers program.

Our published Highly Cited Researchers list reflects the information available from the scholarly record (i.e., the contact details on their Highly Cited Papers in ESI), combined with any requested updates from the researchers themselves.

While we acknowledge that many of the individuals named to our list have genuine, complex research affiliations. Clarivate welcomes and expects accuracy and clarity in researchers’ own claims of primary and secondary affiliations. We remind researchers that a primary affiliation is typically regarded to be their permanent, tenured position.

It is not unusual for researchers of this calibre to have affiliations with several different research organizations – academic and commercial – or to move to new positions in other institutions in other countries and regions during their career. Due to this complexity and high levels of mobility for many researchers, Clarivate asks preliminary candidates of the Highly Cited Researchers program to verify their affiliations to us during the September validation window, when we attempt to contact researchers using the contact details provided on their Highly Cited Papers.

We share a personalized survey link to each preliminary candidate to request that they verify how they would like their name to appear on our list, along with the locations of their primary and, if applicable, any secondary research affiliations. This enables us to accurately track and keep a record of all institutional affiliation requests. Researchers may also contact us directly with updates at any time of year via email ISI@clarivate.com

Research fellowships

The incentives to achieve Highly Cited Researcher status are quite high in some nations and research systems and occasionally researchers are invited to become affiliated researchers at other institutions as part of a fellowship program.

In 2022 we extended the identification of these affiliated or guest researchers, designating these as Research Fellows or Associates. These individuals are not counted in our own ranking of nations or institutions.

Clarivate endorses the actions of universities and research organizations to monitor and manage the activities and behaviors of their employees with respect to specifying correct home institutions which reflect their permanent, tenured positions. See our April 2023 statement on this topic here.

There is no unique or universally agreed concept of what constitutes exceptional research performance and elite status in the sciences and social sciences and there are many highly accomplished and influential researchers who may not be recognized by our chosen method of evaluation and selection.

The only reasonable approach to interpreting a list of researchers such as ours is to fully understand our chosen method of evaluation and selection.

Consequently, no list can satisfy all expectations or requirements – a different basis or formula for selection would generate a different (though likely overlapping) list of names and the absence of a name on our 2023 list should not be interpreted as inferior performance or stature in comparison to others selected.

With that knowledge, the results may be judged by users as relevant or irrelevant to their needs or interests.