In September of 2018, Publons, a Clarivate Analytics company, released its inaugural “Global State of Peer Review” report, the largest ever report to examine how the institution of peer review is responding to changes in journal publishing.
Publons is ideally positioned to produce this report, having taken the lead in organizing reviewers by providing a forum and platform by which scientists and scholars can have their peer-review activities tracked, verified, and recorded. Since the company’s launch in 2011, more than 500,000 reviewers have created Publons profiles documenting their contributions. Furthermore, via its “academy,” Publons offers instruction in the techniques and procedures of peer reviewing.
For the “Global State of Peer Review” report, Publons analysts drew upon publication data from the Web of Science, along with rates of article submission, review completion, and other specifics as reported in another Clarivate resource, ScholarOne, a workflow-management tool for scholarly journals, books, and conferences. In addition, nearly 12,000 researchers worldwide responded to a detailed survey about their individual processes and challenges in creating reviews.
Assembled in the report, these sources characterize peer review as an institution that is indeed confronting challenges. One of these is the ever-increasing, global scope of science itself. Since 2013, the Web of Science has recorded an annual increase of 2.6% in the volume of published papers, while the number of papers submitted to journals for publication has grown even faster, up 6.1% per year.
“Since 2013, the Web of Science has recorded an annual increase of 2.6% in the volume of published papers, while the number of papers submitted to journals for publication has grown even faster, up 6.1% per year.”
This ongoing torrent of publication has created strain in the peer-review system. One problem is “reviewer fatigue,” as experienced reviewers, in some instances, have elected to curtail their activities due to the excessive demands on their time. As a result, journal editors must expend more effort in locating and securing experts to review manuscripts. In 2013, for example, editors sent an average of 1.9 reviewer invitations per review, hoping to engage a researcher with the proper expertise and whose schedule would permit the time to complete the assignment. By 2017, the average had increased to 2.4 invitations.
Nevertheless, there are also signs that peer review is scaling up to meet the increase. China, in particular – mirroring its surge in research output in recent decades – now accounts for a corresponding uptick in reviewer activity.
Whatever the challenges, peer review does enjoy one substantial advantage: Abiding support in the research community. Of respondents to the Publons survey, for example, 98% registered the opinion that peer review is either “important” (31.2%) or “very important” (66.8%) for “ensuring the general quality and integrity of scholarly communication.”
The Publons survey also specifies measures aimed at increasing participation in peer review. One such step is to provide incentives and rewards – for example, instruction in peer reviewing, as well as official certification – in order to maintain peer review as a bulwark against sloppy, erroneous, or even fraudulent science.
A different kind of peer review
In a sense, every scientist and scholar who publishes in the journal literature acts as a kind of peer reviewer. In deciding which previous, foundational work to explicitly acknowledge in footnotes, authors ultimately confer visibility and esteem on the experts whose published work they judge to be the most useful and consequential. In effect, citations constitute a form of peer review.
In a sense, every scientist and scholar who publishes in the journal literature acts as a kind of peer reviewer.”
Over the years, Clarivate Analytics has documented and chronicled authors whose outsize citation records, as tracked in the Web of Science, indicate markedly high influence and significance in their fields. Since 2002, the true outliers – those whose high citation totals are associated with a significant discovery or advance – have been designated Citation Laureates, quite possibly in line for a Nobel Prize someday.
More recently, since 2014, Clarivate has identified an annual cohort of researchers whose citation records, although perhaps not as elite as those of the Citation Laureates, rise to the level of “world class” within their respective specialties. These are the Highly Cited Researchers (HCRs).
Instead of relying on an overall, total-citation count, the main criterion for selection as an HCR is based on an author’s tally of Highly Cited Papers – reports ranking in the top 1% most cited for their field and year of publication – over an 11-year period.
In previous years, authors were identified as HCRs only if their quantity of Highly Cited Papers met the given threshold for a specific field (that is, one of 21 main subject areas covered in Essential Science Indicators, part of the Web of Science). By contrast, the new listing of HCRs will include researchers who have registered strongly in more than one discipline. This “cross-field” coverage is new for 2018.
The new listing of HCRs will include researchers who have registered strongly in more than one discipline. This ‘cross-field’ coverage is new for 2018.”
As the newest HCR listing nears its official unveiling in mid-November, Clarivate will continue to examine aspects of peer review – particularly the distinction conferred by peers in the form of a high citation rate, and what this recognition means to a researcher. In our upcoming “World-class Researcher” webinar series, deeper insights will be shared which will include thoughts and impact perspectives directly from a couple of the 2018 HCRs themselves.
By their accomplishments as well as by their example, the groups of highly cited scientists featured by Clarivate embody and advance the pursuit of world-class research.