Ideas to Innovation - Season Two
Jonathan Adams: Papers that are referenced more often by later literature are having a greater influence on the development of the research to which they link. And so that means that we can use citation counts to identify important papers. So how do we then implement that?
David Pendlebury: I became fascinated how this index worked, linking papers, related papers together through their citation linkages, or their footnotes in the paper.
Intro: Ideas to Innovation from Clarivate.
Neville Hobson: Globalisation is a word we’re accustomed to hearing a lot in the news. It’s most often used in describing the increasing economic interdependence of national economies across the world through the cross-border movement of goods, services, technology and capital. While it’s sometimes used negatively, especially in the context of global business, finance and politics, its value as a useful and positive term is clear when it comes to talking about scientific research and reporting its scale.
This is very much the case with the latest global research report from the Institute for Scientific Information, or ISI, at Clarivate, published in April. The report examines research trends in the United States, the dominant country in global science since the end of the Second World War, and considers the impact of globalization and collaboration by examining the trajectory of US research over the past 15 years. The report also addresses a significant shift in the global science order. Since the beginning of the 21st century and focuses on the balance of domestic and collaborative research and its policy implications.
It considers strong challenges in the research landscape from rapidly developing nations such as Brazil, India, South Korea and especially mainland China. These countries characterise a different geography of research where the US risks falling behind new science-based economies.
Hello and welcome to season two of Ideas to Innovation, a podcast from Clarivate, with information and insight from conversations that explore how innovation spurs incredible outcomes by passionate people in many areas of science, business, academia, sport and more. I’m Neville Hobson.
In this episode, we are fortunate to have two of the authors of the report joining us as guests. Jonathan Adams is Chief Scientist at ISI based in London. and David Pendlebury is Head of research analysis at ISI, based in Oregon in the United States. Both have deep expertise in bibliometrics or the science of science, and their foundational work in citation analysis and research policy have benefited Clarivate and the broader scholarly community for many years. Welcome gentlemen, thank you both for being with us today.
David Pendlebury: Thank you.
Jonathan Adams: Pleasure to be here.
Neville Hobson: So before we get into our discussion, would you tell our listeners a little about your careers and your current roles at Clarivate just to set the scene for our conversation? Jonathan, would you care to start?
Jonathan Adams: My background is originally as a scientist who became interested in science policy. Through science policy I was introduced to ISI, the Institute for Scientific Information, and to David Pendlebury in Philadelphia in the early 1990s when we realized the data would be incredibly useful to our work on understanding the best way of deploying the UK science budget. And that then led me into higher education and research management, subsequently into founding a company using bibliometrics and science metrics to improve research management information, and eventually then that company being acquired by Thompson Reuters, the forerunner of Clarivate, which owns the Web of Science and ISI today.
Neville Hobson: Got it. That’s a great starting point. And that connection with David, I think, you two have been connected, let’s say, for quite some time. You mentioned the early 90s there. So maybe we might touch on that again in a minute. But David, how about you? What brings you into all of this?
David Pendlebury: Well, thank you Neville. Unlike Jonathan, I was not trained as a scientist, but rather as a historian. And in the early 1980s, joined this organization in Philadelphia called the Institute for Scientific Information, which indexed the scientific and scholarly community. I became fascinated how this index worked, linking papers, related papers together through their citation linkages, or their footnotes in the paper. And so as a consequence of that through the years, I’ve studied using quantitative and qualitative techniques, the structure, the dynamics of science itself, how it grows, what it looks like, what shape it takes. And these connections are used not only for search, for finding the articles you need, but also for evaluation, since the citations stand as markers or indicators. of impact, significance, attention.
Neville Hobson: Okay, I think that’s a really key point to stress, I think, in this age today we’re at, where there is so much, I guess, mistrust, disinformation, all these words we keep hearing about that signifies lack of trust, that extends into what we read, see, hear, etc. online in particular. And this clearly has impact in your areas of expertise, in terms of how you work. You’ve even touched on that, actually, David. I’m actually interested to know a little bit about how you do what you do. You both work together in very complimentary ways, it seems to me. You’ve outlined the idea of citation and some of the things that I guess people listening to this conversation may not realize what’s involved in scientific research. So you mentioned citations, can you just talk a little bit about what exactly does that mean a citation and a trustworthy citation? That’s my word. How do you differentiate it between something when you hear things like supposedly valid, trustworthy information actually isn’t the case? And let’s not even mention artificial intelligence and the stuff that we don’t know if that’s true or not. So citations, why is that so key then?
David Pendlebury: I’m actually going to let Jonathan, as the scientist, describe how the scientific process works and how the information that is disseminated is so essential, as you mentioned Neville, that the scientists are able to trust the information that they’re seeing.
Jonathan Adams: Thanks, David. What scientists do is obviously research and they then publish in academic journals in order to disseminate that information and tell others. And the history of academic journals goes back to the days of the Royal Society’s foundation and Newton and others, and that’s the key route to communication. It establishes your priority as the person who discovered some new piece of information, and it also enables others to see what you’ve done. But in doing that, you need to acknowledge the authorities on which your work is based. And of course, as time has gone on, there is more and more material available to refer to. And scientists have deemed it appropriate to ensure that you give due acknowledgment to those who have underpinned your research, whose prior work establishes the background of what you’re doing, that is relevant to what you’re now proposing. And so the network that David indicated builds up over time. And this is also really the key reason why I contacted ISI and David in the first place. Because working with the UK Science Budget, we established how we spent the money. We then wanted to know what we got for our expenditure and what all the scientists we talked to told us was, what you’re buying are publications. Because that’s the corpus, the body of knowledge that represents the investment you’ve made in research. it’s the intellectual property that comes out is set there in the academic library, and that’s what’s then available to others to continue to build, innovate and create new products and processes. But David knows more about the history of citation analysis than I do so I’ll hand it back to him at this point.
David Pendlebury: Sure. Well, the Institute for Scientific Information, which was the original name of the company that we are now members of, was founded in 1960 in Philadelphia by Eugene Garfield. He was somebody trained in science and chemistry, but then in library science. And he realized in the 1950s, and especially after the challenge of Sputnik, that the scientific literature was growing and yet it was harder and harder to deal with. It seems a little bit strange now, but the volume of paper so small compared to today was overwhelming the system. And people talked about information overload. He realized that the footnotes, the references, the citations, what we’re calling citations here at the end of the papers were the most efficient way to index the literature to connect one paper to another that was related. So the insight of Garfield is that the scientists themselves represented an army of indexers whose expert knowledge was embodied in the references that they appended to their papers. And if he could organize an index to the scientific literature based upon those linkages, that citation network, that that would be a very efficient way to find articles that researchers needed. And in fact, when the Science Citation Index was first published in 1964, it was a printed index and took up many, many inches of a shelf on a library. What he imagined in his mind, what he visualized, actually became the world wide web because when I talked to people in the 1980s about the citation linkages cited in citing papers, there was trouble understanding what I was saying or why it was important and then when people began to use the internet and hyperlinks all the lights went on and people understood what a citation index was. So Garfield was a genius who anticipated what technology only later enabled. And until then, it was the print format that he was using. Later, it migrated to tape, CDs, and so on.
Jonathan Adams: Yeah, just to add to what David’s saying there, the significance of citation counts, I think is critical here, that what Garfield saw was that the papers that are referenced more often by later literature are having a greater influence on the development of the research to which they link. And so that means that we can use citation counts to identify important papers. So how do we then implement that? Well, we look at the number of citations per field per year, and we can see that there are very, very typical patterns that citations rise over time at a rate that’s dependent upon the particular field. It’s faster in biology than it is in engineering, for example.
So that’s the key point that Garfield established and was then used by Brin and Page in the rank algorithm that Google depends on. But we were also in the 90s able to show a pretty clear relationship between the relative number of citations that papers got. If we look at, for example, a set of chemistry departments across UK universities and the peer rating that those same departments got if they were going through the UK’s research assessment exercise. So there is further support there for the original concept of Garfield’s that these great citation, greater relative citation counts are a sound guide at least to groups for research excellence. At the individual level it becomes more complex. But the key issue underpinning this is that first of all we have a network of papers linked by citations and secondly where there are higher citation counts it points to work of greater significance.
Neville Hobson: Got it. That’s a great foundation you’ve described on how it all works and how it all started. And I’m actually thinking if Tim Berners-Lee hadn’t invented the World Wide Web, well, we wouldn’t be having this conversation today. But it actually makes it easier to understand for particularly a lay listener who’s not versed in the detail of all this history of how the international element of research is so critical to the dissemination of the outputs or the outcomes of that research. How collaboration is able to be fostered and grown through the very nature of things we take for common today that you couldn’t imagine even back in the, even the late 80s early 90s.
One thing I’m interested to hear your thoughts about, which I probably will lead us into the discussion. on your work in this latest report that we’re going to talk about, is the volume of work and the fact that the United States, as I mentioned in my introduction, has been the dominant force in science since the end of the Second World War. And what we’ve seen happening since the beginning of this century, that’s 20 plus years, there’’s not a huge amount of time in the overall scheme of things, but the fact that the US share of of index data or index documents, if I call that, is actually decreasing as other countries, such as the ones I mentioned back then, is coming to the fore, but notably mainland China.
And so your report talks about how the US’s world share of engineering papers, for instance, has been cut quite dramatically from nearly 40% in 1981 to just 15% a handful of years ago. The EU27 has now surpassed the US in output. And so now we’re looking at the rise of China, for instance. So I think this is probably a good time to talk about that research. And maybe to start with, I could ask you both two twin questions, hard to separate these two, but I think they’re quite connected.
So the first one to me is an obvious one to us. How can the US maintain its position? Although it’s actually diminishing, but how can it maintain? This in global scientific research in the face of globalization, the impact of that. And you might think that’s not the way to describe this perhaps, but it seems to me that it probably is. But nevertheless, I’m keen to know what you think.
And the second related question is directly related to mainland China, which struck me as an interesting element from some of the content in the report that talks about how it’s presenting a serious challenge to the US dominance in global science. But how does that fit with the view the report takes that mainland China is the United States’ most frequent collaboration partner in technology research? If it’s kind of out-competing the US in this, how does that gel with the thought of all this going forward? Will that continue, I guess, is the actual question in that. So how do you two see that as a kind of starting point for this conversation?
David Pendlebury: Well, I’ll be happy to take on the question of share. Of course, share is a zero-sum game. And the United States is a mature country in terms of science, as is Europe, as is Japan. So the development, the publications coming from these research enterprises are much slower than the new countries that, through globalization entering the scientific realm especially in Asia and especially for mainland China so their rate of production is increasing faster than ours which means that their shares are increasing and ours are decreasing, but that does not necessarily lead to the conclusion that there is a decline in U.S. research it’s only a smaller portion of the world’s output.
Jonathan Adams: I think one of the things that is worth considering is the pace of change. When we go back to the 1980s, we’re looking at a very stable world system. The US became the dominant research economy after 1945 with a huge investment. The Russians, the USSR as it was then, were also investing heavily but were slightly more isolated from the world system. So the US played a key pivotal role and the links across to Western Europe, the transatlantic research networks also linking through the other way to Japan, were dominant in the system. And that dominance produced a stability that continued really pretty much through the 1990s. We started producing reports for the UK government, annual performance indicators in the early 2000s and after two or three cycles it was decided that annual reports were pointless because the system changed so slowly so we went down to biannual reports. So there was an expectation that next year would look pretty much like last year in the system and as you said Neville, China emerged from that background and rapidly changed its economy and disrupted that world system.
But at the same time, as you also noted, we saw the emergence of Brazil as the green research power. India was becoming more and more apparent, but we also need to talk about Korea, which transformed its economy. and the Asia-Pacific network and the contribution of Australia as well should not be forgotten. So we’re seeing new networks emerging around the world and that, Ren, refers back to the point you made about the influence of the World Wide Web because what was happening during the 1990s was the development of international collaboration. And that meant that people, instead of going to domestic conferences, were going increasingly to international conferences because it was cheaper and easier to travel to do that. They were making connections. That meant that people were able to do things that they weren’t able to do before. And that helped boost some of the emergent economies. And then as the internet became a key tool, that further accelerated the process of change and international collaboration shifted from bilateral towards multilateral collaboration and that further brought on these other economies. And as David said, it’s a zero-sum game.
So it’s not that the US has necessarily suffered badly in terms of its scientific productivity, so much as it has been joined on the stage by many other players and they are all making their contribution now. Looking at China, how did China transform so rapidly? It had a very strong technology economy, but that was a command economy that underpinned its industries and its defence system particularly. When China opened up, post Mao, then it began to transform that economy and it transformed the research and higher education system as well. And so what was already a very strong research machine now became a publicly accessible research machine. And of course it was strong in those areas of research which underpinned industry. And that meant the technologies, much less so in the biomedical areas, which the US had invested very heavily in. And so they were in a very strong position immediately to begin to build up that side. And of course being like Brazil, like Korea, like India and others, a fresh research economy they had greenfield sites as it were to create these new institutions from. The US was already heavily invested in the areas that it was working in.
David Pendlebury: Well, I was just going to follow on something that you said, Jonathan, which was the traditional or historical emphasis that China had in the physical sciences in terms of investments in physics, chemistry, and engineering in particular.
And now China is expanding its funding and its research output in the biomedical sciences as well. And making a mark at the top levels in terms of particular institutions and particular topics. The United States has over the years overfunded, I would say, and overemphasized investment in the biological sciences and in medicine. And they’ve done so, the United States has done so of course, because it’s always politically advantageous to fund biomedical research that gets a lot of public support. I think that something that the report brings out, especially in discussing how we are so intertwined in collaboration with China in these important technological fields, that the United States may need to think, and it is thinking, the government leaders are now thinking, about a rebalancing and more investment in physical sciences and technology.
Jonathan Adams: With life sciences research in the UK and the US have done absolutely amazing things in the last few decades. You know, our knowledge of genetics, our knowledge now of proteomics has transformed our understanding of how systems work and revealed enormous potential for biomedical approaches. You know, we would not have been able to respond to the challenge of COVID without the platform that all of that work provided. And there was no anticipation that there would be a huge viral outbreak, but the work was all there that enabled us to understand and then to respond to it. So we don’t want to say that that work was in any way unnecessary. But it’s, as David says, it’s an issue of balance.
And what happened with the US research economy is that it became somewhat unbalanced. Because partly, I think political reasons, as David noted, that nobody ever got thrown out of office for putting money into healthcare and medicine. But also, I think it was backing winners to an excessive degree. The huge successes of the work in the fundamental biological sciences and in medicine meant that there was continued growth and investment in those areas without paying attention to that need for a balance across the economy, that core production and processes, core industries remain significant, necessary, and improvements in them are highly desirable.
Neville Hobson: Well, that’s actually a very interesting assessment you’ve both given, I think, which is for me a great jump point to talk about this look ahead element. I think it’s highly relevant in light of what you said, that we’re seeing a still rapidly evolving landscape. We’ve got new actors on the stage, let’s say, who are bringing their approaches to scientific research to the party. Let’s see if I can put it that way. The United States, you are the experts here, not me, but it seems clear to me that this is not a situation where suddenly it’s going to diminish beyond all recognition. But the fact of the matter is that it is declining in this. And so this idea of collaboration in the context of globalization, I suppose becomes ever more important. So that, I think, is really the heart of my question to you both on this. big picture thing, where are we going to be in 10 years or less even in terms of globalization, collaboration, and scientific research overall? I would say if I were answering the question, it would clearly start, well, it depends, right? But really, how do you see things? I mean, what should we be expecting from scientific research generally in the coming five to 10 years? And put that in the context of political upheaval everywhere you look in the world these days, and the differing objectives parties have. What are we looking at in the coming years?
Jonathan Adams: I think the first thing I’d say is that you can’t even begin to think about where you’re going to be if you don’t know where you are now. And I would say that one of the reasons why the US has bigger challenges to face than it needed to is because it has not paid due attention to properly understanding, examining, unpacking where it stood. and where it’s been traveling.
So we are able to produce a report like this fairly readily from data that we have in the Web of Science, the publication citation data. Those are publicly available. They’re available to thousands of scientists all over the US, hundreds of thousands of scientists all over the world, and they’re available to public policy organizations and others. So NSF and others could have looked at those data at any time. And they would have recognized if they had done so, and they’d ask the questions that there had been an emerging dependency of US technology research on international partnerships, not just with China, but with others as well. I mean, we don’t want to overplay the China role, but China has in some areas like nanotechnology. communications, innovative materials, be a very, very important partner for the US in its recent development. But those data were always there. And so we’re looking ahead now because we’ve looked at these data.
But I think what we’ve got to say is, if you want to know where the world’s going to go, you must examine where the world is now and where it’s coming from and see that trajectory of change. and the Web of Science data is fantastically powerful in enabling us to get those views ahead. Not just for research assessment backward, which they’re often used for, and a lot of people think, oh, well, that’s what it’s about. But for management purposes, for examining the present state of play, and then by bringing in other knowledge, which clearly organizations like NSF have, thinking about what the next steps would be. I know David, would you agree with that?
David Pendlebury: Yes, I would agree. Ten years is a relatively short period of time. As Jonathan mentioned earlier, things do not change as rapidly in the science system globally than in other realms, certainly not like technology. So I think more of the same is the answer in the short term, which means China as a dominant producer of scientific research, the United States second, the European community as a group strongly third, and the continuing emergence of new players. I think, that this is not explicitly covered in the report, but one thing I’m concerned over the next ten years comes back to a point we raised in our discussion initially, and that’s trust in the scientific system and in the literature that is being used. The question of research integrity in science is coming in the last year more and more to the fore, as in the literature we’re finding more papers that are retracted for scientific misconduct. We’re finding papers that are made up, fictional papers that find their way into the literature, so-called paper mill papers, that are purchased and inserted in the system, not because of their contribution to the scientific system, but because individual researchers are trying to meet specific performance goals to optimize their appearance and then receive higher pay or promotion or things like that.
So trust in the scientific system is essential, it’s foundational, and yet perverse incentives, new technology, other things are starting to seriously distort the science system. And I think I’m very worried about that in the next 10 years.
Neville Hobson: Okay, there’s clearly a focus on trust. I think you’ve mentioned a couple of things there too, David, that we see in mainstream media a lot these days around this whole notion of fakery, misinformation, disinformation, and lack of trust, it already boils back down to that again. So that’s something I think is almost a USP for any organization that is trustworthy, and is regarded as trusted by others. And that’s something I guess, you know, clearly that’s something that you’re focused on. That’s not a bad picture you’ve painted. I think you’re right. This is, the next 10 years isn’t a long period in science. Unlike, for instance, automotive, where we talk about phasing out petrol cars within 10 years, is that ever going to happen? That’s pretty fast. Tech, as you said, it’s moving at a seriously rapid pace. Let’s look at artificial intelligence for instance. And indeed that touches on this whole notion of fakery, does it not, from what we see even in the newspapers? Right. So, I guess, in the time we have, what I want to say really is you’ve both outlined, I think, a very interesting landscape that we all inhabit. And from a scientific point of view, I think this report has some significant insights for anyone interested in this general topic. So my concluding point would be to ask you both of you if someone wanted to get hold of a copy of this report, where would they get it from?
Jonathan Adams: Well, it’s available for download on the ISI website, along with a lot of other reports that we’ve produced. And it’s very easy to access. And we welcome people reading these things and giving us feedback as well when they have interesting questions to pose that go beyond what’s in the report.
Neville Hobson: So they can maybe read your blog, put comments in, or perhaps some other method to talk to you online, social networks, et cetera. Anything you, final point you want to mention, David?
David Pendlebury: A search for Global research report, US research trends will pull it up on the Google machine, of course.
Neville Hobson: Excellent. Well, that’s great. I’d like to thank you both very much indeed for sharing your knowledge and insight on this big topic that we’ve touched on. Literally scraped the surface, I think. And thank you both very much indeed.
Jonathan Adams: Thank you for your time.
David Pendlebury: Thank you now.
Neville Hobson: So you’ve been listening to a conversation with Jonathan Adams, chief scientist at ISI, and David Pendlebury, Head of research analysis at ISI. For information about ISI, the Institute for Scientific Information at Clarivate and to download a copy of the Global research report discussed in this podcast as Jonathan mentioned, the best way is to visit clarivate.com/isi. Ideas to Innovation continues with our next episode in a few weeks time. Visit clarivate.com/podcasts for information. Thanks for listening.
Outro: Ideas to innovation from Clarivate.