News

‘Hacking the Status Quo to Pieces’: Stephen Curry on the San Francisco Declaration on Research Assessment at Age 10

October 11, 2022 2516
Full color logo for DORA, the Declaration on Research Assessment

Almost exactly a decade ago, a group of academic journal editors and publishers gathered during the annual meeting of The American Society for Cell Biology and developed a set of recommendations to combat the use of literature-based metrics, like the journal impact factor, determine the quality of individual scholars (or scholarship) and develop alternative and more honest ways of assessing academic impact. 

The result of their discussion was the San Francisco Declaration on Research Assessment, or DORA. The declaration in its original form offered 18 specific recommendations for researchers, research funders, institutions, publishers, and organizations that supply metrics, and was signed by a smattering of all of those at the time. As of today, some 22,168 individuals and organizations in 159 countries have signed DORA, and a body named DORA has grown up to promote the goals of the declaration and “catalyze change” in areas such as research assessment, open science, and human equity. 

Stephen Curry’s research career included working as a structural biologist studying protein-drug interactions and the replication of RNA viruses

To review DORA’s first decade, Social Science Space spoke with the Chair of DORA, Stephen Curry. He is a professor of structural biology at Imperial College London and the school’s assistant provost for equality, diversity and inclusion. As his institutional website notes, “He has keen interests in science policy, particularly in R&D funding, in research evaluation (and the use and misuse of metrics), and in scholarly publication.” 

This interview has been edited for length and clarity. 

Social Science Space: Stephen, could give us your elevator speech to explain what DORA is? 

Stephen Curry: DORA stands for the Declaration on Research Assessment. It’s words on a page – words on a web page – which asks people to think about research assessment and to avoid misusing journal impact factors as a proxy for the quality of an individual’s research or of individual research contributions.

In the last five years or so, in addition to just being words on a website, we’ve become much more active. We have support from a number of international organizations that allowed us to recruit a very small staff. But with that small staff, we have been much more proactive as an organization in moving the debate along through hosting meetings and turning up at other conferences and collaborating with other organizations, learned societies and funders to develop alternative, more robust approaches to deliver a more holistic approach to research assessment. 

 
S3: You’ve somewhat prefigured what was going to be my follow-up, which is why do we need an organization? Needing DORA as an idea is one thing, while DORA as an entity is another. 

Stephen Curry: It’s largely because old habits die hard.  

The critiques of the journal impact factor predate the arrival of DORA. And even after 10 years of work, we still see that. Our reliance on aggregate measures is still embedded in many parts of the academy and the research landscape. Many still think that it is a useful shorthand and that’s a difficult concept to tackle because it seems to make sense and it works according to the law of averages. A journal impact factor is an indicator for a particular journal and gives you an idea about the average number of citations that papers in that journal accrue, and therefore it’s seen as a measure that can be applied to anybody or any paper that’s published in that journal. But the law of averages doesn’t work well when you get down to individual people and individual papers. The citation performance of papers in any one journal, whatever its impact factor, varies over two or three orders of magnitude.  

Our appeal is to get people to think that if you are assessing research or a researcher, then you really need to be assessing the particular contribution and the particular individual. Aggregate measures and the impact factor is one. The h-index is another — problematic – aggregate. It boils down a lifetime’s work to one number. It’s a terribly crude measure, but it has still got traction.  

Academics and scholars are all incredibly busy, and so people naturally look for shortcuts. If you’re not going to use the impact factor, what else shall we do? There is a tremendous demand for the development of alternative ideas, and DORA has pivoted to be much more proactive, to develop and promote the tools.  

We’re not the only game in town. We’re very happy to collaborate with other organizations. We’ve done a lot of work with various different funders and learned societies to promote and develop the use of narrative CVS, which provides a more structured and concise summary of somebody’s contributions and allows one to map the questions to what the mission of the institution is. We are well aware that there are many other organizations around the world who have picked up on this agenda. There’s a big initiative launched just this year in Europe, which is about building a coalition of universities and other organizations to really dig into the practical aspects of reforming their research assessment practices. There is widespread acceptance that this is work that really needs to be done by the academy. Many people have picked up the baton or the challenge that DORA has thrown down. 

S3: In the 10 years since the declaration was published, what you do feel is the greatest success? Both the greatest success in this ecosystem, and the greatest success that DORA has its fingerprints on?

Stephen Curry: Establishing precise causality is [difficult]. As I say, for the first five years, it was a pretty shoestring operation. There were no dedicated staff. It relied on a bit of volunteer effort from members of the staff at the American Society for Cell Biology, which was one of the progenitor organizations. But in the last five years, we now have two staff and two half-time interns as our full complement. I think we’ve really put the conversation on the global stage.  

The declaration originated in San Francisco in North America, and very much had North American and European fingerprints on it. But one of the things we’ve done in the last five years is to ensure that we are capturing the global conversation around the need for research assessments, because science and scholarship are international. There’s no point changing the assessment rules in one part of the world because they pertain everywhere else as well. Our steering committee now, if you look on our website, has truly global representation. 

I think you do see Dora’s fingerprints more and more around the world. It has been very influential in the UK; the Wellcome Trust, which is a huge medical funder, now has a research policy whereby you can’t even apply for funding from them unless you have signed and implemented the DORA principles or adopted practices that are consistent with those principles. The European initiative that I just talked about very much references DORA as the kind of leading light that it’s inspired by. And there was a very influential report last year from UNESCO, which clearly has an international perspective, on their recommendations for open scholarship and open science. There’s widespread recognition that you can’t really push the open science agenda without at the same time addressing the research assessment agenda.  

I think we have established ourselves as an authoritative and thoughtful body that people look to and that are hopefully drawing inspiration from. 

S3: What has been the great white whale in that 10 years? What’s been tantalizing just out of reach? 

Stephen Curry: We know that many organizations have signed DORA and have done a lot of work internally to try and translate that into changes in policy and practice. We would like to see more evidence of that actually happening, so that signing DORA isn’t just performative and for good PR. I think we’re maybe vulnerable to that accusation because we don’t police the declaration. We cannot — we are a very small organization. We are relying on building a community of practice that has a shared goal. We will be looking to put more of an emphasis on surfacing the success stories, the departments and the universities that have signed or that have changed the way that they promote people and recruit people and which have created healthier and more vibrant departments as a result.  

For example, Sandra Schmid, one of the original authors on the declaration, was at the University of Texas Biomedical Center. When she was head of department, she pioneered a completely different approach to hiring that was much more collaborative between the department and the applicants.  

But we would like more documentary evidence because I’m a great believer that nothing succeeds like success. If other universities see good things happening when you adopt a more responsible approach to research assessment, then everybody will jump on board. At the minute we’re still trying to hack the status quo to pieces. But it’s a tough old bird. 

S3: You mentioned what you expect from signers. Our parent, SAGE Publishing, just signed DORA. So I’m asking you, Stephen, what has SAGE committed to, in a good faith and not a performative sense, as a publisher? 

Stephen Curry: So obviously publishers are important players in this scholarly landscape and for many publishers, the impact factor is a useful marketing tool. And for them, I think it’s a legitimate marketing tool. What I would like to see is publishers demonstrating that they understand that the impact factor is a crude measure, even for the journal, and to display the citation distributions on which their impact factor is determined. Clarivate actually provides that to any journal that that wants it.  

And to provide more — and maybe you already do this — indicators of journal performance: How many referees do you get per paper? How long does it take you to get it turned around in terms of getting a decision made? If it’s scientific, what sort of checks and balances do you have on statistical analysis or checking images for malpractice? And have messages on your website: “We use the impact factor as a marketing tool. We’re hoping to attract the brightest and the best authors to publish in our journal. But we’re signatories of DORA and firmly believe that the impact factor should not be used [beyond that].” So you’re going to help transmit the message to your communities and authors. 

I’ll be interested to look in six or 12 months time at SAGE’s website to see what does it actually mean. Hopefully there’s not just a badge – “We signed DORA” — and that’s the only information that’s there. Hopefully, your commitment to DORA means that you actually want to help transmit and enable the message that we are trying to get out there. 

S3: So SAGE, as an example, signed a decade in. Is that still a victory? 

Stephen Curry: I wasn’t involved in DORA right at the beginning in 2012, but I knew some of the principles and I was writing about open access and research assessment at the time, and so I got wind of it. As an individual, I was one of the very first people to sign. And when I would go around and talk about open access and research assessment in those early years, I would always ask, “Who’s heard of DORA?” And it would be 1 percent of the audience max. 

That’s partly why a group of us got together and to try and raise some funds and get DORA more active. It’s much better known now I’m sure it’s still not known in all corners of the globe or all corners of academia. I think there is a view that change is glacially slow, and that’s certainly been the case for the open access movement, for example.  

I’m very pleased that SAGE, yet another publisher, has signed on and is committing itself to its obligations under the declaration. So yes, I see it as a victory. Would it have been nice if you’d signed five years before? Well, yes, it would. But you know, it is what it is.  

We’ve seen quite a fraction of the major players in scholarly publishing have now signed DORA. I hope it’s not just that they want the badge and want to wear the cloak as it were, but that they see themselves as cognizant of the problem that we’re but that we’re trying to address. And certainly, the people that I’ve talked to at Elsevier and Springer, Nature and whatnot, they are very well aware of the dangers that that that have accrued to this.  

You know, this isn’t a problem that the publishers created; it wasn’t even the even publishers who created the impact factor. It was Eugene Garfield, who was interested in information science. It’s just that over the years it has taken on a life that it was never meant to have. And so now we have to hopefully put the patient to sleep as quickly and as effectively as we can. 

S3: You are from the biomedical world, as is DORA itself. But Social Science Space represents social science. How does DORA affect social and behavioral science differently from the way that it might affect natural, physical or medical sciences? Or it is that an invalid concern? 

Stephen Curry: You’re absolutely right. DORA came from the cell and molecular biology field. And I have to say that, you know, in the life sciences, in the biomedical sciences, you know, we are the worst offenders when it comes to misusing the impact factor in terms of trying to quantify people’s achievement. Yet I think many perverse incentives are still at play across the scholarly landscape. Publishing decisions are used as a shorthand to indicate quality. So having talked to people in other disciplines, we do want to make DORA relevant across all scholarly disciplines 

But if SAGE, as a social science publisher, has thought, “OK, we need to sign DORA,” then presumably SAGE has identified that there is an issue that does need to be addressed.  


I don’t care who signs DORA. What I care about is what people are doing to reform research assessment and to solve the problem that that DORA is trying to solve

— Stephen Curry

S3: One thing SAGE has done is promoting 10-year impact, which is still literature based, of course, but which we feel reflects the less-frantic pace of idea infusion in social and behavioral science.  [SAGE honors the most cited papers of a rolling-10 year period; it does not create a 10-year impact factor measure akin to the5-year impact factor.]

Stephen Curry: Well, I mean, that’s another interesting number to throw into the mix. But I would again encourage you to publish the distribution in which the 10-year impact factor is based. 

Some of my own papers are still being cited healthily even after 10 years. And the impact factor is skewed, I guess, in favor of the articles that grab a lot of attention for two years and then maybe die away. And so that information is not captured. So once again, it’s important that researchers and scholars are given the opportunity to, when they’re applying for promotions or jobs, identify “this is my best work and this is why,” and hopefully they have a narrative about, you know, changing thinking in a field or, introducing a new technique, or whatever. But they don’t just say it’s important because it was published in a journal with an impact factor 20 or whatever. You see that in cover letters and CVs: “I published in these journals. Therefore, I’m brilliant.” If you’ve published in very well-known journals, that’s an interesting piece of information you and I don’t imagine you’d want to discard a candidate like that. But I would like to hear you know, what was it that you actually did that caught the interest of these journals, and does the citation performance of the work support the decision of the editors and the peer reviewers that it really was top notch. 

You know, most papers in Nature get many fewer citations than Nature‘s impact factor, and that’s true of every single journal. It’s a mean of a skewed distribution, so it’s an inappropriate mathematical tool. 

S3: DORA is doing more than addressing impact factor. Tell me a little bit about TARA? 

Stephen Curry: It’s yet another acronym! We we’ve kind of envisaged TARA as being DORA’s sister. TARA is Tools to Advance Research Assessment. This is a project that’s funded by the Arcadia Fund, which is a charitable organization.  

There’s a number of different elements of the project. One is that we haven’t had big numbers of institutions sign up to DORA from the United States. So we are doing some survey and interview work just to try and pick apart the reasons. The culture of science funding and research policy in the United States is quite different than one sees in Europe, where there’s more centralization, a bit more direction. We wanted to understand that a bit more. So that’s one part of it.  

And then the other is we are compiling a data dashboard to identify documentary evidence from institutions around the world about the changes in policies that they’ve introduced. 

We started gathering that data and organizing it and mapping it. Early next year, we should be able to launch the first iteration of the dashboard, and this will show where change is happening around the world. We hope it’ll be a resource, not just an inspiration showing people that things are moving. We will keep the dashboard updated and hopefully we’ll show progress increasing the number of institutions in different countries that are reforming their research assistant practices and maybe introducing new ideas. That will then obviously then be a resource for the wider community about where they should go and look if they want to copy some good practice. 

We already do this to some degree on our website, but we’re hoping that this will be a kind of worldwide resource. I’m hoping once we launch it and get a bit of buzz behind it, that institutions will be saying to themselves. “Oh, why aren’t we on the TARA dashboard?” And if they’ve signed DORA, but they’re not on the dashboard, it’s because they don’t have a public document that shows what they’ve actually done. It’s kind of a multipurpose mechanism where you hope for stimulating and enabling the pace of change. 

S3: You mentioned open access earlier as being an allied concern for DORA. Are you heartened by trends in open access, such as the recent memo from the United States’ Office of Science and technology Policy mandating free access to publicly funded research?  

Stephen Curry: Yes, yes, I am. I was aware that there’s been a huge and quite long lobbying operation had gone on their end (and probably a counter-lobbying operation from some quarters, as well). I got into this whole debate around research assessment because I was initially interested in the debates around open access, and there’s a very obvious link between the two. People are judged very often on the journals that they publish in, and that meant it was very difficult for new open-access publishers or journals or titles to establish themselves in the market. If we move to a world where one is judged on what you do and not where you publish it, then that creates space to stimulate the drive for open access. 

Within DORA itself, some of our organizational supporters and funders are publishers themselves, and so we have a lively debate about open access and the intersection. Clearly there are different challenges raised by open access for different parts of the world and for different disciplines. We know that STEM is relatively well funded but the humanities and social sciences, on average, are less well funded than those other disciplines, so the publication costs do become an issue. 

And this whole issue of the APC [Article Processing Charge] model. I think there are increasing questions around whether that really is the best business model for long-term sustainability of the endeavor.  

At the same time, there’s a kind of a third wheel here, the equity within the academy. This is opened up both by thinking about research assessment, because research assessment is very often perturbed by biases that operate against women, against researchers from ethnic minorities, against disabled researchers, and the very idea of open science, which is about open publishing. This very naturally raises questions about who is the academy open to? Who within the academy have the voices and the authority to say what are the research questions we should be addressing?  

As I just mentioned, the debate on open access has raised, again, questions about the disparity between the Global North and the Global South. It’s great for the Global South if all the research is free to access, but it’s not great for them if they can’t then publish because they can’t afford to pay the APCs. That economic inequity preexisted open access and it applies equally to the subscription model, but the occurrence of the debate has stirred it up. We do try and look at the system holistically and a reform of research assessment has to include reforms that address the structural inequities that we know operate within the academy. 

 
S3: You’re the DEI Provost at Imperial? 

Stephen Curry: EDI is the way we order it: equality, diversity and inclusion. So yes, that’s my main responsibility at Imperial College. 

S3: But you come to this terrain naturally? 

Stephen Curry: Yes. I’ve had that position [assistant provost for equality, diversity and inclusion] for five years now, but my interest in it arose from my sort of burgeoning interest predating that in thinking about research assessment, open access, all the meta stuff around being in the academy and how our culture operates. I still have a great belief in the idea of the university as an important civic institution, but I see that at my own university — I’ve been there very long time — we are very white, we are very male predominantly, and that is because of historical exclusions that operate both within our own institution but of course arise because of the way the society operates. Most scholars in my view, whatever their field, choose a life in scholarship because they are fascinated by the world and they want to help other people understand it better and hopefully also make it a better place as well. And so I think there is a strong sense of natural justice that goes along with the academic calling, if I can put it in such high-faluting terms. 

But many people readily see then that idea of natural justice doesn’t play out in reality in many places for very complicated reasons because of very different interacting factors — which include research assessment and thinking about how open science should be and about who should be in the Academy. 

S3: I wonder if there’s something about either DORA or some of the things that we’ve talked about you feel should be acknowledged or mentioned that we have addressed? 

Stephen Curry: I’ve always said, even before I was chair of DORA, I don’t care who signs DORA. What I care about is what people are doing to reform research assessment and to solve the problem that DORA is trying to solve. And so if you do it by your own lights and through your own pathway without mentioning DORA at all or by signing the declaration, that’s perfectly fine by me. 

And better still, if you come up with a better idea on how to do it, then do please tell the world and DORA will pick that message up and we’ll hopefully transmit it to other people. I don’t care who does the solving of the problem – what I care about is that the problem gets solved. 

Social Science Space editor Michael Todd is a long-time newspaper editor and reporter whose beats included the U.S. military, primary and secondary education, government, and business. He entered the magazine world in 2006 as the managing editor of Hispanic Business. He joined the Miller-McCune Center for Research, Media and Public Policy and its magazine Miller-McCune (renamed Pacific Standard in 2012), where he served as web editor and later as senior staff writer focusing on covering the environmental and social sciences. During his time with the Miller-McCune Center, he regularly participated in media training courses for scientists in collaboration with the Communication Partnership for Science and the Sea (COMPASS), Stanford’s Aldo Leopold Leadership Institute, and individual research institutions.

View all posts by Michael Todd

Related Articles

Alondra Nelson Named to U.S. National Science Board
Announcements
October 18, 2024

Alondra Nelson Named to U.S. National Science Board

Read Now
Lee Miller: Ethics, photography and ethnography
News
September 30, 2024

Lee Miller: Ethics, photography and ethnography

Read Now
‘Settler Colonialism’ and the Promised Land
International Debate
September 27, 2024

‘Settler Colonialism’ and the Promised Land

Read Now
Artificial Intelligence and the Social and Behavioral Sciences
News
August 6, 2024

Artificial Intelligence and the Social and Behavioral Sciences

Read Now
Pandemic Nemesis: Illich reconsidered

Pandemic Nemesis: Illich reconsidered

An unexpected element of post-pandemic reflections has been the revival of interest in the work of Ivan Illich, a significant public intellectual […]

Read Now
How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

Yes, dad jokes can be fun. They play an important role in how we interact with our kids. But dad jokes may also help prepare them to handle embarrassment later in life.

Read Now
Biden Administration Releases ‘Blueprint’ For Using Social and Behavioral Science in Policy

Biden Administration Releases ‘Blueprint’ For Using Social and Behavioral Science in Policy

U.S. President Joseph Biden’s administration has laid down a marker buttressing the use of social and behavioral science in crafting policies for the federal government by releasing a 102-page Blueprint for the Use of Social and Behavioral Science to Advance Evidence-Based Policymaking.

Read Now
5 2 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments