In the face of this crisis, we need research to be shared faster.
If you were a medical researcher studying infectious disease three months ago and you had an idea for a project, getting that project funded, off the ground, in the lab, and in a journal would have taken you many, many months. Chances are, you’d see your work in a peer-reviewed publication not until this summer or even a year later.
The coronavirus pandemic has changed all that.
One of the more uplifting developments of the bleak past several weeks has been witnessing science rise to the occasion in the face of coronavirus. As the virus has spread across the globe, scientific research has sped up to keep pace with it. The urgency of coronavirus has jolted scientific research, normally a sclerotic process. Studies that once took months to execute and get to the public now take weeks, even days. In the process, we’ve been given a glimpse of what science might look like after the pandemic.
How is the new, faster science manifesting itself? Use of preprint servers — where scientists post research that has not yet been peer reviewed — has spiked dramatically. Views and downloads are both up more than a hundredfold on medRxiv, a preprint server for medical papers. The number of new papers uploaded is up at least fivefold as well.
Journals, too, are seeing an unprecedented surge in submissions. The New England Journal of Medicine sees 110-150 coronavirus submissions a day, a spokesperson told me, and on occasion has seen more than 200 submissions a day.
Many journals have revamped their process to get those papers peer reviewed and published at a vastly expedited pace. “A process that can take weeks has been condensed to 48 hours or less in many cases,” Jennifer Zeis, a director of communications and media relations at the NEJM told me. One preprint posted to the bioRxiv in April looked at 14 journals and found that turnaround times had been, on average, halved.
Meanwhile, the National Institutes of Health and private actors have accelerated the grant-making process for coronavirus research, in an effort to get researchers the funding they need to study treatments, vaccines, hospital care, transmission, and testing.
All of this is great news. Acting faster on the coronavirus could save hundreds of thousands of lives. Faster turnaround means that scientists learn more quickly which projects are promising, policymakers get key updates faster, and treatments that can help patients make it into hospitals sooner.
There are, of course, complications. Quality control could end up being a casualty, for one. Preprint servers have long been part of the scientific process, but they’re getting vastly more use and vastly more media attention. The newer, faster pace could mean that badly flawed preprints get widely shared and covered in the media, fueling the spread of misinformation and forcing other scientists to waste valuable time by publicly debunking papers that would ordinarily be rejected in the peer review process.
But those harms are more than possible to mitigate, and the benefits of a faster scientific process are enormous. Developing and rigorously evaluating medications and treatments faster doesn’t just save lives during a pandemic — it saves lives all the time. We ought to be thinking about which components of our new warp-speed scientific ecosystem we can keep for good.
How the research process works
To understand just how much the process of biomedical research has changed in the span of several weeks, it’s important to know how research worked in the Before Times.
Before the coronavirus crisis, it would take half a year to write a grant application and months more to see if you got the grant. Once you conducted your research, you would usually write it up into an article that you can submit to a journal.
You might submit a draft to a preprint server such as bioRxiv or medRxiv. But before the coronavirus crisis, many researchers preferred not to, and some journals (including a majority of biology journals, at least as of 2017) have policies prohibiting submissions of articles that have already been publicly posted elsewhere.
If you did submit your draft to the preprint server, it probably wouldn’t be widely read (though that has been changing in recent years, as posting to a preprint server has become more common, some grant funders require it, and journals have increasingly accepted such submissions). But preprints do get ideas out faster, put them in front of paywalls, and allow for feedback and collaboration. Even before the coronavirus hit they were a growing part of where science happens, and coverage of preprints in the media has been getting more common, too.
When you submit to a journal, your paper is evaluated to see if it has enough promise to kick off the peer review process. “Manuscripts rejected at this stage are insufficiently original, have serious conceptual and/or methodological flaws, have poor grammar or English language, or are outside the aims and scope of the journal,” Social Science & Medicine explains on a guide for submission to their journal. Rejection at this stage is called a “desk rejection.”
Papers that pass that standard get sent out to several other scientists in the field for peer review. This process usually takes months. Social Science & Medicine says it’ll typically be “within 80 days,” and that’s better than average — one review of thousands of paper submissions found that the average “first response time” across many journals is 13 weeks.
Then, if the article is accepted, Social Science & Medicine states “it currently takes 1 week to get a citable, uncorrected draft of the article online, another 4-5 weeks to get the final corrected article online, and a few weeks later this is compiled into an online volume and issue. The print copy follows 2-3 weeks later.” Often, of course, an article is not accepted as is but is sent back with suggested revisions, resetting the clock.
To give you a sense of how long that is, imagine that you submitted an article about the coronavirus to Social Science & Medicine in mid-January, when China acknowledged that there was person-to-person transmission of the virus, and it went through the normal peer review process. Assuming that the article was accepted without revisions, you’d probably hear that your article was accepted in early April. There’d be a citable, uncorrected draft online a week later, in mid-April. The final corrected article would be online four or five weeks later, so you’d still be waiting now for that to happen! The print copy with your article would go out in July.
Even before the coronavirus, people were raising concerns about this process. “In an era known for the great speed and availability of information — where we could choose to blog our results rather than submit them to journals — publishing papers seems slower and more painful than ever before,” Vivian Siegel, the editor in chief of Cell, argued back in 2008.
“The scientific peer review process is one of the weakest links in the process of scientific knowledge production,” researchers Janine Huisman and Jeroen Smits argued in Scientometrics in 2017. “While the actual time it takes to write a referee report may vary between a few hours and a day, reviewers tend to take several weeks to several months to submit their reports.”
That means that peer review takes months not because there’s months of work to do — there’s about a day of work to do, and no one gets around to it for months. That should be unacceptable even outside an emergency.
How coronavirus is changing things for the better
The coronavirus crisis has pushed scientific research to change the way it does things.
To be clear, this isn’t a first — it’s standard for journals to work a bit differently in a crisis. The peer-review process being used for SARS-CoV-2 (the novel coronavirus) was also used when SARS and Ebola erupted in the last couple of decades, a spokesperson for the NEJM told me. Those experiences tell us that peer review doesn’t need to take months — it can happen faster just by virtue of having a list of peer reviewers willing to take an immediate look at the papers they’re sent.
From the journals’ perspective, it’s a shift that shouldn’t compromise their standards. “We keep our standards as high for breaking stories as we do for anything we publish,” Zeis told me. “That means that the research articles we publish are reviewed and go through our careful editing procedure.” And yet “everything is expedited tremendously” — faster review by editors, faster responses by peer reviewers, faster work by the “manuscript editors, illustrators, proofreaders, and production staff.”
This paper in Science, describing a key protein in the coronavirus that will be targeted in developing treatments and vaccines, was reportedly published nine days after it was submitted. “It’s the same process going extremely fast,” Holden Thorp, the journal’s editor in chief, told the New York Times.
That’s one way that science is happening faster. But it’s not the only way. In addition to getting responses for journals faster, more and more scientists are using preprint servers to share their research before it is peer reviewed. Preprint servers already existed before the pandemic, but they have been used much more, and there’s a feedback loop: Scientists are now more likely to expect useful feedback and engagement, so they are more likely to post on these servers and to engage with other articles.
On the funding side, too, there has been an effort to speed up the pace at which science happens. Grant-writing, just like scientific publishing, is full of frustrating delays and wasted time. It takes months to prepare a grant request, and can take months for the grants to be reviewed and approved or rejected.
Nonprofit programs like Fast Grants, which I’ve written about, are trying to fix this by offering a one-hour application process and a 48-hour turnaround. “It was very much rigorous peer review, it was just accelerated,” Stanford biochemistry professor Silvana Konermann, who led the grant review process, told me.
This sort of approach — giving out lots of money, very fast, with a very streamlined process for understanding what makes a grant opportunity valuable — is called rapid-response grant making. It can be a great way to put money in the hands of those who need it fast and without bureaucracy.
Skeptics of rapid-response grant-making argue that cutting down the approval process typically means that reviewers are forced to rely on vague signals of research quality instead of deeply digging into the relevant medicine and evaluating projects on their merits. They might, for example, approve all applications from prestigious researchers or universities, excluding important research done by a less-established researcher.
There’s some merit to this criticism, but it misses an important point: Standard grant making also has this problem, despite the months-long delays in the process at various points. In fact, studies show that above some threshold of grant quality, there is virtually no agreement among reviewers about which projects are the best ones.
There’s also almost no correlation between how projects “scored” and how often the research that resulted from the projects was cited (an imperfect measure of how influential the research was, but still an indicator that grant evaluators can’t predict which research will ultimately advance science most). That suggests that lots of review time is effectively wasted.
That all said, the grant process has so far been slower than the publication process to adapt to the crisis. Fast Grants is not the norm. Many researchers who are doing critical coronavirus research are still waiting on funds. We gain the most from fast science if every step of the process — grants, approvals to conduct the research, peer review and publication — is sped up, and while there are changes happening on all of those fronts, we are not yet at the point of systematically supporting researchers in getting their work done and results published as quickly as possible.
When faster doesn’t mean better
To be clear, making science go faster won’t be a perfect process. Some journal articles are wrong, even during periods when there’s less scientific uncertainty and no rush. Now, being wrong is a normal and healthy part of the scientific process. But right now, journals are under more pressure than ever to get things right. “We have a responsibility to publish reliable information quickly for a public health emergency of this magnitude,” Zeis told me.
“We feel very much that we are publishing research that is literally day by day guiding the national and global response to this virus. And that is both daunting and full of considerable responsibility, because if we make a mistake in judgment about what we publish, that could have a dangerous impact on the course of the pandemic,” Richard Horton, the editor in chief of the British medical journal The Lancet, told the New York Times. With the stakes higher than ever, making sure peer-reviewed papers are right while moving the process along at warp speed is hugely challenging.
The challenges are even bigger with preprints. To be clear, many peer-reviewed papers turn out to be wrong — the peer review process doesn’t catch all errors, and it sometimes misses big and serious ones. The fact a paper has been published doesn’t make it definitely reliable. But preprints are of course even likelier to have serious flaws, including ones that would have been fixed during peer review or would have caused the paper to be rejected. The majority of preprints do become papers, often with no or minimal changes, but a substantial percentage (between half and 25 percent, depending on the preprint archive and the time window studied) don’t, often because of serious issues.
Take an early February paper that argued that the similarities between the genome of the novel coronavirus and the genome of HIV suggested that the virus had been genetically engineered. Researchers quickly debunked it, but not before the conspiracy theories had already taken off.
Or take an April serology study in Santa Clara County, California, that claimed to determine that 2.5 percent or more of the population had already been infected and the infection fatality rate was much, much lower than claimed. The preprint was widely covered in the media. Many researchers raised methodological concerns and some pointed out math errors, but the initial statistics had already spread widely. The media coverage might have spurred the peer corrections, but lots of people got misapprehensions about widespread immunity in the meantime.
The same team that published the Santa Clara serology study also conducted a study in Los Angeles. From the Los Angeles study, they initially published less than a preprint — just a press release about their results. That, too, was widely covered, often in a fashion that obscured that no study had yet been published.
Just last week, researchers condemned an LA Times article based on a preprint about how the coronavirus had “mutated” to become more transmissible, arguing that the mutation is actually fairly likely to be nonfunctional (as nearly all mutations are) and that the paper ignored better hypotheses about the spread of the variant virus.
The problem in all these cases wasn’t necessarily that the study turned out wrong or that they were published in preprint form. Flawed papers were being published before the pandemic, as were preprints. The difference this time is that because of the circumstances, preprints with eye-grabbing results about coronavirus end up being amplified, when in an earlier time they would have been ignored or discussed only by scientists.
Preprint servers are scrambling to have more of a review process to avoid these events, but of course any kind of review process complicates their mission to let scientists share a ‘first draft’ without onerous review.
There’s a potential fix here: The media should be very thoughtful about how to cover preprints. Journalism is an essential piece of the scientific process, but science communications should be cautious.
One piece of advice that scientists have given to reporters is to be sure, when writing about a preprint, to talk to several unaffiliated scientists about their impression of the research, effectively getting an unofficial “peer review” of the research.
Other critics advise a much stronger measure for researchers themselves: Don’t put speculative conclusions in your preprints. Sharing data is almost always valuable to other researchers, while conclusions are more likely to be widely spread and misinterpreted, and should arguably wait until there’s been consultation with other researchers in the field.
For that reason, preprint servers like bioRxiv have started screening out papers that make claims based on computational models (rather than experiments in the real world). If we went down that route, preprints would be commonplace for some kinds of research while others that are too speculative would have to wait for peer review.
How to keep fast science for good
Certainly, there’s a lot of room for improvement as researchers, media outlets, and individuals figure out how to engage with a new, faster-paced science that relies more on preprints. But the fact there’s room for improvement shouldn’t obscure how much good a faster scientific process is doing.
Faster publication of virus genomes has allowed researchers to build on each other’s work. Faster publication of clinical trial results has helped us better understand how to treat the disease. Research has been used to inform public health recommendations, like allowing states to reopen outdoor facilities first in light of evidence that outdoor transmission is rare, and encouraging face masks in light of evidence of asymptomatic transmission.
Making science happen faster has saved a lot of lives. And even when the crisis is over, making science happen faster will save lives — by speeding up research into cancer treatments, air pollution, climate change, malaria vaccines, and more. The crisis has brought to the forefront the critical role that scientists play in making our world a better place, but that role is by no means unique to the crisis.
Which should leave us asking: How much of this faster scientific process can we keep after the crisis? Can some grants continue to be made available with a short application process and extremely fast approval process? Can researchers stay in the habit of posting and engaging with preprints? Can papers stick with streamlined processes, so that it doesn’t take months to get a paper published?
The answer is almost certainly that we can. It’ll be challenging — it’ll require changing how papers work and doing more to combat misinformation as more and more research bypasses traditional peer review channels. New publishing models (like pre-registration of studies and pre-acceptance by journals based on those preregistrations, or “overlay journals” built based on open peer review on preprint servers) will likely be part of the solution.
But now that scientific researchers have seen what they’re capable of, we shouldn’t just accept returning to a “normal” that was slowing down essential progress.
Future Perfect is funded in part by individual contributions, grants, and sponsorships. Learn more here.
Support Vox’s explanatory journalism
Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that has the power to save lives. Our mission has never been more vital than it is in this moment: to empower you through understanding. Vox’s work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources — particularly during a pandemic and an economic downturn. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts at the quality and volume that this moment requires. Please consider making a contribution to Vox today.