After Plan S, Here's Plan U: Funders Should Require All Research To Be Posted First As A Preprint
from the instant-open-access? dept
Preprints are emerging as a way to get research out to everyone free of charge, without needing to pay page charges to appear in a traditional open access title. The growing popularity is in part because research shows that published versions of papers in costly academic titles add almost nothing to the freely-available preprints they are based on. Now people are starting to think about ways to put preprints at the heart of academic publishing and research. In the wake of the EU's "Plan S" to make more research available as open access, there is now a proposal for "Plan U":
If all research funders required their grantees to post their manuscripts first on preprint servers -- an approach we refer to as "Plan U" -- the widespread desire to provide immediate free access to the world's scientific output would be achieved with minimal effort and expense. As noted above, mathematicians, physicists and computer scientists have been relying on arXiv as their primary means of communication for decades. The biomedical sciences were slower to adopt preprinting, but bioRxiv is undergoing exponential growth and several million readers access articles on bioRxiv every month. Depositing preprints is thus increasingly common among scientists, and mandating it would simply accelerate adoption of a process many predict will become universal in the near future.
There is a precedent for mandating preprint deposition: since 2017, the Chan Zuckerberg Initiative (CZI) has mandated that all grantees deposit preprints prior to or at submission for formal publication. This requirement has been accepted by CZI-funded investigators, many of whom were already routinely depositing manuscripts on bioRxiv.
The proposal goes on to consider some of the practical issues involved, such as how it would fit with peer review, and what the requirements for preprint servers might be, as well as deeper questions about guaranteed long-term preservation strategies -- a crucial issue that is often overlooked. The Plan U proposal concludes:
because it sidesteps the complexities and uncertainties of attempting to manipulate the economics of a $10B/year industry, Plan U could literally be mandated by funders tomorrow with minimal expense, achieving immediate free access to research and the significant benefits to the academic community and public this entails. Funders and other stakeholders could then focus their investment and innovation energies on the critical task of building and supporting robust and effective systems of peer review and research evaluation.
Those are all attractive features of the Plan U idea, although Egon Willighagen has rightly pointed out that using the right license for the preprints is an important issue. At the time of writing, the Plan U Web site is rather minimalist. It currently consists of just one page; there are no links to who wrote the proposal, what future plans might be, or how to get involved. I asked around on Twitter, and it seems that three well-known figures in the open science world -- Michael Eisen, John Inglis, and Richard Sever -- are the people behind this. Eisen has been one of the leading figures in the open access world since its earliest days, while Inglis and Sever are co-founders of the increasingly-popular bioRxiv preprint server, which serves the biology community. That augurs well for the idea, but it would still be good to have the details fleshed out on a more informative Web site -- something that Sever told Techdirt will be coming in due course.
Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: academic research, open access, plan s, plan u, preprint, replication crisis, research, studies
Reader Comments
Subscribe: RSS
View by: Time | Thread
The thing with peer review is that it is mostly not verification anyway - that is left for replication. I don't see why potential reviewers, who are not compensated anyway, could not leave review comments on a preprint server.
Most anyone truly interested in a paper is going to be doing their own mental review. Idiots who want to "cite" papers for bullshit idiot things are going to do it regardless as to peer review status, the quality of the research, or any facts involved. Those sorts will make expansive claims or cite things that don't say what they claim they say regardless.
[ link to this | view in thread ]
Broad mirroring please
Yes, the licenses are ever important. It would be very unfortunate if an Elsevier could set fire on the bioArxiv server and destroy it all, like it happened thousands of years ago when the feared Elsevirus conquered Alexandria.
[ link to this | view in thread ]
Techdirt readers might like to read a more thorough paper on the potential for preprints in this new article https://link.growkudos.com/1qbg6yps9hc
[ link to this | view in thread ]
Re:
That raises the question of what "pre"print will mean. If nobody's going to see the "final" version, there's little reason to make changes during that process, which leaves little reason for the process to exist. Kind of like certain ISO standards where everyone works from the latest draft version because the "real" version is locked up. (I'm told my company actually has purchased the C language standard, but I'm not going to waste the time to track that down when I can bring up a draft instantly.)
[ link to this | view in thread ]
I'm waiting for the next 4 plans that are all aimed at undermining Elsevier and making science free.
Plan C Plan K Plan I Plan T
[ link to this | view in thread ]
Re: Re:
Not really. Final versions are generally edited for format and grammar - not for results. Bad or faked research nakes it's way into journals anyway. The only thing journals do is add cost and limit the number of papers published in physical paper journals for purely practical limitations and editorial taste as to what they think is important or interesting this month or quarter.
None of this stops "final" (whatever that means in the scientific realm) versions from being published also, if people want to continue paying ridiculous sums for access to esteemed journals for whatever added value they expect they get from these.
[ link to this | view in thread ]
Re:
seems maybe an interesting way to track some citation impact. new to me.
[ link to this | view in thread ]
License: CC0 - Public Domain
Simple, require all pre print is public domain. Simplest and easyest!
[ link to this | view in thread ]