Elsevier Will Monitor Open Science In EU Using Measurement System That Favors Its Own Titles
from the conflict-of-interest?-I've-heard-of-it dept
Back in April, we wrote about a curious decision to give the widely-hated publisher Elsevier the job of monitoring open science in the EU. That would include open access too, an area where the company has major investments. The fact that the European Commission seemed untroubled by that clear conflict of interest stunned supporters of open access. Now one of them -- the paleontologist Jon Tennant -- is calling on the European Commission to remove Elsevier, and to find another company with no conflicts of interest. As Tennant writes in the Guardian:
How is it reasonable for a multi-billion dollar publishing corporation to not only produce metrics that evaluate publishing impact [of scientific articles], but also to use them to monitor Open Science and help to define its future direction? Elsevier will be providing data through the monitor that will be used to help facilitate future policy making in the EU that it inevitably will benefit from. That's like having McDonald's monitor the eating habits of a nation and then using that to guide policy decisions.
Elsevier responded with a blog post challenging what it calls "misinformation" in Tennant's article:
We are one of the leading open access publishers, and we make more articles openly available than any other publisher. We make freely available open science products and services we have developed and acquired to enable scientists to collaborate, post their early findings, store their data and showcase their output.
It added:
We have co-developed CiteScore and Snowball Metrics with the research community -- all of which are open, transparent, and free indicators.
CiteScore may be "open, transparent, and free", but Tennant writes:
Consider Elsevier's CiteScore metric, a measure of the apparent impact of journals that competes with the impact factor based on citation data from Scopus. An independent analysis showed that titles owned by Springer Nature, perhaps Elsevier’s biggest competitor, scored 40% lower and Elsevier titles 25% higher when using CiteScore rather than previous journal impact factors.
In other words, one of the core metrics that Elsevier will be applying as part of the Open Science Monitor appears to show bias in favor of Elsevier's own titles. One result of that bias could be that when the Open Science Monitor publishes its results based on Elsevier's metrics, the European Commission and other institutions will start using Elsevier's academic journals in preference to its competitors. The use of CiteScore creates yet another conflict of interest for Elsevier.
As well as writing about his concerns, Tennant is also making a formal complaint to the European Commission Ombudsman regarding the relationship between Elsevier and the Open Science Monitor:
The reason we are pursuing this route is due to the fact that the opportunity to raise a formal appeal was denied to us. In the tender award statement, it states that "Within 2 months of the notification of the award decision you may lodge an appeal to the body referred to in VI.4.1.", which is the General Court in Luxembourg. The notification of the award was on January 11, 2018, and it was exactly 2 months and 1 day later when the role of Elsevier as subcontracted was first publicly disclosed. Due to this timing, we were unable to lodge an appeal.
In other words, it was only revealed that Elsevier was the sub-contractor when it was too late to appeal against that choice. A cynic might almost think those behind the move knew people would object, and kept it quiet until it was impossible under the rules to appeal. Open science? Not so much…
Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: citescore, eu, eu commission, open access, open science
Companies: elsevier
Reader Comments
The First Word
“Re: Techdirt puts out half-assed piece about corporations
Hey, that's not fair!Not all scientists are marine scientists! Some of them are landlubbers and therefore brigands.
Subscribe: RSS
View by: Time | Thread
Techdirt puts out half-assed piece about corporations
You can't escape this site! But I could leave it. But I choose not to. Because all of you are PIRATES.
[ link to this | view in thread ]
Re: Techdirt puts out half-assed piece about corporations
Who hated the process of due
Each film that he'd paid
Was DMCAed
And shoved up his ass with a screw
[ link to this | view in thread ]
What a coincidence
[ link to this | view in thread ]
Fuck Elsevier.
[ link to this | view in thread ]
Re: Techdirt puts out half-assed piece about corporations
Not all scientists are marine scientists! Some of them are landlubbers and therefore brigands.
[ link to this | view in thread ]
Re: Techdirt puts out half-assed piece about corporations
[ link to this | view in thread ]
Some, or all of, TD readers don't seem to realize the danger in all of this, or they are not consuming enough scientific articles.
FOX..... met the hen house!
That's it !
Cheers oliver
[ link to this | view in thread ]
Some incorrect facts
Elsevier do not benefit by a 25% boost in scores and in fact other publishers fare much better than Elsevier, but that was left out altogether. The original Eigenfactor articles are still online:
http://eigenfactor.org/projects/posts/citescore.php#COI
[ link to this | view in thread ]
Correcting the above misleading statement
A simple reading of that COI statement from Eigenfactor says nothing about the correction of the data or resulting statistics at all. It makes it clear that their main concern was the COI of having Elsevier managing CiteScore in the first place, which is the key point emphasised over and over again, but which defendants of Elsevier seem to miss. Reading that article as being favourable to Elsevier in light of the present circumstances is a demonstration of selective reading.
Furthermore, why I would comment on other publishers when the focus of the article is on Elsevier is not a point worth making for obvious reasons.
So, the facts remain, the comment above is false, and serves only as a distraction from the numerous real issues at hand, which are now reinforced by the global research community, and without any sort of real response from any of the parties challenged. Comments like this are generally insulting to anyone with the ability to read.
For those interested, I have now responded to Elsevier, and a personal press release from the President of the Lisbon Council in full: http://fossilsandshit.com/response-to-president-paul-hofheinz-of-the-lisbon-council-regarding-elsevi er-and-the-open-science-monitor/ (excuse the URL..)
For those who don't want to read the whole thing, this was my final comment on the matter:
This is twice now, including the response by Elsevier, that I have had assaults made on my character over this matter, which look like strategic attempts to discredit me, rather than the substance of the posts. Terms like ‘misleading’ and ‘misinformation’ have been used repeatedly, without any substantial evidence, and detracting from addressing the numerous issues that I have raised. These issues have been co-signed by more than 600 [now 800] members of the global research community in a formal complaint to the EU Ombudsman, and not treated with the respect that they deserve by Elsevier or the Council. As a result of this, I will no longer respond to such comments, which are not the sort of critical, granular responses I was expecting as part of a professional, critical, and courteous discourse on this matter. However, if members of the consortium, and Elsevier, wish to directly address the points I have raised, then I am available.
[ link to this | view in thread ]
A bit confusing
Elsevier is in charge of monitoring the effects of Open Science research. (They also invest heavily in the space, so this is the conflict of interest.)
To help them figure out which papers are having the most impact, they're using CiteScore to determine this. They helped develop CiteScore's methodology, but not independently. CiteScore's ratings show Elsevier's papers higher rated than a competitor's, which isn't surprising.
The methodology for CiteScore is freely available.
So the solution is just that Elsevier shouldn't be monitoring the system, it should be some other third party? And if this 3rd party uses CiteScore, then that's fine?
So why is it a problem that Elsevier uses it...?
[ link to this | view in thread ]
Re: Correcting the above misleading statement
http://eigenfactor.org/projects/posts/citescore.php#Lancet
All I was trying to highlight is that the numbers in your article for this particular point were wrong.
[ link to this | view in thread ]
Re: Re: Correcting the above misleading statement
However, irrespective of this, it is still an issue for two reasons. Firstly, the aforementioned COI involved. Secondly, Elsevier titles are still getting a clear boost based on their own metrics, which should be setting off alarm bells all around.
I also think that these data need to be scrutinised a bit more carefully. There are what, 2700 journals that Elsevier own, across an incredibly heterogeneous landscape. To get any sort of real understanding of the metric, we need to analyse things in more detail. However, again, the COI and CiteScore boost should both be the major points of concern at the moment.
[ link to this | view in thread ]