My birthday fell earlier this week, and brought with it the usual delightful overflow of Facebook greetings. It was always my favorite part of that platform, and it managed to draw me out of the semi-boycott I’ve been conducting to say thanks to everyone.
My semi-boycott means that while I haven’t deleted my account because I still lurk a bit in order to see pictures of my nieces and nephews and so forth, I don’t post or interact with others’ posts. This decision has largely been driven by the enormous damage the platform has wrought in recent years, at levels from national politics to individual health and well-being. I miss some of my interactions there, but the nausea I feel when I consider contributing my time and attention to that company is just too much to squelch.
I bring this up here because this morning we discovered a new CORE deposit filled with highly problematic claims about the spread of COVID-19 and the effects of vaccines. We don’t — and can’t — review deposits for soundness, and yet we bill ourselves as a scholarly network, where work with some measure of academic authority can be found. Because of this, I believe we have a responsibility to prevent, or at a bare minimum not contribute to, the spread of harmful misinformation.
As a result, we’ve removed the deposit, and we will remove any similar deposits that we uncover.
We want, however, to develop the best policies and processes we can in order to ensure our network — whose openness we are committed to — does not risk becoming another vector for damaging misinformation. We’d very much appreciate your thoughts about this work; please leave your ideas and concerns in the comments. We’ll keep you posted as our work continues.
Thank you for your swift and decisive action. I think you made the right decision because this repository cannot become a dumping ground for misinformation. Thank you.
Concur. That outcome is depressingly probable.
Thank you for sharing this with the community and for asking us for feedback. I agree you did the right thing, and agree that repositories should not enable misinformation. There is, of course, a serious challenge here, perhaps akin to the one newspapers faced a decade ago or so with comments (some online newspapers, unable to resource moderation, disabled comments altogether). The challenge is one of consistency, and makes us think of what would happen when the issues are not as straight-forward, or as noticeable. In this case the output was identified, but when it’s not possible to review each submission, how to know there are no other outputs deposited that could be problematic too? Perhaps it could help to look into what SocArXiv Papers does, or seek their advise directly? It would be a shame if reviewing every submission became necessary because of the risk of dodgy outputs, hence slowing down the growth of the repository and reducing its attractiveness and difference from peer-reviewed venues. Something that could be considered is creating a form for authors to complete with a declaration on the trustworthiness/soundness/robustness of the methodology and conclusions of any outputs, as well as publishing outputs with a caveat of the type “This output is has not been peer-reviewed and therefore it should not… etc.” as we see in medical pre-prints. I’m sorry I don’t have anything more concrete or useful to suggest! Keep up the great work, and belated happy birthday! (
Thanks for this, Ernesto. Talking to SocArXiv and other repositories is a very good idea. I can also see the possibility of some kind of author declaration and/or caveat. But your comment now has me wondering about ways of relying on the community itself to help us with this kind of review — a means by which users can flag content as problematic or in need of evaluation?
I think your decision to remove a dodgy post was absolutely the right one, but I share some of Ernesto’s anxieties about having to review each submission. One way, as he suggests, is to talk to repositories and have an author declaration. The community, as you suggest, could serve as a means of checking on content that is problematic. There is so much misinformation in the public space that it’s vital to keep some spaces both open and reliable.