Breaking the review system

We hear that it’s increasingly difficult to find reviewers for journal articles. Peer review is probably a hallmark of science, but the incentives are not exactly working out. Despite efforts to counter this (e.g., DORA, slow science), we still have plenty of incentives to publish articles other than the desire to share our findings with the research community (e.g., job applications when we are asked to count the number of publications, reputation drawn from publishing in a certain journal).

While open access is undoubtedly a good thing, I’ve always had some reservations about so-called gold-access: research teams pay publishers to have an article published. Obviously the idea is that we keep rigorous peer review in place, but the incentives are staked differently. We’ve seen the incredible growth of open-access publishers like Frontiers and MDPI, at times with questionable efforts like spamming researchers in a way that fraudulent journals do. It’s a grey area.

Even though publishers like MDPI engage in peer review, we frequently hear about questionable papers getting published. To be fair, that’s something that can happen to all publishers. MDPI are incredibly fast — but a pre-print will still be faster! — and they are actively unpleasant from the perspective of a reviewer. They put a lot of time pressure, which increases the chances of a rushed review.

But having reviewed for one of their journals once, now they keep spamming me with invitations to review. I use ‘spamming’ because of the frequency, and the fact that these invitations to reviews are all about work that has absolutely nothing to do with the work I do. This is not what a serious publisher does, irrespective of what we might think of article ‘processing’ charges and commercial profits. So definitely a dark shade of grey this.

We’ve seen great work in terms of diamond or platinum open access, but for it to catch on, we also need senior colleagues to come aboard (e.g., by clearly defining how junior colleagues are selected and evaluated, by submitting their work there) — ideally before commercial interests break the system completely…

https://magazin.nzz.ch/nzz-am-sonntag/wissen/profit-statt-wissenschaftliche-qualitaet-ld.1710205 (German, paywalled)

… submit any type of the article based upon your research interest …

Academic spam can be funny some times. Who on earth is going to fall for this one?

Respected Doctor,

I had a glance at your profile online and was extremely amazed with your work. I feel you will be an ideal person who helps us for the progress of our Journal. Hence, I am approaching you through this email.

All the authors around the globe are cordially invited to submit any type of the article based upon your research interest for the upcoming edition.

I hope you will consider my request and I wish to have your speedy response in 24 hrs.

Await your cheerful comeback.

👉 So, please, all the authors around the globe, quickly submit any article! I’m sure it’s going to be great, any you’ll have plenty of readers… but note that you’ll have to respond within 24 hours…

Academic Spam

I guess I’ve got used to academic spam — invitations to publish in predatory journals. They typically scrape conference programmes, but today I got a surprising one:

I congratulate you on the paper “<paper title>”, published in the “3rd ISA Forum of SOCIOLOGY 2016”. Observing the relevance and contribution that the paper has in the field of study addressed, after analyzed by our editorial board, I invite you to publish it in <journal paper>

Mail

Yes, it took them five years to “read” the conference paper. We also get the usual nonsense of being invited, and pretension of the journal being important, but the time lag… I’m sorry, you’re simply too late.

Should I review this?

I have just received an invitation to review an article by a publisher that’s — let’s say “less established”. Given that they have been accused of being a predatory publisher in the past, I was at first positively surprised: There was none of this silly flattering of being a leading expert etc. and they apparently did try to get a proper review. Then came the title and the abstract. It had “public attitudes” in it, and a “scoping review” — so if you allow for synonyms in the keyword search, I can see how their machine picked me, but if no human is involved, neither am I (irrespective of the fact that this was utterly out of my expertise). Maybe we should react with automatized reviews, a fork of SciGen perhaps?

A gem from the spam folder

I got this today…

Cooperating with 6 other guest editors […] the Lead Guest Editor, has proposed a special issue titled Society, Culture and Politics in Contemporary Africa

wow,  I count 7 editors in total, that must be a big special issue…

gather together researchers in order to spread their academic experience and research findings on all topics in relation to Africa

I see, all topics in relation to Africa. Now I wonder whether they can manage with 7 editors, I mean all topics in relation to Africa.

Unfortunately, this is followed by this table:

Topics of interest include, but are not limited to:
  1. Socioeconomic dynamics
  2. Culture
  1. Social mobility
  2. Tradition
  1. Politics
  2. Society

That’s a real shame, not all topics after all. Now I’m not so sure anymore, I mean they do narrow it down quite a bit (there’s hope, though, that desperate “not limited to”).

Image credit: CC-BY-NC AJC1