Breaking the review system

We hear that it’s increasingly difficult to find reviewers for journal articles. Peer review is probably a hallmark of science, but the incentives are not exactly working out. Despite efforts to counter this (e.g., DORA, slow science), we still have plenty of incentives to publish articles other than the desire to share our findings with the research community (e.g., job applications when we are asked to count the number of publications, reputation drawn from publishing in a certain journal).

While open access is undoubtedly a good thing, I’ve always had some reservations about so-called gold-access: research teams pay publishers to have an article published. Obviously the idea is that we keep rigorous peer review in place, but the incentives are staked differently. We’ve seen the incredible growth of open-access publishers like Frontiers and MDPI, at times with questionable efforts like spamming researchers in a way that fraudulent journals do. It’s a grey area.

Even though publishers like MDPI engage in peer review, we frequently hear about questionable papers getting published. To be fair, that’s something that can happen to all publishers. MDPI are incredibly fast — but a pre-print will still be faster! — and they are actively unpleasant from the perspective of a reviewer. They put a lot of time pressure, which increases the chances of a rushed review.

But having reviewed for one of their journals once, now they keep spamming me with invitations to review. I use ‘spamming’ because of the frequency, and the fact that these invitations to reviews are all about work that has absolutely nothing to do with the work I do. This is not what a serious publisher does, irrespective of what we might think of article ‘processing’ charges and commercial profits. So definitely a dark shade of grey this.

We’ve seen great work in terms of diamond or platinum open access, but for it to catch on, we also need senior colleagues to come aboard (e.g., by clearly defining how junior colleagues are selected and evaluated, by submitting their work there) — ideally before commercial interests break the system completely…

https://magazin.nzz.ch/nzz-am-sonntag/wissen/profit-statt-wissenschaftliche-qualitaet-ld.1710205 (German, paywalled)

The PRIO Guide to Migration Journals

This deserves more attention that ‘just’ a tweet! The PRIO guide to migration journals is now live: https://migration.prio.org/Journals/

It’s a guide of 29 migration journals you might want to consult once in a while if you consider publishing in migration journals.

What do you get?

The first thing you’ll notice is a list of (currently) 29 migration journals — with a relatively broad understanding of ‘migration’. As is probably necessarily the case, we can quibble about the inclusion of journals in such a list, but in my view the PRIO guide provides a pretty good overview of the publishing options. Having such a list in itself is greatly useful.

For a slightly different list of migration journals, you can consult the excellent list provided by our Documentation Centre: http://www.unine.ch/sfm/home/library/revues-liees-a-la-migration.html

It doesn’t stop here, though, far from it! For each of these 29 journals, you get a detailed portrait that should help you decide whether the journal is a suitable outlet for your research. The headings included are relevant for researchers, and I really like how they managed to provide information about the impact factor without listing it (or other similar measures). (unlike my blunt summary here).

Perhaps the most useful part (but also the most difficult one, thus possibly also the one where we might not always agree) is at the end, where they have picked typical articles. On the one hand, this saves you a trip to the journal website to check recent publications. On the other hand, it doesn’t entirely answer the question of what kind of research do they typically publish? I guess that’s the question we’re asking, but also one which is very difficult to answer when the common factor is the topic (migration) and not the methodology or something like that. In that sense, three articles can never do justice of the diversity of articles in IMR or JEMS, for example.

If open access is a concern for you, the end of the guide nicely summarizes the open access status. This doesn’t include (how could it possibly?) national agreements with publishers.

If Because impact is probably one of your concerns, there’s a nice summary at the end. I really like it how they avoided impact factors of Scimago rankings, yet still provide you with a general idea of ‘impact’ — and with that ‘prestige’.

What don’t you get?

You don’t get journals that publish a lot on migration but are not focused on migration, like some demography journals. The selection of journals is nicely documented, so no quibbles there! You also don’t get journals without peer review — but that’s definitely a good thing!

You don’t get impact factors (that’s probably a good thing), but you also don’t get information about the peer review — that’s a factor many early career researchers (have to) take into consideration. Luckily, we have SciRev for this. While journals have the relevant information about turn-around time or rejection rates, they tend not to publish them in a systematic way — it’s more like advertising: journals often highlight those aspects they do ‘well’. With SciRev, everyone can review the review process, and there are also short comments that can be quite insightful. There are other such guides, like some wiki pages, but SciRev is the only one I know with a systematic procedure, and speaking of migration journals, the only one that spans different disciplines!

One thing that a generic guide like the PRIO guide will struggle to do is capture the prestige of journals in different circles of researchers. This is linked to the question of what kind of research typically gets published in the journals, and can be quite different to impact factors or Scimago rankings… not that a Q4 journal in Scimago will be considered high prestige by some, though. I guess there’s still value in ‘asking around’ a bit.

If you need more information about ‘green’ open access, there’s still https://v2.sherpa.ac.uk/romeo/

Open Access Options for Migration Studies

Today we’ve discussed open access options for migration studies. Here’s an attempt to provide an overview. In this list, a journal is “compliant” if it allows publishing a post-print within 6 months of publication on a non-profit or insitutional repository (green road). This includes fully open access journals. Payments in hybrid journals are not considered compliant. Information on compliance as of 31 October 2019, taken from http://sherpa.ac.uk/romeo/index.php; impact factors as listed on the journal websites, SJR from Scimago. All information is provided without warranty.

You may also consider peer-review experiences on https://scirev.org/

I am not covering disciplinary journals here (e.g. Social Inclusion, Sociological Science, Politics and Governance, Research & Politics, or the innovative OLH). Don’t hesitate to mention ommissions and errors in the comments.

Eaten E-Mails

We all dislike junk mail, but this week I learned there’s potentially something worse: e-mails that get ‘eaten’ by the internet and never arrive. I’m usually annoyed when people ask “Did you get my e-mail?” — because we always do; it’s more of a question whether we have read it or why we did not react to it. I don’t think this will change, but this week I have learned about e-mails that disappear without a trace.

I have been using a ‘for-life’ forwarding service from Oxford because I thought this would be a good way to ensure I could be reached irrespective my current academic affiliation. Unfortunately, on several occasions I have not received e-mails that I know were sent to my alumni account. These include a confirmation mail from COST actions (cost.eu — brilliant support from their IT), information on changing contact details (sent to the ‘old’ and ‘new’ e-mail, but only received on one), or more seriously decision letters from journals (which I can check on the journal website). There was no trace of these mails (not in the spam folder or the spam quarantine). Once I figured out this was not an isolated case, I checked with IT at Oxford to learn that this is a ‘known problem’ (just nobody has told me about it):

Some domains have chosen to publish a policy that says that if mail is relayed (that is, if recipients receive it from servers other than the ones the domains specify) it should be rejected. Providers are obeying this policy, and thus are rejecting the mail, because it comes via our servers rather than from the source servers they specify.

It’s apparently a generic problem with e-mail forwarding, where the policies of sending domains are to ‘blame’. This means there is nothing I can do about this, except for (largely) abandoning the forwarding service.