I think it’s worth a read, but I struggled with peer review being pitched as “an experiment”, and especially with the extrapolation from one “I posted this on PsyArXiv and got a lot of feedback” to this is what we should be doing. Would it scale? Would it be better? Would it be fairer or simply give even more weight to “prestige” and those in stable jobs with all the resources? Would we encourage even more hyperbole and select on eloquence?
Would there still be journals (or other recommendation services), and do we want to give more decision power to individual editors (and specific algorithms)? I’m just asking a lot of questions here, but I think that the answers need a careful distinction between journals, peer review, and for-profit publishers.
We hear that it’s increasingly difficult to find reviewers for journal articles. Peer review is probably a hallmark of science, but the incentives are not exactly working out. Despite efforts to counter this (e.g., DORA, slow science), we still have plenty of incentives to publish articles other than the desire to share our findings with the research community (e.g., job applications when we are asked to count the number of publications, reputation drawn from publishing in a certain journal).
While open access is undoubtedly a good thing, I’ve always had some reservations about so-called gold-access: research teams pay publishers to have an article published. Obviously the idea is that we keep rigorous peer review in place, but the incentives are staked differently. We’ve seen the incredible growth of open-access publishers like Frontiers and MDPI, at times with questionable efforts like spamming researchers in a way that fraudulent journals do. It’s a grey area.
Even though publishers like MDPI engage in peer review, we frequently hear about questionable papers getting published. To be fair, that’s something that can happen to all publishers. MDPI are incredibly fast — but a pre-print will still be faster! — and they are actively unpleasant from the perspective of a reviewer. They put a lot of time pressure, which increases the chances of a rushed review.
But having reviewed for one of their journals once, now they keep spamming me with invitations to review. I use ‘spamming’ because of the frequency, and the fact that these invitations to reviews are all about work that has absolutely nothing to do with the work I do. This is not what a serious publisher does, irrespective of what we might think of article ‘processing’ charges and commercial profits. So definitely a dark shade of grey this.
We’ve seen greatworkin terms of diamond or platinum open access, but for it to catch on, we also need senior colleagues to come aboard (e.g., by clearly defining how junior colleagues are selected and evaluated, by submitting their work there) — ideally before commercial interests break the system completely…
Journal article accepted, sure we want the world to know about it. In this case, the journal throws in 50 e-prints to share with colleagues:
When the article has published, you will receive 50 eprints to share with colleagues. This will enable you to give 50 friends, colleagues, or contacts free access to an electronic version of your article.
Source: Acceptance Mail
But you know what, it’s going to be open access anyway — thanks to publisher agreements and taxpayer money. Well, I’m not complaining to get free access to something that’s free to access anyway…
It’s a guide of 29 migration journals you might want to consult once in a while if you consider publishing in migration journals.
What do you get?
The first thing you’ll notice is a list of (currently) 29 migration journals — with a relatively broad understanding of ‘migration’. As is probably necessarily the case, we can quibble about the inclusion of journals in such a list, but in my view the PRIO guide provides a pretty good overview of the publishing options. Having such a list in itself is greatly useful.
It doesn’t stop here, though, far from it! For each of these 29 journals, you get a detailed portrait that should help you decide whether the journal is a suitable outlet for your research. The headings included are relevant for researchers, and I really like how they managed to provide information about the impact factor without listing it (or other similar measures). (unlike my blunt summary here).
Perhaps the most useful part (but also the most difficult one, thus possibly also the one where we might not always agree) is at the end, where they have picked typical articles. On the one hand, this saves you a trip to the journal website to check recent publications. On the other hand, it doesn’t entirely answer the question of what kind of research do they typically publish? I guess that’s the question we’re asking, but also one which is very difficult to answer when the common factor is the topic (migration) and not the methodology or something like that. In that sense, three articles can never do justice of the diversity of articles in IMR or JEMS, for example.
If open access is a concern for you, the end of the guide nicely summarizes the open access status. This doesn’t include (how could it possibly?) national agreements with publishers.
If Because impact is probably one of your concerns, there’s a nice summary at the end. I really like it how they avoided impact factors of Scimago rankings, yet still provide you with a general idea of ‘impact’ — and with that ‘prestige’.
What don’t you get?
You don’t get journals that publish a lot on migration but are not focused on migration, like some demography journals. The selection of journals is nicely documented, so no quibbles there! You also don’t get journals without peer review — but that’s definitely a good thing!
You don’t get impact factors (that’s probably a good thing), but you also don’t get information about the peer review — that’s a factor many early career researchers (have to) take into consideration. Luckily, we have SciRev for this. While journals have the relevant information about turn-around time or rejection rates, they tend not to publish them in a systematic way — it’s more like advertising: journals often highlight those aspects they do ‘well’. With SciRev, everyone can review the review process, and there are also short comments that can be quite insightful. There are other such guides, like some wiki pages, but SciRev is the only one I know with a systematic procedure, and speaking of migration journals, the only one that spans different disciplines!
One thing that a generic guide like the PRIO guide will struggle to do is capture the prestige of journals in different circles of researchers. This is linked to the question of what kind of research typically gets published in the journals, and can be quite different to impact factors or Scimago rankings… not that a Q4 journal in Scimago will be considered high prestige by some, though. I guess there’s still value in ‘asking around’ a bit.
I had a glance at your profile online and was extremely amazed with your work. I feel you will be an ideal person who helps us for the progress of our Journal. Hence, I am approaching you through this email.
All the authors around the globe are cordially invited to submit any type of the article based upon your research interest for the upcoming edition.
I hope you will consider my request and I wish to have your speedy response in 24 hrs.
Await your cheerful comeback.
👉 So, please, allthe authors around the globe, quickly submit any article! I’m sure it’s going to be great, any you’ll have plenty of readers… but note that you’ll have to respond within 24 hours…