Journal article accepted, sure we want the world to know about it. In this case, the journal throws in 50 e-prints to share with colleagues:
When the article has published, you will receive 50 eprints to share with colleagues. This will enable you to give 50 friends, colleagues, or contacts free access to an electronic version of your article.
Source: Acceptance Mail
But you know what, it’s going to be open access anyway — thanks to publisher agreements and taxpayer money. Well, I’m not complaining to get free access to something that’s free to access anyway…
It’s a guide of 29 migration journals you might want to consult once in a while if you consider publishing in migration journals.
What do you get?
The first thing you’ll notice is a list of (currently) 29 migration journals — with a relatively broad understanding of ‘migration’. As is probably necessarily the case, we can quibble about the inclusion of journals in such a list, but in my view the PRIO guide provides a pretty good overview of the publishing options. Having such a list in itself is greatly useful.
It doesn’t stop here, though, far from it! For each of these 29 journals, you get a detailed portrait that should help you decide whether the journal is a suitable outlet for your research. The headings included are relevant for researchers, and I really like how they managed to provide information about the impact factor without listing it (or other similar measures). (unlike my blunt summary here).
Perhaps the most useful part (but also the most difficult one, thus possibly also the one where we might not always agree) is at the end, where they have picked typical articles. On the one hand, this saves you a trip to the journal website to check recent publications. On the other hand, it doesn’t entirely answer the question of what kind of research do they typically publish? I guess that’s the question we’re asking, but also one which is very difficult to answer when the common factor is the topic (migration) and not the methodology or something like that. In that sense, three articles can never do justice of the diversity of articles in IMR or JEMS, for example.
If open access is a concern for you, the end of the guide nicely summarizes the open access status. This doesn’t include (how could it possibly?) national agreements with publishers.
If Because impact is probably one of your concerns, there’s a nice summary at the end. I really like it how they avoided impact factors of Scimago rankings, yet still provide you with a general idea of ‘impact’ — and with that ‘prestige’.
What don’t you get?
You don’t get journals that publish a lot on migration but are not focused on migration, like some demography journals. The selection of journals is nicely documented, so no quibbles there! You also don’t get journals without peer review — but that’s definitely a good thing!
You don’t get impact factors (that’s probably a good thing), but you also don’t get information about the peer review — that’s a factor many early career researchers (have to) take into consideration. Luckily, we have SciRev for this. While journals have the relevant information about turn-around time or rejection rates, they tend not to publish them in a systematic way — it’s more like advertising: journals often highlight those aspects they do ‘well’. With SciRev, everyone can review the review process, and there are also short comments that can be quite insightful. There are other such guides, like some wiki pages, but SciRev is the only one I know with a systematic procedure, and speaking of migration journals, the only one that spans different disciplines!
One thing that a generic guide like the PRIO guide will struggle to do is capture the prestige of journals in different circles of researchers. This is linked to the question of what kind of research typically gets published in the journals, and can be quite different to impact factors or Scimago rankings… not that a Q4 journal in Scimago will be considered high prestige by some, though. I guess there’s still value in ‘asking around’ a bit.
I had a glance at your profile online and was extremely amazed with your work. I feel you will be an ideal person who helps us for the progress of our Journal. Hence, I am approaching you through this email.
All the authors around the globe are cordially invited to submit any type of the article based upon your research interest for the upcoming edition.
I hope you will consider my request and I wish to have your speedy response in 24 hrs.
Await your cheerful comeback.
👉 So, please, allthe authors around the globe, quickly submit any article! I’m sure it’s going to be great, any you’ll have plenty of readers… but note that you’ll have to respond within 24 hours…
The instructions go: “You’re a social scientist with a hunch: The U.S. economy is affected by whether Republicans or Democrats are in office. Try to show that a connection exists, using real data going back to 1948. For your results to be publishable in an academic journal, you’ll need to prove that they are “statistically significant” by achieving a low enough p-value.”
I’m just going through some reviewer comments on a paper I have no stake in at all, and came across this gem:
The study finds support in favour of their hypothesis.
This was highlighted as a key strength of the study. Let’s not quibble about hypotheses here, but let’s focus on the explicit value for a “positive” result. This matters, because it’s peer review, and it’s the standards we have as reviewers that shape what gets published (and where). This focus on positive results does not help us move forward with actually understanding what’s going on — but then a cynic would see a quite different role for publications anyway.