Friday, 3 September 2021

Shots fired at MDPI for publishing predatory journals

I just received a revise-and-resubmit decision for a journal article submission (I won't name the journal though, for reasons that will become apparent). That in itself is not interesting. However, in the comments from the reviewers, the editor had suppressed references to some of the literature that the reviewers had suggested that my co-authors and I should be citing. That is very unusual. In the editor's comments, they linked to this new article by Angeles Oviedo-Garcia (University of Seville), published open access in the journal Research Evaluation.

Oviedo-Garcia takes aim at the journal published MDPI, arguing that they are a publisher of predatory journals, which she defines based on a number of criteria, including:

...journal names may be very similar to prestigious journals; the web page may contain spelling errors and questionable grammatical constructions and/or low quality images; the language on the journal webpage may resemble a ‘hard sell’ that targets academic authors; the journal may include articles outside its stated scope or may have a very broad scope; submission can be by email instead of a manuscript management system; the editor-in-chief might also act as the editor-in-chief of another journal with a widely different scope, predominance of editorial board members from developing countries; time-lines for publication and fast-track peer-review processes might appear unrealistic; APCs can be low; impact-factor metrics may be unknown; spam emails may invite academics to submit papers; despite the open-access approach, transfer of copyright may be required; and, finally, non-professional or non-journal affiliated contact information may be given for the editorial office...

That set of criteria seems awfully broad, but when you think about each criterion individually, they each characterise practices that are somewhat dodgy. Taking together, or where many of them are present, we should be very wary. Oviedo-Garcia then analyses data from the 53 MDPI-published journals that are indexed in Journal Citation Reports (which produces the Impact Factors that are most often reported), focusing on the number of special issues published in each journal, the amount of self-citation (within-journal), and the length of time each article spends undergoing peer review. In terms of the first, she notes:

...in January 2020, the number of special issues scheduled for 2020 with respect to those in 2019 skyrocketed in all the journals under study to levels as surprisingly high as 788 special issues in Sustainability, 830 in Applied Sciences, and 846 in Materials. From December 2019 to January 2020, almost all MDPI-journals (94.33%) scheduled more than one special issue per week during 2020 while, as previously mentioned, the number of regular issues per year was in all cases 12...

Yikes! Credible journals don't typically publish 800 or more special issues per year. In terms of self-citation, Oviedo-Garcia compares the MDPI journals with the top journal in each field, and notes that:

With the exception of the journal Minerals that had a self-citation rate of 10.75% in 2018, compared to 12.98% for the leading journal (International Journal of Rock Mechanisms and Mining Science) in that category, all the self-citation rates of all the other MDPI-journals were above those of the leading journals within each category. In 2018, the case of the International Journal of Molecular Sciences and Sustainability, with self-citation rates several times higher than those of the leading journals within the same category stands out (a difference of 25.17% and 16.98%, respectively).

In terms of peer reviewing time, again comparing the MDPI journals with top journals, Oviedo-Garcia notes that:

...the average number of days from submission to acceptance fluctuated between 81 days (Nature Structural and Molecular Biology) and 258 days (Nature Neuroscience). The surprisingly short time lapse from submission to acceptance (39 days) of the manuscripts for all 218 MDPI-journals in 2019 is astounding. All the more so, if it is taken into account that, in addition, the editorial staff of MDPI is formed of researchers who have to organize their time for revision among their other professional activities (research, teaching, dissemination, evaluation, grant applications, etc.), rather than professional editors (as with the journals of Nature Research).

In other words, the MDPI journals take on average 39 days to reach a decision, and that compares with between 81 and 258 days for the comparison group of journals. Now, you might argue that the comparison group is simply taking too long (and many researchers would agree with you). However, it is possible for the review process to be too quick as well. Oviedo-Garcia is essentially arguing that 39 days is far too short for a genuine review process to be undertaken, for the authors to respond to comments and submit a revised article, and for the revision to also go through a thorough review.

From experience, [*] MDPI journals ask all of their reviewers to turn around reviews within seven days. This leads to two problems. First, the short time window puts pressure on reviewers, who may not be able to undertake as thorough a review as they could do under less binding time constraints. Second, the types of academics who would accept a review assignment with such a short time window probably selects more for academics with more time on their hands, perhaps because they are not as busy in their own research. Having less active researchers as reviewers leads to lower quality reviews. Both of those problems suggest that the peer reviewing for MDPI journals may be problematic simply due to the impossibly-short deadlines, without even considering that the review system might be compromised through unwritten editorial policy, etc.

Oviedo-Garcia concludes that:

...the constant and quite exceptional increase in the number of articles published in MDPI-journals between 2018 and 2019, reinforced by an exponential increase in the number of special issues, which easily outweigh the number of regular publications (above all in view of the previsions for 2020), together with an opportune increase in APC fees all raises serious concerns over the legitimacy of MDPI as a publisher, at the very least because its 'APC-based business model alters the economic and scientific incentives in academic publishing'...

...so as not to contribute to the continuance of malpractice: 1 researchers should neither send papers for their publication, nor cite them, nor act as reviewers for them, nor form part of their editorial committees; 2 research institutions should inform researchers of the reality of predatory journals and their iniquitous consequences at an individual and general level; and, 3 evaluation agencies and committees should ignore the registers that refer to predatory journals.

Now, there are some issues with Oviedo-Garcia's analysis. Comparing the MDPI journals with the top journals in each field is not a fair comparison. It would be better to apply some form of matching to identify an appropriate comparison group. At the very least, comparing with other journals that have similar impact factors would improve the credibility of the comparisons. However, even putting aside the self-citation, the short reviewing cycle (which would no doubt show up regardless of which journals they are compared with), and the sheer volume of publishing would both remain as red flags.

In her conclusion, Oviedo-Garcia is clearly calling for the blacklisting of all MDPI journals, and that explains the reaction of the editor of the journal that returned me a revise-and-resubmit with suppressed references. It will be interesting to see where this goes from here. Personally, I have already written off Hindawi and Frontiers as two publishers that I won't have anything to do with. The constant spam emails that they send long ago gave me cause for concern. Up to this point, I've been more agnostic about MDPI, and have even published a comment in one of their journals (Education Sciences) in 2018. However, this article by Oviedo-Garcia is not the only recent warning I've received about MDPI. One of my colleagues sent a warning earlier this year about the rapid increase in the number of special issues published in the MDPI journal Sustainability, advising early career researchers in particular to be wary. If the blacklisting of MDPI gains momentum, it would be wise for all researchers to heed that warning, since publishing, reviewing, or guest editing there may well be a negative signal in the future.

*****

[*] I've completed a number of reviews for MDPI journals over the last few years. I have stopped in the last 18 months or so, mainly because I could not commit to turning around a review within the seven-day time limit.

No comments:

Post a Comment