Edit April 20th, 2021: thanks to Christos Petrou I found a bug in my code. I was considering both “Section” and “Collection” articles as Speical Issue. The whole analysis has been changed to accommodate the new data. I also acknowledged in the text the arguments of Volker Beckmann, who develops a coherent defense of MDPI practices and disagrees with my overall take; and inserted references to what MDPI (and traditional publishers) are doing for the Global South inline at the end of the piece, thanks to input from Mister Sew, Ethiopia.
This post is about MDPI, the Multidisciplinary Digital Publishing Institute, an Open-Access only scientific publisher.
The post aims to answer the question in the title: “Is MDPI a predatory publisher?” with some data I scraped from the MDPI website, and some personal opinions.
Tl;dr: main message
So, is MDPI predatory or not? I think it has elements of both. I would name their methods aggressive rent extracting, rather than predatory. And I also think that their current methods & growth rate are likely to make them shift towards more predatory over time.
MDPI publishes good papers in good journals, but it also employs some strategies that are proper to predatory publishers. I think that the success of MDPI in recent years is due to the creative combination of these two apparently contradicting strategy. One — the good journals with high quality — creates a rent that the other — spamming hundreds of colleagues to solicit papers, an astonishing increase in Special Issues, publishing papers as fast as possible — exploits.This strategy makes a lot of sense for MDPI, who shows strong growth rates and is en route to become the largest open access publisher in the world. But I don’t think it is a sustainable strategy. It suffers from basic collective action problems, that might deal a lot of damage to MDPI first, and, most importantly, to scientific publishing in general.
So that’s the punchline. Care to see where it stems from? In the following I will
- focus on the terms of the problem;
- develop an argument as to how the MDPI model works;
- try to give some elements as to why the model was so successful;
- explain why I think the model is not sustainable and is bound to get worse over time.
I’ll do so using some intuitions from social dilemmas and econ 101, a handful of personal ideas, and scraped data from MDPI’s website. The data cover the 74 MDPI journal that have an Impact Factor. They represent about 90% of all MDPI published articles in 2020 (somewhat less for the previous years, as MDPI growth has concentrated in their bigger journals. You find the data & scripts to reproduce the analysis in the dedicated github page.
Ready? Let’s go.
Scientists are by and large puzzled by MDPI.
On the one hand, MDPI publishes journals with high impact factor (18 journals have an IF higher than 4) many of which are indexed in Web of Science. Many, if not most papers are good. Several distinguished colleagues in nearly all fields served as Guest Editors or as Editors for their journals, often reporting positive assessments. MDPI is Open Access, so it does not contribute to the very lucrative rent-extraction at the base of Elsevier & other traditional publishers. MDPI’s editing is fast, reliable, professional; publication on the website is swift, efficient and smooth — all things that are hard to say of other, traditional, publishers. Several MDPI journals are included in the rankings used by different states to evaluate research and grant promotions to academics, for instance Sustainability is “classe A”, the highest possible rank, in Italy (source: ANVUR).
On the other hand, MDPI is known for aggressively spamming academics to edit special issues, often in fields that are far away from the expertise of the recipient of the frequent and insisting emails. Twitter is full of colleagues complaining that they get several invitations per week to contribute to journals they didn’t know existed and that lie outside of their domains, for instance here, here or here. MDPI even asked Jeffrey Beall, the author of Beall’s list of predatory publishers, to edit a Special Issue in a field that is not his own. It gets further than annoying emails, though. In 2018 the whole editorial board of Nutrients, one of the most prestigious MDPI journals, resigned en-masse lamenting pressures from the publisher to lower the quality bar and let in more papers.
This duality has generated some debates in several different places, among others in two posts by Dan Brockington here and here, in a post on the ideas4sustainability blog by Joern Fischer, and in the scholarly kitchen blog.
A predatory publisher is a journal that would publish anything — usually in return for money. MDPI rejection rates make this argument hard to sustain. Yet, MDPI is using some of the same techniques of predatory journals. So the question is simple: if you are a scientist, should you work with MDPI? Submit your paper? Review? (Guest) edit for them? Is MDPI predatory?
MDPI’s growth: how?
MDPI has had an impressive growth rate in the last years. It went from publishing 36 thousand articles in 2017 to 167 thousands in 2020. MDPI follows the APC publishing model, whereby accepted articles have to pay an Article Processing Charge (APC) before they are published. The APC has increased over time at MDPI. It can go up to more than 2000 CHF — MDPI is based in Switzerland — but there are several waivers and discounts. MDPI reports the average APC per article in 2020 amounted to 1180 €. Calculations by Dan Brockington show their revenue increasing from 14 mln $ in 2015 to 191 mln $ in 2020.
To know more about them, see their Annual Report 2020.
How did MDPI reach such high levels of growth? By cleverly exploiting the publish or perish policy widespread in academia, the fact that several countries mandate or suggest Open Access publications, and the rise of formal requirements for tenure and promotion within academia. But this state of affairs is independent of MDPI and anyone could have profited from it, though didn’t. So how?
As far as I can see, the success of MDPI relies on two key pillars: a lot of special issues and a very fast turnaround.
An explosion of Special Issues
Traditional journals have a fixed number of issues per year — say, 12 — and then a low to very low number of special issues, that can cover a particular topic, celebrate the work of an important scholar, or collect papers from conferences. MDPI journals tend to follow the same model, only that the number of special issues has increased in time, to the point of equaling, surpassing, and finally dwarfing the number of normal issues. Moreover, special issues are usually proposed by a group of scientists to the editors of the journal, who accept or reject the offer. At MDPI, it is the publisher who sends out invitations for Special Issue, and it is unclear which role, if any, the editorial board of the normal issues has in the process.
Virtually all of MDPI’s growth in the last years can be traced back to Special Issues.
The figure below shows the growth in articles for 74 journals with an IF at MDPI, dividing them between articles published in normal issues, special issues, collections and sections. Sections are a way to create several distinct branches of a sigle journal. Collections seem more similar to special issues, since they have their own collection editor. Special issues covered already the majority of papers in 2017 (it was not so earlier on, but I have article data from 2017 only), but grew rapidly from then on. While the number of normal issue articles increased 2.6 times between 2016 and 2020, the number of SI articles increased 7.5 times. At the same time, the number of articles in Sections increased 9.6 times, while Collections increased by 1.4 times. Articles in SI now account for 68.3% of all articles published in these 74 journals.
MDPI journals are becoming more differentiated, through the use of Sections, and they rely more and more on special issues.
The explosive SI growth is reflected also in the number of special issues, overall (table) and by journal (figure).
Across the 74 journals, there were 388 Special Issues in 2013, about five per journal. In 2020, there were 6756 SIs, somewhat less than a hundred per journal. The provisional data for march 2021 counts 39687 SIs that are open and awaiting papers — about 500 per journal. Not all of them will go through — many will fail to attract papers, others will be abandoned by the Guest Editors — but in all likelihood SIs in 2021 will be much more numerous than in 2020.
SIs increase at all journals, in some cases exponentially. Some have unbelievably high number of SIs. In March 2021, Sustainability had 3303 open Special Issues (compared to 24 normal issues). These are 9 SIs per day, just for Sustainability. 32 MDPI journals have more than 1 open SI in 2021 per day, including Saturday and Sundays.
The “Journal Growth” table in the data appendix at the end of this post reports the growth of articles and number of SIs for each MDPI journal that published at least 100 articles in 2020. It also shows the share of articles that appear in SI rather than in the normal issues. This share has followed different paths in different journals, mainly because of the rise of Sections and Collections, but is still very high for virtually all journals.
Are Special Issues a problem?
SIs are good because they pack together similar articles, increasing the readability of an ever-growing literature. They can contribute to the birth or growth of research teams, consolidate networks or help build new ones, and be a place where to carry out interdisciplinary research, that is often squeezed out of traditional disciplinary journals.
But they ought to be special, as the name says, and they ought to be under the control of the original editorial boards. In most (if not all) non-MDPI journals, the SIs are managed by the journal’s editorial board, together with the guest editors. Not so at MDPI. It is the publisher that sends out the invites (often, mass-sending them without much regard to the appropriateness of the invitations). This, coupled with the exponential explosion of SIs, marginalises the editors of the original journal. The people that created the reputation of the journal in the first place are sidestepped by an army of MDPI-invited Guest Editors.
While I will discuss later the implications of the SI model adopted by MDPI, I think the data prove beyond doubt that the most important MDPI journals are turning into collection of sometimes loosely related Special Issues at an accelerated pace. Normal issues are disappearing.
A coordinated reduction of turnaround times
Traditional publishers can be extremely sluggish in their turnaround. Scientists share horror stories about papers stuck in review for years. The situation is particularly bad in some fields (economics: I’m looking at you) but it is generally less than optimal.
MDPI prides itself on its very fast turnaround times. In the Annual Report 2020 MDPI reports an average time to a first decision of 20 days. This is extremely fast. A paper after submission must be assigned to an editor; this editor has to find an associate editor (or not), and then find referees. It is hard to find the correct people to review a paper, and these might be not available. Once the referees have been found and have accepted, they need time to make their report. Then the editor has to read the reports and make a decision. 20 days is really fast.
But MDPI does not provide aggregate statistics on the time from submission to acceptance. This includes revisions, and is crucial to understand how the editorial process works. To get this data, I scraped MDPI’s website. The information is public — for each paper, we know the submission date, the date when the revised version was received, and the acceptance date. The aggregate results are shown in the figure below. Three main takeaways: 1. there is not much difference between normal, special issues, sections and collections. 2. MDPI managed to halve the turnaround times from 2016 to 2020. 3. the variance across journals has gone down at the same time as the mean.
But these are just means. Surely there is a lot of heterogeneity in the turnaround, and some papers will take their time. There could be hidden heterogeneity also by field — economists have shown to have different reviewing times and practices than, say, virologists. Let’s have a look.
Below is the raincloud plot of the overall distribution (cut at 150 days, for the sake of visualisation. This leaves out about 3% of the papers in 2016, but, a further indication of the shrinking of turnaround times, only 0.3% of papers in 2020). On the left, each point is a paper. On the right, you see the kernel density estimation. There is heterogeneity, but it is rather low, and it is being dramatically reduced. The rather flat distribution of 2016 has been replaced by a very concentrated distribution in 2020. The distributions for normal and special issues are similar, with some more variance for SI articles.
Are there differences by journal, or field? After all we are talking here of different people, from different fields, and thinking back to the SI explosion, at an army of heterogeneous and uncoordinated guest editors. Below you find the distribution of turnaround times for the main MDPI journals (cut to 150 days).
The really striking finding here is that there is virtually no heterogeneity left. The picture from 2016 is as you’d expect: fields differ, journals differ. Some journals have faster, other slower turnaround times. Distributions are quite flat, and anything can happen, from a very fast acceptance to a very long time spent in review. Fast forward to 2020, and it is a very different world. Each and every journal’s turnaround times’ distribution shrinks, and hits an upper bound around 60 days. The mean is similar everywhere, the variance not far off.
This convergence cannot happen without strong, effective coordination. The data unambiguously suggest that MDPI was very effective at imposing fast turnaround times for every one of their leading journals. And this despite the fact that given the Special Issue model, the vast majority of papers are edited by a heterogeneous set of hundreds of different guest editors, which should increase heterogeneity.
So: hundreds of different editorial teams, on dozens of different topics, and one common feature: about 35 days from submission to acceptance, including revisions. The fact that revisions are included makes the result even more striking. I am surely slow, but for none of my papers I would have been able to read and understand the reports, run additional analyses, read further papers, change the wording and/or the tables, and resubmit within one week of receiving the reports — unless revisions were really minor. Can revision be minor for so many papers?
About 17% of all papers at MDPI in 2020 — this is 25k papers — are accepted within 20 days of submissions, including revisions. 45% — this is 66k papers — within 30 days. A (too much) detailed table in the data appendix at the end of the post shows turnaround times for all top MDPI journals. Its main point is to provide full info for particular journals, but also to highlight how little variance there is.
Is it bad to have very fast turnaround times?
Per se, no. Fast turnaround is clearly an asset, and MDPI shines in this. They did cut all the slack to referees and editors, and this could be a good thing — papers usually spend most of their editorial lives, unread, in the drawers of busy referees and editors (economists, I am again looking at you).
But cutting editorial and referee times is good for science only if the quality of the peer review can be kept up even at this very fast pace. A well written referee report might take several days. People are busy with their lives and do not have just referee reports to write; good scientists’ time is scarce. Editorial decisions can also take time. All but the smallest revisions require substantial amounts of time.
Again, I will expand below on the implications of these extremely short lags, but let’s note that lags have decreased for all journals in a coordinated way and are now at the lower limit of what is compatible with quality reviews + revisions.
MDPI’s growth: why?
How could MDPI grow so much?
Whenever something sells so fast, the answer is trivial: demand. What MDPI sells is obviously in high and growing demand. It wouldn’t be enough for MDPI to try to sell it if there weren’t scientists picking it up. But what do they sell that competitors don’t?
MDPI sells high acceptance rates and very fast turnaround times, in special issues that are very likely to match your specific field and ran by a colleague you know, within journals that have a decent-to-high impact factor and that count in the official rankings of several universities and public agencies in the western world. Exactly what scientists all over the planet, and especially in developing economies, need to keep competing in the publish or perish landscape of modern academia.
Getting into the details. Is acceptance high? it is at around 50% at most MDPI journals. While this is far from predatory practices (that would be 100%) and shows that MDPI journals do reject papers, acceptance rates are an order of magnitude higher than in most, say, economics journals.
Does MDPI cover all niches? The very fine grid of Special Issue guarantees that you’ll find an SI that seems tailored for you — if not, you can always accept MDPI’s invitation and tailor one to suit yourself. Impact Factor is high and the fact that the journals count in national rankings covers your back. It’s a winning game.
The Special Issue model is key in providing incentives to contribute effort and to submit papers. Doing some serious editing activity is a requirement for tenure in many countries, so mid-career scientists will find the offer appealing; it offers an easy way to publish papers from multi-disciplinary groups often required to obtain research funds. At the same time, you are more likely to send your paper to a previously not known journal if a colleague you know runs the Special Issue. Overall, the Special Issue generates incentives and increases trust in the system. It also motivates editors to convince fellow scientists to contribute.
The second key element is the high Impact Factor of the top MDPI journals. The system wouldn’t work if there was no quality indicator. The high IF is what covers the scientists backs with respect to their funding agencies, employers, and colleagues. Yes, it’s not Nature. Yes, it’s not an exclusive top journal in my field. But look it has a good IF and it is growing. [and look, Dr. X invited me.] Must be good.
The fast turnaround time and the high acceptance rate are the icing on the cake: you have very high chances of getting a publication in what for academics is the blink of an eye.
As a firm, MDPI should be admired for pulling this extremely effective strategy. MDPI created a handful of journals with high IF from scratch. They devised the SI-based model. They managed to cut all slack times to zero and deliver an efficient workflow — mean times from acceptance to publication are down to 5 days in 2020 from nearly 9 days in 2016.
Still, I think this model is not sustainable, and stand a high chance of collapsing. It’s simple, really: it will likely collapse because journal reputation is a common pool resource — and MDPI is overexploiting it.
An unsustainable, aggressive rent extraction model
MDPI is sitting on a large rent — it controls access to something that is in demand and for which it faces little competition. The rent is created by perverse incentives in the academic system, where publications in high IF journals is at a very large premium, since careers and research financing depend on it. Thanks to their past efforts, MDPI sits on several dozen journals with moderate-to-high IF and good reputations.
They could have chosen to continue business as usual, slowly increasing the number of papers and the reputation. This is what a traditional publisher like Elsevier would have done. Elsevier makes money by selling subscriptions. The value of the subscription is given by the gems in your journal basket. So you take care of the gems, and never let them lose their shine. But MDPI sells no subscription. It makes money by the paper, and so chose instead to exponentially increase the papers at their gem journals.
MDPI strategy boils down to exploiting their rent, as fast and aggressively as they can, through the Special Issue model. The original journals’ reputations are used as bait to lure in hundreds of SIs, and thousands of papers, in a growth with no limit in sight.
Despite the best efforts and good faith of MDPI and of all the guest editors and authors involved, the quality of journals that are inflated so much and in this uncontrolled way is bound to go down. Thousands of SI generate a collective action problem: each individual guest editor has an incentive to lower a tiny bit the bar for his or her own papers or those of his or her group. As we know from Elinor Ostrom’s Nobel Prize winning work, collective action and common resources need careful governance lest the system break down. MDPI has chosen to give control of the reputation of its journals to hundreds of guest editors, with just loose or no control from the original editors, who were responsible for the journal reputation in the first place and who are also in the process drowned by a spiraling number of additional board members. Even if the vast majority of those aim for quality, the very fast turnaround times make it harder. MDPI made things worse by sending mass spam invitations to nearly every scientists on the planet to edit special issues, thus increasing the chance of getting on board less than conscientious guest editors that will exploit the system. When the system showed to work (that’s 2018 and 2019), MDPI doubled down sending even more invites (that’s 2020) and recently flooring the accelerator and mass-inviting special issues (500 per journal in 2021).
This is not sustainable, because it is impossible that there are enough good papers to sustain the reputations of any journal at 10 SIs per day. And even if there were, the incentives are set for guest editors to lower the bar. Reputation is a common pool resource, and the owner of the pool just invited everyone on sight to take a share.
If MDPI were a music major, this strategy would amount to, first, putting in effort to sign a dozen of very popular bands, and then, once the reputation is in place ask their friends (or indeed anyone with a guitar) to produce more records under the name of the rock stars, for a low fee. Musicians of all levels jumped in. U2 this year released one album, but there are 400 invited bands releasing their albums under the U2 franchise. Buy the records, they are all as good as the original!
I don’t know why MDPI is so aggressive in rent exploiting. They could have done it more slowly, and still reap substantial profits. The former CEO Franck Vazquez said in 2018 when the Nutrients Editorial Board resigned that he “would be stupid to kill his cash cow” (source). This rings true: why so aggressively exploiting their rent, knowing that reputation can decrease fast if journals are inflated? Still, the data show beyond doubt that MDPI is milking its bigger journals at increasing rates.
Economics offers some cues as to why aggressive rent extraction might be the best strategy for a firm in the situation of MDPI. In general, when you have a rent but feel that it might soon vanish, it is your best move to suck it dry, fast. It is the economic reason behind the fact that movie sequels suck, or that second albums are usually worse than the first. It is not the only reason, of course. But developing a good, say, teenage dystopian movie increases the likelihood that other producers will jump in on the fashion and copy you; and each further movie exploiting your trope decreases your rent, giving you incentives to hasten the production of your sequel, and market it when it’s not ready. If it is easy to copy your model, you’d better be your own cheap imitation brand, rather than waiting for others to take that role (incidentally, this mechanism is at work also in the fashion industry, as masterfully described in Roberto Saviano’s Gomorrah and confirmed by this criminology paper).
Why would MDPI think that their rent will vanish? I don’t know. I suspect it might have to do with the recent move to Open Access of the main traditional publishers, that are opening OA subsidiary of their main non-OA journals. Or to the increasing acceptance of repositories as arXiv as legitimate “publications”, that would make the whole “publisher” concept history. But that’s just a hunch, I have really no clue.
Whatever the reason, I think that MDPI is aggressively exploiting its rent. It is not predatory in the sense of publishing everything for money. Still, it is a misleading and dangerous practice. The reason why academics are puzzled by MDPI, and you can find people defending the quality of the papers and the special issues they read and/or were involved with and, at the same time, people that completely loath MDPI and deem it rubbish and predatory, is that MDPI is both things at the same time.
The problem is that bad money always crowds out good money. With MDPI pushing the SI model faster and faster, the balance will shift sooner rather than later towards deeming MDPI not worth working with.
What if I am wrong?
I might be wrong. There are many good sides of the OA model that MDPI adopts.
It is more inclusive, for one. It breaks the gate-keeping that is done by the small academic elites that control the traditional journals in many disciplines. It probably makes no sense in the 21st century to limit the number of articles in journals to, say, 60 a year, because back in the 20th century we printed the journals and space was at a premium, so breaking the (fictional and self-imposed) quantity constraint is a good thing.
All the above rings even more true for academics from the global South, that face even higher hurdles to publish in the rather closed and limited traditional journal. Mister Sew from Ethiopia has provided me with several references on how and how much traditional publishers exclude researchers from the Global South. Non Euro/West researchers account for as low as 2% of published papers, and have virtually no representation in editorial boards. Some traditional publishers deny access to IPs from several African countries. References here, here, here and in a special collection of articles on the role of OA for the global South here. MDPI features a much higher share of editors and papers from the Global South, and is as such a liberating force. At the same time, MDPI publication fees are very high and unattainable by researchers from the Global South.
The special issue model is also very good in many respects. It has potential to be a clever way of organizing science in the digital age. We still live within the empty shells of the 20th century way of publishing science. That gave us another form of rent extraction, traditional publishers with their incredibly high margins on the back of scientists. A more open form of knowledge exchange, leaner, open access, and organized around topics rather than journals can be promising.
Fast turnaround times across all journals could be a sign of real productivity and logistic advances from MDPI, without affecting the quality of reviews. Those would be extremely welcome and a breath of fresh air in academic publishing.
Volker Beckmann in the comments made also several arguments as to why in his own view MDPI quality is not going down and will not decrease in the foreseeable future. Check his points out in the comments.
If you have more thoughts on this and you think I’m wrong, I am happy to hear your thoughts. If you are from MDPI and want to reply, I am even more happy. This (too) long piece is meant as an exploration of an intriguing topic. And to scratch a personal itch, that I got when invited to guest edit for Nutrients, last summer, and was surprised to see how MDPI could be at the same time considered very good and very bad by colleagues.
Methods, data and code
The scripts to reproduce the analysis and the data are available on the dedicated github repo. The scrape was performed in march 2021, for the number of SIs, and in April 2021, for the turnaround times. All data is publicly available on MDPI’s website, and I just collected and analyzed what is otherwise public. I would like to thank Dan Brockington for extensive discussion, encouragement, and comments on an early draft of this post. Thanks also to Joël Billieux, Ben Greiner, David Reinstein, and all the colleagues on the ESA-discuss forum for discussions on MDPI and publishing.
Additional Tables and Data
Journal growth, Special issue growth and share of SI articles on total articles for selected MDPI journals
Turnaround times at selected MDPI journals
165 thoughts on “Is MDPI a predatory publisher?”
Great and interesting article. I am an associate editor in an MDPI journal and what is very professional is the organsisation of the reviewer finding process, which may be part of the success story to align turnover times. The MDPI IT system suggests lots of reviewers names from a large data base which the associate editor either selects or he/she suggests others. Then – in theory – an MDPI fellow is approaching the reviewers whethe rthey would be willing to do the review.
Recently, I was not convinced at all by the expertise offered by the data base and suggested 8 scientists or so what I think are peers to be approached that were not listed in their data base.
The reviews returned back very quickly and were very short and of poor quality. I realized that none of the reviewers I suggested was approached to do a review. I asked why and learned that the MDPI team had already sent out the manuscript for review to their preselected reviewers before they asked me.
I was very much annoyed about this and they apologized for this misbehaviour but I doubt this was a mistake but common practice if things may take too long according to their rules ( I was not late with my response but responded perhaps 2 days after I received the request). I did not resign yet but I will carefully watch the future review processes and make sure that no predicision was made.
A thing that I noted, and got anedoctal evidence as well, is that turnaround times are “rigged”.
A major revision from another journal (sometimes) becomes a “reject, but resubmit and get the same reviewers”. The submission time then becomes that of the revised version not that of the original version… and of course this cuts the acceptance time much shorter than otherwise possible.
It is a bad practice (let’s call it false advertisement), but at least it should not hurt the science.
Anyway, thanks for this article, I was pondering the issue a lot, given the fact that I published on SI and acted as a reviewer (with some very far off the mark review requests).
Thanks a lot for your interesting analysis. You mainly argue that it’s impossible to handle so many Special Issues and keep up the quality. I disagree.
MDPI has developed rules that work effectively and can secure quality of Special Issues even at large scale. You did not recognize these rules in your post.
You wrote: ”Thousands of SI generate a collective action problem: each individual guest editor has an incentive to lower a tiny bit the bar for his or her own papers or those of his or her group.”
This diagnosis is wrong.
Among others, Guest Editors are not allowed to make any decision on manuscripts that are written by themselves, by members of their group or by close peers. See: https://www.mdpi.com/editors Decisions on those manuscripts are made by the Editors-in-Chief or other Editorial Board Members (EBMs). As an EBM of Resources and Land, I regularly make these kinds of decisions and secure the quality, but also support the development of Special Issues. I’m happy with that.
You argue that Guest Editors (GEs) devalue the work of the “original” editors. You wrote: “MDPI has chosen to give control of the reputation of its journals to hundreds of guest editors, with just loose or no control from the original editors, who were responsible for the journal reputation in the first place and who are also in the process drowned by a spiraling number of additional board members.”
This diagnosis is wrong again.
Special Issues are important right from the start of a journal and a central task of an EBM is to support the development of Special Issues. GEs and EBMs co-evolve at MDPI journals. To be sure, these are distinct positions and GEs and EBMs have different roles. Thus, GEs are not necessarily EBMs and EBMs are not necessarily GEs, but there is quite some overlap. As already said, as an EBM I support the development of Special Issues and I’m happy with that. One reason is that I also will be supported. As a GE I get support from other EBMs if I or members of my group or close peers submit a paper to a Special Issue I’m editing. By support I mean that some other EBM is willing to make rigorous editorial decisions. By the way, in these cases GEs do not know the identity of the decision maker (EBM) during the process. That’s very important.
And there is another way to escape the tragedy of the commons. Maybe you recognized that the names of the Academic Editors (the decision makers) are now written on the manuscript. This creates clear responsibility and accountability.
You also claim that many Special Issues are recruited by “predatory” mass emails. Actually, I doubt this a bit. I was invited for one Special Issue and initiated three more without any further invitation. It seems that this is a typical pattern: GEs make good experiences and initiate from time to time more Special Issues in their field of expertise on their own. You can see this pattern if you scroll through the list of EBMs. This tells a different story than yours.
As a GE for MDPI journals you get extraordinary support from the in-house Assistant Editors, but also from EBMs. From my experience I would say that’s the main reason why Special Issues at MDPI journals are so popular and developed into the growth engine of many MDPI journals.
Despite all the growth, please note that the market share of MDPI in the total number of publications was only 3,74% in 2020. See: https://www.scilit.net/statistic-publishing-market-distribution
Thank you for your feedback.
Volker, since my tweet has attracted considerable attention I took the liberty to advertise your comment and your different view on twitter too, so it gets the same visibility as my post. I will think more about your points and you surely have much more insight since you see things from the inside. The tweet is here: https://twitter.com/PaoloCrosetto/status/1382049674111123465?s=20
Excellent analysis. Could we interpret this as seeing MDPI as some sort of publication pyramid scheme? At one point it is bound to collapse, and we could see four potential scenarios: MDPI goes full predatory, MDPI downsizes significantly by focusing only on a few flagship journals (most journals will lose their IF), MDPI gets sold, or MDPI becomes a cheaper version of PLOS One, but just being one journal. MDPI’s strength was that it became a voice of hundreds and thousands of scientists from the Global South who could not get into subscription-based journals, because of various reasons, the sad thing only is that they are being exploited now by the numerous APC charges….
Paolo, please note that your analysis also ignores developments in the broader publication market.
Ever since I was invited in 2013 to guest-edit a special issue in Sustainability, I followed the development of the publisher. At that time MDPI published 9.852 articles. It was not listed among the 20 largest publishers, but it reached rank 11 for open-access articles. In contrast, Elsevier published 591,872 articles in 2013 of which 96,552 articles were published open access. Elsevier was the leading publisher worldwide, also the leading open-access publisher. In 2020, MDPI has published 163,831 articles and became the fourth largest publisher and the largest open-access publisher. Elsevier published 797,348 papers in 2020 of which 140,354 were open access. Elsevier still is the largest publisher, but now only ranks third in terms of open access publishing.
Thus between 2013 and 2020 Elsevier increased the annually published papers by 205.476 whereas MDPI did it only by 153,979 papers. Sure in terms of growth rate, the MDPI story is impressive, however, in absolute terms the growth was less than that of the market leading publisher.
This is to put things into relation.
The general market for publications is growing due to multiple reasons. The number of papers published increased from 2,891,795 in 2013 to 4,376,690 in 2020. Thus MDPI managed to grow faster than the market, but still the market is very large and the market share of MDPI was only 3,7% in 2020. Thus, there is still a lot of potential to grow in a competitive market.
Please note that all of the above data were retrieved from SciLit, a scholarly publication search platform maintained by MDPI. https://www.scilit.net/ This platform also contains under “Rankings” very useful information about the publication market and about publishers and individual journals.
You will see that phases of exponential growth like Sustainability can also be found with the Journal of Cleaner Production or Science of the Total Environment.
To be sure, there is no exponential growth that can be maintained forever. See the developments of Plos One or Scientific Reports.
As I said, you need to put things into relation.
By the way, SicLit is totally free of charge, it contains very useful information for scientific communities. It was invented and is maintained by MDPI. Now comes the question: Why should a company that according to your analysis is “ aggressive rent extracting” or “shift towards more predatory over time” set up such a platform? Why should they set up Preprints? https://www.preprints.org/ Why should they set up an Encyclopedia? https://encyclopedia.pub/
I think your analysis reveals a deep misunderstanding of the company’s objectives.
your article is very interesting, for some time I have been trying to understand this phenomenon that does not only concern MDPI, but also Frontiers and associate journals from the important journal (scientific report for example).
I wanted to point out an important factor that you may not have considered, is that there is a positive feedback effect of these papers on the impact factor that is similar to inflation in failing economies that print paper money.
more articles published = more citations placed on the market (1:50 as a minimum), these citations increase the impact factor of all journals, but in particular of those MDPI that self-refer. But it can’t work! sooner or later the system collapses! now in the international academic world, it is clear that you cannot build a CV with publications in these journals, it only works in countries with poor career growth systems such as Italy or the emerging world.
As an editorial assistant for two of MDPI’s special issues in the humanities, I can only report on that part of the elephant with which I am familiar — and my experience is that the articles we published were earnest and carefully-prepared contributions to their respective fields. There has been an implication that special issues are in some way underhanded — but why should we not take advantage of the unlimited capacity of the internet to explore our vast universe in a way that has never before been possible?
G. W. (“Glenn”) Smith
Dear Volker Beckmann
Your point about product extensions (SicLit, Prepints, etc.) may contradict Paolo’s ‘feeling’ (and mine too) that MDPI is milking the cow (or even killing the goose that lays the golden eggs). Or not…
Since you appear to know more about MDPI’s strategy, how do these crazy numbers of Special Issues fit in a long run strategy?
Do you also have an explanation for the fact that, in the last four months or so, I keep receiving emails such as this:
“Given your impressive expertise, we would like to invite you, on behalf of the Editor-in-Chief, to join the Editorial Board of Digital as our Topic Editor”. I have no expertise neither in the subjects that should be covered by Digital, nor in the other handful of journals they keep spamming me to edit.
My replies to all of you above. First, thanks for commenting, and providing insights and debate.
thanks for the insider view. Your comments are part of mounting anecdotal evidence that in order to deliver on their promise of short times MDPI staff is heavily involved in editorial handling and sometimes decisions. This is worrisome. They also cut corners and different people on Twitter as well pointed out that if a revision takes too long it is rebranded as a new submission. This is mostly harmless, but it depends also on how the rejection statistics are computed.
thanks for your insight. I might be wrong with respect to the details of the process. For instance on the level of control that Editorial Boards have on Guest Editors you surely know more than me and I take your point — there is probably more control than I thought there was. On the amount of spam, though, I am rather sure that you don’t see it because you are not a target of it. Several colleagues went public showing their inboxes full with up to 4 or 5 invitations per week, often in unrelated fields. MDPI seems to send direct messages on Twitter also. The spamming strategy is clearly there and is clearly increasing in scope and volume. Also about the returning guest editors, I have no reason not to believe you so I’ll take your point: many SIs can be edited by returning guest editors that had a good experience. But that cannot explain the exponential growth. in 2021 there are 7 times as many SIs as in 2020. How many returning GE can there be, and can all of them have returned 7 times?
My point here is not that quality is currently bad: it is that with the level of invitations and of growth they are trying to achieve, it is most probably going to decrease. I can see we disagree on this, but I’ll stick to my point till I see evidence of the contrary.
Andy: I think it has elements of a pyramid scheme, but is not one per se. They are using aggressive marketing strategies on their own journals, this is for me now sure. And aggressive marketing means also discounts, efforts to get into the scientists’ networks… so it looks like a pyramid. Re. the Global South, I have now added a box up top with references. Indeed, MDPI is doing a lot more for the South than traditional publishers. Does still mean to me they shouldn’t hijack their own journals for short term profit.
Andrea: thanks for the analogy, spot on.
Regarding the SI issue, I think the analysis exaggerates. Many papers are published in SI of MDPI journals but originally were not submitted to SI
Thanks for taking the time to analyse this data. And for the nice plots!
The homogeneity of how they (MDPI) reduce the submission to acceptance rate across the board comforts me in my thought that there is deliberate manipulation here. From my experience as one-time guest editor, when a paper comes back with major revisions, they explicitly ask you to label them as “Reject and resubmit” if the major revisions cannot be addresses in less than 10 days or so, thus resetting the time between submission and acceptance.
In my view, MDPI’s success was to be early in exploiting the lack of OA alternatives when these were rare. For instance, in my field of Remote Sensing, me and several colleagues looked positively at the arrival of a fully OA journal dedicated to the field (even before it had an IF). This effectively encouraged many to send out decent papers there (me included). A couple of targeted special issues curated by high profile scientists further increased the confidence in this as a valid solution. There was even one that provided an early access to a very useful dataset in Earth System science, which stimulated many good papers to be submitted there based on that data. The IF arrived and encouraged more people to submit work there.
Yet, as the success of the journal (called “Remote Sensing”) rose, it also became evident that they were publishing studies of very variable quality. As a reviewer, I realized that when rejection was recommended (with good arguments), the paper would still be published. What became clear was that if a paper was submitted there, there was a (very) high chance that it would end up being published sooner or later (and if later, the “reject and resubmit” strategy mentioned above would ensure it would still be done in a short amount of time). When early career scientist and students are required to have a list of papers in their CVs, MDPI ends up being a pragmatic solution for them.
However, I think this is now backfiring. Not necessarily for MDPI, who is still cashing in, but for the researchers who have inflated their CVs with such publications. When I see a CV with many MDPI papers listed, I cannot help and think negatively of it, despite knowing that some of those papers might be good.
Another smart tactic of MDPI has been to exploit the Special Issue card, as you have shown. Awarding a badge of “guest editor” to early career scientists is a very good incentive for them to do the work (for free), including inviting and convincing peers to submit good papers, and it would also look good on an academic CV. However, now it seems literally anybody can be an MDPI “guest editor”, so the alleged prestige that goes with it has been lost too. This further goes in line with your suggestion of an aggressive rent extraction model.
If you want to understand MDPI as a company, please check out this blog
Thank you very much for your response. Would you be so kind and disclose your full name? Unfortunately, I can’t find any academic named Beijo Kense.
As to your questions.
If you call the number of Special Issues “crazy” you follow the mental model of traditional journals. Working with Special Issues is a building block of MDPI’s journal development. Take Digital, a newly established journal and the example you mentioned. https://www.mdpi.com/journal/digital They have published up to date 8 papers in total. One paper is part of a Special Issue. They have currently 8 Special Issues open for submission. https://www.mdpi.com/journal/digital/special_issues
This journal and its Special Issues will co-evolve. As the journal grows the number of the Special Issues grows (or you argue the other way round). That’s what I mean by co-evolving. Sustainability has published 4069 papers in 2021 up to date. How many Special Issues are open? According to Paolo’s research it’s 3178. The principle does not change. Special Issues allow you to manage small projects as parts of a larger project. This works at a small scale, it works at a large scale. Sure, you need rules that prevent the abuse of the system. As I explained, these rules are in place at MDPI.
Why does MDPI invite researchers to edit Special Issues? You can also see it in this way: They want to support scientists all over the world. They want to accelerate open access to trusted knowledge. They want to change the game of publishing. They want to grow into a major publisher. You may want to read the interview with the MDPI’s CEO: https://www.mdpi.com/anniversary25/ceo
Is there any limit to growth? Sure, exponential growth can’t be continued forever. However, the largest journal in 2020 was SSRN Electronic Journal with 39,238 papers followed by Scientific Reports with 22,437 papers published. See https://www.scilit.net/statistic-journal Sustainability, as the biggest MDPI journal, has published 10,669 papers in 2020. So there is still some room for growth. The same holds true at the level of publishers: In 2020, MDPI has published 163,831 articles and became the fourth largest publisher and the largest open-access publisher. Elsevier, the market leader, published 797,348 papers in 2020 of which 140,354 were open access. https://www.scilit.net/rankings So again, there is still room to grow. Nobody knows precisely where the limits are.
As to the emails. Yes, all researchers get many emails every day from different publishers inviting us to submit to their journals, reviewing for them or becoming editors. I agree that many of these emails are spam, they are not targeted, they are more or less clearly from “predatory” publishers. I delete most of them immediately. Sometimes I check if the publisher is trustworthy. However, also I get a lot of invitations to submit or to review a paper from traditional journals and publishers, e.g. Elsevier, Springer, Wiley, Sage, etc. I consider them, but often I ignore (invitations) or decline (reviews) them due to time constraints or because it simply does not fit my profile. Some, however, I accept.
Serious invitations to become a (guest) editor I have received only from MDPI, Frontiers, and Sage journals so far. MDPI and Frontiers use similar, but slightly different inclusive editorial models. MDPI works with Special Issues and Topics. Frontiers works with Research Topics. When I receive these invitations, I consider them carefully, but often again it does not fit or I don’t have the time. Some, however, I have accepted. Sure, sometimes these invitations really don’t fit my profile or interest (there is always some fuzziness unless you know the people very well). In these cases, I just ignore or decline. However, I’m happy to know that these options exist. Without their emails, I perhaps wouldn’t know. To be sure, without their invitation I wouldn’t have started my first Special Issues in Sustainability in 2013. For me this was the beginning of a process to learn a lot about academic publication from the perspective of an editor.
Thanks Volker Beckmann.
Emails as the one I cited, I only got from MDPI. As you can see from my profile, I have no ‘expertise’ in the fields of these journals:
And I’ve received emails from MDPI telling me that I am expert on these matters and suggesting me to be a Topic Editor. I get emails for Elsevier, T&F, etc. because I opt in (usually when submitting a paper) to receive news on the fields I’m interested in, so I don’t feel spammed by them.
I concur with Paolo’s view that these massive ‘invitations’ will certainly decrease the quality of published papers, because you’ll get lots of Topic Editors that have little or no knowledge in the field and a poor publishing or review record. This, combined with a questionable review policy (you can find in Publons people reviewing an MDPI paper every other day), results in manuscripts being really poorly edited.
My profile: https://scholar.google.pt/citations?user=wuGfIBcAAAAJ
yes, specifically these kinds of emails you could probably only get from MDPI. And only in the last four to six months.
The reason is that the position of a Topic Editor was just recently introduced. So MDPI journals obviously sent out lots of invitations to advertise this new position. Indeed, now almost every MDPI journal does not only have an Editorial Board, but also a Topic Board and a Reviewer Board.
See the example of Sustainability:
Related to the topic board, MDPI journals now also offer the possibility to edit not only Special Issues, and Topical Collections, but also Topics which can include multiple journals. See:
Within the list of topic editors you can find researchers who are also potentially willing to collaborate on a specific topic. That’s a great idea. Let’s see how this will develop.
Thus, this new position, which was only introduced recently, probably explains the many emails that were sent.
Obviously, you did not accept any of the invitations you got. This is exactly what I would expect in your case.
As one colleague put it recently: “researchers have brains”! They can think, they can judge. Why should a researcher accept an invitation which is not in her/his field of expertise? Why should a completely inexperienced researcher accept such an invitation? I don’t think that this will happen often, if at all.
What if you would have gotten an invitation of Tourism and Hospitality?
But yes, there are some risks of having a many and diverse editors. You need to have good mechanisms in place to secure quality. I mentioned a few that are in place at MDPI journals. One is that the names of the editors are listed on the paper, which creates individual responsibility and accountability. This is also done by Frontiers journals or by Plos One, among others (both work with very large editorial boards). From my experiences, alone this mechanism works quite well to secure quality. And there are more mechanisms in place.
This is a much needed analysis of MDPI journals. MDPI is especially damaging to early career researchers as many researchers instead of going for top/reputed journals in the field just publish in MDPI journals and then brag about their number of publications. I know so many of my fellow colleagues whose papers got rejected by reputed journals in my field, they sent the same articles without any modification to MDPI journals and got accepted in a matter of few days. I won’t say that all MDPI journals are predatory but their practices are highly questionable (one of my colleagues paper got rejected from at least 5-6 quality journals but got accepted in a special issue of a so-called-high-impact-factor-MDPI-journal for which his own professor was a guest editor). Also, impact factors alone can never be a measure of journal’s reputation; the journals that publish open access journals usually have higher chances of having higher impact factor.
Thank you very much for interesting analysis and (at least for me) novel view on the issue.
I work in environmental sciences, my WOS record currently counts 50 publication, out of them about 30% in MDPI journals (silver goes to Elsevier with 20% and bronz to Springer with 16%). Thus reputation of MDPI matters to me.
Positive points for MDPI:
1. The open-access fees are reasanable in comparison to Springer and Elsevier. Some providers of research fundings demand open-access publications and the grants are not unlimited.
Here I would propose a different view on what is “predatory”. Elsevier and Springer systematically gather journals published so far by universities or other research institutions which were previously web-online without any payment (in fact free-open access, examples e.g. IJEST, Folia Microbiologica, Chemical papers…). Now the big players sell the literature to us researchers within the subscription (not-free) and as a bonus (for them only) they offer highly expensive open-access. Reviewers work for free, editors dominantly for free and the reveneus of these publishers are high and they are not returned back to science in high proportion (exception is e.g. Oxford). I consider this more predatory than what MDPI does.
2. Speed of publishing.
MDPI realy mastered the review process, dominantly eliminated the neccessary “waiting for action”. I have served as guest-editor of two special issues for MDPI and I serve long-term as an associate editor in Springer journal IJEST. Here we ask reviewers to finish the review within two weeks (while MDPI demands 7-10 days), nevertheless we achieve total review process in many months, while MDPI in weeks. The difference is obviously not in reviewers, but in the editorial process and majority of the time is wasted. Professional editors handle the manuscripts immediatelly, while me, as a volunteer-editor do it in my spare time, which I do nt have much… Editor-in-chief usually distributes new submissions among associate editors in clusters of 4-8 submissions. I am not able to process them (i.e. precheck, reject on average 1/3 directly and find reviewers for the rest) in less then one week. Also MDPI has got much better system for finding reviewers. This enables searching for reviewers in much more precise parameters, while in Springer we have quite wide indicators of reviewers’ expertize. This leads to sending submissions to inapropriate reviewers and high decline-rate and consequently longer time. In MDPI also the professional editor does part of that job for you. The guest editor can add other reviewers, decline some or change the order. This saves a lot of time and speed up the process.
3. Rejection rate. Rejection rate of MDPI is obviously lower compared to established journals. Thus if you get your manuscript rejected a few times, MDPI presents higher chance of acceptance and publishing. In the academic career there are many occasions when you need paper published (Ph.D. defence, approaching end of project, habilitation requiring defined minimal WOS record etc.), when combination of rejection rate and speed of publishing is useful.
This however does not neccesarrily mean lower quality of research. Sometime I have the feeling that the top journals act as “elite club”, where youngs, souths, easterns etc. Are preliminary declined. There is also the mantra of supernovelty, which for example case studies are not (and it is useful if they are published). Let the readers decide.
Also consider that not all researchers have the ideal instrumentations, complex team, can afford repeating experiments 7 times (ideal for statistical evaluation) etc. These have low chance in cometition to top teams with high grants, however that does not matter the data are not worth publishing.
Why am I a bit affraid?
I generally do not have concerns about the review process in MDPI higher than in other non-MDPI journals.
Experience as author: Majority of my MDPI submissions had three or four reviewers and with one exception they all were clearly qualified in the fields, their comments were useful and led to improvement (or deserved rejection). This is more than in other journals where the standard is dominantly 2, I was accepted in SPringer and Elsevier (Q1 and Q2 journals) even with a single review only.
When I published in “my” special issue, the review process was carried out by the editorial board members, here is no conflict of interest.
Experience as reviewer: I have reviewed about 50 manuscripts for MDPI and in majority of cases “reject” meant reject (in exceptional cases I was overvoted by other reviewers and major revision or two rounds of revisions were carried out).
Experience as editor: Here I really like the system for selection of reviewers. What I do not like is the fact that in case of unity between reviewers, the professional editor sends the manuscript for revision (and in case of major revision for second review round) WITHOUT the act of the acadeic editor / guest editor. I can only finally confirm the acceptance (rejections are carried out by professional editors) or initiate a new round of reviews. Here more power is left on reviewers than editors.
Compared to other journals I can see here a bit more space for “review leaking”, i.e. inappropriate guest editor or combination of more inappropriate reviewers that would not be capable of finding the major drawbacks (mistakes, frounds…) of the manuscript a let a poor or even fraudent manuscript to be published. However overall I consider the review process correct and effective.
What is a bit suspicious is also the practise of reject + resubmission. This can lead to higher presented rejection rate lower editorial times compared to reality.
I do not consider the mechanisms of special issues as problematic as far as the publisher selects appropriate editors. Compilations of compatible paper to one issue increases visibility. Also consider the fact, that many papers go through “standard submissions” and they are assigned to speciall issue later. And also consider that in case of immediate publishing, the issue, either standard or special, have not much meaning. You can always browse based on the time of publishing, topic or just search what is interesting for you.
Fot the conclusions I should also add that I am a big fan of open-access, which dates back do 90ties, when as a student I had to visit many university libraries to reach for all published but generally unavailable literature.Paying 40 USD for a single paper is impossible for a student, pirate services like sci-hub were not available. As a research and also part of university management, I consider open-access as very good solution to availability of literature, general costs of it and also for dissemination of the results ti wider auditorium than just researchers (profesionals, industry, media…). MDPI offers good product for a cheap money. As far as the editorial / review process will be kept rigorous (i.e. before significant frouds will be revealed) I will keep submitting to MDPI.
Please note that with the new data your story about the exceptional role of Special Issues for MDPI journals growth is collapsing (at least for the ex-post analysis).
According to your calculation, the number of papers published in Special Issues increased 7.5 times between 2016 and 2020. All other papers increased 7 times in the same period (if I calculated correctly from your diagram). This is almost the same growth rate.
Thus, it is rather as I already stated: papers in Special Issues and in others parts of the journals co-evolve.
Papers assigned to Sections and Collections must be treated as “normal” papers. Sections and Topical Collections are structures which are permanently open for submission, unlike Special Issues (and newly Topics).
Section Editorial Board Members (and most Collection Editors) are general Editorial Board Members. This is often not the case for Guest Editors.
So your analysis is partly based on a misperception of “normal” papers and “normal” issues.
What is normal?
You seem to refer to traditional (print) journals which can be quite misleading when looking at modern journals.
MDPI journals publish papers continuously as soon as they are ready. However, every paper is officially assigned to an issue. MDPI journals publish either 2, 4, 12, or 24 issues of a journal per year. The number of issues is related to the journal size. Also the number of Sections is related to the journal size. See
Challenges 2 Issues: https://www.mdpi.com/2078-1547/11
Oceans 4 Issues: https://www.mdpi.com/2673-1924/1
Land 12 Issues: https://www.mdpi.com/2073-445X/9
Sustainability 24 Issues: https://www.mdpi.com/2071-1050/12
The issue number appears on the paper, in addition to the volume number and the article number. That’s very traditional, except that article numbers are used instead of page numbers.
Now, one issue is composed of papers assigned to Sections, Collections, Topics, Special Issues or of papers that are not assigned to any of those. This creates structure within an issue. And in general it creates structure within the online journal.
You define papers that are not assigned to any of these categories as “normal”. In doing so you follow the logic of traditional (print) journals which simply does not fit to new types of electronic journals.
New electronic journals can easily create overlapping structures. A paper belongs to an issue, but also to a Special Issues, which belongs to a Section. A paper might just belong to an issue and a Section. Or it belongs to an issue, a Topical collection, which belongs to a Section. A Topic can cover different journals. There quite some combinations possible. In any case the article identifier for MDPI journals is related to the issue, not to a section, topic or special issue.
Thus the whole idea that there are “normal” papers or “normal” issues at MDPI journals is quite odd.
How to distinguish the different structural elements?
Sections – give general structures to the journals, often have a Section Editor-in-Chief, a Section Editorial Board, a Section Topical Board, are constantly evolving, no deadline for submission, Topical Collections and Special Issues can belong to Sections.
Topical Collections – have Collection Editors (who are often Editorial Board Members), focus on a broader topic, no deadline for submission.
Special Issues – have Guest Editors (who might or might not be Topic Board or Editorial Board Members), are considered as projects with a defined deadline for submission, papers are published immediately once they have passed successfully the review process; Special Issues are evolving over time until they are closed; Special Issue can be transformed into a printed book.
Topics – have a Topic Editor-in-Chief and Topic Assistant Editors, cover different MDPI journals simultaneously, are considered as projects with a defined deadline for submission, papers are published immediately once they have passed successfully the review process; Topics are evolving over time until they are closed; Topics can be transformed into a printed book.
Thus, the distinction between Special Issues and Normal Issues don’t fit the reality of modern MDPI journals. It’s more complex.
Or it’s very simple: every paper is assigned to at least one Academic Editor (this can be an Editorial Board Member, Collection Editor, Guest Editor, or Topic Editor) who is responsible for making decisions. Her/his name appears on the published paper. Editors are not allowed to make decisions regarding their own submissions, submissions of members of the same institution or submissions from close peers. Editors-in-Chief are decision makers of the last resort (except for their own submissions, submissions of members of the same institution or submissions from close peers).
P.S.: Please note that Frontiers has a similar but slightly different structure. Frontiers journals don’t work at all with issues. A paper is assigned to a volume and has an article number. Every larger Frontiers journal has Sections. Sections, however, can also cover different journals. Every journal has Research Topics (similar to MDPI Special Issues). All Frontiers journals together have probably ten thousands of Research Topics open for submission currently. Sections and Research Topics also create a complex overlapping structure. A paper is part of a volume, but in addition potentially part of a Research Topic and/or part of a Section. There will also be papers that are not part of Research Topics or Sections. As with MDPI journals, every paper is assigned to one Academic Editor. The name of the Academic Editors appears on the journal. Different from MDPI journals also the names of the reviewers appear on the paper. In Frontiers journals Editors and reviewers are members of the Editorial Board. This is also different from MDPI. As a consequence the Editorial Boards of Frontier journals are usually very large, even compared to MDPI journals.
Take Frontiers in Psychology as an example: https://www.frontiersin.org/journals/psychology
572 Research Topics open for submission
11,290 Editorial Board Members (as of May 1st, 2021)
3803 paper published in 2020
1683 papers published in 2021 (as of April 26th, 2021)
Thus, also Frontiers journals are modern and inclusive journals that don’t fit the traditional categories. They are also growing fast.
Thank you for an excellent review. Just some thoughts of an author who has not yet published with MDPI but is about to do so for the following reasons.
As you say: “Elsevier makes money by selling subscriptions”. And hereafter good manuscripts are rejected for wrong reasons. Elsevier journals with lower rankings than MDPI journals pride themselves with acceptation rates of a few percents. Why not, the service has been paid for. Without subscriptions, Elsevier’s article processing charges (APC) outweighs those of MDPI (sometimes with a factor 2) for open access publishing.
I understand why MDPI is so successful. As you mention: MDPI has reputable journals (rankings) and publishes good quality papers. Compared to other journals its APC’s are reasonable. Turnaround time is fast. And most important, MDPI is inclusive compared to journals who only want to facilitate a select circle of established research groups. It is time for a change.
Excellent analysis, Bravo! Very useful for me. Today I received an invite from Universe journal to review an article. It was the first time I heard of this journal . So I asked a person very close to me, who is Editor of a major journal in the field, for an opinion. But this person did not know the journal neither. Universe has an IF = 1.75 which is not bad for an almost unknown journal. If I accept I have only 10 days to answer, which is far to few for me. It is not that I use 20 days to answer, but that my agenda is too busy and I need more time to find 8 hs that in average takes me to write a report. Finally, MDPI is in general too expensive for a third world country. Traditional journals with even higher IF, with the old scheme of subscription and page charges, are very much less expensive. There are some you don’t pay at all (think of Monthly Notices of the RAS, with IF=4) And, last but not least, if you come from a developing country, and you have an accepted paper in a traditional journal, you can always ask to not pay because your science agency does not pay page charges. It normally works. There are people who consider this a dishonor, but others find it the way collaboration works. And I know very good scientists that made their professional careers without paying a Penny for publishing.
Great article Paolo, so interesting and so well supported by data. I had one unpleasant anecdote with an MDPI journal. After an invitation to submit with full APC waiver, the company requested payment. I showed the original invitation signed by them, and got a simple response: sorry, our policies have changed. As you can guess, my manuscript went elsewhere (and got accepted).
I can only add a small piece supporting the hypothesis that the only interest of MDPI is to gain more and more fees.
I was contacted to serve as a co-guest editor. I accepted since the main guest editor was a friend of mine and a well-respected scientist in my field. From the very beginning, the managing editor of MDPI made clear that we MUST collect an X number of papers (I don’t remember exactly the number), otherwise the special issue would have been erased/closed. We invited our colleagues to participate. When a submission was done, guest editors were requested to make an initial decision based on 4/5 criteria, all very easy to fulfill (pertinence to the issue, decent English, and so on). Then, in case of a positive initial decision to proceed further, a list of potential reviewers were suggested by the managing editor to “help” and guide our choice. The majority of the suggested experts were young investigators with few papers and a low- to very low HI. I tried to suggest more pertinent reviewers (in a dedicated space), but none of them was contacted (I asked to some of them to check). I complain about this aspect with the managing editor but received no answers. The quality of the review reports, overall, was very low. With these items in our hands, we were asked to take a decision of acceptance/reject after the first round of revision. No further revisions were allowed. The majority of times, our final decision was “respected” by the managing editor. However, 2 decisions about 2 papers, one acceptance and one rejection, were overturned by the managing editor, which changed our decisions without advising us before sending the communication to the authors. I complained again, and this time I received a response from the managing editor, which explained me his/her motivation, clearly evidencing that she/he had read the reviewers’ comments and basically took a decision on his/her own.
As one of the comments stated, one paper that included one of the guest editor as author was not managed through this process, but followed the normal procedures for regular submission, and it was accepted after asking for minor changes.
Overall, 9 papers were published while 3/4 were rejected. The rejected papers (including the one with the overturned decision) were almost all submitted closely to the end date of the special issue (when the requested minimum of published papers was likely reached).
Two days ago, less than one year after the closing of such special issue, I received an e-mail from MDPI asking us (it was forwarded to me and to the other guest editor) to set up another special issue which should represent the follow-up of the previous one. The managing editor was different from the previous one, and, despite my email address was right and the name of the other guest editor was correct, they refer to me as Dr. Phillips, which is not me.
In brief, the sensation that this experience left in me is that MDPI is always seeking for guest editors to promote and spread their potential to receive submission, but they absolutely do not care about the scientific opinions of such guest editors. They use them (at least used me) only as a tool to reach the widest possible audience while performing the review process in a completely automatic and questionable manner, by contacting a number of researchers with low reputation just to ensure high speed to the process.
I had a similar sensation before, as a reviewer and not as a guest editor, with Oncotarget. Everybody knows how the story ended.
In summary, while I am quite sure that MDPI publishes both high- and low-quality manuscripts, the editorial process and the approach of the journal are highly questionable. Thus, I have never (and will never) submitted the manuscripts in which I appear as corresponding author to them. I also discourage my colleagues to do so.
There’s plenty of journals with similar IF that do not request money to publish, at least in my field (biology/medicine). Even though such journals (and publishers) gain money through subscription, I personally agree with the “old” idea that less papers means more mean quality, more value to the journal, and so on. As a result, I personally take with cautions the findings from papers derived from this (and other with similar style) publisher.
Another point deserving attention might be the ratio between the number of published items and the number of correction/retraction. I do not have statistics, but if a publisher publishes a lot, then, statistically, a high number of retraction/correction should be expected. It would be of interest if MDPI has a ratio of publishing/retracting that is comparable, higher, or lower compared to other non-open access publishers. If higher (lots of publications, few retractions), I would interpret this as a non-serious follow-up by the publisher of issues raised by the scientific community. The opposite scenario would suggest that probably the review process is of low quality but the publisher takes seriously the process of making science (which is the most important point in my opinion). In the case of a ratio comparable to other publishers, it means that the overall process is OK and MDPI can go on and on in making millions for years and years without, at least, doing a serious damage to science.
I came across these MDPI papers when I was doing my PhD, and I was quite happy that they were open access. And being an “infant” in academia at that time, I was amazed by the fact that these MDPI papers have such high impact factor. But after learning of their APC, it was quite impossible for us (my supervisor) to publish in such journals. Now that I am in my “teenage” years, I am quite convinced that MDPI has some trick to be what they are today (which has been partly answered by your analysis. GOOD JOB!).
Nevertheless, I am a bit puzzled, apart from being Open Access, with such high numbers of articles published annually, how do these MDPI journals maintain their impact factors? This is just a very innocent question, but do the editors have the “power” to insert some self-citations into the articles? I hope someone can enlighten me on this.
Self-citations can be reached via Web of Science. They vary a lot. For example the top-rated MDI journal Cancers has got almost 50% of self-citations while Sensors has got around 20% and Antioxidants only around 10%.
The 2019 impact factor of Cancers after correction for self cites is still 5.492, so no Cancers does not have a 50% self citation rate. Check your sources correctly…
I am very sorry, I have mislooked to another line and taken wrong number 😦
Thank you for these analyses. Personally, I had two contradictory experiences with MPDI.
The first, I published a paper in IJERPH and the review process was clearly similar to other “traditional” journal. Except that the second round of review was quite less demanding, and that in the third round, a decision was made, even if I’m still thinking that the manuscript has some weaknesses and could have been improved again.
This lead me to think that this fast decision at the end of the process (around 30 days) have been in part influenced by the need to “reduce time to acceptance” and to fit in the norms.
Anyway, the paper is still scientifically sound, and cited, within the community. So, what should we think about it ?
My second experience was as reviewer for Sustainability. I receive an article apparently written by someone who just earned a conference prize with it. I read it thoroughly and made an appropriate review, with plenty of questions and request for changes/clarification. Authors answers were out-of-topic, and their clearly did not answer my comments or makes appropriate changes. Thus, I made another review of the manuscript were I pointed this lack, and also, deepen my initial review with some statistical errors and incorrect sentences (wrong assumption according to a reference). I clearly said to the editor en apparté that the manuscript should be rejected in view of the few considerations of the authors for my comments, as well as, the low apparently rigor of author concerning referencing and statistical analysis.
I was very surprised when editors send me again the manuscript with changes made by the author. However, these changes were still clearly not sufficient for a quality scientific articles. I clearly then said to the editor that this paper shouldn’t be published at all, but that if they absolutely wanted to piublish it, they could send it to another reviewer.
I did not hear again about this manuscript.
Anyway, it showed to me a very aggressive way to make their article published. I never experienced this again with other journals for which I made reviews (around 10 journals).
Since then, I received plenty of request for reviewing MDPI manuscripts, and I observed that I was quite often outside of the field of the paper. I am obviously rejecting the invitations, but this is clearly questioning.
Please note, that this apply to many MDPI journals that contact me for review (sustainability, sensors, and I forgot the others).
Meanwhile, I still not experienced something bad with IJERPH, and I hope it will continue like this.
To conclude, I think MDPI clearly need a thorough assessment in their editorial process, decision, for each of their journal.
It seems to me that a lot of articles are of poor quality, and it is quite detrimental for the community when scientists cite these articles, and most of all, students.
it’s a very interesting discussion. I have experience with MDPI as a reviewer, a member of the editorial board and later as an author. So, I can share my experience and guess/answer some of your questions.
1) MDPI has very effective editorial management system (it includes also the professional editors and their assistants) that makes most of technical work for you.
2) MDPI changes all the time. Every year they collect feedback of, at least, editorial board members and decide what to change.
3) Special issues is not a problem. SIs are just open for some time. If there are enough papers they do a collection. If not, nothing happens. I have one open SI, as an guest-editor I rejected 3 papers as being out of scope, so there is 0 currently. SIs is also a way to advertise the journal and attract submissions.
As an editor, I get a paper for making first decision whether it should be rejected immediately (e.g., because of evident bad quality or being out of scope). I also get pre-selected reviewers whom I can recommend to exclude from the list due to some reason (e.g., they can be from a slightly different field), I also can propose other reviewers (however, it seems to me that I never get reports from the reviewers I proposed yet). Then, the next time I get an email with an invitation to make a decision on the manuscript when it receives at least 2 reviews. Depending on the review reports I can recommend to the editor to accept, reject, do minor or major revisions, or reject and encourage to resubmit, or initiate an additional round of full review. I also get an email with invitation to make a decision on the manuscript after it was revised. Again, I can make any of the above mentioned decisions. So, the quality of the selected papers depends a lot on a particular academic editor dealing with a manuscript and reviewers agreed to review the manuscript. It looks like the pre-selection of the reviewers is done by editor assistant following keyword criteria, therefore, some of the reviewers are not really specialists in the field of the manuscript. It’s, probably, the reason why many of you get invitations from MDPI to review manuscripts that are out of your scope.
As an author. I was heavily pressed by the editor assistants to submit the manuscript and the revision in very short time. It’s hard, it’s what I’d recommend to relax a bit.
As a reviewer. Everything is like in the other journals except the very high time pressure and emails from editor assistants that the deadline is close.
In fact, similar experience is with CBM journal (Springer?).
MDPI expands aggressively. Soon, the other (business) publishers will adopt similar or another more successful model and MDPI will slow the growth rate. Another danger for MDPI comes from the national academies which sign open access publishing agreements with ‘traditional’ publishers (e.g., Elsevier, Springer) but not MDPI, thus for the researchers the publishing of an open access paper with the ‘traditional’ publisher becomes free of charge.
MDPI and Sustanability. I know this journal quite well because they asked me to collaborate as an associated editor in 2018 after I published an article in Sustainability. I not even knew the existence of the journal at the time. I accepted because the topic of sustainability interests me, even if obviously it is so inflated that it has become a “catch-all concept”.
It is a new academic business model, very different from traditional journals. The biggest difference is that the whole review process is managed by the staff and not by academics and this makes it much faster by eliminating any downtime, this is their big advantage. They record anything and therefore certainly have “big data” that they can analyze to optimize the process. It also has undoubted advantages. For example, the reviewers are chosen by the staff and not by the editors and this completely eliminates any risk of favoritism, ideological bias, nepotism and conflicts of interests. They always choose the reviewers from the other side of the world. They have gigantic databases that contain thousands of names and which they use to manage the review process. This type of procedure greatly increases the efficiency and transparency of the process, but the obvious downside is that the staff cannot know the topics covered in the articles as well as an editor and therefore the choice of reviewers can sometimes be questionable.
The aspect that can create problems is that they want to publish as much as possible because they make money out of it, and everything is organized in that direction. I guess they send the articles to experts who don’t reject much, but I don’t know exactly, just speculation. There can be up to 5 reviewers per article, to make sure you have done everything possible to accept the article. They probably choose associated editors who are experts in the field, but don’t reject much (I never rejected much even before collaborating with Sustainability), but even that is just speculation. If you reject a lot, they don’t send you the articles, I think, otherwise it would be impossible to explain why they publish so much. That said, all final scientific decisions whether to accept or reject are exclusively entrusted to academics and therefore no one can say that they are following scientifically incorrect procedures.
The articles they receive are good quality, also because you have to format the paper yourself in publishable version, you have to do that before submission. It can take several days, it wouldn’t make sense to submit a bad paper that is desk rejected after all that work. Of course, they do not receive top papers, but they get good papers that maybe have had some difficulties in being published and that people want to publish at the soonest. If you want to publish very fast with good quality that is the best way.
The problem of special issues can be real and in fact they obsess anyone to have more special issues, but this in my opinion comes from the fact that the regular editors of the magazine could never be able to handle such a huge number of articles. Impossible, even working night and day, so they decentralize the process by recruiting guest editors wherever possible and reasonable, which allows them to expand scientific review services supply almost infinitely.
They don’t actually contact randomly like real predatory journals (I constantly get invitations to publish engineering of materials journals, and I do not know a damn dime about it …). However, they always follow the path of contacts “between acquaintances”, so much so that before publishing one article on Sustainability in 2018 I did not even know that Sustainability existed, they had never contacted me. They don’t contact randomly, but create gigantic databases containing all the researchers in the field that can serve their purposes.
It looks a bit like bitcoin, it is the new trading currency of science ….
It would be an interesting experiment to submit this as a manuscript to Sustainability (since it’s a catch-all phrase now as noted in another comment), Micromachines, or Processes (both of which seem to catch the flavor of the publishing model) and see what happens.
This was a fantastic and very interesting read. The homogenization of acceptance times was especially striking. Thank you for scratching your itch!
I think a good informative metric to respond to rebuttals – something that can help sort the wheat from the chaff – is a metric like the Scimago’s Journal Rank (SJR) that looks not only at total citations, but where those citations are coming from and whether they’re high impact/reputable journals within their field. This metric distinguishes MDPI’s higher impact journals (5-6) from similar IF journals like PLoS Genetics or PLoS Pathogens, which are cited in far more impactful research. MDPI’s top IF journals hover at only SJR = ~1.0-1.5, while similar IF and community-respected PLoS journals hover around SJR = 3.5-4.0.
I’d be especially curious about a similar analysis done on Frontiers Media journals. The two organizations share a lot of qualities, but my sense is that Frontiers’ flagship journals are better-respected. For instance, Frontiers in Immunology and Frontiers in Cell and Developmental Biology have an SJR hovering around 2.5, while Frontiers in Oncology hovers around 2.0. Lower than better-respected institutions like PLoS, but higher than MDPI. I’m wondering if Frontiers is simply lagging behind MDPI, and is on the same trajectory towards explosive journal number and homogenization of time to acceptance, or if Frontiers is perhaps navigating this push to OA and special issues etc… in a seemingly more responsible (but still controversial) way. That said article charges remain a barrier to global south scientists… so no points on that front.
Thanks to all for a great analysis and good discussion. To add my experience: I turned to MDPI after very disappointing experiences with several Elsevier journals. I have changed from affiliation (great position but less reputable university) and suddenly my manuscripts were desk rejected or rejected after review. Motivations and review reports were substandard.
I did not perform such thorough analysis as Paolo did, however, I subtracted the affiliations of the past 5 years from published articles in the Elseviers’ journals concerned. It was striking that all these journals seem to be biased towards the affiliations of the editors in chief. The best ‘predictor’ for acceptance seems to be a certain affiliation.
The quality of the articles published in these Elseviers journals did not impress me. Some articles where good, but there were also articles published which would not have passed my review (methodologically wrong).
Moreover, as it was raised in the discussions above, one Elsevier editor desk rejected my manuscript because it was judged to be out of scope and mentioned that I did not cite his journal or related journals. My manuscript was certainly within the scope of this journal.
Elsevier subscriptions are not cheap, to say the least. Proper manuscripts are rejected. ACP’s of Elsevier journals are outrageous.
In comparison, my novel experience with MDPI. I received 3 good quality, constructive and helpful review reports within a month (so not two weeks) from a Scopus indexed MDPI journal. The ACP is reasonable and I pay for what I get. With Elsevier, I pay for what I not get.
We as scientists are responsible for the quality. After horrible experiences with Elsevier (real nightmare), I embrace the effectiveness, efficiency, quality and inclusiveness of MDPI. I my view, the quality is good, otherwise I would not submit. In the future I will whole healthily support MDPI with maintaining this quality as an author and reviewer. I will also support inclusive publishing. I will gladly pay my APC if it allows MDPI to waive APC’s of scientists in a less fortunate position. We all should have the right to publish our work.
In my view, the biased attitude of other publishers also greatly contributes to the success of MDPI.
Thanks for your analysis and conclusions. IMHO MDPI is not yet a predatory publisher, but is dangerously heading in that direction – at least with some journals. I did some superficial analysis in Scopus for one of the highest ranked journal of MDPI – Energies. The results are as follows:
1. Increasingly high self-citations in the journal: out of total 79914 citations for 2017-2020 period, 19849 are citations in Energies (25%). But almost half of these self-citations (8367) come from 2020 alone, while in 2020 Energies has published 6567 papers. For 2019 there were 6459 self-citations with 4983 documents published. And in 2017 there were just 954 self citations with 2176 documents published. With such a pace, the self-citation of Energies will soon surpass 50% which might result in expulsion from Scopus and WoS.
2. Huge number of citations come from conference proceedings. For 2017-2020 period, Energies has received 23731 citations from this source (at least, as I had difficulty in identyfing all conference proceedings in excel), which is another 29.7 % of citations for that period. I am quite skeptical on the quality of many conference procedings and I am not sure if counting them for an Impacf Factor/Cite Score of a journal is a good idea.
3. Large number of citations from other MDPI journals – 2300 from Sustainability, 1989 from Applied Sciences, 984 – Electronics, 754 – Sensors, Almost 6 thousand citations just from those 4 journals.
Summing up: yes, Energies has a high Impact Factor and Cite Score, but the quality of these citations is at least questionable.
Additionally, the accpetance rate of Energies is very high – for 2020 it was 51% for 13487 papers submitted. Please note, that we do not know how submissions were counted – if you were rejected once, you could apply again in few weeks time with a changed title and major/minor changes following the previous round of (negative) reviews. In a traditional journal you would be desk rejected for sure. How does Energies deal with this issue?
Finally, just look on the number of papers with Polish afiliation in Energies: it has jumped from 83 papers in 2018 to 861 papers in 2020 and in 2021 already 943 papers published with Polish afiliation. So currently one out of four papers published in Energies has Polish author(s).This is not due to lucky discovery of oli & gas reserves in Poland:) but rather because of a very fortunate (for MDPI) new journal rank, which attributed 5 stars to Energies (in scale 1-6). 6 stars (200 points) is for Nature/Science.
Obviously, in a short time the reputation of Energies will gradually decrease. But I guess the strategy of MDPI is the diversification of risks: they have a portfolio of 360 journals and they can quickly improve the reputation of dozens of them.
I do not know how the academic community in the West will deal with this issue, but one of unexpected outcomes of MDPI expansion is the erosion of the evaluation model based on H-index and citation record of a scholar. In short time, publications in MDPI can get you boosting your citation rank in WoS, Scopus and Researchgate.
About the quality of MDPI’s IFs:
I do not know if others have noticed this or not. So advance apology if I have made a repetition.
One notable difference between MDPI and other OA publishers is ‘discount voucher’ offered by MDPI. If people review for any MDPI journal, they get a discount coupon [interesting! 🙂 ] (50 or 100 CHF) which could be used if they would like to publish in MDPI in future. That is why people are invested to review for MDPI, irrespective of if that manuscript matches their expertise or not. That is why MDPI can afford to push their reviewers for quick turn around. And clearly that is a conflict of interest and goes against transparent practice.
If one makes many instants reviews for such benefit, the question is: What is the quality of such reviews? Should scientific reviews look like this? I think the answer is obvious to all serious scientists.
Thank you for such an interesting article. I am an Associate Editor of Biomolecules and today have resigned.
I was asked to evaluate an article to recommend if it could proceed to review and provide reviewers. I recommended the article to proceed and after several weeks was asked to assess the revised manuscript by the authors in keeping to the reviewers’ comments. This was the point at which I could see the review history. In there, I saw that there was another editor contacted as a Special Issue Editor who was a co-author in the article under the review. In fairness, this individual declared his conflict, but still recommended the article should proceed. Also, my selected reviewers (both international experts in the field of the article) were not contacted. While the chosen reviewers were reasonable experts I am unsure how and by whom were they picked. I indicated my discomfort to the Assistant Editor, who gave me a wishy-washy apology that made no sense and asked me if I would continue reviewing the responses to make a final decision.. I did not respond to that email on the same day, and I received another email in which I was removed from any access to this manuscript and another editor was assigned.
I was appalled and resigned immediately since I have questions on the transparency of the review process. I am an experience research with over 200 career publication and have served and still serve in various editorial boards of reputable journals. I have been always suspicious of MDPI journals and have only sporadically published in a few of them. I also was among the perplexed ones who could not understand how they managed to grow so quickly and become accepted by the scientific community as mainstream editorial company. They cannot play two sides: they either build on the scientific excellence of the editors or they self-derail their excellence by practices that amount to predation. I have chosen not to participate any longer as Editor nor contribute anymore to any MDPI journal and will exhort peers in my field to do the same.
The whole academic industry is a mess. Let’s say we take an average publishing platform: Taylor and Francis or ScienceDirect. You, as a scientist, do your research, then it takes years to go through the peer-review and then another year a production process. And then, your work is not available to the public. Moreover, all rights belong to the publisher. Is this not predatory? Incompetent reviews can be in ScienceDirect and Taylor and Francis too.
‘And then, your work is not available to the public.’ – This is a lie!
Have you ever had a problem getting the publication you needed? Really?
Firstly, scientists have access to scientific publications because their universities subscribe to the most important publishers. It has been working well for many years.
Secondly, if someone does not really have access to a publication, he can write an e-mail to the author asking him to send the paper.
And if someone can’t go to the library or write such an e-mail, then – in my opinion – he shouldn’t deal with science.
Thank you for your great posting. Could I share your post by translating into Korean?
This blog is great, however it focuses on MDPI only. Some negative experiences by individuals are elevated to ‘common practice’. I have experiences with various publisher and in my view MDPI is one of the best.
As one of the previous responders wrote: “science is a mess”. MDPI challenges this mess by introducing an effective and professional publishing system. In my experience, MDPI is more transparent than other publishers and I have seen MDPI journals publishing the review reports as well.
Indeed, reviewers get a voucher but this is less than 5% of publishing processing charges. In my view it is a fair compensation and more a gesture than a motivation. MDPI also assists reviewers in writing proper review reports by giving guidelines on their website. Very professional.
More over, article processing charges are reasonable in comparison with other publisher and more importantly, MDPI does real work on papers. After acceptance, my manuscript was reviewed by a professional editor (native English speaking) before being published. With other publishers, I have to do this myself in some do-it-your-self online system.
Actually, I had one doubtful experience with MDPI which made me question the quality, professionalism and effort of a voluntary scientific editor but I consider this an incident. I have similar experiences with voluntary scientific editors from other publishers. Adherence to schedules and proper reading is just part of being professional. If you cannot adhere to schedules, as an editor or reviewer, just decline to review.
And editors bragging about their number of scientific journal contributions (200 and even 400 in one case) as a measure of their quality, well, I cannot follow it. Let’s take the 400 of an editor in chief (not with MDPI) over a 40 years of scientific career. This comes to 10 publications each year or nearly one each month. It is just impossible to ‘own’ each of these publications as an author or co-author (supervisor). Moreover, this editor in chief should also review manuscripts. The generally accepted rule of thumb is three reviewed to one authored. Professors have become managers and the number of publications (co-)authored just a key performance indicator required to attract funding for more PhD’s and more publications. It is numbers that count. Science has indeed become a mess. But let’s not blame MDPI.
Rachel, are you an employee of MDPI? 😀
SJK: feel free to translate it. Please link back to the original though.
I know this might not be precisely on the topic, but there is a person in the company apparently doing sometimes two weeks worth of English editing in one day. Given that the APC includes English editing and this is clearly not possible, I’d say there is a case to be made that MDPI is committing fraud.
I read your article and was impressed.
We also had some issues and I think it is worth to report here. My problem was that my associate editor was chinese and was not able to speak english. So all communication from the chinese side went through google translate, which probably led to severe misunderstandings. (A phone call was not possible!)
The standard strategy and the deal for the special issue was to have three papers from the leading scientists with no fee to attract others also to publish. In the end also the three got bills and it was not possible to solve the issue which brought me to the point that I resigned from my position as guest editor. We now shifted all papers to an other sound journal:
I also was suprised that I got dozens of invitations of chinese MDPI associate editors in Linked-in after agreeing to act as a guest editor. From that point I realized that swizerland is not swizerland. 🙂
Good analysis. All many-brand and/or wholesale retailers of anything distribute both high quality and low quality products. This is true of wine, fashion, … and science alas.
E.g., Springer Nature does just that with Nature Portfolio.
So perhaps the key question is: is there a minimum threshold of junk above which a publishing house can no longer be deemed serious, or is that threshold=0 ? My answer is “threshold=0” but that’s just personal inclination.
For sure, scientific publishing is in need of radical reforms.
MDPI games their impact factors by having a very high proportion of review articles relative to comparable journals. Someone mentioned PLoS Genetics–for the latest IF data there were 12 reviews out of 532 citable items (2%). MDPI journals of similar scope (Cells, IJMS) are 35-40% review articles. Median research article citations are lower in those journals than in PLoS Genetics. MDPI games IF and counts on that gamed number to lend an air of respectability. We should all know that IF is a flawed measure, and we don’t need to fall for those games.
Dear Paolo, dear all,
you might be interested in reading the most recent MDPI 25th Anniversary blog on the role of Special Issues in modern publishing. See:
It is worth reading. By the way, the blog contains more interesting and useful entries on open access publishing and the viewpoints of MDPI.
Kind regards, Volker
I do not think this is an appropriate place for the MDPI spam/propaganda.
PS. And referring to the post title ‘Is MDPI a predatory publisher’? – Jeffrey Beall has no doubts that the answer is YES, e.g., https://twitter.com/Jeffrey_Beall/status/1376534050656018435.
I agree. It seems that Volker has all the time on his hands to defend a company driven by money with absolute disregard for science. I guess scientists are busy in the lab than making analyses on MDPI aggressive and predatory practices !!! There is absolutely no doubt that MDPI is a predatory publisher. Those who have been around for a while, as I have been, and being an editor for numerous journals (including several from MDPI) and reviewing for more than 60 journals, I have seen and experienced first hand the appalling practices of MDPI journals. First, there is not a single day without invitations from MDPI to serve as guest editor. Second, the quality of the reviewers and sometimes of the editors is ridiculously low. There are people who never published in a specific field but you see them as guest editors for a specific topic that has nothing to do with that individual’s expertise. As I write this, I received a decision on one of my article with absolutely no rationale provided by the editor. The reviewer did not even read the paper and made unsubstantiated claims that were never in the paper. Likewise, the editor didn’t even bother to check the paper and made the same false claims. Some articles from MDPI are even difficult to read, written in a horrible English. A lot of decisions are made by staff (associate and assistant editors), just to make the deadlines and show a good turnaround time. As a reviewer you don’t receive the evaluation comments from the other reviewers once the decision on the manuscript has been made, which is the norm for good journals. You reject the paper and then you see that same paper published with absolutely no justification from the editor. It seems that when they need to meet the “quota” for articles in their endless number of special issues, they have their own people (with a “good” track record of accepting everything and anything) to accept the papers. The IF of these journals are all inflated and completely disconnected from the quality of journal. Most of these papers published in MDPI will never make to a lower impact factor but a well known journal, that does not have the machinery of people and the predatory tactics of MDPI. If you want a bad article to be published, then yes MDPI is your place. I’ve come to the conclusion that they actually cant handle good articles since it is beyond their editors’ and reviewers’ competence. It is a shame that the scientists (myself included) fall into these practices for the sake of rapid and certain publications for their institutional evaluation or promotions. It is a disservice to science and a good business for MDPI, including Volker!!! But we can demand better and reject these practices.
Would you be so kind as to disclose your full name? Thanks!
And could you please verify your experiences?
Reading your statements, I fear that you might have never been an editor or reviewer for MDPI journals or published papers in one of their journals.
I am an Editorial Board Member of Resources and Land. I also served as a Guest Editor for Sustainability, Resources, and Land. Moreover, I have multiple experiences as a reviewer and author with MDPI journals. My experiences are all verifiable.
And my experiences are very different from your “stories”.
Academic editors and reviewers are overwhelmingly highly qualified and provide constructive comments for authors. In addition, in-house Assistant Editors are competent, helpful, and responsive. As a result, good papers are further improved and accepted while bad papers are rejected. This is real as everybody can verify from the open reviews (which are not continuously published since this is at the discretion of the authors) and rejection rate statistics (which are available for every developed journal). I can testify that the published rejection rates are accurate for the journals I work with.
You claim: “As a reviewer you don’t receive the evaluation comments from the other reviewers once the decision on the manuscript has been made, which is the norm for good journals.”
If you ever were a reviewer for MDPI journals, you should know that reviewers have full access to the comments from other reviewers in the SuSy system. This access is provided even before a decision is made (immediately after a submitted review). Very convenient. And it is stored in the SuSy System. I still have full access to all review reports for all papers I have reviewed since 2014!
You claim: “You reject the paper and then you see that same paper published with absolutely no justification from the editor.”
If you ever were an editor of MDPI journals, you would know that Academic Editors always need to justify the decision. In the case where an Academic Editor decides to accept a paper although a reviewer recommended rejection, a second Academic Editor must double-check the paper and can confirm or reverse the decision.
The decisions are not visible to the reviewer, yes. But there is a clear rule at MDPI journals: Reviewers provide recommendations; Academic Editors decide; Assistant Editors communicate. Academic Editors take the final responsibility, not reviewers. By the way, reviewers can always contact the Assistant Editors if they have any questions. According to my experiences, they will get quick and substantial responses.
You claim: “A lot of decisions are made by staff (associate and assistant editors), just to make the deadlines and show a good turnaround time.”
If you ever were an editor of MDPI journals, you should know that only Assistant Editors are staff members. Associate Editors are Academic Editors who are not on the payroll of MDPI. Assistant Editors are not allowed to make decisions on accepting any paper. Independent Academic Editors always make these decisions. Also pre-checking decisions are taken by Academic Editors.
(If you are interested in the role of different In-house and Academic Editors in the editorial process, please read this webpage and watch the video: https://www.mdpi.com/editors )
You claim: “It seems that when they need to meet the “quota” for articles in their endless number of special issues, they have their own people (with a “good” track record of accepting everything and anything) to accept the papers.”
I am sorry, but this is complete nonsense.
There is no quota. The number of papers per Special Issue is very diverse, and there are quite some Special Issues with a few numbers of papers or even none. It is also ridiculous to claim that they “have their own people (with a “good” track record of accepting everything and anything) to accept papers”. MDPI journals have a Board of Reviewers, yes, but if you ever were an editor of an MDPI journal, you should know that reviewers are assessed based on the quality of their review only, independent of their recommendation.
I don’t want to comment further on every single claim, but all of your claims are either wrong or highly biased, at least.
As an editor, reviewer, and author, I have experienced MDPI as a responsible and innovative publisher who acts to benefit scientific communities.
Why should a “company driven by money with absolute disregard for science” organize annual events like “Basel Sustainable Publishing Forum”?
(I attended the BSPF 2021 and was impressed to hear, from different perspectives, what’s going on in academic publishing.)
Why should they establish a Sustainability Foundation awarding outstanding contributions to sustainability research with the World Sustainability Award and Emerging Sustainability Leader Award,?
Why should they provide SciLit, a search engine for academic literature free of charge, offering unique data on the publishing market?
Why should they provide Encyclopedia, a scholarly community encyclopedia, completely free of charge for readers and authors?
Why should they provide Preprints, a preprint server, completely free of charge?
And I can continue this list.
No, MDPI is a company with a mission to support open science.
Just see the recent statement:
MDPI offers valuable services to scientific communities. If you want to understand the company, please read the interview with the CEO
Now, what are my motives? I support open science too, and I reject disinformation, unsubstantiated allegations, and prejudgements.
I have had experiences much like Stefan Peiffer describes (comment of April 13, 2021). I feel strongly that the single most important problem with how MDPI operates is that they tend to take reviewer selection out of the hands of the editors. When called out on it by an editor, they apologize and act like it was some clerical error. If indeed it is, they need to have another look to see why it happens so pervasively. And they need to fix that. In effect it means that many MDPI papers aren’t peer reviewed.
My experience lately is that the vast majority of reviewers have profound difficulty with English. Since the manuscripts are in English, this presents obvious problems. Reviewer comments read as if they come from speakers of an East Asian language, and the staff structure would suggest that they are almost all from China. Nothing wrong with that, if the reviewers are qualified and are able to communicate well to the authors. But, I think those “ifs” aren’t realized in my recent experiences.
Why MDPI staff regularly ignore lists of potential reviewers from authors or editors is beyond me! I should also point out that in my experience as a guest editor in 2013, they nevertheless used appropriate reviewers. But they don’t seem to do so any more.
(As a SI editor I had another problem: papers were reviewed before I saw them. That was embarrassing because it meant that my colleagues were asked to read some manuscripts that were nearly unreadable. As an editor, I’m sensitive to the demands I put on reviewers.)
I agree with other points and comments that there is a lot to like about MDPI. But the dubious nature of the article “peer” review is profoundly disturbing!
That the Academic Editors don’t have full control over the selection of reviewers may raise concerns, yes, but this procedure has a considerable advantage. Why?
Collusion between Academic Editors (who make decisions) and reviewers (who make recommendations) is prevented! Be aware that this is in particular important for handling Special Issues.
The Assistant Editors of MDPI journals assure that reviewers are not too close to the authors AND Academic Editors! They often also ensure diversity among reviewers.
Thus, the procedure actually secures quality.
I don’t share your negative experiences with reviewers and during the pre-check phase.
Since these experiences might be specific to a journal, I kindly ask you to disclose your full name and the MDPI journals you refer to. Thanks!
A nice article that deserves the attention of the general scientific community.
Since I was both author and reviewer at MDPI (and recently asked to be removed from this money-laundering machine), I would like to leave a couple of comments and a couple of questions reflecting only my personal opinion.
I have a suspicion that reviewers for MDPI magazines do a pretty OK job. Probably not the same as if they were reviewing for Cell or Nature, but still. And if so, then the question of whether or not to accept manuscripts is handled differently. If MDPI has become omnivorous (which is what I’ve been seeing for the last couple of years), then the impact factor of many MDPI journals should drop. However, we see the opposite picture! We are seeing an increase in IF and an overall increase in the number of publications, while their quality is definitely not getting better. So how does this happen? Certainly, there are many who are willing to pay an extra couple thousand dollars to get published quickly in a journal with an Impact 6, where the quality of peer review, shall we say, is not comparable with the quality of peer reviews of similar IF journals from other Publishing Houses. But it can’t be that this is the rule. Or can it? Since there is no transparent data here, as in politics, this problem cannot be solved mathematically, that is, in terms of published audit reports.
So who else benefits from this? Besides Publishing House. Looking even higher, for example, let’s ask a simple question. If article rents were a secondary factor, but if I wanted to increase the “publishability” of only a certain fraction of scholars in “my circle”, for example – what would I do? I would probably try to change the Editorial Board to a more loyal one. That is, so controlled that by some simple combination you can always publish papers by sending them to a reviewer with a high acceptance/rejection rating. I guess it’s not very difficult, is it? And then the burden of deciding to accept manuscripts will no longer be directly tied to the quality of the publications themselves and the reviewers’ feedback. By the way, if you look at the editorial boards of many journals with an IF 3-7( no relation to MDPI), you see a dramatic change in the composition of the teams over the last 5-7 years. Any thoughts here?
And lastly, and not just MDPI related. Can anyone explain to me why after the 2000s, when all publications went online, many Publishing Houses not only did not lower, but even raised publication fees? After all, no one pays you (or us) as reviewers, it’s kind of our duty to organize criticism of our peers. Nor have the editors and their deputies made any more money, and journal subscription prices for academic institutions have not dropped much. This funny marketing move in the form of open access – is it really done so that those who can pay fees will let those who can’t pay to read their papers? There was a time when it was fashionable to say that converting journals to electronic format was an aid to reducing the deforestation of the Amazon forest. And now what – the price of electronic paper has multiplied?
Auragen group is a recent spammer and predatory company started in Hyderabad. If you are free call at +91 9989 661 232., they are ready to initiate their scam over phone itself, by the name called Mr. Sai Kiran Olipilli, who is just a graduate and spamming the researchers.
#Fake Auragen #spammer Auragen # predatory Auragen
I will add that the website for one of the listed predatory conference organizers, Coalesce Conferences, has been blocked by my university as a phishing website, which I discovered when I tried to go to the conference website page to see how many of the red flags they checked off.
I am sorry to partially contradict Paolo. But
1- to all well known journals it is a common knowledge that you can publish easily if you are co-author of one of the editors, are friend with one of the editors, or a little bit harder if you are co-author a friend of one of the editors, or are friend are friend with one of the editors 😦 Otherwise you submit the paper and wait half a year or maybe more to get a reject; this is frustrating especially I am sorry to partially contradict Paolo. But
1- to all well known journals it is a common knowledge that you can publish easily if you are co-author of one of the editors, are friend with one of the editors, or a little bit harder if if you are co-author a friend of one of the editors, or are friend are friend with one of the editors 😦 Otherwise you submit the paper and wait half a ytear or maybe more to get a reject; this is frustrating especially for young PhD students who are at the end of their thesis and waits validation for a high ranked journal;
2- most MDPI published papers are good; maybe are so not so good – but – goodness grace – what Journal can stand up and say that all and every paper is genial;
3 – it is true – they have high taxes (maybe unjustified), but they offer open access to all readers – which is increased visibility – leading to possible collaborations, citations, as so on;
4 – there is something no other journal gives to you – they give a small token for each review. It is not much (maybee 1% of the taxes – but if you are open to work hard in one year you can publish a paper free. For ones it is important. From other papers you review for nothing. Only for personal glory.
2- most MDPI published papers are good; maybe are so not so good – but – goodness grace – what Journal can stand up and say that all and every paper is genial;
3 – it is true – they have high taxes, but they offer open access to all readers – which is increased visibility – leading to possible collaborations, citations, as so on;
4 – there is something no other journal gives to you – they give a small token for each review. It is not much (maybe 1% of the taxes – but if you are open to work hard in one year you can publish a paper free. For one it is important. Obviously there are some reviewers that wand to trick the system, byt doe this not happens to high ranked journals too. And, yes, they have a system to qualify the activity of a reviewer (give a grade to the quality of their review) – which is a feedback that will probably converge in eliminating the ones that wand to trick.
So, as a bottom line, I think MDPI offers a good opportunity for those ones who wand fast publishing, especially for young PhD students who are at the end of their thesis and waits validation for a high ranked journal. There are many other high ranked journals with good and bad aspects … so do not speak bat of them.
Just a small note…
One of the reasons I asked to be excluded from MDPI reviewers and do not advise anyone to publish on this platform is that MDPI accepts articles even if the quality of the work does not deserve it. Criticism from reviewers is not taken into account and one of the crowd of review editors simply turns a blind eye to the scientific quality and helps MDPI make money they don’t deserve. Another curious fact about MDPI is that the right hand doesn’t know what the left hand is doing. For example, my numerous requests to send for review only those works that have the keywords I specified are simply ignored. Also, I periodically received invitations to lead another special issue with the condition of recruiting a dozen people who wanted to place their work on this platform – does this remind anyone of a financial pyramid scheme?. From this I conclude that MDPI is just an article factory and nothing more.
PS and yes, to satisfy the ambition of Ph.D. students to publish their papers .. just to get published is a question I would address to the ethics committees and their PIs
I found this a while ago – an unintentionally useful compendium of predatory conference companies. According to the comments in the blog, of people who have been contacted by these events, everything seems to indicate that they are using scientists as useful idiots for them to make money in the organization of bad quality events to defraud people. Why researcher need to pay a minimal fee if they organize only webinars? Its unethical, how a just graduated person can make researchers fools, I am wondering now, I am going to file a case on this. Does the refund saves their organization from predatory, Hilarious answer given by #spamauragen, this indicates their fake organization standards. I am sure #scamauragen will face a law suit soon to stop deceptive practices. Don’t make the costly mistake of being fooled by fake conference organizers.
Beware of Auragen scam. Dubious, fake or eventually predatory Conferences, its just another version of #OMICSpredatory, #Scientificfederationscam
MDPI is a really Predatory Publisher
I worked with MDPI as an officer.
We sent the same article to more than 10 reviewers
Some of them pass it. We ignore those that they reject it
Even if all of the reviewers rejected the article, we approved it because we changed the option: Reject it to “Accept it after Major Revisions”
MDPI has reviewers, but they accept all the garbage articles. This is the truth
After that several universities in Europe consider MDPI as Predatory and put it in black list
I totally agree.
My recent case with this “money laundering and promotion company of a certain proportion of publications from one known country” illustrates the general situation.
I recently rejected an article sent to me for review because it had no stated effect and results were not validated. I also indicated that I was not going to review the article again.
What was my surprise when 3 weeks later I received this paper again for review, where the text of the article, the authors and the color of the graphs had been changed, but nothing in essence had been changed. In fact – it is just falsification of the results. After that I asked to be removed from all possible MDPI mailing lists and not to contact me in any other way.
What was my surprise when a couple of weeks later I received a message from the MDPI editorial board that the article I had reviewed had just been published and the editorial board thanked me for the review.
I have a question – is there any regulator other than WoS and Scopus that somehow keeps track of this garbage from MDPI?
Please, I decided to DELETE the previous comment, because I might expose the other co-organizers.
So, delete it please
Direct to the point: yes, MDPI, Frontiers, and similar publishers are in fact predatory and just want make money. The ones who scream here against these conclusions are part of this business or have a conflict of interest. Simple as that. We don´t need robust methodologies to reach such conclusions. I am bombarded everyf.day with invitations from mdpis, frontierss and similarities..keep the distance to not be linked forever to immoral and unethical companies like these ones…yes, I am fed up with people trying to take advantage of others.
MDPI is an unacceptable predatory publishing house with a fake review. I just received again spam that they call me in their magazine “Mathematics” and they promise me a review within 7 days. Personally, I do not recognize any student publication in MDPI at my university. It is unfortunately a scam. I wonder how their Magazines are still indexed in so many indexes. MDPI is a scam
Unfortunately, MDPI is a huge predatory publishing house with a fake review.
I have just been invited to take part in Special Issue in their journal on Mathematics where the organizer is a scientist without the necessary qualifications and without Ph.D.
He called me by Phone and promised me to publish the articles that I wrote in 1980 about Animals.
He told me that he has already collected 15 articles in his special issue and MDPI accepted all.
15 over 15. They do not reject anything.
So, MDPI is a huge predatory Publisher without serious peer review. They accept everything since you are willing to pay the fees
MDPI is predatory!
Many of my friends publish their works in MDPI, simply because they can not publish them in IEEE, ACM, ASME, Elsevier
Of course, MDPI is a very predatory publishing house
In European Union, those who have many publications on their CV in the MDPI are rejected from the European Union research projects
Thanks for sorting this out, Paolo. My limited experience as a reviewer for MDPI was all but positive. But then there are some good articles by good people in such journals, so did I just have some bad luck? Reading through the posts here, it becomes very clear that MDPI is predatory. I understand the viewpoint of Volker and others who want to give the new model a fair chance (traditional publishers have done a rather mediocre job on promoting fair and transparent science so far).
Yet, this is similar to hypothesis testing, where one case violating the hypothesis is in principle enough to discard it. Here, scientific rigour and quality is at stake. Reading about people’s experiences in this blog, which match my own, is sufficient to conclude that MDPI regularly disregards basic principles of peer review, in a manner I have not encountered with any of the traditional publishers and journals.
Here’s my own experience, just to be complete: My first review was for a paper invited by a colleague who served as guest editor. That worked fine for me, thanks to that colleague, who was, however, very upset by the process because MDPI had first chosen other reviewers instead of her own. They changed that after her intervention. As for Volker’s argument: Yes, collusion can be a problem. But many guest editors take their job seriously. The few reviewers from MDPI that she retained turned in completely useless reports.
After that, I kept getting invitations to review papers that had nothing to do with my field of expertise. Until, finally, one really fitted (in the field of economics). It was terrible, with implausible hypotheses, ignoring almost all relevant literature, badly written, missing basic information and incomprehensible estimation results. I rejected it, but got a revised version few weeks later. I refused to look at it a second time. It was finally published, and I was able to look at the comments by the other reviewers. The first one had accepted the paper instantly, with a four-liner basically saying that it was a good paper with remarkable results. The second one asked for some clarifications, which were indeed necessary, but he/she did not make any remark on missing literature, theory, or flawed estimation. I suspect that he was not very familiar with the field and topic. Anyway, I do not get why the guest editors would accept such a poor article. Why do they apparently play along with this kind of sham? If they want to overturn a reject decision, I would at least expect some explanation on the merits of the paper that guided their decision.
Everybody would agree that the people of MDPI are robbers. They approve articles that are of a very minor or even zero importance. Just for money.
I know a Lady that published in December a light study (like an article in a newspaper) on Wolves in Northern Europe and MDPI accepted her work immediately without any review.
And what a joke! The work was simply a copy-paste of a 2004 popular magazine, full of pseudoscience and was not based on any scientific study. MDPI is full of garbage articles and pseudoscience.
SCIE web of Science Indexes MDPI. So, in our University SCIE web of Science is not an important index now.
I really think that MDPI is a Predatory Publisher. They have no real review. They are really predatory. In our department, MDPI is like a garbage bin. We sent them the Rejected articles of all the other publishers.
MDPI IS AN 100% PREDATORY PUBLISHER. I ORGANIZED AN IEEE CONFERENCE IN FRANCE IN 2019 AND THREE CLERKS OF MDPI INVITED ME TO ORGANIZE A SPECIAL ISSUE IN THEIR JOURNALS “ENERGIES”. I SENT THEM ALL THE REJECTED ARTICLES OF MY CONFERENCE AND THEY ACCEPTED EVERYTHING. I HAVE NEVER SEEN SUCH A PREDATOR. MDPI IS THE MOST PREDATORY PUBLISHING COMPANY.
MDPI IS AN 100% PREDATORY PUBLISHER. I ORGANIZED AN IEEE CONFERENCE IN FRANCE IN 2019 AND THREE CLERKS OF MDPI INVITED ME TO ORGANIZE A SPECIAL ISSUE IN THEIR JOURNALS “ENERGIES”. I SENT THEM ALL THE REJECTED ARTICLES OF MY CONFERENCE AND THEY ACCEPTED EVERYTHING. I HAVE NEVER SEEN SUCH A PREDATOR. MDPI IS THE MOST PREDATORY PUBLISHING COMPANY.
I recently got a paper from mdpi to review. The article was handled by their Chinese office – thy urged me with the argument that they had two conflicting reports and needed a fast decision. The article had a lot of passages marked in red in response to previous reviews. The editor urged massively to get it done in 10 days. The manuscript was hopeless, I had to reject it, and I do not reject many articles. After 2 days I had the article back “You recently kindly reviewed the original version of the following manuscript, submitted to Cells. The authors have now provided a revised version along with a cover letter in which they address the referees’ comments”.
THIS IS PREDATORY PUBLISHING, THEY ACCEPT EVERYTHING FOR MONEY!
Yeeesss!!!! MDPI is a 100% Predatory Publisher. The MDPI Journals are Scam.
Thank you so much for this contribution.
Since I am rather new in the academic publishing world, I want to ask the community on this page to tell me if this feedback I got from an MDPI journal sounds like a helpful and legit review one would expect to get from a serious journal. To me, it sounded extremely vague and it seemed to me that someone was in an extreme rush and just wrote some generalities.
The reviewer chose the option MUST BE IMPROVED for all the review questions, then wrote this:
‘In terms of content, there appears to be a lack of introductory
material (the other reviewer told me my introduction was too long!).
I would anticipate that adding this more complete
introduction would also require the addition of a number of
additional references including previous research.
In addition, there is an absence of appropriate data and
evidence based on hypothesis and analysis. Therefore, I don’t
understand the results of this research and recommendations’.
I will tell you why I consider that MDPI is a fake and bogus publisher! I was on Energies’ editorial board, but I quit because of their review model. Often one editor would pick reviewers and the paper got a revise and resubmit but then I was asked to make a decision on whether the paper should be published. I would look at the paper and often it was really bad and should never have gotten an R&R in my opinion. The reviewers were not competent reviewers for the topic/methods. I felt bad though about rejecting papers that authors had revised specifically for the journal. I don’t know why the journal does this. I told the journal that this didn’t make sense to me, but they kept doing it. Anyway, I can’t work like that, so I quit. I had a bad experience with this fake, bogus and Predatory Publisher. MDPI is the Shame of our Academic community. They are scammers and predatory.
bad beexperience earlispecial issue editor for the journal.
MDPI is probably the most predatory publisher. Journals like “Viruses”, “Animals” are bullshit. Their portfolio is a portfolio of predatory and awful Journals. Do not count MDPI publications as Publications, but as fake and junk Academic articles. Reject candidates for a position in your faculty if they published in MDPI even 1 paper. Please, spread the Information all over the world thar MDPI is a catastrophe of the Academic system. They are scam, sham, fake and predatory.
Since I regard “predatory” as a useless or dangerous term for journals, and have no experience with MDPI personally, I won’t comment on the gist of the article–but I will update one item: Based on my counts for Gold Open Access 2016-2021, MDPI is no longer the second-largest OA publisher. Even taking all of Holtzbrinck (BMC, Frontiers. Nature, Springer) as a single publisher, for 2021 MDPI had over 234,000 articles and a potential %540 million fee revenue, compared to over 195,000 articles and potential $502 million for Holtzbrinck. (Elsevier is a distant third for gold OA, with over 89,000 articles and $144 million revenue–and, by the way, its average fees for gold OA articles–not “hybrid”–are much lower than either MDPI or Holtzbrinck.)
MDPI is a publisher without peer review.
I consider the MDPI biomedical bogus Journals to be completely dangerous to the Public Health, because they publish fake statistics, fake experiments, fake studies. Their journal CELLS has published 3 SciGen fake papers. I am a cellular biologist and Journals of MDPI are fake and bogus. MDPI is the Leader of Predatory Publishing Industry.
They are scam. Avoid MDPI JUNK publisher.
The Academy of Management Ethics Education Committee in China suggests that you would be on dangerous ground to attempt to publish a publication in MDPI predatory journals. Avoid MDPI.
Note that the University of South Bohemia in České Budějovice officially declared: MDPI is a criminally Predatory Publisher
In December 2021, the Faculty of Science of the University of South Bohemia in České Budějovice announced that it will stop financial support for publishing in MDPI journals, officially recommended against publishing in or reviewing for MDPI, and warned that publications in MDPI journals might not be taken into account for evaluations of employees and departments
My experience with MDPI:
In 2018, I wanted to publish an article as a “private scientist” without any university address. It would have cost about 400 euros at the time. I was rejected by two journals within a few days. I then uploaded the article to Preprints. In the meantime, in 2022 I received an email from another MDPI journal that noticed how often this preprint had been uploaded and invited me to publish. Cost meanwhile over 2000 Euro. Not worth it to me.
MDPI is only about money not about science.
Just read an article in Agronomy – MDPI published a few days ago; 11 authors, 2 reviewers and the editor have put their names to Cut & Paste exercise with poor English grammar, fantasy species distribution, images without attribution, references that don’t match the points raised, and more. The Chinese Academy of Science are currently flagging this journal but the Chinese authors have chosen to publish anyway.
Thanks for this post,
Personally I never considered publishing in any MDPI journals. I don’t know whether it is actually a predatory journal or not, but there is that question hanging over it and I rather avoid that possibility completely. I received around 30 invitations in the first six months of 2022 from MDPI affiliates either asking for manuscripts or asking to be a guest editor, that is a little scary when ACS was threatening to remove my email from their contact list unless I respond to give them explicit permission. Promise of the length between submission to publication as 30 days makes me very nervous, sometimes having additional experiments to justify a conclusion may take at least few months. The page fee of 2200 CHF (even with 600 CHF discount) also doesn’t sound very appealing to me, either. So maybe it is an appropriate journal for some, but I personally would stay away from it no matter how high the IF is. There are plenty of other journals out there.
Volker, On your comment “Reviewers provide recommendations; Academic Editors decide; Assistant Editors communicate.” That may be a good policy for the journal, but I would refuse to be a reviewer unless I know what they did (or how they acted) with my review. Otherwise, I do not want to waste my time reviewing something when the Academic Editors decision is not transparent to the reviewers, I will let the academic editors review the paper themselves. Also, I would take your opinion with some reservation, as you have several papers in MDPI journals it seems ( and you were open about it) – Sustainability, Agriculture, Resources, Land, etc.
Volker’s posts are very informative, giving details of the inner workings of MDPI. But with his association with MDPI at multiple levels, I believe he has a conflict of interest. If justifications for MDPI comes from researchers with high h-index or other unique reputation, then it is certainly worth paying attention. Last time I checked, I was not very impressed with MDPI’s Board, Management Team, etc
The obvious advantage I see in publishing in MDPI is the turn around time, but you pay for it. Is that a great advantage? New England Journal of Medicine says “NEJM generally replies to Rapid Review requests within three business days and an initial decision on publication will typically be reached within two weeks.” Lancet says: “Our fast-track process means that The Lancet aims to peer review and publish papers within four weeks of submission,..” In science the journal Advanced Materials takes 36 days for acceptance, etc. So one can find journals with reasonably quick turn around time (unless one thinks that two weeks is going to make a huge difference). I suspect the other, not so obvious, reason may be that there may be more certainty that a manuscript will be accepted in MDPI. Which, if true, is the best reason to avoid them in my opinion. For new academic members quick publication is obtained under a suspicion of quality. More established academics can certainly wait longer to get their manuscripts published in more reputable journals.
MDPI may have helped to seriously question the old belief that high IF and h-index means quality?
I am biased when it comes to MDPI, I do not take MDI publications seriously. The reason is given in Wikipedia, as well as several posts above. Also having 98 journals having an impact factor out of 393 journals published by MDPI does not impress me as a credible publisher. I was also looking at their main address MDPI AG Klybeckstrasse 64 4057 Basel Switzerland on Google map, and that does not look like a publishers office from what I see.
The Titles of MDPI journals say: “We are predatory”
For example a journal “Axioms” is a ridiculous journal. MDPI paid the ISI and the ISI accepted this predatory journal.
We announced a conference in our University in Frankfurt. Immediately 5 days after the opening of the web site, two Predatory MDPI Journals asked us to organize Special Issues in their Predatory journals. We did not reply them. I added a filter in my email account. Every incoming email from mdpi.com will be automatically deleted.
I have a lot of experience with several high ranking MDPI journals. That was mostly positive until last year when I experienced acting as editor, reviewer and author that the papers recommended for acceptance by the reviewers were pending for moths to be eventually rejected without any reasonable arguments by “editors”, who did at the first stage approve processing of the paper, meaning the paper was found by them as appropriate for the journal.
The editorial process can even result in rejection of the positive reviews themselves by the “managing editors”, who are not professional scientists, but could state that the positive review was irrelevant. These “managing editors” are almost exclusively from Serbia and Romania, some still being students (!), probably to save money for the millionaires of the MDPI. I guess they practice so immoral and perhaps even criminal editorial policy to show that they are rejecting papers, even if written by the top scientists, so they are not predatory.
On the other side that allows stealing of the authors’ intellectual property because such “academic editors” and/or their partners could perform and publish similar study themselves after getting the idea from the authors of criminally rejected papers.
Furthermore, requesting editors of the special issues (SI) to recruit submissions for the SI from their colleagues or even any other scientists, offering some “discount”, to be eligible to get themselves 100% discount is typical racketeering, as done by mafia.
Finally, what kind of serious scientific journal has hundreds, even thousands of the editors/editorial board members, the most of which served as editors of some SI?
The managing editors and majority of the MDPI employees are Chinese, while formal registration of MDPI in Switzerland is for better image and financial activities.
MDPI is a fake, bogus, predatory publisher .
Josep Soler and Andrea Cooper wrote the article:
“Unexpected Emails to Submit Your Work: Spam or Legitimate Offers? The Implications for Novice English L2 Writers”
in MDPI Journal: Publications
They blame Predatory Publishers in a Predatory Journal
Josep Soler and Andrew Cooper are both idiots, crooks and criminals.
Shame, Shame for Josep Soler and Andrew Cooper. They are bogus scholars and Jerks.
Dear Mr Crosetto, my manuscripts had been rejected 7 times by MDPI journals; 6 desk rejection and 1 after peer review (rejected with invitation to resubmit). So, I do not agree that MDPI is a predatory publisher.
It is obvious that “Khairul Anwar” is a clerk of MDPI.
Either his real name is Khairul Anwar or it is fictitious name, this guy works for MDPI and pockets each month a good bonus from MDPI Predator.
Stay away from the Predators of MDPI!
See this https://twitter.com/plieningerlab/status/1428346892979634178 written by
About the comment “it is the publisher who sends out invitations for the Special Issueit is unclear which role, if any, the editorial board of the normal issues has in the process”, I CAN ONLY TELL MY EXPERIENCE. EACH TIME THE MDPI JOURNAL (PLANTS-BASELS) sent an invitation on my behalf (I was Guest Editor for a SI), the editorial board first asked me if I whished them to do so. I said yes when I was very busy. There was nothink akward in doing.
I was wondering whether it would not be fruitful to study the average number of references cited (nb) in year n by a journal and the (%) of these references that are from years (n-1) and (n-2), therefore ‘counting’ for the IF. By plotting nb and % for a large number of journals you might be able to isolate statistically those which, through citations rings and the like, maximize %. Mapping the journals that are cited would also provide insight on journal coalitions that cross-reference each other (it is no longer MDPI citing only MDPI, although still quite high a rate, since the business has expanded…). (This indicator would add to the self-citation indicator.)
If you find that % is around, say, 10% in non predatory journals, but around 25%, in MDPI et al. then you get a measure of the efforts deployed to game the system…
I am sick to read that they are not predatory journals because you can find some good papers too. Yes you can, and this is precisely where they are very smart: they attract mid-carrer colleagues by inviting them to edit a special issue; the colleague is flattered, needs to get some academic experience and accepts. This colleague then seriously draws on his/her academic network to populate the special issue, including some good scholars (not yet fully informed about MDPI!). So then other (younger) scholars say: Oh! Smith and Sicrano have published there so it must be a good journal, and in addition I see they have a good impact factor! so the trick is done…
MDPI declines papers for which authors want to use vouchers !
Is it true ?
Paolo, thank you for your deep analysis.
Is MDPI predatory? As for my humble experience as an author, a reviewer (about three dozens of reviews, both rounds) and an an editor (both GE and EBM), it strongly depends on journal. IJMS, Molecules, Separations, and Marine Drugs are not. Foods, Antioxidants, Antibiotics, Pharmaceuticals… So-so… (At least, fake papers were rejected.) Polymers, Life and J.Appl. Sci. are not good journals, but not 100 % predatory. Cosmetics is definitely a predatory journal swallowing any papermill fabricated fraud. Same is Fermentation. Again, this is my humble opinion.
In addition. World and Axioms published gobbledigook AI-generated papers signed by “highly cited prominent scholars” (sarcasm). Symmetry invites any (minimally reputable, i.e., not listed in Retraction Watch) chemists to publish for free irrespectively from their molecules are symmetric or not ^) (to save reputation, I guess…).
Hi, I love your website! I tend to agree with this as I am currently employed as a team leader for an outsourcing company.
Mscholar conferences are the predatoryconferences and it’s a sister organization of scinentificfederation: https://www.mscholarconferences.com/
Mscholar conference websites have been blocked by my university as a phishing website, according to my research it has been established recently by Deepesh Reddy who is just graduated and spamming scientists, the address listed as Rhode Island and California is fake too. Mscholar conferences are the Dark Side of Predatory Conferences. It is located in Hyderabad, India, don’t believe and don’t waste time on these conferences. It is yet another way to suck people into paying for Mscholar events
MDPI L: The worst publisher, the most of fake of the fake, the most predatory of the predatory
MDPI is a criminal organization.
I rejected a paper of some friends, but they did not send my comments to the authors.
If you are reviewer in MDPI, you cannot reject a paper.
Simply, they will change you.
MDPI is not only a predatory publisher! It is a MAFIA.
They are criminals
I did not know it was Predatory. My University did not count my two articles in MDPI. My University says that MDPI is a fucking Predatory publisher, they publish every junk article if you pay them. They are bastards
Indeed. Aiming to compensate that editorial stuff of the MDPI journals reject good papers of the eminent scientists to show they are not predatory, especially if the papers are invited and supposed to be the APC-free “feature” papers. Such a paradox makes them even more predatory, rejecting good papers, and publishing rubbish.
This is a bit off topic, however, I would just to mention that as from 01/01/2023, I will not work with MDPI (no submission, no review, no reply to their spam). The same for Frontiers and Hindawi. This boycott is a personal decision, and I’m not seeking for support in any way. The ground for such a decision is that within the plethora of (good) journals available for chemistry and crystallography reports, there is always suitable options, so I do not need to consider MPDI/Frontiers/Hindawi as an alternative. Moreover, I can not spend 2 month’s salary for a single article APC. Finally, there are some indications that the main funding agency in the country where I currently work will soon delist these publishers for researchers and grants evaluation.
Good for you, I did the same already and hope eventually we will stop being abused by the predators.
I understand your comments, but I believe ALL the publishing system is predator for researchers in many ways. To make it short, we pay to fund our research, we pay publisher for open acces (in ALL journals) and our institutions pay AGAIN so that we have access to publications. Second, but not less important, journals who are considered with a high impact in each scientific discipline SEEK for sensationalist results and refuse HIGHLY ACCURATE studies that do not produce “INCREDIBLE RESULTS”. This is very dangerous, because scientifis are pushed to exagerate their results and their experimental designs. In my opinion, all journals are predators because they take profit of the way researchers are evaluated but each journal has its own system.
@Elena. Your extreme position (the whole system is broken, something similar to “Tous pourris” in politics) is actually shared by many others. Not by me, however, or only in part. Many journals are not (yet) published under a Gold open access model. Most of the journals published by the American Chemical Society (ACS) remain close to the pre-internet print-based journals system. However, I agree with you that even for these historical and well-trusted publishers, they are grey areas. In the case of the ACS, a desk rejection comes now systematically together with a proposal to submit the very same manuscript to an ACS sister journal, which is obviously… a Gold OA journal charging an unaffordable APC. This is not fair. A manuscript rejection should be a rejection, whatever the reason, period. Without going into details, this industry is now so complex (Gold, Green, Hybrid, Bronze, Diamond, Platinum, Black publishers) that I am under the impression that, yes, ALL publishers are at some point profit-seeking abusers. The spectrum of abuses is however as large as the spectrum of open-access colors. Moreover, third-party agents, like Sci-Hub, repositories and preprint servers, add more noise. For example, the pay-per-view scheme established by Elsevier 10 or 15 years ago has been killed by Sci-Hub (I do not debate here whether this is good or bad).
Returning to the specific case of MDPI, it is clear that they are operating on a low-cost, high-speed, high-volume business model. So, if “predatory” means “we are by no way interested by your research, we just want your money”, then yes, definitively yes, MPDI is a predatory publisher. And probably the worst striker around. They are however honest in the sense that they are not concealing it.
The consequences of the “Publish or Perish” mood you are mentioning, that is a trend towards hyperbolized results, HARking (Hypothesizing after the results are known), etc. are probably more worrying, and on this point, I agree with you. But again, not everything is black or white. It is not easy to fool the scholarly community. Someone reports a room-temperature superconductor in Nature (Impact factor close to infinity)? Naaaah! It’s all fake!