Journalology #10: Speed and workloads


Hello fellow journalologists,

This is the tenth issue of the Journalology newsletter; I’m rather proud (and slightly surprised) that I managed to make it to double figures. If you’ve been with me from the beginning, thank you. If you're reading this newsletter for the first time, welcome.

Putting together these newsletters takes time, so if you think your colleagues may benefit from reading the newsletter please do spread the word. I am more likely to reach issue #100 if the audience grows steadily each week.

The sign up page is here:

https://journalology.ck.page

Speed and workloads

On Tuesday Christos Petrou assessed whether journal articles are being peer reviewed and published quicker now than a decade ago. His analysis concluded that the time between submission and publication fell from 199 days (mean) in 2011/12 to 163 days in 2019/20 for the industry as a whole. However, much of that improvement was due to two publishers: MDPI and the Amercian Chemical Society (ACS).

In 2020, it [MDPI] accepted papers in 36 days and it published them 5 days later. ACS, the second fastest publisher in this report, took twice as long to accept papers (74 days) and published them in 15 days.

With regards to peer review:

As MDPI became larger, it brought down the overall peer review speed across publishing. Including MDPI, the industry accelerated by 14 days in the period 2019/20 in comparison to the period 2011/12 (from 150 to 137 days). Barring MDPI, review performance did not show much change (from 151 to 150 days).

Christos’ article contains some valuable lessons for editors and publishers regarding the competitive nature of reducing manuscript turn around times (TAT). He concludes:

TAT information can also be useful to authors, who may wish to take it into account when choosing where to submit their paper. While there are several impact metrics available, publishers are the only source of TAT information, which is provided with gaps and inconsistently. Setting up a platform that provides TAT metrics in the same way that Clarivate’s JCR provides citational metrics will allow authors to make better-informed decisions.

That sounds like a good idea in theory, but TAT data would only be helpful to authors if publishers measure their journals’ performance in a commonly agreed and consistent manner.

Publishers’ policies will affect these metrics, as Phil Davis points out in the comment section. For example, some journals may only do one round of peer review and make an accept or reject decision at that point. “Rejected” authors can resubmit once they’ve addressed the referees’ comments, but that would restart the TAT clock.

Another confounder is that some journals receive a large number of papers through a transfer cascade; papers that already having been peer reviewed by the ‘donor’ journal will help improve pubication times for the ‘receiver’ journal.

In short, TAT data need to be interpreted carefully especially if an editor is directly comparing their journal’s performance with a competitor’s publication.

We know from multiple surveys that authors want to be able to publish quickly. How can editors and publishers make that happen? One factor is manuscript loads, which are directly related to publication speeds.

Let’s explore this with a thought experiment, which is entirely fictional and overly simplified, but hopefully makes the point. I’ve created a spreadsheet which may help you to follow the logic.

Journal A publishes papers quickly: 30% of papers are either accepted or rejected (i.e. a final decision made) by day 30, 70% by day 60, and 100% by day 90.

A brand new editor joins Journal A and receives 10 papers to peer review on Day 0, another 10 on Day 30, another 10 on Day 60 and so on.

By Day 60 the editor’s manuscript load (the number of papers that they are peer reviewing at any given time) reaches a steady state of 20 papers. The editor makes quick decisions and so their manuscript load remains low.

Journal B publishes papers slowly: 30% of papers are either accepted or rejected (i.e. a final decision made) by Day 210, 70% by Day 240, and 100% by Day 270.

A brand new editor joins Journal B and receives 10 papers to peer review on Day 0, another 10 on Day 30, another 10 on Day 60 and so on.

However, because the TATs are slower on Journal B the editor’s manuscript load reaches steady state at 80 manuscripts. This means that editors on Journal B are having to manage the peer review of 80 papers while the editors on Journal A only have to keep on top of 20 papers, even though both sets of editors are sending out 10 papers for peer review each month.

This has the potential to create a vicious circle. Editors that have a high manuscript load are likely to struggle to keep on top of their pipeline because of the admin burden, which is likely to slow them down even further.

This is hardly surprising when you think about it, but it has important implications for editors and publishers when deciding how best to set a journal up for success.

Is it better to have one editor with a case load of 90 manuscripts or three editors each with a case load of 30 manuscripts? After all, authors are likely to be published more quickly if they submit to a journal whose editors are not overwhelmed.

Bearing this in mind, here are two excerpts from the MDPI 2021 annual report:

In 2021, the number of employees increased by almost 50%, reaching more than 5700 by year end. This means that almost one in three current employees joined the company within the past twelve months.

and…

MDPI provides a high degree of assistance to Academic Editors so that they can only focus on editorial decisions. We employ in-house editors to invite reviewers, collect review reports, communicate with authors and reviewers, correspond with authors about revisions, etc.

The reports notes that MDPI had 115,000 academic editors by the end of 2021, who worked on 383 journals.

It seems highly likely that one reason why MDPI is able to publish papers so quickly is because they have a large number of in-house and academic editors, who each have low manuscript loads. Overwhelmed editors are slow editors. MDPI is fast in part because of the large team that it has created.

Rapid publication is helpful from a commercial perspective when operating under a Gold OA APC model: fast publication means improved cash flow. If a publisher is able to knock 30 days off their journals’ time to publication, then the journals would be able to publish an extra month’s worth of papers in a calendar year, increasing revenues by ~8%.

In summary: a low manuscript load is good for editors; fast publication is good for authors and publishers; however, unless quality is maintained, rapid publication may not be good for readers.

Editorial wisdom

The trade off between speed and quality has been a theme in our industry for decades. Here's a quote from two former editors of The New England Journal of Medicine from 1991 to make that point.

Briefly quoted

Science Has a Nasty Photoshopping Problem

Using my pattern-matching eyes and lots of caffeine, I have analyzed more than 100,000 papers since 2014 and found apparent image duplication in 4,800 and similar evidence of error, cheating or other ethical problems in an additional 1,700. I’ve reported 2,500 of these to their journals’ editors and — after learning the hard way that journals often do not respond to these cases — posted many of those papers along with 3,500 more to

Source: The New York Times


How many ducks do you need to line up to get a publication retracted?

When something looks like a duck, walks like a duck and quacks like a duck, it is very likely a duck. Here, the ducks of compromised publication integrity are all aligned, duck-shaped, feathery, waddling and quacking, but the publication integrity assessment and resolution processes are misaligned. The journal wants to retract but the publisher won’t permit that.

Source: Retraction Watch


Does the Peer Review Process Need Blockchain?

TLDR works a lot like Reddit: First researchers post their work publicly, either directly or to numerous so-called “pre-print” servers like bioRxiv or medRxiv. These have been around for several years but became much more influential during the COVID-19 pandemic because of the speed with which they could bring research to other scientists. Reviewers get paid by the TLDR site, which is funded through charitable

Source: NEO.LIFE


Climate Action: Are We Committed Enough?

A thorough stocktaking of the progress made by the signatories towards the SDG Publishers Compact’s 10 action points would help us to understand how we are doing in terms of climate action. A survey of the Compact signatories was recently conducted, and the initial results were presented by UN Publications at the 2022 Frankfurt Book Fair. Earlier, on October 12, the European Association of Science Editors (EASE) and HESI (Higher Education Sustainability Initiative) SDG Compact Fellows organized a workshop on a survey on the SDG Publishers Compact, which they had recently launched to find out how editors, publishers, and their organizations are contributing to achieving the SDGs. This survey may also help us to recognize publishers’ climate actions when the results are made public in Spring 2023.

Source: The Scholarly Kitchen


Time for change - global annual survey

In 2022, we’re seeing a growing interest in open research, with 46% of academics considering open access and sharing of open data sets, compared to 29% four years ago. Academics in 2022 are placing greater importance on the impact of research in society, with 81% saying it’s important to them personally, up by 6% on 2021.

Source: Emerald Publishing


Increased public scrutiny of scientific research, report finds

Being published in a peer-reviewed journal is the most important marker of reliability according to 74% of researchers surveyed. The study also reveals that more than half of researchers (52%) feel the pandemic increased the importance of publishing research early, prior to peer review, and many – particularly women, early career researchers and those in Global South countries – feel the pandemic has widened inequalities in, and access to, funding in their fields.

Source: Research Information

Full report: Confidence in research: researchers in the spotlight


PREreview and eLife welcome Chan Zuckerberg Initiative’s support to boost community engagement in public preprint review

With backing from CZI, the organisations are in a strong position to achieve two key goals over the next two years. The first is to develop PREreview’s software and engagement strategies to allow new communities to solicit and create expert feedback on preprints. The second is to help enable reviewing organisations and societies to implement their own flavours of the ‘publish, review, curate’ model that PREreview and Sciety are showcasing, by building systems that facilitate and display expert reviews and curated lists.

Source: eLife


The STM Mentorship Program

It is disappointing that, with 145 publisher members of STM, across 21 countries, only 20 companies are involved in the mentoring scheme. Furthermore, in the last five years, out of 274 mentors in total, only 17 have participated more than once. Another potential concern is that the majority of participants — both mentors and mentees — are from editorial departments. We aim to change this. We want to increase our pool of mentors and widen the program to include sales, marketing, production, technology, and HR; this would be of benefit to the whole profession and is something we strongly support. Given the feedback from this year’s mentors, where 70% of respondents said they would be interested in participating again, we are confident new mentors will find this program fulfilling.

Source: The Scholarly Kitchen


Why do journals engage with preprints? We talked to editors and this is what they told us

As part of our activities in the program, some of us are interested in discovering the various perspectives that journal editors have on preprints. Specifically, what they like about preprints, whether they pose any challenges for editors and how journal policies evolved over time to accommodate preprints. We interviewed several editors to ask about their experience with preprints at their journal. In this inaugural post, we summarize what we have learnt about journals’ motivations to engage with preprints, and share insights from our conversations with Drs. Alejandra Clark (Editor at PLOS ONE), Beth Osia (Postdoc at City of Hope, California, Preprint solicitation team, formerly at Proceedings B and now at Open Biology) and Mario Malički (Editor-in-Chief at Research Integrity and Peer Review).

Source: ASAPBio


If you managed to get this far, well done and thank you for reading until the end.

Until next week...

James

Journalology

The Journalology newsletter helps editors and publishing professionals keep up to date with scholarly publishing, and guides them on how to build influential scholarly journals.

Read more from Journalology

Subscribe to newsletter Hello fellow journalologists, On July 1 the new NIH open access policy will kick in, mandating the deposition of the author-accepted manuscript with no embargo. It’s not clear at this point how the major publishers will respond; there haven’t been any formal announcements, as far as I can tell. Perhaps they’re waiting until July 1 to update their policy pages. The best resource on publishers’ policies regarding green OA that I’ve found is hosted by the PennState...

Subscribe to newsletter Hello fellow journalologists, On Thursday I wrote to explain that (1) Journalology will soon become a paid subscription product and (2) I will start a separate, shorter, newsletter, called The Jist, that will be free to read. If you missed that announcement, you can catch up here. I received many replies to Thursday’s email, probably more than to all of the 120 newsletters combined. The responses were incredibly supportive and generally fell into two camps: “Good for...

Hello fellow journalologists, On Sunday some of you will receive the Journalology newsletter for the last time. Let me explain. Starting next week (i.e. in ~10 days’ time) there will be two versions of this newsletter: A paid newsletter (called Journalology) that follows a similar format to what you’ve been receiving up until now, containing links to all the news and opinion alongside my thoughts on what the stories mean (JB: you know, commentary like this). A free newsletter (called The...