Editors at Science Journal Resign En Masse Over Bad Use of AI, High Fees

Over the holiday weekend, all but one member of the editorial board of Elsevier’s Journal of Human Evolution (JHE) resigned “with heartfelt sadness and great regret,” according to Retraction Watch, which helpfully provided an online PDF of the editors’ full statement. It’s the 20th mass resignation from a science journal since 2023 over various points of contention, per Retraction Watch, many in response to controversial changes in the business models used by the scientific publishing industry.

“This has been an exceptionally painful decision for each of us,” the board members wrote in their statement. “The editors who have stewarded the journal over the past 38 years have invested immense time and energy in making JHE the leading journal in paleoanthropological research and have remained loyal and committed to the journal and our authors long after their terms ended. The [associate editors] have been equally loyal and committed. We all care deeply about the journal, our discipline, and our academic community; however, we find we can no longer work with Elsevier in good conscience.”

The editorial board cited several changes made over the last ten years that it believes are counter to the journal’s longstanding editorial principles. These included eliminating support for a copy editor and a special issues editor, leaving it to the editorial board to handle those duties. When the board expressed the need for a copy editor, Elsevier’s response, they said, was “to maintain that the editors should not be paying attention to language, grammar, readability, consistency, or accuracy of proper nomenclature or formatting.”

There is also a major restructuring of the editorial board underway that aims to reduce the number of associate editors by more than half, which “will result in fewer AEs handling far more papers, and on topics well outside their areas of expertise.”

Furthermore, there are plans to create a third-tier editorial board that functions largely in a figurehead capacity, after Elsevier “unilaterally took full control” of the board’s structure in 2023 by requiring all associate editors to renew their contracts annually—which the board believes undermines its editorial independence and integrity.

Worst Practices

In-house production has been reduced or outsourced, and in 2023 Elsevier began using AI during production without informing the board, resulting in many style and formatting errors as well as reversing versions of papers that had already been accepted and formatted by the editors. “This was highly embarrassing for the journal and resolution took six months and was achieved only through the persistent efforts of the editors,” the editors wrote. “AI processing continues to be used and regularly reformats submitted manuscripts to change meaning and formatting and require extensive author and editor oversight during proof stage.”

In addition, the author page charges for JHE are significantly higher than even Elsevier’s other for-profit journals, as well as broad-based open access journals like Scientific Reports. Not many of the journal’s authors can afford those fees, “which runs counter to the journal’s (and Elsevier’s) pledge of equality and inclusivity,” the editors wrote.

Elsevier has long had its share of vocal critics (including Ars Technica’s Chris Lee), and this latest development has added fuel to the fire. “Elsevier has, as usual, mismanaged the journal and done everything they could to maximize profit at the expense of quality,” biologist PZ Myers of the University of Minnesota Morris wrote on his blog Pharyngula. “In particular, they decided that human editors were too expensive, so they’re trying to do the job with AI. They also proposed cutting the pay for the editor-in-chief in half. Keep in mind that Elsevier charges authors a $3,990 processing fee for each submission. I guess they needed to improve the economics of their piratical mode of operation a little more.”

Elsevier has not yet responded to Ars’ request for comment.

Not All AI Uses Are Created Equal

John Hawks, an anthropologist at the University of Wisconsin–Madison, who has published 17 papers in JHE over his career, expressed his full support for the board members’ decision on his blog, along with shock at the (footnoted) revelation that Elsevier had introduced AI to its editorial process in 2023. “I’ve published four articles in the journal during the last two years, including one in press now, and if there was any notice to my coauthors or me about an AI production process, I don’t remember it,” he wrote, noting that the move violates the journal’s own AI policies. “Authors should be informed at the time of submission how AI will be used in their work. I would have submitted elsewhere if I was aware that AI would potentially be altering the meaning of the articles.”

There is certainly cause for concern when it comes to using AI in the pursuit of science. For instance, earlier this year, we witnessed the viral sensation of several egregiously bad AI-generated figures published in a peer-reviewed article in Frontiers, a reputable scientific journal. Scientists on social media expressed equal parts shock and ridicule at the images, one of which featured a rat with grotesquely large and bizarre genitals. The paper has since been retracted, but the incident reinforces a growing concern that AI will make published scientific research less trustworthy, even as it increases productivity.

That said, there are also some useful applications of AI in the scientific endeavor. For instance, in January 2024, the research publisher Science announced that all of its journals would begin using commercial software that automates the process of detecting improperly manipulated images. Perhaps that would have caught the rat genitalia figure, although as Ars science editor John Timmer pointed out at the time, the software has limitations. “While it will catch some of the most egregious cases of image manipulation, enterprising fraudsters can easily avoid being caught if they know how the software operates,” he wrote.

Hawks acknowledged on his blog that the use of AI by scientists and scientific journals is likely inevitable, and he recognizes the potential benefits. “I don’t think this is a dystopian future. But not all uses of machine learning are equal,” he wrote:

It’s bad for anyone to use AI to reduce or replace the scientific input and oversight of people in research—whether that input comes from researchers, editors, reviewers, or readers. It’s stupid for a company to use AI to divert experts’ effort into redundant rounds of proofreading, or to make disseminating scientific work more difficult.

In this case, Elsevier may have been aiming for good but instead hit the exacta of bad and stupid. It’s especially galling that they demand transparency from authors but do not provide transparency about their own processes … It would be a very good idea for authors of recent articles to make sure that they have posted a preprint somewhere, so that their original pre-AI version will be available for readers. As the editors lose access, corrections to published articles may become difficult or impossible.

The journal Nature published an article in March raising questions about the efficacy of mass resignations as an emerging form of protest after all the editors of the Wiley-published linguistics journal Syntax resigned in February. (Several of their concerns mirror those of the JHE editorial board.) Such moves certainly garner attention, but even former Syntax editor Klaus Abels of University College London told Nature that the objective of such mass resignations should be on moving beyond mere protest, focusing instead on establishing new independent nonprofit journals for the academic community that are open access and have high academic standards.

Abels and his former Syntax colleagues are in the process of doing just that, following the example of the former editors of Critical Public Health and another Elsevier journal, NeuroImage, last year.

This story originally appeared on Ars Technica.

Source : Wired