WSJ: Flood of Fake Science Forces Multiple Journal Closures

SNT Gatchaman

Senior Member (Voting Rights)
Staff member
Wall Street Journal | Archive

"Wiley to shutter 19 more journals, some tainted by fraud"

Fake studies have flooded the publishers of top scientific journals, leading to thousands of retractions and millions of dollars in lost revenue. The biggest hit has come to Wiley, a 217-year-old publisher based in Hoboken, N.J., which Tuesday will announce that it is closing 19 journals, some of which were infected by large-scale research fraud.

In the past two years, Wiley has retracted more than 11,300 papers that appeared compromised, according to a spokesperson, and closed four journals. It isn’t alone: At least two other publishers have retracted hundreds of suspect papers each. Several others have pulled smaller clusters of bad papers.

Although this large-scale fraud represents a small percentage of submissions to journals, it threatens the legitimacy of the nearly $30 billion academic publishing industry and the credibility of science as a whole.
 
For Wiley, which publishes more than 2,000 journals, the problem came to light two years ago, shortly after it paid nearly $300 million for Hindawi, a company founded in Egypt in 1997 that included about 250 journals. In 2022, a little more than a year after the purchase, scientists online noticed peculiarities in dozens of studies from journals in the Hindawi family.

Scientific papers typically include citations that acknowledge work that informed the research, but the suspect papers included lists of irrelevant references. Multiple papers included technical-sounding passages inserted midway through, what Bishop called an “AI gobbledygook sandwich.” Nearly identical contact emails in one cluster of studies were all registered to a university in China where few if any of the authors were based. It appeared that all came from the same source.

“The problem was much worse and much larger than anyone had realized,” said David Bimler, a retired psychology researcher in Wellington, New Zealand, who started a spreadsheet of suspect Hindawi studies, which grew to thousands of entries.

Cabanac and his colleagues realized that researchers who wanted to avoid plagiarism detectors had swapped out key scientific terms for synonyms from automatic text generators, leading to comically misfit phrases. “Breast cancer” became “bosom peril”; “fluid dynamics” became “gooey stream”; “artificial intelligence” became “counterfeit consciousness.”
 
Also discussed by Joanne Nova: So much for “peer review” — Wiley shuts down 19 science journals and retracts 11,000 gobbledygook papers

And what do we make of the flag to clamor ratio? Well, old fashioned scientists might call it ‘signal to noise’. The nonsense never ends.

A ‘random forest’ is not always the same thing as an ‘irregular backwoods’ or an ‘arbitrary timberland’ — especially if you’re writing a paper on machine learning and decision trees.

The most shocking thing is that no human brain even ran a late-night Friday-eye over the words before they passed the hallowed peer review and entered the sacred halls of scientific literature. Even a wine-soaked third year undergrad on work experience would surely have raised an eyebrow when local average energy became “territorial normal vitality”. And when a random value became an ‘irregular esteem’. Let me just generate some irregular esteem for you in Python?
 
A ‘random forest’ is not always the same thing as an ‘irregular backwoods’ or an ‘arbitrary timberland’ — especially if you’re writing a paper on machine learning and decision trees.
That's hilarious.

This whole things does explain some of the weirdness we have seen here. But, quality assurance systems? What systems?
 
That's hilarious.

This whole things does explain some of the weirdness we have seen here. But, quality assurance systems? What systems?
In hindsight, quality control for the main source of knowledge our civilization depends on being done as a voluntary process involving a few overworked peers who all share the same perverse incentives may not have been the best idea.

In foresight it was pretty obvious, but in hindsight it's also just as obvious.

Weird how this basically used to be a real argument: "is it peer-reviewed?" And then it turns out that it often means... nothing. It's not as arbitrary as being nominated for the Nobel peace prize, but it's basically no better.

Because for all that it's a legitimate argument that a lot of it is basically AI gobbledygook sandwich, sometimes derided as "stochastic parrots", things are even worse in health care, where you can find gobbledygook so complete that no AI trained after 2023 would ever come up with it, but it's basically praised and elevated because human bias can be even worse than AI gobbledygook sandwich.

In obviously fake papers like what's discussed here, some of the terms are complete nonsense, but in the case of most psychosomatic ideology, the words are legitimate, but no one involved understands them any better than the AI gobbledygook sandwich. And they use statistics, ironically how AIs work, but in an even worse way, doing many of the things that wreck AI models, like overfitting and feeding bad data. So while AI "stochastic parrots" are a problem, there is a much deeper problem where beloved nonsense that most AIs would flag as being nonsense makes it through humans.
 
I despair. It is so pervasive and behind every example is a corrupted human being acting with intent to deceive others.

There was an interesting thread today from Donald Robertson who writes about Stoic Philosophy & Philosophers. There is so much irony in other authors submitting manuscripts to him for review on this subject matter and they are flagrant in their violations of stoic virtues.

https://twitter.com/user/status/1795201843213033509
 
Sabine Hossenfelder recently made one of her shorts about her personal disillusionment with academia and why she quit for a career on YouTube. It is very telling and I am afraid it points to an academic career mill which leads to papers for the sake of papers which is apparent to me reading the plethora of papers on ME, none of which get to the point. It reminds me of the line from the poem "The Rime of the Ancient Mariner" by Samuel Taylor Coleridge.

Water, water, every where,
Nor any drop to drink.

"My dream died, and now I'm here", 'fessing up by Sabine Hossenfelder.
 
It is very telling and I am afraid it points to an academic career mill which leads to papers for the sake of papers which is apparent to me reading the plethora of papers on ME, none of which get to the point.


Which is why I’m not a big cheerleader for the “we need lots more publications” camp.

Having said that, I think even worse are the researchers that hype a lot of strong claims but rarely publish anything of note (looking at you Jared Y*unger and Nancy Kl*mas).
 
Back
Top Bottom