Bad science is perpetuated through the literature

What Happens When a Research Article is Retracted: Editors, Research Integrity, and the Cooperative Ethics Working Group (COPE)

It’s important that results aren’t spread in the scientific literature, even if they do find their way into a published paper. They don’t want to base their reasoning on false premises. In the same way that many people wouldn’t accept a medical treatment bolstered by shaky clinical trials, the scientific community doesn’t want researchers, the public and, increasingly, artificial intelligence (AI) to rely on erroneous data or conclusions from retracted articles.

Since my colleagues and I reported that tortured phrases had marred the literature2, publishers — and not just those deemed predatory — have been retracting hundreds of articles as a result. Springer Nature alone, for example, has retracted more than 300 articles featuring nonsensical text (see go.nature.com/3ytezsw).

Overall, to facilitate all these steps, publishers must update their practices and attribute more resources to both editorial and research-integrity teams.

What happens to the papers that cite retracted research is something that’s usually overlooked. A Nature paper on stem cells was pulled back because of concerns about the reliability of the data after 22 years after its publication. Of course, an article lists references for a variety of reasons, such as to provide context, to introduce related work or to explain the experimental protocol. There is no guarantee that the papers that used the retracted article are still reliable. At a minimum, researchers should be aware of any retractions among the studies that they have cited. This would enable them to assess potential negative effects on their own work, and to mention the relevant caveats clearly should they continue to cite the retracted paper in the future. But, as far as I know, no systematic process is in place for publishers to alert citing scholars when an article is retracted. There should be.

Changes in how COPE operates may benefit readers, as well as updating its guidance. When we requested assistance from COPE to resolve integrity concerns that we felt were inadequately addressed by journals and publishers, the responses were often slow, confusing, and not helpful. Some correspondence received no reply, and some responses acknowledged problems that were not followed up on without further prompting.

The guidelines for the Committee on Publication Ethics stated they should identify retracted articles “unmistakably”. Most publishers edit the article PDF file to include a ‘Retracted’ watermarked banner. This caveat won’t be included in any copy downloaded before a retraction took place.

To help independent tools such as the Feet of Clay Detector to harvest data on the current status of articles in their journals, all publishers should publicly release the reference metadata of their entire catalogue.

Here we argue that updating and improving COPE guidance and flowcharts (see Supplementary information, Fig. S1, for example) could make the process of checking publication integrity more efficient and effective. Five ways are proposed to do it.

Editors, referees and co-authors are warned against using unnatural phrases in their articles. They are able to expose text that has been created by artificial intelligence, or by using a translation tool to make it seem as if the text is from another country.

The other is the Feet of Clay Detector, which serves to quickly spot those articles that cite annulled papers in their reference lists (see go.nature.com/3ysnj8f). I have added online PubPeer comments to more than 1,700 such articles to prompt readers to assess the reliability of the references.

Authors should check for any post-publication criticism or retraction when using a study, and certainly before including a reference in a manuscript draft.

How to Report Critical Phenomena to the Editorial Team of a Publication: The Case of PubPeer and the Publishing Industry

Two PubPeer extensions are instrumental. One plug-in automatically flags any paper that has received comments on PubPeer, which can include corrections and retractions, when readers skim through journal websites. The other works in the reference manager Zotero to identify the same articles in a user’s digital library. For local copies of downloaded PDFs, the publishing industry uses Crossmark: readers can click on the Crossmark button to check the status of the article on the landing page at the publisher’s website.

The articles with thesetortured phrases are slow to be investigated or corrected. As of 20 August, the Problematic Paper Screener that I launched in 2021 had flagged more than 16,000 papers citing 5 or more such tortured phrases — only 18% of which have been retracted (see go.nature.com/3mbey8m).

In February 2021, I launched the Problematic Paper Screener (PPS; see go.nature.com/473vsgb). In the past, the software flagged the randomly generated text. It now tracks a variety of issues to alert the scientific community to potential errors.

There are other ways to ask a question after a study has been published, for example on the PubPeer platform. As of 20 August, 191,463 articles have received comments on PubPeer — nearly all of which were critical (see https://pubpeer.com/recent). But publishers typically don’t monitor these, and the authors of a criticized paper aren’t obliged to respond. It is common for post-publication comments, including those from eminent researchers in the field, to raise potentially important issues that go unacknowledged by the authors and the publishing journal.

And journals are notoriously slow. The process requires journal staff to mediate a conversation between all parties — a discussion that authors of the criticized paper are typically reluctant to engage in and which sometimes involves extra data and post-publication reviewers. Most investigations can take months or years before the outcome is made public.

Scientists who find a paper suspicious can reach the editorial team of the journal where it was published to flag it. But it can be difficult to find out how to raise concerns, and who with. Furthermore, this process is typically not anonymous and, depending on the power dynamics at play, some researchers might be unwilling or unable to enter these conversations.

Unscrupulous businesses known as paper mills have arisen that benefit from this system. They produce manuscripts based on made-up, manipulated or plagiarised data, sell those fake manuscripts as well as authorship and citations, and engineer the peer-review process.

A researcher with a high number of papers published and citations obtained can get invited to speak at conferences, review manuscripts, guest-edit special issues and join editorial boards. This can give more weight to an application and lead to more citations, all of which can lead to a high-profile career. Institutions generally seem happy to host scientists who publish a lot, are highly cited and attract funding.

Article retractions have been growing steadily over the past few decades, soaring to a record-breaking figure of nearly 14,000 last year, compared with less than 1,000 per year before 2009 (see go.nature.com/3azcxan and go.nature.com/3x9uxfn).

The Last Year of Clinical-Trial Reports: The Role of Editors and Publications in Predicting Clinical Guidelines and Recommendations

There were alarm bells ringing in 2020, when a systematic review of clinical-trial reports began. Out of 153 reports submitted to a single journal, Anaesthesia, 44% were found to contain false data1. The high proportion shows a wealth of dubious publications in the public domain. Such papers can harm clinical practice and skew the direction of future research, clinical guidelines and systematic reviews.

Delays are caused by the recommended time frames for contacting institutions. Take plagiarism, for instance — if neither author nor institution responds to concerns, a COPE flowchart recommends that journals “keep contacting the institution every 3–6 months”, potentially creating an endless loop. But institutions often struggle to provide timely, objective and expert assessments6, and focus heavily on perceived misconduct by their employees. They might be concerned mainly with protecting their reputations.

In our opinion, this lack of clarity can again allow publishers to pass the buck. For example, when the editorial board of the Journal of Bone and Mineral Metabolism recommended retracting 11 publications in 2020, the publisher, Springer Nature, refused to do so (go.nature.com/4fg9xes), on the basis that the university of the author in question had not investigated the matter, as was required under COPE guidelines. (Nature is editorially independent from its publisher.)

The recommendations should include that key study information is collected and verified at the manuscript-submission stage. Staff members could check details of ethical oversight, as well as the location, research infrastructure, funding and more, before peer review. The number of papers needing assessment will be reduced by these checks.

We believe that COPE should be careful to avoid accusative language. The need for journals and publishers to contact institutions before making decisions on editorial action will be lessened by rewarding reliability in the study.

Critics might argue that this change of focus would make it less likely that misconduct would be exposed. But that need not be the case — after identifying integrity issues and correcting the literature, the reasons for compromised integrity can be examined.

And although COPE can apply sanctions to its members by revoking membership, to our knowledge it has never done so. The use of sanctions, and making that action public, would probably improve performance of journals and publishers.

Publishers could argue that the recommendations would be a lot of work. But just because it might seem daunting, doesn’t mean it’s not worth doing. The publishing industry is extremely profitable, and some of those profits should be invested in quality control.

Previous post The Telegram CEO is being questioned about sexual abuse images and drugs
Next post Someday, your entire life will be chronicled by this Wearable Artificial Intelligence Notetaker