Putting Research Integrity Checks Where They Belong – The Scholarly Kitchen

[ad_1]

Every research article submitted to a journal should come with a digital certificate validating that the authors’ institution(s) has completed a series of checks to ensure research integrity.

Journals should not hold primary responsibility for detecting, correcting, and punishing authors for inappropriate behavior.

In recent months, several “scandals” have rocked trust and confidence in journals. Thousands of Hindawi papers were retracted because they most likely came from papermills. Wiley announced that their new papermill detection flagged up to 1 in 7 papers submitted to the 270 former Hindawi titles.

Frontiers published ridiculous AI generated figures. And a handful of journals from various publishers were caught publishing papers with obvious AI LLM chatbot text included.

These incidents led to mainstream press articles questioning the value of journals and SCIENCE. I’ve yet to see one that questions the value of institutions or funding of science.

There are consequences for journals that don’t seem to care about research integrity, no matter what their corporate mission statements claim. There are mechanisms for dealing with those journals — they lose indexing in important compendia, they lose their Impact Factor, they lose out reputationally and submissions drop dramatically, at least for a while. They go on lists at institutions or national funders as being restricted.

Journals have positioned themselves as being a trusted source with peer review and some level of validation of scholarship.

Journals have been increasingly expected to explain their value and for the vast majority of serious journals, rigorous peer review has been the answer along with community curation. However, forensic analysis of data sets and gel stains was never an expected task of traditional peer review. And yet, today, journal staff may be performing any number of checks including plagiarism scans, figure analysis, identity checks on claimed authors and reviewers, and at least following the data links supplied to see if data was deposited as may have been required. Now we will start having to add in papermill checks — are all the named authors real people, are they at the institutions listed on the paper, do those institutions exist, are all of the authors working in the same field, have they collaborated before, are there “tortured phrases” littered throughout the paper?

And AI detection tools, for which none have risen to the top as accurate, scalable, or integrated in any manuscript tracking systems, is a new frontier awaiting exploration by journal offices.

For every score or report on every automated integrity check, a person needs to review and decide what to do: just reject the paper based on the score or go back to the author for an explanation and if acceptable, work with them to fix the problem?

For any journal (or maybe suite of journals at a society), dealing with ethics issues requires significant staff time. Any one of these issues could be an honest error by an inexperienced author. In those cases, a journal may want to work with them to get the issue fixed and continue the paper down its peer review path. Other times, it’s bad behavior that needs to be addressed.

If a paper that fails the integrity checks is rejected, there is a good chance it will show up at another journal, wasting the time of yet another journal staffer.

These additional checks are coming at a time when the review of papers submitted to journals is expected to be fast and inexpensive and yet none of the processes above are either fast or inexpensive. And the number of papers submitted to journals is mostly increasing — though that is not the case in every discipline.

Conducting integrity checks on papers is also a Sisyphean task with little reward. The vast, vast majority of papers submitted to the vast majority of journals are written by ethical and responsible researchers. Despite the sensational headlines decrying almost 10,000 retractions in 2023, about 8,000 of them were from Hindawi journals. Context is everything.

If we remove the journals or publishers that are not actually conducting peer review (or do conduct peer review but then ignore the reviewer comments and accept the papers anyway), the number of papers with serious ethical issues is low. And yet, every publishing conference this year, last year, and next year will spend significant amounts of time addressing research integrity issues. An equal amount of time will be spent attending demos of new tools built to detect research integrity issues.

Despite the relatively low number of incidents, not checking every accepted paper puts a journal at risk of missing something and winding up on the front pages of Retraction Watch or STAT News. This is not where we want to be and it opens you up to a firestorm of criticism — your peer review stinks, you don’t add any value, you are littering the scientific literature with garbage, you are taking too long to retract or correct, etc.

The bottom line is that journals are not equipped with their volunteer editors and reviewers, and non-subject matter expert staff to police the world’s scientific enterprise.

Some have called for journals to simply retract or publish an expression of concern if questions are raised about published papers and force the institutions to conduct an investigation.

Every time a managing editor has to send a paper to an institution for investigation, a small piece of their soul goes dark. Maybe if all your papers come from US R1 research institutions you will at least be able to identify who to whom the email should be sent. I have personally spent hours hitting translate in the web browser on institution web pages searching for anything that might look like an integrity officer. Usually the best you can find is a dean without an email publicly listed.

Every time a managing editor has to send a paper to an institution for investigation, a small piece of their soul goes dark.

But large well-funded institutions are not off the hook. Their reviews and investigations are slow and not at all transparent.

There are obvious reasons not to trust an institution with policing their own research outputs; however, the use of third-party tools would help mitigate those concerns.

A solution to the problem is for institutions to take responsibility for conducting integrity checks and providing validation to the journals. The many fine companies that are trying to sell publishers the expensive technology solutions should be trying to sell enterprise solutions to institutions.

Might there be a middle ground? The technology tools could be available to individual authors that then have to get the validation and submit it with their paper. This will come with a fee.

I don’t see how journals needing to employ more and more integrity checks and human review of the results is sustainable. As “cheating the system” becomes exponentially easier with the AI tools already at our fingertips, the constant public shaming of journals for not catching issues will continue to erode trust, not only in journals, but also science.

And this is why backing the integrity review up in the timeline is crucial. Trust in science is low, like really low. The US is one election away from potentially losing most science funding. In corners of the universe, what is true is no longer relevant and many lies are believed as fact.

Instead of hoping that strapped journal offices and volunteers find the bad papers before they are published and instead of blaming the journal when one slips through, maybe the institutions — the employers of the researchers — have a significant role to play in ensuring the scientific record is clean from the start.

I welcome continuing discussion on where efforts to ensure research integrity are most efficiently deployed.

[ad_2]

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

اغراء سكس pornolaw.net نسوانجى قصص مصوره sex videosfreedownloads hindipornsite.com gonzoxx احلى نيكة pornvideoswatch.net سكس حيوانات مع النساء xnxx hd hot video mom2fuck.mobi www sex new photo com xindianvidios 3porn.info www.xnxx telugu
seduced sex videos masturbationporntrends.com iporentv xxx12 orgyvids.info nude bhabi com bangla bf xxx tubeofporn.net malayalam bf video سكس اخوات عرب todayaraby.com سكسفلاحين nikitha hot tryporno.net www.fucking videos.com
dirty linen episode 1 bilibili pinoyteleseryechannel.com la vida lena january 17 2022 indian sexy xxx video pornstarslist.info jabardastisexvideo افلام جنسية امريكية esarabe.com نىك فى الحمام movirulz com pornvuku.com kolkata bengali sexy video elf yamada hentai hentaihq.org karami zakari