ChatGPT is no longer allowed to (co-)write scientific studies

After researchers have already mentioned the artificial intelligence ChatGPT as a co-author in at least four articles, scientific journals are now putting an end to this practice. Because the chatbot cannot take responsibility for the content of the survey.
Author: Sabine Kuster / ch media

It would actually be very practical if you could use the artificial intelligence ChatGPT to write a scientific study. Because it is easy for them to collect a lot of data quickly. But how reliable are the written investigations?

As the news portal “Nature” now reports, four scientific studies had already designated ChatGPT as a co-author. But the editors of scientific journals and studies agree that artificial intelligence does not meet the criteria for authorship. The reason: You cannot take responsibility for the content.

“We wouldn’t allow an AI to appear in the byline of a study published by us,” Holden Thorp, editor-in-chief of the Science journal group in Washington, told Nature. AI-generated text that does not properly cite the sources is also not allowed. That could be considered plagiarism.

However, some editors believe that the fact that an AI was involved could be mentioned elsewhere.

But as researchers’ experiments have shown, ChatGPT is not at all suitable for writing highly technical studies. AI often makes statements that aren’t necessarily true, and asking the same question multiple times will get you different answers, notes Alex Zhavoronkov, CEO of Insilico Medicine, a Hong Kong-based pharmaceutical company.

Author: Sabine Kuster / ch media

Source: Blick

follow:
Ross

Ross

I am Ross William, a passionate and experienced news writer with more than four years of experience in the writing industry. I have been working as an author for 24 Instant News Reporters covering the Trending section. With a keen eye for detail, I am able to find stories that capture people's interest and help them stay informed.

Related Posts