ChatGPT, an artificial intelligence model developed by OpenAI, was named along with 11 other researchers from medical startup Ansible Healt. co-author of a scientific article which analyzes the capabilities of artificial intelligence itself for the US Medical Licensing Exam (USMLE). A measure that did not convince many professionals and users and was even rejected by Springer Nature, one of the largest academic publishers.

Artificial intelligence OpenAI, in fact, He has repeatedly acted as a co-author of some articles published in the journal Nature.. One of them, with an error attributed to a person. Even the media like CNET used ChatGPT to write articles; many with many bugs, which made us question its use as a tool for automating work.

Springer Nature, we repeat, does not agree that ChatGPT is credited as the author of the article. They don’t have a problem, yeah, help write a study. Jack Poe, CEO of Ansible Health, however, highlighted Futurism that “adding ChatGPT as a collaborator was definitely a deliberate move.” In addition, this is something that they have been assessing for some time, the researcher emphasizes. “The reason we listed him as the author was because we believe that he actually made an intellectual contribution to the content of the article, and not just as a subject for evaluation.”

Poe, however, specifies that ChatGPT was not part of “the prevailing scientific rigor and intellectual contribution”.. He states that he contributed in the same way as the “average author”, but expects ChatGPT and other similar models to be used throughout the work. Including knowledge.

The fact that ChatGPT appears as the author of a scientific article does not convince many.

However, the Ansible Health CEO’s decision to include ChatGPT as a contributor did not sit well with many users and experts who arrived claim it is a “profoundly stupid” measure. Mainly because the language model of these characteristics “cannot have a moral obligation on the author”. Others specify that “if a person creates or contributes results for an article”, they can be called a co-author. But this, however, is not something that extends “to models or algorithms”.

ChatGPT also has a flaw that has been reflected in the dozens of articles it has published. CNET during the last weeks. It is not always perfect and allows many errors in calculations due to incorrect processing of information.


Source: Hiper Textual

Previous articleAll parts of “Harry Potter” and “Fantastic Creations” will be available in Russia from February 1
Next articleLost Dog, an unknown film based on a true story that is taking over Netflix

LEAVE A REPLY

Please enter your comment!
Please enter your name here