Chatbots, Generative AI, and Scholarly Manuscripts: WAME Recommendations on Chatbots and Generative Artificial Intelligence in Relation to Scholarly Publications Revised May 31, 2023

Authors

DOI:

https://doi.org/10.32412/pjohns.v38i1.2135

Keywords:

Chatbots, ChatGPT, Artificial Intelligence, Generative AI, Scholarly Publications

Abstract

Introduction

This statement revises our earlierWAME Recommendations on ChatGPT and Chatbots in Relation to Scholarly Publications(January 20, 2023). The revision reflects the proliferation of chatbots and their expanding use in scholarly publishing over the last few months, as well as emerging concerns regarding lack of authenticity of content when using chatbots. These Recommendations are intended to inform editors and help them develop policies for the use of chatbots in papers published in their journals. They aim to help authors and reviewers understand how best to attribute the use of chatbots in their work, and to address the need for all journal editors to have access to manuscript screening tools. In this rapidly evolving field, we will continue to modify these recommendations as the software and its applications develop.

     A chatbot is a tool “[d]riven by [artificial intelligence], automated rules, natural-language processing (NLP), and machine learning (ML)…[to] process data to deliver responses to requests of all kinds.”1 Artificial intelligence (AI) is “the ability of a digital computer or computer-controlled robot to perform tasks commonly associated with intelligent beings.”2

     “Generative modeling is an artificial intelligence technique that generates synthetic artifacts by analyzing training examples; learning their patterns and distribution; and then creating realistic facsimiles. Generative AI (GAI) uses generative modeling and advances in deep learning (DL) to produce diverse content at scale by utilizing existing media such as text, graphics, audio, and video.”3, 4

     Chatbots are activated by a plain-language instruction, or “prompt,” provided by the user. They generate responses using statistical and probability-based language models.5 This output has some characteristic properties. It is usually linguistically accurate and fluent but, to date, it is often compromised in various ways. For example, chatbot output currently carries the risk of including biases, distortions, irrelevancies, misrepresentations, and plagiarism many of which are caused by the algorithms governing its generation and heavily dependent on the contents of the materials used in its training. Consequently, there are concerns about the effects of chatbots on knowledge creation and dissemination – including their potential to spread and amplify mis- and disinformation6 – and their broader impact on jobs and the economy, as well as the health of individuals and populations. New legal issues have also arisen in connection with chatbots and generative AI.7

     Chatbots retain the information supplied to them, including content and prompts, and may use this information in future responses. Therefore, scholarly content that is generated or edited using AI would be retained and as a result, could potentially appear in future responses, further increasing the risk of inadvertent plagiarism on the part of the user and any future users of the technology. Anyone who needs to maintain confidentiality of a document, including authors, editors, and reviewers, should be aware of this issue before considering using chatbots to edit or generate work.9

     Chatbots and their applications illustrate the powerful possibilities of generative AI, as well as the risks. These Recommendations seek to suggest a workable approach to valid concerns about the use of chatbots in scholarly publishing.

Downloads

Download data is not yet available.

Downloads

Published

2023-06-04

How to Cite

1.
Zielinski C, Winker M, Aggarwal R, Ferris L, Heinemann M, Lapeña JF, Pai S, Ing E, Citrome L, Alam M, Voight M, Habibzadeh F. Chatbots, Generative AI, and Scholarly Manuscripts: WAME Recommendations on Chatbots and Generative Artificial Intelligence in Relation to Scholarly Publications Revised May 31, 2023. Philipp J Otolaryngol Head Neck Surg [Internet]. 2023 Jun. 4 [cited 2024 Apr. 27];38(1):7. Available from: https://pjohns.pso-hns.org/index.php/pjohns/article/view/2135

Most read articles by the same author(s)

1 2 > >>