The increase in medical publications reflects the constant progress in health research. This has increased our knowledge, stimulated innovation, and improved patient care worldwide. In recent years, artificial intelligence (AI) has gone from a niche field to a rapidly expanding industry, revolutionizing everything from healthcare to finance and now even creative writing. Among these advances, OpenAI’s ChatGPT stands out as a pioneering AI generator of prose, demonstrating the potential of machine-generated content to reshape how we view creativity, storytelling, and authorship. But what does it mean for AI like ChatGPT to be considered an “author”? What are the implications of this technology for writers, readers, and the publishing industry as a whole?
For many, the idea of an AI “author” may seem paradoxical. Authorship is traditionally associated with personal experience, imagination, and emotional depth. In Respiration, we define authorship as it is outlined by the ICMJE [1]. However, generative AI (GenAI) challenges this notion by producing content that, while lacking true consciousness, often resonates with readers. By training on rich and varied literary content, it learns structures, tropes, and stylistic nuances, which it then weaves into original compositions. GenAI can be given a prompt such as “Write a story about the journal Respiration,” and although the AI has no personal experience, it can draw on vast amounts of human-generated content to simulate these elements convincingly.
Where is GenAI used? It can basically be used to create first drafts of medical articles or as a tool for authors. It is important to note that GenAI does not have an in-depth understanding of specific medical specialties, and its output should not be relied upon as definitive, accurate medical information. Medical articles require precise and accurate information, which should be provided by a medical professional.
However, GenAI can be used as a tool to brainstorm, formulate text, or help structure articles. It is important to emphasize that GenAI is not capable of writing reliable medical articles, as it has no specific qualifications or in-depth understanding of medical issues. It can provide general information, but the accuracy and correctness of such information cannot be guaranteed.
While the potential of GenAI as an author’s assistant is exciting, it also raises several ethical and philosophical concerns. One of the main issues is that of authorship and intellectual property. Who owns the copyright in AI-generated content? Legal systems around the world are currently grappling with this question, and there is no single answer. Some jurisdictions do not consider AI-generated works to be copyrightable, arguing that only humans can hold intellectual property rights.
Another key ethical issue is transparency. Should readers be informed if a piece of content is AI-generated? In journalism, transparency is particularly important, as readers may assume that an article reflects a human perspective, complete with intuition, moral considerations, and context that AI cannot provide. For Respiration, the policy is clear – use of all GenAI-based tools must be clearly declared in either the Methods or Acknowledgments.
There are also concerns about the potential for AI-generated misinformation. Given its lack of understanding of the real world, GenAI may inadvertently generate false or misleading information if its training data are biased or incomplete.
And like all AI, GenAI has limitations in terms of context, knowledge, and originality. GenAI lacks an understanding of the real world, so it can sometimes produce content that sounds logical but is factually incorrect. It may unintentionally generate biased or inappropriate content based on its training data. Also, GenAI’s responses are usually derived from patterns in the training data, making it difficult for it to generate highly original content. For creative work, it is best used to generate options rather than final pieces. GenAI can produce text quickly, but it is important not to sacrifice quality for speed. However, GenAI generates text based on existing data patterns, so all factual information must be verified. Authors should be aware that AI does not create original thought but rather recombines learned information. Double-checking sources is always necessary to avoid unintentional plagiarism.
Managing GenAI as an author’s assistant requires understanding both its strengths and limitations, while maintaining the unique voice and perspective of human authors. By defining clear roles for GenAI, setting ethical standards, and continually refining our approaches, it can be used effectively as a creative tool that enhances your work without overshadowing it.
In this new era of AI-assisted writing, the goal is not to replace human creativity, but to complement it – using AI to explore new ideas, overcome obstacles, and streamline workflows. Used carefully, GenAI can be a powerful partner in the creative process, helping writers and creators reach new heights in storytelling, content creation, and artistic expression.
How the use of GenAI should be presented in publications is an evolving issue. While Respiration’s policy is clear, rules of engagement need to be established.