ChatGPT took the world by storm when it was publicly released by OpenAI on 30 November 2022. More than a year later—and many generative AI (genAI) chatbots since—ChatGPT remains one of the most popular genAI chatbots among researchers, educators and students, even as debates surrounding its ethical, technical, and pedagogical limitations and controversies continue.
An artificial intelligence (AI) chatbot trained based on large language models (LLMs), ChatGPT is known for its ability to answer questions and assist with tasks from research to academic writing and teaching. Indeed, ChatGPT has the potential to transform healthcare education and research—when used right and with a clear understanding of its limitations. To comprehend the opportunities and constraints ChatGPT presents in these domains, Associate Professor Shefaly Shorey, Alice Lee Centre for Nursing Studies (NUS Nursing), National University of Singapore Yong Loo Lin School of Medicine, and team completed a scoping review on relevant literature recently.
A/Prof Shorey said, “The motivation for doing this review is simple. Everyone has something to say about ChatGPT. But how is it relevant to us as educators and researchers—does it help us, or does it threaten our work? More importantly, how can we tap into ChatGPT’s potential effectively and ethically? By examining and consolidating literature, both supportive and opposing, we offer an objective perspective that allows readers to decide for themselves. On a personal level, I have also gained great insights into ChatGPT as a tool that helps with my research and teaching work.”
ChatGPT as a Researcher
ChatGPT is known for its ability to extract and analyse vast amounts of data quickly, efficiently, and accurately. “In my research, I typically conduct many qualitative interviews—so it is common that I’ll end up with about 30 to 40 transcripts. Even for experienced researchers like me, it can be quite time consuming to review and analyse everything—so you can imagine how overwhelming it is for researchers who are new to qualitative analysis,” A/Prof Shorey said.
She added, “Although there is full-featured software to help with that, they are usually available only on a subscription basis and with minimum lock-in period. In comparison, ChatGPT’s features are basic but free. I could foresee days are not far when researchers can use ChatGPT or its advanced features to analyse data—provided confidentiality and anonymity can be maintained. This will be a game changer for researchers from under-resourced institutions and regions to contribute to research and knowledge equally, which might be limited behind the paywalls currently.”
ChatGPT as a Co-Author for Research Papers
It is one thing to use ChatGPT to help with data analysis—but another to list it as a co-author in research papers. In January 2023, the Elsevier Journal Nurse Education in Practice was caught in the crossfire between publishers, editors and researchers when ChatGPT was recognised as a co-author. Following the controversy, Nurse Education in Practice published a corrigendum to remove ChatGPT. Other prominent publishers and leading journals such as Nature and Science, as well as the International Committee of Medical Journal Editors have also updated their guidelines to state that ChatGPT cannot be listed as an author.
“NUS Nursing is aligned in this regard with the rest of the research world. We make it very clear to our students that they can use ChatGPT to help them with their knowledge acquisition or idea generation. However, they cannot submit information generated by ChatGPT as their own. Acknowledgment and details on how ChatGPT is used in the assignments are required to avoid academic dishonesty—we are in the middle of revising our assessment guidelines to tandardi this,” A/Prof Shorey said.
“ChatGPT, and all the other LLMs, is like a Pandora’s box—and they are not going away. Therefore, rather than trying to ignore the possible issues—or worse still, go against the growing trend, it would be a better approach to tackle them directly by establishing clear guidelines. Although there are now at least 20 software out there that can check whether a piece of work is tandardize or from ChatGPT, nothing beats putting expectations upfront and taking it as an opportunity to educate our students the importance of academic honesty, integrity—and to be responsible for their actions.”
ChatGPT as an Unconventional Teacher
ChatGPT’s capacity to generate personalised educational content according to each student’s individual needs, learning style, and pace offers amazing opportunities to a varied group of learners. Sharing her experience in leveraging ChatGPT for her classes, A/Prof Shorey said, “At the moment, I am teaching the honours programme. For one of the topics—how to write the methods section for primary research—it is very factual and non-dynamic, so I got ChatGPT to summarise my slides into a simplified version and gave it to the students under my advisory to go through the content on their own. Then during consultations, we could focus on addressing parts they were unclear about.”
She continued, “I am confident that this would have a positive outcome on students’ learning. WhileChatGPT can help students to understand basic concepts around research, it is still very much up to the students to apply the learned concepts to their specific research topics and find solutions to their proposed research question. ChatGPT may not be able to provide ready-made answers or help with specific questions pertaining to the selected research topics. The application of learned knowledge must be mastered by the students.”
Nonetheless, as ChatGPT becomes increasingly integrated into healthcare education, it calls for a reevaluation of assessment methodologies, with a shift towards evaluating critical and problem-based thinking competencies to mitigate the potential for academic dishonesty and plagiarism. “Indeed, we have and will continue to take steps to change our assessment. For topics where we want to see students’ writing skills or their ability to solve problems, we may request for them to come to classes and write them or complete some tasks in person,” said A/Prof Shorey.
“For example, I used to teach this module on effective communication. Students had to write a reflective essay sharing their experience communicating with some leaders as part of the assessment. Instead of getting them to write an essay, I get students to role play on a tandardized patient to showcase their effective communication and problem solving skills. Such redesigns of assessments will enable us to continue to adapt, embrace, and evolve with emerging technologies.”
ChatGPT as a Personal Administrative Assistant
The usefulness of ChatGPT is not limited to research and education as A/Prof Shorey found out recently. She said, “I often receive emails from students requesting for help to write a referee report. Usually, I try and oblige because I know it means a lot to them. However, crafting these letters can easily take an hour or two. After enlisting ChatGPT’s help, I have managed to reduce the amount of time spent on each letter to no more than a few minutes.”
“Of course, ChatGPT doesn’t know my students personally—so I’ll need to input relevant keywords around their achievements and accolades, which would then write the letter. I would always validate the output against what I know and personalise it accordingly to ensure that it is relevant and meaningful for the students.”
Summing up, she said, “But as exciting as the potential benefits of ChatGPT are, I am always mindful that I need to use it responsibly. For example, I am careful with the information I input about my students—and any raw research data. They are always de-identified to protect the privacy of my students and interviewees—and I caution the people around me with this same advice.”
“ChatGPT is a useful tool—for helping healthcare educators and researchers improve our productivity and free us from mundane and time-consuming tasks. But at the end of the day, we should always use it cautiously and validate information from ChatGPT against other reliable sources—and more importantly our human brain and senses—because AI hallucinations are real.”
– Dr Shefaly Shorey, Associate Professor, NUS Nursing