
Amongst healthcare professionals, nurses typically have the most direct patient contact in hospital wards1—putting them in a prime position to spot subtle signs of clinical deterioration and prevent adverse outcomes. Professor Liaw Sok Ying, Head of Alice Lee Centre for Nursing Studies (NUS Nursing), National University of Singapore Yong Loo Lin School of Medicine, is now using artificial intelligence (AI) to give students a learning boost.
For Prof Liaw, understanding the importance of detecting signs of clinical deterioration early came from personal experience. Having worked as a nurse in the general wards, she experienced firsthand how important early interventions were before conditions worsened.
She said, “For example, majority of cardiopulmonary arrests are preceded by early warning signs. Unfortunately, these signs frequently go undetected. Not all cardiopulmonary resuscitations are successful. Even for those who survive, many suffer from chronic health problems for the rest of their lives.”
The Case for Learning to Tackle Clinical Deterioration
With experience, Prof Liaw started to spot early signs of deterioration and escalate cases to the doctors’ attention before conditions worsened—preventing patient collapse. Realising that this was a skill that could be taught and trained, and knowing that novice nurses are often required—and expected—to attend to patient deterioration with the same competency as experienced nurses2, Prof Liaw and her team set out to explore ways to hone student nurses’ ability to spot signs of clinical deterioration.
Prof Liaw’s research projects such as Rescuing A Patient In Deteriorating Situations (RAPIDS)—a simulation programme for Nursing students to learn skills for recognising and managing clinical deterioration—further validated the teachability of the skills required for detecting clinical deterioration, as well as the use of simulation programmes.
However, Prof Liaw observed that it is not enough to know how to detect clinical deterioration early. “When I was working in the wards, I realised that what also mattered was how you communicated with the doctor-in-charge—if you can’t communicate clearly and effectively, the patient might still not receive timely intervention.”
She continued, “Not to mention, our students are going to work in interprofessional settings when they graduate. It’s crucial that our simulation training reflects this reality and trains our graduates to work alongside other healthcare professionals like doctors.”
In response, Prof Liaw and her team developed a multiuser virtual reality platform, Create Real-time Experience And Teamwork In Virtual Environment (CREATIVE), which was found to be as effective as live simulation in improving Nursing students’ performance in assessing and managing clinical deterioration as well as teamwork. With CREATIVE, the team could provide interprofessional simulation training for all medical and Nursing students at NUS.
Yet, scheduling could be a challenge, especially when the Nursing and Medicine curricula run on different timetables. Furthermore, a problem the team found when evaluating the scalability of interprofessional virtual reality simulation training3 was—“There are simply not enough medical students to train with our Nursing students,” Prof Liaw explained. “That was how we came up with the idea of an ‘AI doctor’ to stand in during virtual simulations—allowing our Nursing students to practise at their own time and pace.”
The Case for Leveraging AI-powered Simulation
The AI-enabled virtual reality simulation (AI-enabled VRS) that Prof Liaw and her team developed featured an immersive virtual hospital environment, with an AI-powered doctor, non-playable characters (patient) and a nurse avatar controlled by the students. Conditions signalling impending deterioration, such as breathlessness and altered consciousness, were also included, enabling students to practise the ABCDE (Airway, Breathing, Circulation, Disability, Expose) assessment framework.
“We designed the training in the AI-enabled VRS to be comprehensive. In addition to learning about spotting and managing clinical deterioration, students could also practise strategies for communicating the patient’s condition to doctors,” Prof Liaw shared. “They can interact with the AI-powered doctor character, which is not only represented by a realistic 3D avatar, but also able to generate voice responses and non-verbal gestures in response to the Nursing students’ input.”
For their AI-enabled VRS investigation, the team evaluated the effectiveness of an AI-powered doctor versus human-controlled doctor when it came to training Nursing students for sepsis care and interprofessional communication. While the AI-powered group obtained higher sepsis post-test knowledge scores, the human-controlled group reported a significantly higher level of self-efficacy in interprofessional communication. No significant differences were found in sepsis care performance between groups, suggesting that AI-powered doctors are not inferior to human-controlled virtual reality simulations.
The Case for Upgrading AI-enabled Simulation
To facilitate graduating Nursing students’ transition to clinical practice, AI-enabled VRS was integrated into a simulation-based education programme. More deteriorating scenarios—including hypoglycaemia and hypovolemic shock—were developed for students to practise their skills. Hearteningly, students’ evaluation revealed encouraging outcomes for the use and acceptance of AI-enabled VRS training.
Students recognised the benefits of AI-enabled VRS in preparing them for clinical practice, reflecting a willingness to use AI-enabled VRS. In particular, almost three quarters agreed with the statement, “I think AI-enabled VRS is useful to me”. They also appreciated its flexibility in exposing them to a wide range of clinical scenarios that realistically depict clinical deterioration.
At the same time, the team has also identified areas for improvement—especially the technical aspects. Students reported facing technical difficulties such as bugs and lags while using the software; as well as reflected the desire for a more natural and authentic experience while interacting with the AI-powered doctor.
“We used Google Cloud’s Dialogflow engine for the study, which offered limited human-AI interactions,” Prof Liaw said. “But now that we’ve heard the students’ feedback, we plan to make our AI-powered doctors more intuitive and lifelike to enhance the learning experience and impact. Developments in generative AI (genAI) have been promising—and we hope to tap on that in future iterations for a more interactive experience.”
Also next on the cards is leveraging multi-genAI (including AI learning assistant, AI doctor and AI facilitator) in virtual reality simulation, which Prof Liaw and her PhD student are currently working on to enhance Nursing students’ clinical reasoning abilities in recognising and responding to clinical deterioration.

