Medical students’ perception on the impact of mock Objective Structured Clinical Examination (OSCE) on final examination performance
Submitted: 30 October 2024
Accepted: 8 April 2025
Published online: 1 July, TAPS 2025, 10(3), 89-92
https://doi.org/10.29060/TAPS.2025-10-3/SC3562
Shanya Shanmugam1, Rajeswari Kathirvel1,2, Kayda Soh2 & Xinyi Li1,2
1Lee Kong Chian School of Medicine, Singapore; 2Division of Obstetrics and Gynaecology, KK Women’s and Children’s Hospital, Singapore
Abstract
Introduction: The Objective Structured Clinical Examination (OSCE) is a popular method for assessing medical students’ clinical proficiency. Mock OSCEs are often incorporated into medical curricula to help students familiarise themselves with the examination format. While the impact of mock OSCEs on academic performance has been studied, their perceived utility remains less explored. This study aimed to assess the effectiveness of a mock OSCE in preparing medical students for their final examinations.
Methods: A prospective study was conducted at a tertiary hospital in Singapore, involving medical students undergoing their Obstetrics and Gynaecology posting. The mock OSCE consisted of five stations and included immediate feedback from examiners. Students completed three questionnaires: pre-mock OSCE, post-mock OSCE and post final examinations, rating the utility of the session and their confidence levels.
Results: Of the cohort of 147 students, 121 responded to the pre-mock OSCE survey, 132 responded to the post-mock OSCE survey, and 105 to the survey after their final examinations. The percentage of students who found the mock OSCE useful/very useful increased significantly from 97.5% before to 98.5% after the session, and significantly decreased to 96.2% after the examinations. Confidence levels rose significantly from a mean score of 2.34/5 pre-mock to 3.89/5 post-mock, to 4.67/5 post-exam. Qualitative feedback was positive, highlighting the benefit of familiarisation with examination mark schemes.
Conclusion: The mock OSCE was well-received by students and perceived as a valuable tool in preparation for examinations. Despite the limited sample size, these findings support the implementation of mock OSCEs to enhance students’ learning and exam preparedness.
Keywords: OSCE, Undergraduate, Medical Education, Medicine, Students’ Perception
I. INTRODUCTION
The Objective Structured Clinical Examination (OSCE) is a well-established, widely utilised method for assessing the clinical proficiency of medical students. This format comprises multiple stations where students perform clinical tasks, including history taking, physical examination, and discussing clinical management. Despite being a crucial assessment tool, OSCEs can be a daunting experience for medical students. Educational institutions often incorporate mock OSCEs into curricula to alleviate this stress, allowing students to familiarise themselves with the format and receive feedback. These sessions aim to mimic the actual OSCE, allowing students to familiarise themselves with the format and requirements, and offer an opportunity to receive feedback.
The correlation between mock OSCE practice and performance in the actual OSCE has yielded mixed results in literature. Studies on medical students (Townsend et al., 2001), and residents from internal medicine (Pugh et al., 2016), paediatrics (Hilliard et al., 1998), and emergency medicine (Lee et al., 2021) have established positive correlations between mock OSCE scores and final examination scores. Conversely, a study on second-year medical students concluded that while the mock OSCE led to improved performance in individual stations, it did not cause a significant change in the pass rate of the final examinations (Chisnall et al., 2015).
While the impact of mock OSCEs on students’ academic performance has been examined, the perception of their utility among participants has not been extensively analysed. As OSCE is viewed as a stressful component of medical assessments, the introduction of mock OSCE sessions is thought to mitigate this stress by fostering confidence and familiarity among students (Chisnall et al., 2015).
We performed a study to determine the perceived effectiveness of a mock OSCE in preparing its participants for their final examinations. More specifically, the study aimed to assess whether participation in the mock OSCEs improved students’ understanding of the domain-based exam scoring system employed by markers, enhanced their time management skills, delivered valuable content, helped them develop effective approaches to OSCE stations, and boosted their confidence levels.
II. METHODS
This prospective study evaluated students’ perceptions of the mock OSCE’s utility. This was conducted at KK Women’s and Children’s Hospital (KKH), which is a tertiary hospital in Singapore. KKH caters to students from the three different medical schools in Singapore including Lee Kong Chian School of Medicine (LKC), Yong Loo Lin School of Medicine and Duke-NUS Medical school.
A mock OSCE programme was organised by the Obstetrics and Gynaecology (O&G) department for the fourth-year medical students from LKC during their O&G posting from 2022-2023. The mock OSCE was conducted three times as the students attended in three streams. This is the first of such a programme in the O&G curriculum and was introduced as this cohort had reduced clinical exposure due to COVID-19 restrictions during their clinical years.
The mock OSCE included five 10-minute stations covering history taking, physical examination, and clinical management in O&G. Students were evaluated according to the domain-based scoring system used by LKC for their final examinations. The assessment was designed to be formative in nature and the students received immediate feedback at the conclusion of each station.
The students were asked to complete anonymous questionnaires before and after participating in the mock OSCE, and after completing their final year examinations. A 4-point Likert scale was used to gauge the usefulness of the mock OSCE session, with 1 being “not useful at all” and 4 being “very useful” and a 5-point Likert scale was used to gauge confidence for final examinations, where 1 represented “not confident” and 5 represented “very confident”. The responses collected before and after the mock OSCE session were compared. The study also examined the impact of the mock OSCE on insight into exam scoring system, time management, and content relevance through Likert scale questions. Qualitative feedback was obtained through open-text responses. The data obtained through the Likert scales were combined into nominal categories. Statistical significance was determined by performing Chi-square tests, with p<0.05 being considered significant. Informed consent was implied through the voluntary participation of individuals in the questionnaires.
The study was approved and given exempt status by the Nanyang Technological University Institutional Review Board for research (IRB 2023-677).
III. RESULTS
There were 147 fourth-year medical students for the academic year 2022–2023. 121 students responded to the questionnaire prior to the mock OSCE (82.3%), 122 (83.0%) after the mock OSCE, and 105 (71.4%) after their final examinations.
Before the mock OSCE, 97 (80.8%) students believed that it would be very useful for their exam preparation, while 20 (16.7%) believed it would be useful, and 3 (2.5%) were unsure. After the mock OSCE, 120 (91.6%) students thought it was very useful for their exam preparation, with 9 (6.9%) believing it was useful, 1 (0.8%) being unsure, and 1 (0.8%) believing it was not useful. After the final examinations, 77 (74.8%) students found it to be very useful for their exam preparation, 22 (21.4%) found it was useful, 3 (2.9%) were unsure, and 1 (1.0%) found it was not useful (Figure 1). Overall, the percentage of students who found the mock OSCE useful/very useful increased from 97.5% before to 98.5% after the session and decreased to 96.2% after the examinations. The difference in perceived utility of the mock OSCE, as determined by the combining “very useful” and “useful” into one category and “unsure” and “not useful” into another, from before the OSCE, after the OSCE, and after the examinations, was statistically significant at p<0.05, with a p-value of 0.0147.

Figure 1. Comparison of perceived utility of mock OSCE
When asked to rate their confidence regarding their final examinations, the overall score improved from 2.34/5 before the mock OSCE to 3.89/5 after, to 4.67/5 after the examinations. This rise in confidence levels was statistically significant, with a p-value of <0.00001.
We asked the students to rate the domain that they found the mock OSCE helped them with the most, including improved insight into domain-based exam scoring system, improved time management, useful content, prepare an approach for OSCE stations, and improved confidence. Almost a third (n = 32, 30.5%) of the students found that the mock OSCE helped them to prepare an approach for OSCE stations the most. This was followed by improved confidence (28, 26.7%), improved insight into exam scoring system (21, 20.0%), useful content (15, 14.3%), and lastly, time management (9, 8.6%). Furthermore, a majority of students (n=70, 66.7%) felt that the O&G mock OSCE was helpful not only for the O&G component, but for the entire OSCE examination.
Qualitative feedback obtained from the students was largely positive. The common theme that surfaced was how the mock OSCE allowed students to familiarise themselves with what to expect from the final examination. Examples include:
“Helped to give us a broader understanding of how 1) clinician thinks and how 2) an examiner grades.”
“It was a great opportunity for medical students to learn about history taking and physical examination in a controlled environment.”
IV. DISCUSSION
Overall, the mock OSCE was well-received by students as a useful tool in preparation for final examinations. The differences in the perceived utility of the mock OSCE from before the OSCE, after the OSCE, and after their final examinations was statistically significant. This supports the value of the mock OSCE programme, given the students’ lack of exposure to clinical scenarios and examination structure as previously outlined. Interestingly, the proportion of students who found the mock OSCE programme useful/very useful decreased from 98.5% after the OSCE to 96.2% after final examinations. This drop can be explained by the qualitative feedback obtained, which cites the examination’s increased complexity and variations in content.
Most students felt that the mock OSCE helped them prepare their approach to OSCE stations, consistent with previous studies showing that mock OSCEs help students familiarise themselves with the format. (Lee et al., 2021, Chisnall et al., 2015) These sentiments are echoed in the qualitative feedback obtained. Furthermore, the difference in confidence levels before and after the mock OSCE was statistically significant.
This study is limited by its small sample size, ranging from 105 to 132 students. As participation was voluntary and the questions were not compulsory, some students did not complete the survey, leading to discrepancies in response rates. Furthermore, due to the anonymous nature of the surveys, we are unable to monitor for survey drop-offs. Using Likert scales to assess the utility of the mock OSCE may not fully capture participants’ opinions. Students may have interpreted the scales differently, as utility is subjective to the individuals’ standards for themselves. This could have resulted in less reliable data due to the diversity in how participants understood the scales.
V. CONCLUSION
The results of this study indicate that a mock OSCE is perceived to be an important part of examination preparation for medical students. Despite being a single-specialty mock OSCE, most students felt it was useful for preparing for their entire final exam.
These findings suggest that there is great potential in using mock OSCEs as a revision tool for medical students and support the implementation of such programmes to guide students in their learning and examination preparation.
Notes on Contributors
Shanya Shanmugam is a medical student at Lee Kong Chian School of Medicine, who is interested in medical education. She reviewed the literature, analysed data and wrote the manuscript.
Dr Rajeswari Kathirvel is a senior consultant at KK Women’s and Children’s Hospital and the principal lead for Obstetrics and Gynaecology at Lee Kong Chian School of Medicine. She, alongside with Dr Li Xinyi, designed the study, developed the questionnaire, and developed the manuscript.
Kayda Soh is an executive in KK Women’s and Children’s Hospital OBGYN Academic Clinical Programme. She was involved in administering the questionnaires and collating the data.
Dr Li Xinyi is a consultant at KK Women’s and Children’s Hospital and the posting lead for Obstetrics and Gynaecology at Lee Kong Chian School of Medicine. She, alongside with Dr Rajeswari Kathirvel, designed the study, developed the questionnaire, and developed the manuscript.
Ethical Approval
The study was approved and given exempt status by the Nanyang Technological University Institutional Review Board for research (IRB 2023-677).
Data Availability
The data that support the findings of this study are openly available in the Figshare repository, at https://doi.org/10.6084/m9.figshare.25903786.
Acknowledgement
We would like to thank the students at Lee Kong Chian School of Medicine who participated in this study.
Funding
The authors report that there is no funding associated with the work featured in this article.
Declaration of Interest
The authors report there are no competing interests to declare.
References
Chisnall, B., Vince, T., Hall, S., & Tribe, R. (2015). Evaluation of outcomes of a formative objective structured clinical examination for second-year UK medical students. International Journal of Medical Education, 6, 76–83. https://doi.org/10.5116/ijme.5572.a534
Hilliard, R. I., & Tallett, S. E. (1998). The use of an objective structured clinical examination with postgraduate residents in Pediatrics. Archives of Pediatrics & Adolescent Medicine, 152(1). https://doi.org/10.1001/archpedi.152.1.74
Lee, M. H., Phua, D. H., & Heng, K. W. (2021). The use of a formative OSCE to prepare emergency medicine residents for summative OSCE: A mixed-methods cohort study. Research Square. https://doi.org/10.21203/rs.3.rs-495003/v1
Pugh, D., Bhanji, F., Cole, G., Dupre, J., Hatala, R., Humphrey-Murto, S., Touchie, C., & Wood, T. J. (2016). Do OSCE progress test scores predict performance in a national high-stakes examination? Medical Education, 50(3), 351–358. https://doi.org/10.1111/medu.12942
Townsend, A. H., Mcllvenny, S., Miller, C. J., & Dunn, E. V. (2001). The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance. Medical Education, 35(9), 841–846. https://doi.org/10.1046/j.1365-2923.2001.00957.x
*Shanya Shanmugam
Lee Kong Chian School of Medicine,
11 Mandalay Road,
Singapore
Email: shanya001@e.ntu.edu.sg
Announcements
- Best Reviewer Awards 2024
TAPS would like to express gratitude and thanks to an extraordinary group of reviewers who are awarded the Best Reviewer Awards for 2024.
Refer here for the list of recipients. - Most Accessed Article 2024
The Most Accessed Article of 2024 goes to Persons with Disabilities (PWD) as patient educators: Effects on medical student attitudes.
Congratulations, Dr Vivien Lee and co-authors! - Best Article Award 2024
The Best Article Award of 2024 goes to Achieving Competency for Year 1 Doctors in Singapore: Comparing Night Float or Traditional Call.
Congratulations, Dr Tan Mae Yue and co-authors! - Fourth Thematic Issue: Call for Submissions
The Asia Pacific Scholar is now calling for submissions for its Fourth Thematic Publication on “Developing a Holistic Healthcare Practitioner for a Sustainable Future”!
The Guest Editors for this Thematic Issue are A/Prof Marcus Henning and Adj A/Prof Mabel Yap. For more information on paper submissions, check out here! - Best Reviewer Awards 2023
TAPS would like to express gratitude and thanks to an extraordinary group of reviewers who are awarded the Best Reviewer Awards for 2023.
Refer here for the list of recipients. - Most Accessed Article 2023
The Most Accessed Article of 2023 goes to Small, sustainable, steps to success as a scholar in Health Professions Education – Micro (macro and meta) matters.
Congratulations, A/Prof Goh Poh-Sun & Dr Elisabeth Schlegel! - Best Article Award 2023
The Best Article Award of 2023 goes to Increasing the value of Community-Based Education through Interprofessional Education.
Congratulations, Dr Tri Nur Kristina and co-authors! - Volume 9 Number 1 of TAPS is out now! Click on the Current Issue to view our digital edition.

- Best Reviewer Awards 2022
TAPS would like to express gratitude and thanks to an extraordinary group of reviewers who are awarded the Best Reviewer Awards for 2022.
Refer here for the list of recipients. - Most Accessed Article 2022
The Most Accessed Article of 2022 goes to An urgent need to teach complexity science to health science students.
Congratulations, Dr Bhuvan KC and Dr Ravi Shankar. - Best Article Award 2022
The Best Article Award of 2022 goes to From clinician to educator: A scoping review of professional identity and the influence of impostor phenomenon.
Congratulations, Ms Freeman and co-authors. - Volume 8 Number 3 of TAPS is out now! Click on the Current Issue to view our digital edition.

- Best Reviewer Awards 2021
TAPS would like to express gratitude and thanks to an extraordinary group of reviewers who are awarded the Best Reviewer Awards for 2021.
Refer here for the list of recipients. - Most Accessed Article 2021
The Most Accessed Article of 2021 goes to Professional identity formation-oriented mentoring technique as a method to improve self-regulated learning: A mixed-method study.
Congratulations, Assoc/Prof Matsuyama and co-authors. - Best Reviewer Awards 2020
TAPS would like to express gratitude and thanks to an extraordinary group of reviewers who are awarded the Best Reviewer Awards for 2020.
Refer here for the list of recipients. - Most Accessed Article 2020
The Most Accessed Article of 2020 goes to Inter-related issues that impact motivation in biomedical sciences graduate education. Congratulations, Dr Chen Zhi Xiong and co-authors.









