Beyond the response rate: Navigating challenges and crafting strategies in survey data collection

Number of Citations: 0

Submitted: 14 April 2025
Accepted: 19 August 2025
Published online: 7 April, TAPS 2026, 11(2), 134-136
https://doi.org/10.29060/TAPS.2026-11-2/II3751

Dexter Chai Yih Haur1, Lee Shuh Shing2, Yeo Su Ping2, Goh Zi Qing2 & Han Ting Jillian Yeo2

1School of Humanities and Interdisciplinary Studies, Ngee Ann Polytechnic, Singapore; 2Centre for Medical Education, Yong Loo Lin School of Medicine, National University of Singapore, Singapore

I. INTRODUCTION

Surveys/ Questionnaires are one of the most commonly used tools in data collection, enabling researchers to analyse different patterns or trends and ultimately, contribute to the advancement of evidence-based educational practices (Wilson et al., 2023). However, conducting surveys presents several challenges that can compromise data quality and reliability of data collected. This paper draws on our institutional experience in conducting educational surveys, sharing both successes and challenges. We aim to explore the factors influencing survey participation, strategies adopted to address them, and emerging opportunities in survey design.

II. CHALLENGES AND FACTORS IMPACTING SURVEY RESPONSE RATES

Despite their utility, surveys face challenges that can be broadly categorised as participant-related and survey structure-related. Participant-related challenges include response bias, low motivation, and incomplete or inaccurate answers, all of which can compromise data validity, reliability, and representativeness (Phillips et al., 2017). Low response rates further increase the risk of non-response bias. Survey structure-related issues, such as poorly framed questions or excessively lengthy surveys, can reduce engagement and completion rates. Technical barriers and survey fatigue further contribute to low response quality.

This contention aligns with the scholarly discourse presented by Saleh and Bista (2017), elucidating the multifaceted nature of factors influencing survey response rates. Drawing upon the theoretical framework of social exchange theory, which posits that human behaviour is directed by the anticipation of reciprocation, Saleh and Bista (2017) explained that survey response rates hinge upon an array of factors which includes the following: (a) incentives (rewards in increasing survey participants), (b) authority (credibility of the agencies which conducted the survey), (c) survey design (length and types of the question in the survey) and (d) ethical considerations (data privacy, and anonymity).

III. INSTITUTIONAL STRATEGIES TO MITIGATE CHALLENGES

In the past year, Centre for Medical Education (CenMED) conducted multiple educational research involving surveys. Guided by an understanding of response rate determinants, we focused on improving two key domains: design and administration. We prioritised brevity, targeting surveys that could be completed within 10–15 minutes. A mixture of open and closed-ended questions – multiple choice, sliders, ranking, etc. – kept respondents engaged while minimising cognitive load. We limited the number of items to under 50 to avoid fatigue. Questions were concise and clearly worded. Surveys were hosted on Qualtrics with no identifiable data collected, and ethical standards were upheld with Internal Review Board approval before administration.

To improve reach and uptake, we enlisted key faculty members to disseminate the surveys and framed invitation emails to highlight the relevance and value of participation. Despite these measures, we continued to observe lower-than-expected response rates. Deeper examination revealed that survey fatigue – exacerbated by repeated requests and overlapping topics – reduced participant enthusiasm. Generic email blasts lacked the personal engagement necessary to motivate participation.

To circumvent survey fatigue and improve perceptions of surveys, the Centre collaborates with the Dean’s Office to streamline administration process and prevent topic overlap. Recognising the influence of authority figures, key faculty members such as medical educationalists, Vice Dean (Education) or Phase Coordinators, were enlisted to encourage participation during face-to-face sessions. This approach provided a direct personalised appeal to enhance the perceived importance and value of the survey. Additionally, to further facilitate the survey participation, the team provided protected time and space during face-to-face session for respondents to complete the survey.

With this, the team successfully boosted response rates to a satisfactory level. Other than participant-related, and survey structure-related challenges, organisations-related challenges could be one of the contributing factors to low response rates especially in a highly research-focused university. Streamlining processes to eliminate redundancy in survey administration helps improve perceptions of surveys and reduce survey fatigue.

IV. EMERGING OPPORTUNITIES IN SURVEY DESIGN

A. Artificial Intelligence (AI)

One significant area of innovation lies in the use of Artificial Intelligence (AI) to support various stages of the survey lifecycle. AI can facilitate the development of adaptive questionnaires that adjust in real time based on a respondent’s previous answers, thereby improving relevance and reducing cognitive fatigue. Natural language processing capabilities allow for more efficient analysis of open-ended responses. Predictive analytics can help researchers identify trends in responses and patterns of dropout or incomplete data. As described by Paduraru et al. (2024), this enabled for better allocation of resources (e.g. manpower) as AI agents will be able to facilitate the process of questionnaire development, data collection and survey data analysis.

B. Open Data Sharing

Open data sharing in research refers to the practice of making research data freely accessible to other researchers and the public. Open data sharing in research not only enhances collaboration and transparency but also plays a crucial role in improving survey responses. By making survey data openly accessible, researchers can build upon existing datasets, reducing the need for repetitive surveys and minimising respondent fatigue. This approach allows for more robust meta-analyses, enabling a deeper understanding of trends without overburdening participants with multiple survey requests. Additionally, shared data fosters greater trust and engagement among respondents, as they see their contributions being utilised effectively to drive meaningful research outcomes.

C. Social Media

Social media platforms offer a powerful avenue for survey administration, enabling researchers to reach diverse and geographically dispersed populations efficiently. Platforms like Facebook, Twitter, LinkedIn, and Instagram allow for targeted survey distribution through organic posts, paid advertisements, and community groups. Features such as polls, direct messaging, and embedded survey links enhance accessibility and engagement, encouraging higher response rates. Additionally, social media analytics provide real-time insights into respondent demographics and engagement patterns, allowing for adaptive survey strategies. By leveraging these platforms, researchers can improve outreach, increase participation, and gather timely data while minimising costs.

D. Gamification

Gamification can enhance survey administration by incorporating game-like elements to increase engagement and response rates. Features such as points, badges and progress bars make the survey experience more enjoyable, reducing respondent fatigue and encouraging completion. Personalised challenges, leaderboards, and instant feedback can further motivate participation, especially in longer surveys. By integrating storytelling and immersive design, researchers can create a more engaging environment that keeps respondents interested while maintaining data quality. Gamification not only enhances the user experience but also helps mitigate dropouts, making it a valuable strategy for improving survey administration.

V. CONCLUSION

Survey questionnaires remain vital tools in medical education research. Yet, the challenges surrounding participation demand thoughtful design and contextual strategies. Our institutional experience highlights the importance of reducing respondent burden, personalising outreach, and integrating surveys into existing workflows.

Looking ahead, innovations in AI, open data sharing, and social media offer promising avenues for improving data collection. Importantly, while our strategies have shown success within our context, we acknowledge they may not be directly transferable to other institutions. Adaptation to local contexts remains essential. Ultimately, survey success lies in balancing methodological rigour with human-centred design – facilitating meaningful research that benefits both educators and learners.

Notes on Contributors

Dexter Chai drafted the first manuscript and subsequently, the manuscript was edited by Lee Shuh Shing, Yeo Su Ping, Goh Zi Qing and Han Ting Jillian Yeo.

Ethical Approval

Ethical approval is not required for this article as no human participant data was collected/presented.

Funding

The authors did not receive any funding for this article.

Declaration of Interest

The authors would like to declare that they do not have any conflict of interest.

References

Paduraru, C. I., Cristea, R., & Stefanescu, A. (2024). Adaptive questionnaire design using AI agents for people profiling. International Conference on Agents and Artificial Intelligence, 3, 633-640. https://doi.org/10.5220/0012379600003636

Phillips, A. W., Friedman, B. T., Utrankar, A., Ta, A. Q., Reddy, S. T., & Durning, S. J. (2017). Surveys of health professions trainees: Prevalence, response rates, and predictive factors to guide researchers. Academic Medicine, 92(2), 222-228. https://doi.org/10.1097/acm.0000000000001334

Saleh, A., & Bista, K. (2017). Examining factors impacting online survey response rates in educational research: Perceptions of graduate students. Journal of Multi-Disciplinary Evaluation, 13(29), 63-74. https://doi.org/10.56645/jmde.v13i29.487

Wilson, A. B., Brooks, W. S., Edwards, D. N., Deaver, J., Surd, J. A., Pirlo, O. J., Byrd, W. A., Meyer, E. R., Beresheim, A., Cuskey, S. L., Tsintolas, J. G., Norrell, E. S., Fisher, H. C., Skaggs, C. W., Mysak, D., Levin, S. R., Escutia Rosas, C. E., Cale, A. S., Karim, M. N., … Lufler, R. S. (2023). Survey response rates in health sciences education research: A 10‐year meta‐analysis. Anatomical Sciences Education, 17(1), 11-23. https://doi.org/10.1002/ase.2345

*Dexter Chai Yih Haur
School of Humanities and Interdisciplinary Studies,
Ngee Ann Polytechnic, Singapore
Email: trexed89@gmail.com

Announcements