Authentic assessment utilising innovative technology enhanced learning

Number of Citations:

Published online: 7 January, TAPS 2020, 5(1), 70-75
DOI: https://doi.org/10.29060/TAPS.2020-5-1/SC2065

Carmel Tepper, Jo Bishop & Kirsty Forrest

Faculty of Health Sciences and Medicine, Bond University, Australia

Abstract

Bond University Medical Program recognises the importance of workplace based assessment as an integrated, authentic form of assessment. In partnership with a software company, the Bond Medical Program has designed and implemented an online Student Clinical ePortfolio, utilising a mobile-enabled, secure, digital platform available on multiple devices from any location allowing a range of clinically relevant assessments “at the patient bedside”. The innovative dashboard allows meaningful aggregation of student assessment to provide an accurate picture of student competency. Students are also able to upload evidence of compliance documentation and record attendance and training hours using their mobile phone.

Assessment within hospitals encourages learning within hospitals, and the Student Clinical ePortfolio provides evidence of multiple student-patient interactions and procedural skill competency. Students also have enhanced interprofessional learning opportunities where nurses and allied health staff, in conjunction with supervising clinicians, can assess and provide feedback on competencies essential to becoming a ‘work-ready’ doctor.

Keywords:        Authentic Assessment, Interprofessional Learning, Technology-Enhanced Learning, Feedback, Workplace-Based Assessment

I. INTRODUCTION

The medical education community is rapidly embracing workplace based assessment (WBA) as a more authentic form of assessment of medical students’ clinical competence. These clinical interactions are complex, with integrated competencies observed in real-life settings. For the safety of patients, however, it is essential that medical schools have evidence that their graduates have attained sufficient standards in core skills and activities as indicated by their relevant accrediting institutions’ graduating doctor competency frameworks. This includes evidence not only of sufficient maintenance of compliance documentation, attendance in clinical settings and teaching sessions, but also the ability of the student to interact competently with a variety of patients.

Student clinical placements within medical schools are often undertaken in multiple locations with a variety of clinical supervisors. At Bond University, Australia this process involves over 150 locations with up to 800 clinical supervisors observing, assessing and providing feedback on student performance. Previous manual, paper-based processes were inefficient, time-consuming, prone to error, and limited the opportunity for real-time feedback to students. Difficulty aggregating information resulted in difficulty making pass-fail decisions on student performance on rotation and delayed intervention for students requiring remediation for either compliance, attendance or clinical performance.

Whilst clear and validated documentation of proficiency required of a “work-ready” graduate is often challenging to obtain, this aggregation of multiple data points to build a more complete picture of student competence is central to the concept of programmatic assessment (van der Vleuten, 2016). A portfolio of evidence with timely feedback on performance is seen as essential for demonstrating the growing development of student clinical skills.

An electronic, or ePortfolio, represents the technological evolution from paper-based to electronic clinical assessments (Garrett, McPhee, & Jackson, 2013). There are multiple ePortfolios and learning management systems available which can be used in the workplace and electronically collect in-progress assessments and accomplishments (Kinash, Wood, & McLean, 2012). Some ePortfolios also allow students to manage continuing professional development. Bond Medical Program, however, sought to develop an ePortfolio specifically designed for undergraduate medical students that could aggregate not only attendance and compliance but also competency assessment data in a meaningful way to build an accurate picture of student competency in the hospital setting.

The aim of this short communication is to describe why and how a new version of a bespoke electronic portfolio was designed and implemented.

II. METHODS

Bond University partnered with a software company, which had healthcare experience, in the development of a digital student Clinical ePortfolio. The business requirement specification was for a fully mobile-enabled, secure, digital platform available on any device from any location that would allow a range of clinically relevant WBAs to be captured by clinicians “at the bedside” with the ability to provide immediate feedback to students. In addition, the software was to contain a process for students to provide evidence of compliance documentation and attendance at compulsory teaching sessions and on rostered placement shifts. The initial plan was to replicate all paper-based processes onto an electronic platform. The development of the software was iterative to the needs of the university using a road cycle improvement process. An app-based product was developed to house the clinical portfolio.

Feature Benefit Replacing
Tablet and mobile phone-enabled clinical assessment Readily available, user-friendly, allows for opportunistic assessment

Guest assessors (allied health and nursing) can participate in medical student education

Paper assessment which had to be collected and collated
Compliance Simple to scan and upload by students

Dashboard shows aggregate of compliance completion to ensure all required documentation has been provided

Time-consuming, laborious paper trail of compliance documentation
Attendance with GPS tracking Students take responsibility for being on rotation when rostered

Specific number of absences can trigger early student support processes

Accurate record of which students attended compulsory classes

Paper sign-on forms
Dashboard – Summary data Student and clinical staff can view aggregated summary data showing attendance, compliance, student patient logs and WBAs Multiple individual paper WBAs that could not be aggregated
Personal student learning Students can log personal patient interactions as a record of their learning on rotation Paper patient logs
Learning Modules with associated procedural skills assessment Students watch a ‘best-practice’ learning module, demonstrate their understanding by answering a short quiz and then generate an assessment for a clinical supervisor. The clinical assessor guides and observes the skill performance and then provides a ‘trust level’ competency rating. Students can repeat the assessment until competency achieved Skills performed in hospital setting not formally captured
Feedback to student Voice recorded or typed, feedback is provided to student as soon as submitted by the assessor – timely and relevant to the performance Verbal feedback or occasional comment on a form
CPD Students can log personal continuing professional development to capture more fully their learning journey

Table 1. Bond eportfolio features

In August 2017, the compliance portion of the portfolio was piloted with a single clinical year cohort of medical students and supervisors. In 2018, attendance and WBAs were conducted at the bedside of patients across all clinical years.

Delivering the project across many sites required the support of all supervisors, along with timely stakeholder engagement, and change management considerations. The needs of busy clinicians were surveyed, and a low-key launch by way of an online training video was preferred by the majority, with face-to-face on-site training available upon request. There are several barriers to timely feedback in the busy clinical environment with ‘opportunistic assessment’, multiple demands on clinician time and multiple students and/or trainees under supervision at any one time (Algiraigri, 2014). Feedback using the ePortfolio can be provided in the moment, recorded as either typing or voice recording and reviewed by students within their own time. Feedback from clinicians described it as “easy to complete on the go” and “easy to assess then and there (at the bedside)”. Table 1 describes the features and benefits of the Bond ePortfolio.

An example of the compliance dashboard, and the assessment portfolio front page is shown in the Appendix.

III. RESULTS

The new platform successfully delivered the required features through the Bond Student Clinical Portfolio. The Portfolio is accessible to both student and supervising clinicians using mobile phones or office desktop computers. Students indicate their attendance using a GPS geolocating attendance application. Compliance documents, clerked cases, reflections and other assessment components including the final in-training assessment are uploaded for supervisor assessment, whilst Mini-Clinical Evaluation Exercises are now completed by supervisors using a mobile phone at the patient bedside. All assessments are housed in one cloud-based portal, accessible to the decision-making committees.

An added advantage of this system is improving student digital literacy and self-directed learning, assisting them to become familiar with the process of self-documenting evidence of competence and skills obtained a valuable and highly sought-after skill for a graduated doctor.

A. Workplace Based Assessment

Evaluation of the 2018 pilot demonstrated significant efficiencies in documentation collection of WBA. Previously, professional staff would have collated 2,350 components of high-stakes assessment per year to be reviewed and presented to the Board of Examiners (BoE). Faculty can now track student progress during clinical rotation, with a process in place to identify students who require additional support to succeed. Faculty receive automatic notifications for review of submitted assessment items. During meetings of decision-making committees such as the BoE, student assessment items can be viewed by the committee to verify students who are borderline or those who receive commendations.

B. Attendance

Key members of the medical programme have a ‘dashboard’ on their homepage with ‘live’ attendance data. The Professional Staff Team can run reports when required but the platform will monitor students who meet the nominated ‘concern’ percentage of missed sessions which notifies the team that a support email may be required. In our experience, concerns around student well-being often present with non-attendance patterns. Supervisors in the clinical setting can now electronically track the progress of students allocated to their teams during rotations. In addition, they can identify students who require additional support in a timelier manner, helping to provide the best education experience possible.

C. Feedback

The clinicians’ ability to utilise their preferred method of feedback delivery allows flexibility and improved engagement in the process. For instance, the ability to voice record was introduced, enabling students to immediately access assessor feedback. This has resulted in increased communication between students and their assessors and a very positive response from the student body.

Feedback on students’ experience of this platform has been sought through ongoing discussion with the initial pilot group, and regular updates on their learning management system, and representative year specific feedback through staff-student liaison committees. The attendance monitoring has had mixed reviews from students who “appreciate not having to sign in on paper” but have been impacted by technical issues around non-syncing with certain mobile devices.

IV. DISCUSSION

Our belief is that assessment within hospitals will encourage learning within hospitals. Our intention is to remove Objective Structured Clinical Examinations (OSCEs) from the final year assessment to be replaced with authentic WBAs that are reliable and valid. OSCEs will continue to be used in the earlier years of the medical programme. There may be limitations as the very nature of the hospital environment is opportunistic. Students will have multiple patient (data) interactions to support their developing portfolio with evidence of competencies achieved. Students can personalise their studies and identify areas of focus for skill development during placement, to ultimately build confidence in their work readiness as a day one doctor. Ultimately, assessment information “should tell a story about the learner” (van der Vleuten, 2016, p. 888).

This platform offers many advantages over other platforms. The selection of the software partner was a competitive process. A full needs analysis and tender process was performed which for brevity has not been presented here. The advantages over other platforms identified at procurement were the opportunity to customise and the ability to have all the processes (compliance, attendance, placement and assessment) on one platform. Subsequent advantages made clear after implementation, and not delivered by other platforms included; the ability for students to take the portfolio into the workforce, a dashboard for attendance, and working with a partner based in health care who understood all stakeholder requirements, with an emphasis on safe patient care.

The next step will be to utilise the platform for training, progression and maintenance of competency of procedural skills before graduation. Specific procedural skills, required by accrediting bodies and relevant to the year of learning, will be assigned to the student for completion during a rotation. The student, in addition to the routine clinical practice of for example intravenous cannulation, will observe an interactive learning module about that skill, complete a short assessment to test their understanding of the module, and the system will then generate an assessment assigned to a clinical supervisor. The student will then perform the skill on the patient and the supervisor will submit the completed assessment on a ‘trust level’ scale of competency (ten Cate, 2013). If the student is not yet able to perform the skill sufficiently independently, there are opportunities to practice and repeat the assessment until competency is obtained.

V. CONCLUSION

Digitising the processes for monitoring attendance, conducting and collating compliance documentation, clinical assessment and delivering feedback at sites of clinical exposure has created significant efficiencies in the delivery of our programme. Preliminary feedback indicates that this leads to a vastly improved student experience with real-time, enhanced feedback on assessment performance and timely student remediation to assist students in becoming safe and competent ‘work-ready’ doctors. Live updates that notify of absenteeism allow for more timely support and personalised care. The aggregation of data into one personalised student clinical ePortfolio will allow decision-making bodies to make intelligent and safe pass-fail decisions based on evidence of student clinical performance.

Notes on Contributors

Carmel Tepper is the Academic Assessment Lead at Bond University. She has a special interest in exam blueprinting, item analysis and assessment technologies.

Jo Bishop is the Academic Curriculum Lead and Associate Dean, Student Affairs and Service Quality at Bond University. Jo is an expert on curriculum planning and development and has a passion for enhancing the student experience.

Kirsty Forrest is the Dean of Medicine at Bond University. She has been involved in medical educational research for 15 years and is co-author and editor of several best-selling medical textbooks including ‘Medical Education at a Glance’ and ‘Understanding Medical Education: Evidence Theory and Practice’.

Ethical Approval

Ethical approval was not required.

Acknowledgements

An e-poster presentation on some of this work was presented at 15th anniversary APMEC and awarded a merit.

Funding

There is no funding involved for this paper.

Declaration of Interest

Other institutional uptake of the new designed Student Clinical Portfolio may financially benefit Bond University.

References

Algiraigri, A. H. (2014). Ten tips for receiving feedback effectively in clinical practice. Medical Education Online, 19(1), 25141. https://doi.org/10.3402/meo.v19.25141

Garrett, B. M., MacPhee, M., & Jackson, C. (2013). Evaluation of an eportfolio for the assessment of clinical competence in a baccalaureate nursing program. Nurse Education Today, 33(10), 1207-1213. https://doi.org/10.1016/j.nedt.2012.06.015

Kinash, S., Wood, K., & McLean, M. (2013, April 22). The whys and why nots of ePortfolios [Education technology publication]. Retrieved from https://educationtechnologysolutions.com/2013-/04/the-whys-and-why-nots-of-eportfolios/

ten Cate, O. (2013). Nuts and bolts of entrustable professional activities. Journal of Graduate Medical Education, 5(1), 157-158. https://doi.org/10.4300/JGME-D-12-00380.1

van der Vleuten, C. P. M. (2016). Revisiting ‘Assessing professional competence: From methods to programmes’. Medical Education, 50(9), 885-888. https://doi.org/10.1111/medu.12632

*Carmel Tepper
Faculty of Health Sciences,
Bond University,
14 University Drive, Robina QLD,
4226 Australia
E-mail: ctepper@bond.edu.au

Announcements