![](image/apmec.jpg) |
|
Code: W1A1
Measurement 101 in Medical Education
Name of Facilitator: Kevin W. Eva
Institution: McMaster University, CANADA
Background on the topic
Central to many topics in medical education are issues of measurement (i.e., reliably and validly collecting data
from which one can draw appropriate inferences). Researchers hoping to study the success of educational interventions
must often devise/use questionnaires to assess the impact of their efforts. Similarly, educators routinely create/use
assessment instruments meant to evaluate student success with competency development. These and other goals can only be
achieved in a meaningful way if the instruments used for data collection are reliable (i.e., able to consistently
discriminate between individuals) and valid (i.e., able to measure what is intended). This workshop will highlight
some key issues and general principles inherent in the development of measurement instruments.
Objectives/intended outcomes |
1. |
Obtain a working knowledge of general principles relevant to survey/exam development |
2. |
Consideration of how these principles can/should be applied in participants' work lives |
3. |
Understand psychometric principles of reliability and validity and how to apply them |
4. |
Discover typical errors made in questionnaire design and implementation |
|
![](image/divider.gif) |
Code: W1A2
Methods of Assessment in Medical Ethics
Name of Facilitator: Alastair V Campbell
Institution: Yong Loo Lin School of Medicine, National University of Singapore, SINGAPORE
This workshop will examine the literature on assessment in ethics, review the methods and include some practical exercises.
|
![](image/divider.gif) |
Code: W1P1
Generalisability Theory : Getting the Computer To Do the Work
Name of Facilitator: Geoff Norman
Institution: McMaster University, CANADA
Participants will get an overview of G theory methods, and specific instruction in how to compute G coefficients using the computer. Some prior knowledge of ANOVA is an asset.
The workshop will include free software (GENOVA and G String).
|
![](image/divider.gif) |
Code: W1P2
Problem-Based Learning : Process and Outcomes
Name of Facilitator: Matthew Gwee1, Khoo Hoon Eng1 and Gerald Koh2
Institution:
1Medical Education Unit, Dean's Office
2 Department of Community, Occupational & Family Medicine
Yong Loo Lin School of Medicine, National University of Singapore, Singapore
Problem-Base Learning (PBL) is a strategic combination of educational elements to optimise student learning.
The key to PBL is to start with a problem. It is student-centred learning because the environment is an interactive
small group using a problem to initiate brainstorming, discussions and identification of learning issues.
It is also integrated learning because the student will learn in the context of knowledge application to resolve real problems.
This workshop will give you the theory, demonstrate how it is done and provide you with a hands-on session where you
can practise what you have learned, i.e. learning by doing.
|
![](image/divider.gif) |
Code: W1P3
Assessing the Outcomes
Name of Facilitator: Margery Davis
Institution: Centre for Medical Education, University of Dundee, Scotland, United Kingdom
This is an introductory level workshop for participants who have responsibility for student / trainee
assessment at either undergraduate or postgraduate level. The workshop will cover assessment instruments in
common use throughout the world and help familiarise participants with the strengths and weaknesses of each instrument.
At the end of this workshop participants will be able to: |
![](image/bullet.gif) |
Identify the outcomes that need to be assessed in health professions' education |
![](image/bullet.gif) |
Identify appropriate assessment tools for each outcome |
![](image/bullet.gif) |
Understand the need for an examiners toolkit |
![](image/bullet.gif) |
Select appropriate assessment tools for use in their own toolkit |
|
Programme |
Welcome to the workshop.
Introductions and overview of the afternoon |
|
What outcomes are we talking about in the health professions?
Plenary session looking at outcome frameworks from across the world |
|
Buzz groups identify tools for assessing individual outcomes |
|
Break |
|
Assembling the toolkit
An introduction to Miller's pyramid |
|
Buzz groups assemble assessment toolkits for real life situations |
|
Groups report back with their toolkits for comment from the plenary group |
|
What have you learned from this workshop? |
|
End of workshop |
|
![](image/divider.gif) |
Code: W1P4
Using Simulators for Outcomes-Based Assessment
Name of Facilitator: S. Barry Issenberg and Ross J. Scalese
Institution: Center for Research in Medical Education,
University of Miami Miller School of Medicine, USA
A common challenge for medical educators is to determine the most appropriate assessment tool for particular competencies
that their students should acquire. In the past, it has been customary to rely on real patients for assessment of many
important skills. More recently, however, ethical considerations and the growing concern for patient safety have appropriately
limited the use of real patients as assessment "instruments"; it is no longer acceptable as a matter of routine to assess
third- and fourth-year medical students' ability to perform critical (e.g., intubation) or sensitive (e.g., pelvic examination)
tasks on real (even standardized) patients. Use of patient substitutes, such as cadaveric or animal tissue models, has its
own challenges, not the least of which is maintaining an adequately realistic clinical context. In addition, availability,
cost and ethical concerns have limited the use of cadavers and animals for medical skills assessment. Simulators, on the other
hand, circumvent most of these obstacles and, thus, recently have come into widespread use for evaluation of learners across
the continuum of medical education.
In this workshop we will have the opportunity to learn from international experts on simulators from the very center that
has the longest continuous simulation program in medical education.
This hands-on workshop will focus on the following: |
a. |
Best evidence aspects of simulator-based outcomes assessment |
b. |
Review of available simulators for outcomes-based assessment |
c. |
Development of assessment instruments to use with simulators |
|
Participants will have the opportunity for hands-on use of simulators, including Harvey, the Cardiopulmonary Patient Simulator. |
|
![](image/divider.gif) |
Code: W2A1
Assessment Methods: What Works, What Doesn't
Name of Facilitator: Geoff Norman
Institution: McMaster University, CANADA
In this workshop I will review the literature on assessment and its implications for the choice of particular assessment methods.
Background
There is an extensive literature on assessment in medical education, dating back over three decades. From this literature, it is possible
to systematically and critically examine our use of various approaches. Regrettably, much of this literature appears to be ignored by
educational practitioners
Objectives |
To familiarize participants with the literature on assessment |
![](image/bullet.gif) |
criteria for assessing an assessment method |
![](image/bullet.gif) |
general "axioms" regarding desirable and undesirable properties of an assessment method |
To review various methods currently in use, both old and new, from this perspective |
|
Structure
I will present a framework for critical examination of various methods. I will then critically review existing methods, both old and new, with a view to examining the evidence of effectiveness. From this, I will make some general inferences about the usefulness of various methods. While there will be no "hands-on" exercises, there will be ample opportunity for discussion and sharing of experiences
Who Should Attend
Individuals with responsibility for the implementation of student assessment methods
Take-Home Message
Choice of an assessment method should be based on evidence of effectiveness. From this evidence, it is possible to identify specific essential characteristics necessary for credible assessment
|
![](image/divider.gif) |
Code: W2A2
Critiquing Outcome-Based Assessment Plans
Name of Facilitator: Rukhsana Zuberi
Institution: The Aga Khan University, Pakistan
Prerequisite Knowledge/ Skills for Participants:
No particular knowledge / skills required.
Preferably medical or nursing faculty members.
|
Program Objectives:
By the end of this workshop, participants will be able to |
![](image/bullet.gif) |
Recapitulate the principles of assessment: why, what, how, when, by whom |
![](image/bullet.gif) |
Match objectives pertaining to different domains to appropriate methods of assessment |
![](image/bullet.gif) |
Apply the principles of reliability, validity, objectivity, standardization and feasibility to critique assessment tools and plans |
![](image/bullet.gif) |
Use the same principles to improve the plans to assess outcomes appropriately. |
|
Program Format: |
The workshop lays emphasis on active participation and problem-solving though critical thinking, group discussion and
evaluative judgment. The workshop has a multi-modal teaching/learning strategy with a brief introductory large group
interactive plenary session to activate prior knowledge and stimulate higher level thinking.
In smaller groups, participants will review and critically analyze the given assessment plans (to assess outcomes);
identify the weaknesses and identify alternate plans in the light of earlier discussions and presentation.
In the second plenary session, each group will report their findings, revised plans, and will justify the revisions.
This will be followed by a summary of learning points and workshop evaluation.
|
|
No of Participants:
20 (minimum) to 28 (maximum).
|
|
Duration of the workshop:
180 minutes (3 hours) |
Duration of Session |
Activity |
Setting |
15 minutes |
Introductions and Program Overview |
Large group |
30 minutes |
Interactive Presentation on Assessment: Why, What, How, When and Characteristics of assessment methods |
Large group |
30 minutes |
Divide participants into small groups. Assign assessment plans and tasks (Group Work) |
Small groups |
60 minutes |
Presentation by each of the four groups and discussion in the large group: critique and revised assessment plans |
Large group |
10 minutes |
Debriefing & Workshop Evaluation |
Large group |
|
![](image/divider.gif) |
Code: W2P1
Using the Multiple Mini-Interview to Select Applicants to Medical School
Name of Facilitator: Kevin W. Eva
Institution: McMaster University, CANADA
Background on the topic
While applicants to health professional schools can often be differentiated reliably on the basis of cognitive ability,
accurately selecting candidates on the basis of personal qualities has proven much more challenging. Several research studies
conducted over the past five years have demonstrated that an OSCE-style Multiple Mini-Interview (MMI) process can fulfill the
selection goals of reliability (i.e., reproducibly differentiating between applicants), validity (i.e., accurately predicting
medical school performance), feasibility and acceptability. Recent implementation of the MMI at other institutions provides
evidence that an MMI-based admissions system can be designed to meet the unique values of particular training programs.
The system developed at McMaster University involves institution-specific identification of the relative importance of personal
qualities using a paired comparison analysis followed by the development, implementation, and validation of stations directed at those values.
Objectives/intended outcomes |
1. |
Obtain updated knowledge regarding research conducted on the MMI |
2. |
Attain familiarity with paired comparison methodology to determine the values held dear at your institution and construct an appropriate MMI blueprint to guide station writing and use |
3. |
Practice writing MMI stations in small groups |
4. |
Conduct a shortened version of the MMI on fellow workshop attendees |
5. |
Discuss the appropriateness of various outcome measures for use in judging the validity of admissions decisions and the challenges inherent in performing this type of research. |
|
![](image/divider.gif) |
Code: W2P2
Designing Oral examinations: Challenges and Opportunities
Name of Facilitator: Ara Tekian
Institution: University of Illinois at Chicago, Department of Medical Education, USA
Rationale:
Oral examinations (viva voce) are used to assess the critical reasoning, problem solving, judgment process and/or communication skills of candidates. Controversy over the advantages and limitations of oral examinations has dominated the discussion of this assessment technique for decades. Despite serious reservations about reliability, validity, practicality, and usefulness, orals are frequently used in undergraduate and graduate medical education worldwide and by 15 out of 24 U.S. ABMS Boards. This program will prepare participants to plan and implement effective oral examinations. The workshop instructors will provide a brief synthesis of the literature published during the past four decades and discuss issues related to reliability/reproducibility and validity. Participants will have the opportunity to design a blueprint and a scoring instrument for a standardized oral examination; role play an oral examination scenario, and critique a videotape of an examiner in action.
Intended Audience:
This workshop is intended for health professions faculty involved in organizing and/or conducting oral examinations for course, clerkships, and residency programs.
Specific Objectives:
By completion of this workshop, participants will improve their ability to: |
![](image/bullet.gif) |
Identify the uses and limitations of oral examinations |
![](image/bullet.gif) |
Recognize and avoid sources of error associated with oral examinations |
![](image/bullet.gif) |
Design a blueprint and scoring instruments for a standardized oral |
![](image/bullet.gif) |
Select and train examiners for an oral examination |
|
![](image/divider.gif) |
Code: W2P3
Evaluating Students in an Outcomes-Based Curriculum
Name of Facilitator: Stephen R. Smith
Institution: Brown Medical School, Providence, Rhode Island, USA
This workshop is designed for medical educators who wish to gain skills in defining educational outcomes (competencies) for
their undergraduate medical education programmes and design teaching, learning, and assessment methods based on those outcomes.
At the end of the workshop participants will be able to: |
![](image/bullet.gif) |
Specify a set of educational outcomes for their medical education programme |
![](image/bullet.gif) |
Define and elaborate criteria to assess student mastery of the outcomes |
![](image/bullet.gif) |
Create performance-based assessment instruments |
![](image/bullet.gif) |
Assess student performance using those instruments |
|
Program
Welcome and introductions
Why the outcomes-based curriculum model is appropriate
Conceptual model |
![](image/bullet.gif) |
Flexnerian Model |
![](image/bullet.gif) |
Outcomes (Competency)-based Model |
Examples of How It's Done |
![](image/bullet.gif) |
Anatomy |
![](image/bullet.gif) |
Medicine Clerkship |
![](image/bullet.gif) |
Psychiatry Clerkship |
![](image/bullet.gif) |
Surgery Clerkship |
Videotape segment
Defining and Elaborating Criteria exercises |
![](image/bullet.gif) |
Defining Outcomes (Using their own institution and models from other institutions) |
![](image/bullet.gif) |
Defining Criteria (define three criteria which the competency would include) |
Refreshment break
Defining and Elaborating Criteria exercises (continued) |
![](image/bullet.gif) |
Elaborating Criteria |
![](image/bullet.gif) |
Leveling |
General Discussion
Performance-based Assessment in Action |
![](image/bullet.gif) |
Moral Reasoning and Clinical Ethics videotape and rating sheet |
![](image/bullet.gif) |
"Talking Out Loud" Method of Defining Criteria for Noncognitive Areas |
![](image/bullet.gif) |
Social and Community Contexts of Health Care journal review and rating sheet |
Conclusion and Plans for Future Action |
|
![](image/divider.gif) |
Code: W2P4
Assessment of Professionalism in Medical Students
Name of Facilitator: Maxine Papadakis
Institution: University of California San Francisco, USA
Workshop objectives: |
1. |
Discuss standards for professional behavior |
2. |
Discuss evaluation strategies and methods for detecting problems with professionalism. |
3. |
Discuss administrative consequences for students who have not obtained satisfactory professionalism skills. |
|
Description of the workshop: |
How do educators evaluate professionalism in their medical students and
what happens to students who have professionalism problems while in medical school? This workshop will use
case studies to focus on the assessment of professionalism and discuss policies that medical schools can
implement to address unprofessional behavior. There will be time to share information and for questions and answers. |
|
![](image/divider.gif) |
Code: W3P1
Using SPSS for Data Analysis in Medical Education Research
Name of Facilitator: Chan Yiong Huak
Institution: Biostatistics Unit, Yong Loo Lin School of Medicine, National University of Singapore, Singapore
This workshop provides the participants the skill to set up a database in SPSS, to be able to perform the correct statistical
analyses for their research data and to interpret the findings.
At the end of this workshop participants will be able to: |
![](image/bullet.gif) |
Set up a database using SPSS |
![](image/bullet.gif) |
Import files from Excel & Access to SPSS |
![](image/bullet.gif) |
Perform simple recodings & computations using SPSS |
![](image/bullet.gif) |
Analyse Quantitative & Qualitative data using Univariate & Multivariate techniques |
![](image/bullet.gif) |
Perform reliability & factor analyses on their instruments |
|
Programme |
8.30am |
Registration |
9.00am |
Welcome to Workshop.
Introduction to SPSS - Setting Up A New Database
Exporting from Excel & Access Databases to SPSS |
10.00am |
Break |
10.30am |
Analysis of Quantitative Data |
12noon |
Lunch |
1.00pm |
Analysis of Qualitative Data |
3.00pm |
Break |
3.30pm |
Reliability & Factor Analysis |
5.00pm |
End of Workshop |
|
|
|
|
|