Автор неизвестен - Mededworld and amee 2013 conference connect - страница 84
Osaree Akaraborwon (Prince of Songkla University, Department of Surgery, Hatyai, Thailand) Gloyjai Kumkong (Prince of Songkla University, Department of Surgery, Hatyai, Thailand) Piyaporn Kongnuan (Prince of Songkla University, Department of Surgery, Hatyai, Thailand) Jittima Intarapan (Prince of Songkla University, Department of Surgery, Hatyai, Thailand)
Background: The Objective Structured Clinical Examination (OSCE) is an important tool in assessing clinical skills of medical students, particularly in the surgical department. Several factors, however, affect the reliability of this tool, including the raters. We wish to evaluate whether a video-assisted OSCE assessment is as effective as a real-time rater.
ABSTRACT BOOK: SESSION 7 TUESDAY 27 AUGUST: 1045-1230
Summary of work: A 5-minute OSCE station of history taking of breast mass for 43 fourth-year medical students, rotated at the department of surgery, was evaluated by the real-time rater. The video-records were carried out during this examination. All records were evaluated by the same rater. Mean total scores from the two methods were compared. Summary of results: There was no difference in mean total scores between the real-time rater and video-assisted assessment (76.67 vs. 76.98, p = 0.25). A strong correlation (r = 0.975) between the two groups was demonstrated.
Conclusions: The video-assisted OSCE assessment is effective and reliable. Using this method while human resources are limited is helpful. Take-home messages: The video-assisted OSCE assessment is as effective as the real-time rater.
Saudi Internal Medicine Residents' Perceptions of the Objective Structured Clinical Examination as a Formative Assessment Tool
Salwa Aidarous (KAMC, Medicine, Jeddah, Saudi Arabia) Mihael Seefeldt (KSAU-HS, Medical Education, Riyadh, Saudi Arabia)
Tariq Awad (KAMC, Medicine, Po Box 35038, Jeddah 21488, Saudi Arabia
Background: Saudi Commission for Health Specialties first implemented Objective Structured Clinical Examination (OSCE) as part of the final year internal medicine clerkship exam during the 2007-2008 academic year. This study evaluated Internal Medicine residents' overall perceptions of the OSCE as a formative assessment tool. It focused on residents' perceptions of the OSCE stations' attributes, determined the acceptability of the process and provides feedback to enhance further development of the assessment tool. The main objective was to assess Internal Medicine resident test-takers' perceptions and acceptance of the OSCE, and to identify its strengths and weaknesses through their feedback.
Summary of work: A cross-sectional survey of a group of Internal Medicine residents who participated in the OSCE course on November 8th 2012 was conducted using a self administered questionnaire with various domains, modified from a study by Pierre et al in 2004 and administered immediately after all residents completed the OSCE stations. Summary of results: Overall, residents' evaluation of OSCE was favorable with respect to the comprehensiveness (79-86%), transparency (83%), fairness (95%), and authenticity of the required task (89%). However, the majority (87%) appreciated the supportive attitude and constructive feedback given by the examiners. But few felt that the time given for feedback was inadequate (22-36%) and expressed their concerns about the ambiguity of some station instructions and the high fee of the course. Conclusions: Overall, residents' evaluation of OSCE was favorable and encouraging.
Take-home messages: To this end, we recommend that formative assessment opportunities using OSCE for providing feedback to students should be included in the curriculum, and continuing refinement and localized adaptation of OSCEs in use should be pursued by course directors and assessment personnel.
Changing clinical teaching improves performance of interns in pediatric OSCE
Ren-Huei Fu (Chang Gung Memorial Hospital and Chang Gung University, Department of Medical Education Science, Department of Pediatrics, 12F, No. 5, Fu-Shing St.,Kwei-Shan, Taoyuan county 333, Taiwan Liang-Shiou Ou (Chang Gung Memorial Hospital and Chang Gung University, Department of Medical Education Science, Department of Pediatrics, Taoyuan County, Taiwan)
Peng-Wei Hsu (Chang Gung Memorial Hospital and Chang Gung University, Department of Medical Education Science, Taoyuan County, Taiwan) Jing-Long Haung (Chang Gung Memorial Hospital and Chang Gung University, Department of Medical Education Science, Department of Pediatrics, Taoyuan County, Taiwan)
San-Jou Yeh (Chang Gung Memorial Hospital and Chang Gung University, Department of Medical Education Science, Department of Internal Medicine, Taoyuan County, Taiwan)
Background: In our medical center, all medical interns receive a 6-week pediatric training course, which includes a program of core lectures and 3 different subspecialties of clinical practice. A pediatric OSCE test of 2 clinical cases is given to all interns at the end of their course. Our previous report showed that OSCE performance is not affected by sub-specialist training. However, the scores of neonatal physical examination are lower in interns without newborn center training experience. Therefore, we designed a brief training course of neonatal physical examination for all interns. The purpose of this study is to evaluate the results of this brief training course.
Summary of work: The brief training course began in April 2012. The OSCE data were collected between November 2011 and January 2013. We evaluated the scores of the neonatal jaundice test in pediatric OSCE. A total number of 135 interns participated in the OSCE test. They were divided into 4 groups according to whether or not they received the sub-specialty training course and brief training course. A t-test is used to compare the scores of these interns. Summary of results: After taking the brief physical examination training course, there is no statistically significant difference between the scores of the groups either with or without newborn center training experience. The scores of interns without sub-specialty training are improved after taking the brief training course (60.57±10.72 vs 69.21±16.35). Conclusions: We show that by designing a brief training course we are able to improve the clinical skills of
ABSTRACT BOOK: SESSION 7 TUESDAY 27 AUGUST: 1045-1230
medical interns, evidenced by their performance in
Take-home messages: Our pediatric OSCE for medical interns is designed for the purpose of teaching. Therefore, it is useful to discover problems in our training course. This study proves that tailoring our teaching courses according to the results of OSCE is able to improve the clinical skills of interns.
The Pediatric OSCE Collaboration of Canada
Moyez Ladhani (McMaster University, Pediatrics, Hamilton, Canada)
A Atkinson (University of Toronto, Pediatrics, Toronto, Canada)
H Writer (University of Ottawa, Pediatrics, Ottawa, Canada)
S Lawrence (University of Ottawa, Pediatrics, Ottawa, Canada)
A Jeffries (University of Toronto, Pediatrics, Toronto, Canada)
CPPD Canadian Pediatric Program Directors of Canada (CPPD) (McMaster University, Pediatrics, Hamilton, Canada)
(Presenter: Sarah Manos, Dalhousie University, Pediatrics, Halifax, Canada)
Background: The Royal College of Physician and Surgeons of Canada requires residents to pass a comprehensive examination at the end of their training, which includes an OSCE. All 17 pediatric training programs in Canada have implemented a practice OSCE, commonly occurring biannually, which provides the learner with formative feedback. Implementation is, resource intensive, requiring question bank development and maintenance, faculty development, and extensive human and physical resources. Many programs lack the infrastructure and manpower to run a comprehensive and meaningful OSCE for their learners. Summary of work: With the objectives of distributing resources and improving standardization, in 2009 three programs in Ontario began collaborating on OSCE administration. Station development responsibilities were spread across programs, reducing individual centre resource strain, and allowing for the administration of a more standardized exam at each centre. A national OSCE was the logical expansion, with the creation in 2012 of the Pediatric OSCE Collaboration of Canada (POCC). All 17 residency programs now participate in this nationwide standardized process. Responsibilities for OSCE blueprint development, station development and review; language translation and data collection and distribution are spread nationwide. Collaboration is key to the success. Data for individual stations and resident peer group performance is distributed nationally, allowing each program to benchmark resident performance against a national cohort, and to identify strengths and weaknesses with regard to specific content.
Summary of results: POCC has successfully implemented a national standardized formative OSCE and highlights an effective collaboration of the Canadian Pediatric Program Directors. Residents can now receive feedback compared with their peer group nationwide. Task distribution across programs has decreased individual program resource challenges, and organization and standardization have improved.
Faculty Development on OSCE for Internal Medicine Clerkship
Marcelo Cruzeiro (Federal University of Juiz de Fora, Internal Medicine, R. Oscavo Gonzaga Prata, 350/501, Juiz de Fora 36033-220, Brazil)
Valdes Bollela (Sao Paulo University, Infectious Disease, Ribeirao Preto, Brazil)
Background: Undergraduate internship comprises only a structured global assessment as a method of clinical competence evaluation.
Summary of work: Weekly meetings were held during seven months and participants discussed about the OSCE, as well as the development of blueprint and the building of test stations. By means of Survey Monkey, two surveys were performed prior to implementation of OSCE in our institution. They aimed to check how students and teachers view the clinical competence evaluation and the current institutional assessment model.
Summary of results: Students have been submitted to badly elaborated tests which were viewed as non-formative. Teachers think that assessments prepare students better and that clinical competence evaluation is essential. Teachers view the clinical competence evaluation as an assessment of skills and performance. OSCE performance assessment was implemented giving students feedback with five stations and then re-implemented with ten stations including 16 teachers (11 evaluators, two observers, two actors and a coordinator) and 19 students (ten actors and nine evaluators). Each station lasted ten minutes, eight of which devoted to evaluation and two to student feedback. A new Survey-monkey survey was held after the OSCE. Only knowledgeable to 22.2% of the students, the OSCE with feedback reached their expectations and they all agree that this new method of clinical competence evaluation needs to be implemented in their medical course. Conclusions: The OSCE with feedback is an important tool in clinical competence evaluation and is able to motivate teachers and students alike. Faculty felt motivated on working with performance assessment and planned to include it as part of the faculty development program.
Take-home messages: All steps of development and implementation of the OSCE provided both to faculty and students the opportunity to reflect about the learning process, the evaluation and the overall structure of their medical course.
ABSTRACT BOOK: SESSION 7 TUESDAY 27 AUGUST: 1045-1230
Students' Perceptions About Objective Structured Clinical Examination In Dentistry, University Of Concepcion, Chile
Claudio del Canto (Universidad de Concepcion, Prosthodonthics, Roosevelt 1550 Concepcion, Chile) Liliana Ortiz (University of Concepcion, Medical Education, Concepcion, Chile)
Background: The Objective Structured Clinical Examination (OSCE) has widely shown its advantages on measuring skills in health professions students. This paper presents the students' perceptions about the first OSCE experience at the Dental School, University of Concepcion-Chile at the clinical level. Summary of work: Objective: To characterize the undergraduate students' perceptions about the implementation of an OSCE in the Clinic of Prosthodontics in the final year of the curriculum. Methods: A descriptive study. 66 students participated in this OSCE. Review was collected electronically using a Likert survey.
Summary of results: Students appreciated the implementation of the OSCE in the subject's assessment process in the fields of objectivity, self-evaluation, autolearning, and coherence with the learning objectives of the course. The time allocated to some stations must be modified. The perceived difficulty of OSCE and anxiety produced by participating in it are similar to any other assessment. The students think it should be used to assess their clinical performance skills in other disciplines.
Conclusions: OSCE was successfully implemented in a clinical subject for the first time. This exam should be used widely at the clinic because of its advantages.
Can 3rd year Medical Students write a 4th year OSCE? Making a summative exam formative
Richard Lee (University of Alberta, Undergraduate Medical Education, Edmonton, Canada) Patrick San Agustin (University of Alberta, Emergency Medicine, Edmonton, Canada)
Mohit Bhutani (University of Alberta, Internal Medicine, Edmonton, Canada)
Tracey Hillier (University of Alberta, Undergraduate Medical Education, 1-002 Katz Group Centre, Edmonton T6G 2E1, Canada
Background: University of Alberta medical students sit a summative OSCE at the end of their 4 year program to assess competency to advance. This leaves little opportunity to remediate learners who fail. We hypothesized that administering this exam at the end of 3rd year would allow failed students (FS) to remediate areas of deficiency during their final year of medical school.
Summary of work: The entire academic record of each FS (including clinical, written exam and OSCE performance) was reviewed by 1 of 3 senior Faculty
education experts, who then prepared an individualized remediation plan in the form of outcome objectives. FS re-challenged the exam after 7 months of remediation informed by this plan.
Summary of results: 25/151 (16.6%) students failed the OSCE when administered at the end of 3rd year compared to 12/143 (8.4%) students from the previous year when the OSCE was administered at the end of 4th year. 22/25 (88%) FS passed the retake after 7 months with remediation compared to 10/12 (83.3%) previous year's FS after 4 weeks without remediation. 20/22 (90.9%) FS agreed they had enough time for
remediation, 13/22 (59.0%) preferred the OSCE at the
end of 3rd year and 12/22 (54.5%) students agreed the OSCE helped identify areas of weakness. Conclusions: It is feasible for 3rd year students to write a 4th year OSCE. Although initial fail rates were higher, retake fail rates were not. Administering this exam earlier allowed more opportunity to remediate the learner in difficulty.
Take-home messages: Remediation should be offered to the learner in difficulty.
An empirical method of setting OSCE pass-scores with small numbers of candidates
Dwight Harley (University of Alberta, Faculty of Medicine, 1-001 Katz Centre, Edmonton, Alberta T6G 1E7, Canada)
Margaret Dennett (Vancouver Community College, Certified Dental Assisting, Vancouver, Canada)
Background: The OSCE is a commonly used objective measure of clinical competency. When OSCEs are part of an evaluation process, determining valid pass-scores is critical
Summary of work: Several methods of standard setting have been applied to OSCEs. Although the Borderline Regression method is becoming the method of choice, results are inconclusive. This method relies on several key assumptions which if not met, lead to spurious results. Often when conducting OSCEs, the groups of candidates are small thereby causing concern about satisfying the underlying assumptions. Fitting a resistant-line is a non-parametric method of curve fitting that can be used to set pass-scores with minimal regard for model assumptions. This study compares pass-scores based on smoothed resistant-lines to those set by the borderline regression method. An eight-station OSCE was administered to the 28 fourth year medical students at the University of Alberta. Pass-scores were determined for each station using both methods. Summary of results: Different approaches to standard setting result in differing pass-scores. When the regression assumptions are satisfied, pass-scores are similar. When the data have outliers or long tails pass-scores are less similar suggesting that those determined by the resistant-line method may be more valid. Conclusions: The use of resistant-lines to determine OSCE pass scores has promise but requires further investigation. Subsequent research involving this
ABSTRACT BOOK: SESSION 7 TUESDAY 27 AUGUST: 1045-1230
method needs to look at the effect of smoothing the resistant line, examining the effect of extrapolation and developing computer based applications to perform the necessary calculations efficiently Take-home messages: Further study is required to determine more robust methods of setting pass-scores with small groups.
Peer Organised OSCE - Useful Revision Opportunity for Undergraduates?
Sarah Staight (University College London, Medical School, London, United Kingdom) Vruti Dattani (University College London, Medical School, London, United Kingdom)
Rakhee Nathwani (National Hospital for Neurology and Neurosurgery, Anaesthesia, London, United Kingdom)
Background: Due to changes in the curriculum, undergraduate students in their penultimate year will sit one summative Objective Structured Clinical Examination (OSCE) at the end of the academic year (instead of sitting 3 modular OSCEs). A peer organised mock OSCE for the Child and Family Health with Dermatology (CFHD) Module was devised, and participants' feedback evaluated. Summary of work: Sixteen stations were designed by medical students and approved by a senior clinician. Candidates sat the mock exam either individually or in pairs. Doctors or medical students assessed the candidates, based on a marking scheme. Each station was 5 minutes long and was followed by 2 minutes of constructive verbal feedback. All candidates completed feedback sheets rating the quality of individual station feedback and overall components of the mock exam from a scale of 1-5 (1-Poor to 5-Excellent). Summary of results: 127 students attended and rated 88.3% of individual station feedback as either Excellent (45.7%) or Good (42.6%). Overall the Mock OSCE received either Excellent or Good by 95.3% of participants for content, 92.9% for variety of stations, 93.7% for organisation, 87.4% for similarity to formal exam setting and 95.3% for usefulness. Qualitative feedback from participants highlighted the perceived demand for more OSCE practice. Conclusions: The mock CFHD OSCE was found to be an extremely valuable learning experience for the participating students.
Take-home messages: Peer organised OSCEs can be useful for revision purposes.
An Evaluation of Objective Structured Clinical Examination (OSCE) scores in 6th year medical students
Satit Klangsin (Prince of Songkla University, Obstetrics and Gynecology, Faculty of Medicine, Hat Yai 90110, Thailand)
Chitkasaem Suwanrath (Prince of Songkla University, Obstetrics and Gynecology, Hat Yai, Thailand)
Siwatchaya Khanuengkitkong (Prince of Songkla University, Obstetrics and Gynecology, Hat Yai, Thailand) Nungruthai Saeaib (Prince of Songkla University, Obstetrics and Gynecology, Hat Yai, Thailand) Saovakon Boonkumnerd (Prince of Songkla University, Obstetrics and Gynecology, Hat Yai, Thailand)
Background: Our Objective Structured Clinical Examination (OSCE) consists of 5 sections (History taking, physical examination, laboratory interpretation, procedure competency, and communication skills). We aimed to evaluate OSCE scores and pass rates on 5 sections in 6th year medical students. Summary of work: OSCE scores of 433 sixth-year medical students at the end of training in Obstetrics and Gynecology department between 2010 and 2012 were analysed. The minimum passing scores were determined for each station by experienced obstetrics and gynecologic staff.
Summary of results: The mean OSCE scores in history taking, physical examination, laboratory interpretation, procedure competency, and communication skills were
66.8%, 71.7%, 57.7%, 71.7% and 56.7%, respectively.
The passing rates were 71.5%, 78.4%, 53.2%, 74.1% and 46.2%, respectively.
Conclusions: Communication skills and laboratory
interpretation had low passing rates.
Take-home messages: We recommend that 6th year
medical students should be provided with more learning
experiences in communication skills and laboratory
The Association between the Objective Structured Clinical Examination (OSCE) scores on Amniotomy and the Experiences from Clinical Practice
Siwatchaya Khanuengkitkong (Prince of Songkla University, Department of Obstetrics and Gynecology, Faculty of Medicine, Hatyai, Songkhla 90110, Thailand) Sathana Boonyapipat (Prince of Songkla University, Obstetrics and Gynecology, Hatyai, Songkhla, Thailand) Nungruthai Saeaib (Prince of Songkla University, Obstetrics and Gynecology, Hatyai, Songkhla, Thailand) Satit Klangsin (Prince of Songkla University, Obstetrics and Gynecology, Hatyai, Songkhla, Thailand) Sirirat Thamrongwat (Prince of Songkla University, Obstetrics and Gynecology, Hatyai, Songkhla, Thailand)
Background: Amniotomy is an essential skill for medical students. Clinical competency in amniotomy is evaluated using the Objective Structured Clinical Examination (OSCE); this test is administered after medical students have had an opportunity to practice on models and on patients under supervision. We aim to assess the correlation between the scores on the OSCE and experiences from clinical practice. Summary of work: The study was conducted on 5th year medical students during training in the Obstetrics and Gynecology Department, 2010. The demographic data and grades of Obstetrics and Gynecology students were
ABSTRACT BOOK: SESSION 7 TUESDAY 27 AUGUST: 1045-1230
obtained. The OSCE scores on amniotomy were evaluated at the end of training. Clinical experience was evaluated by log books recording the number of procedures performed successfully. The OSCE scores were classified using the cut-point of median at 73. Chi-square and Fisher's exact test were used for analysis and a P < 0.05 was considered significant. Summary of results: A total of 135 medical students were evaluated. Ninety-eight students (72.6%) performed 1-2 procedures, 8 students (6%) performed 3-4 procedures and 29 students (21.4%) did not perform a procedure. Student grades on the paper-based test and overall clinical competency were the factors associated with high OSCE scores, P = 0.014 and 0.036. However, the number of procedures performed did not correlate to the OSCE score (P 0.089). Conclusions: OSCE scores on amniotomy correlate with student grades in Obstetrics and Gynecology, however, these scores are not related to the number of procedures performed.