THE DEFINITIVE GUIDE TO THE OSCE For Elsevier Content Strategist: Laurence Hunter Content Development Specialist: Carole McMurray Project Manager: Anne Collett Designer/Design Direction: Miles Hitchen Illustration Manager: Amy Faith Naylor Illustrator: Suzanne Ghuzzi THE DEFINITIVE GUIDE TO THE OSCE The Objective Structured Clinical Examination as a performance assessment Ronald M. Harden OBE MD FRCP (Glas) FRCPC FRSCEd Professor Emeritus Medical Education, University of Dundee, UK; General Secretary, Association for Medical Education in Europe (AMEE) Pat Lilley BA (Hons) Operations Director, Association for Medical Education in Europe (AMEE) Madalena Patrício PhD Professor of Education, Faculty of Medicine, University of Lisbon, Portugal Foreword by Geoff Norman PhD Professor Emeritus, Department of Clinical Epidemiology and Biostatistics, McMaster University, Hamilton, Ontario, Canada Edinburgh London New York Oxford Philadelphia St Louis Sydney Toronto 2016 SECTION A Atrophies and Disorders of Dermal Connective Tissues © 2016 Elsevier Ltd. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage and retrieval system, without permission in writing from the publisher. Details on how to seek permission, further information about the Publisher’s permissions policies and our arrangements with organizations, such as the Copyright Clearance Center and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions. This book and the individual contributions contained in it are protected under copyright by the Publisher (other than as may be noted herein). ISBN 978-0-7020-5550-8 Notices Knowledge and best practice in this field are constantly changing. As new research and experience broaden our understanding, changes in research methods, professional practices, or medical treatment may become necessary. Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using any information, methods, compounds, or experiments described herein. In using such information or methods they should be mindful of their own safety and the safety of others, including parties for whom they have a professional responsibility. With respect to any drug or pharmaceutical products identified, readers are advised to check the most current information provided (i) on procedures featured or (ii) by the manufacturer of each product to be administered, to verify the recommended dose or formula, the method and duration of administration, and contraindications. It is the responsibility of practitioners, relying on their own experience and knowledge of their patients, to make diagnoses, to determine dosages and the best treatment for each individual patient, and to take all appropriate safety precautions. To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions, or ideas contained in the material herein. The publisher’s policy is to use paper manufactured from sustainable forests Printed in China Contents Foreword vii Preface xi About the Authors xiv Contributors to Case Studies xvi Acknowledgements xix SECTION A An introduction to the OSCE 1 1 What is an OSCE? 1 An introduction to the OSCE for readers unfamiliar with the concept and, for those already familiar with the OSCE, a more in-depth insight into the characteristics that define the OSCE as an assessment tool. 2 The inside story of the development of the OSCE 13 An account of how the OSCE was conceived and developed in the 1970s in response to the assessment challenges facing educators in the healthcare professions. 3 The OSCE as the gold standard for performance assessment 23 The OSCE with its multiple samples of performance has come to dominate performance assessment and merits a place in every assessor’s toolkit. 4 How the OSCE can contribute to the education programme 35 The OSCE can be adopted as an assessment tool in any situation or phase of education where an assessment of the learner’s clinical or practical skills is important. 5 What is assessed in an OSCE? 49 The OSCE can be used to assess a range of learning outcomes, including communication skills, physical examination, practical procedures, problem solving, clinical reasoning, decision making, attitudes and ethics and other competencies or abilities. SECTION B Implementation of an OSCE 65 6 Choosing a format for an OSCE 65 Flexibility is a major advantage of the OSCE. Many factors influence the choice of format. These include the number of examinees, the purpose of the examination, the learning outcomes to be assessed, the resources available and the context of the local situation. 7 The setting for an OSCE 83 The OSCE can be located in a range of settings. The selection of a venue will depend on the nature of the examination and the local circumstances. v 8 The patient 91 Patients in an OSCE may be represented by real or simulated patients, computer representations, video recordings, medical records and investigation results or a combination of all these. Each has a specific role to play. 9 The examiner 105 Health professionals, simulated patients and students can serve as examiners in an OSCE. Their roles and responsibilities should be defined and training provided. s t n 10 Implementing an OSCE 115 e There are ‘good’ and ‘bad’ OSCEs. Advance planning and effective organisation on the day are neces- t n sary to deliver a ‘good’ OSCE. o C 11 Evaluating the examinee’s performance 127 Different approaches can be adopted for assessing performance in an OSCE, making pass/fail decisions and setting standards. 12 Providing feedback to the learner 149 The OSCE can be a powerful learning experience, and a variety of approaches can be adopted for the provision of feedback to the learner. 13 The examinee’s perspective 161 Communicating with learners about the OSCE is important. Examinees can prepare for and maximise their performance during an OSCE. 14 Evaluating an OSCE 169 Evaluation and quality control of an OSCE is important, and constant monitoring and improvement are necessary. 15 Costs and implementing an OSCE with limited resources 181 The resources required and the costs incurred can be tailored to the local situation. The OSCE can, but need not, be expensive to administer. Many OSCEs are run at little or no additional cost. SECTION C Some final thoughts 193 16 Limitations of the OSCE 193 The OSCE has an important role to play in the examiner’s toolkit alongside other assessment approaches. If recognised, the limitations of the OSCE can be addressed. 17 Conclusions and looking to the future 203 The OSCE will continue to evolve and have a major role to play in response to changes in medical education. SECTION D Case studies 213 SECTION E References 323 SECTION F Bibliography 345 Index 353 vi Foreword When Ron Harden approached me to write the foreword, I viewed it as a distinct honour. It was also a bit of a watershed. There was a time, now two decades ago, when I would have been the last person Ron would have asked (and, yes, it’s Ron to me, not Professor Harden – entirely as a result of the incident I am about to relate). And if he had asked me to write the foreword, to paraphrase Lyndon Johnson, “If asked, I would not write”. But something happened twenty years ago that has a bearing on both the book itself and my authoring of the foreword. Prior to 1995, Ron and I were at opposite poles. With my PhD in physics, I was a purist ivory-tower researcher whose goal was to advance the science of education. Consequences of my actions were of no consequence. Ron’s goals were almost dia- metrically opposed. He genuinely wanted to improve the education of medical stu- dents and physicians, and the more people he could influence, the more impact he could have. I was the elitist; Ron the populist. To me, no standards could be rigorous enough; to Ron, engagement was the issue, and so he would bring in the novices and nurture them to higher standards. Then, in 1995, we met at a small conference in Islamabad, and ended up in my hotel room – just me, Ronald and a third participant named Johnnie Walker. And we have become good friends and confidants ever since. In hindsight, I began to understand better where he was coming from, and moreover, I began to realize that the inclusiveness of meetings like AMEE and the Ottawa Conference, both of which had a large Harden imprimatur (along with Ian Hart, rest in peace, one of the loveli- est men ever to grace this planet), served an ulterior motive. By making a conference presentation accessible to almost all, he created a venue where even novices could come and fall under the influence of some of the masters. Moreover, the large number of participants enabled the conference to “buy” top class people as plenary speakers. So my arrogance was misplaced and arguably Ron, with his inclusiveness, has done more to improve the quality of medical education than all of us academics. At another level, perhaps we were both right. Both of us went on to be awarded the Karolinska Prize, the highest award in medical education research. We have both been widely recognized for our contributions – far more than I (and my wife) could vii ever have dreamed possible. And despite the fact that our world views remain dis- tinct, each has come to appreciate the contribution of the other. Back to Ron, and the underlying rationale for this book. Nowhere is his genuine concern more evident than in the development and success of the OSCE. Ron describes the concerns he had about assessment in Chapter 2. Recognizing the fail- d r ings of the traditional clinical examination, which bore more resemblance to the o Spanish Inquisition than anything in education, he devised the OSCE strategy to w provide a more equitable test of clinical skills. e r o F However, left unstated in his narrative is just why the OSCE became so popular. (And it is popular. I do a workshop on assessment around the world. I used to ask people if they know what an OSCE is. I don’t anymore. Everyone, in every land, knows what OSCE is.) To understand its popularity requires an expanded history lesson. Back in the early 1970s when I was first hired into medical education, we were all preoccupied with “skills” – problem-solving skills, critical thinking skills, commu- nication skills, physical examination skills, evidence-based medicine skills, etc. I was hired (Why me, Lord? Goodness knows.) to investigate clinical problem-solving skills. We put doctors and students into rooms with simulated patients, videoed them, reviewed their tapes, and pored over the transcripts seeking the mysterious elixir of problem-solving skill. We never found it. Instead what we found, looming large, was “content specificity”, as identified by the group at Michigan State (Elstein et al. 1978). In brief, successful problem-solving was dictated as much by the specific knowledge required to solve the problem as by any general intellectual skill. And when we looked at other measures of “problem-solving” we found the same issue. Patient Management Problems or PMPs (McGuire and Babbott 1967) were a written objective case-based test, requiring about 45 minutes per case. For them as well, the correlation of performance measures across problems was 0.1 to 0.3. Since each PMP took about 45 minutes, it was not long before PMPs were dropped from licens- ing and specialty examinations. The solution to the psychometric problem was simply one of sampling. To get good measurement required multiple samples rated by multiple observers (and, inciden- tally, sampling across cases was more important than sampling across raters). The larger issue, as noted by Ron in Chapter 2, was that removing the PMP meant that most examinations were now multiple choice only. While that may be acceptable for a specialty examination where the goal is just precise and valid measurement, it is not acceptable for educational programs because of the potential steering effect (Newble and Jaeger 1983). What was required was something that, on the one hand, efficiently sampled over cases and raters, and on the other, measured actual perform- ance. Enter OSCE! And as they say, the rest is history. Not surprisingly, though, as an innovation gets disseminated, it also gets diluted and mutated. Strategies like problem-based learning, small group learning, viii multiple-choice tests – just about everything we do in education – eventually get reborn in so many variations as to be almost unrecognizable. It’s not like a drug – there is no equivalent of 300 mg t.i.d. As a result, it is critical to develop standards of best practice, based on evidence. To some degree this is achieved by approaches like Best Evidence Medical Education (another Harden innovation), although the constraints of systematic review methodology limit the usefulness of these reviews d as guidelines for educational practice. And that is where this book is an invaluable r o addition. It pulls together in one place pretty well everything that is known about w the OSCE; what works and what doesn’t. It is a welcome addition to the bookshelf e r of any educational leader. Please enjoy! o F Geoff Norman Professor Emeritus, Department of Clinical Epidemiology and Biostatistics, McMaster University, Hamilton, Ontario, Canada References Elstein, A.S., Shulman, L.S., Sprafka, S.A., 1978. Medical Problem Solving: An Analysis of Clinical Reasoning. Harvard University Press, Cambridge MA. McGuire, C.H., Babbott, D., 1967. Simulation technique in the measurement of clinical problem-solving skills. J. Educ. Meas. 4, 1–10. Newble, D.I., Jaeger, K., 1983. The effect of assessments and examinations on the learning of medical students. Med. Educ. 17, 165–171. ix