ebook img

Evaluation Practice for Collaborative Growth: A Guide to Program Evaluation with Stakeholders and Communities PDF

257 Pages·2018·5.658 MB·English
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview Evaluation Practice for Collaborative Growth: A Guide to Program Evaluation with Stakeholders and Communities

Evaluation Practice for Collaborative Growth Evaluation Practice for Collaborative Growth A Guide to Program Evaluation with Stakeholders and Communities Lori L. Bakken 1 1 Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and certain other countries. Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America. © Oxford University Press 2018 All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by license, or under terms agreed with the appropriate reproduction rights organization. Inquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above. You must not circulate this work in any other form and you must impose this same condition on any acquirer. Library of Congress Cataloging- in- Publication Data Names: Bakken, Lori L., author. Title: Evaluation practice for collaborative growth : a guide to program evaluation with stakeholders and communities / by Lori L. Bakken. Description: New York, NY : Oxford University Press, [2018] | Includes bibliographical references and index. Identifiers: LCCN 2017057063 (print) | LCCN 2017059395 (ebook) | ISBN 978–0–19–088538–0 (updf) | ISBN 978–0–19–088539–7 (epub) | ISBN 978–0–19–088537–3 (pbk. : alk. paper) Subjects: LCSH: Evaluation research (Social action programs) Classification: LCC H62 (ebook) | LCC H62.B2865 2018 (print) | DDC 658.4/0 13—dc23 LC record available at https:// lccn.loc.gov/ 2017057063 9 8 7 6 5 4 3 2 1 Printed by WebCom, Inc., Canada CONTENTS Preface ix Acknowledgments xiii About the Authors xv PART ONE: Prepare 1. Thinking Like an Evaluator 3 Purchasing a Car: An Example of Evaluation in Everyday Life 4 Major Components of an Evaluation Process 6 Program Planning and Evaluation 11 Summary 12 References 12 2. Acquiring Requisite Knowledge and Skills 14 Roles of Evaluators 14 Evaluation Standards 16 Ethics and Human Subjects Protections 18 Evaluation Competencies 20 Communication and Negotiation 22 Building Partnerships and Capacity for Evaluation 24 Effective Partnerships and Collaborations 26 Inclusive Practice in Evaluation 26 Summary 30 References 30 3. Choosing an Evaluation Approach 33 Philosophical Perspectives that Influence Evaluation 33 Expertise- oriented Approaches 34 Consumer- oriented Approaches 35 Program- oriented Approaches 35 Decision- oriented Approaches 42 Participant- oriented Approaches 43 Systems Approaches 46 Approaches and the Evaluation Tree 46 Matching Approaches with Evaluation Questions 47 Examples of Integrated Approaches to Evaluation 49 Summary 50 References 50 4. Planning a Program Evaluation 53 Understanding the Evaluation’s Context 54 Identifying and Engaging Stakeholders 56 A Program’s Purpose, Goals, and Activities 59 Using Theory to Focus an Evaluation 61 Evaluability Assessment 66 An Evaluation’s Purpose and Use 67 Evaluation Questions 69 Evaluation Proposals and Contracts 71 Summary 74 References 75 PART TWO: Design 5. Designing a Program Evaluation 79 Qualitative Designs 80 Case Study Designs 82 Quantitative Designs 83 Study Designs for Evaluating Contribution 90 Threats to Internal and External Validity in Quantitative Studies 91 Mixed Methods Designs 92 Complexity in Study Designs 93 Summary 94 References 95 6. Choosing Samples, Sampling Methods, and Data Sources 96 Data Sources and Units of Analysis 97 Sample Size, Selection, and Strategies 98 Defining Inclusion and Exclusion Criteria for Your Sample 102 Sampling Bias 103 Sample Size 104 Sampling Considerations in Relation to Study Designs 105 Strategies for Participant Recruitment and Retention 106 Ethical Practice 106 Sampling and Data Collection Plans and Protocols 108 Summary 109 References 109 [ vi ] Contents PART THREE: Conduct 7. Collecting, Storing, and Tracking Information 113 What and Who Defines Credible Evidence? 114 Survey Design and Development 116 Psychometric Tests 120 Interviews and Focus Groups 123 Observations and Video- recordings 125 Checklists and Rubrics 126 Maps and Drawings 126 Photographs 129 Existing Sources 131 Capturing Accurate and Reliable Information 132 Designing Databases for Electronic Storage 133 Tracking Information 136 Summary 137 References 137 8. Analyzing and Interpreting Quantitative Data 139 Cleaning Data and Handling Missing Data 139 Matching Statistics to Quantitative Study Designs 142 Variables, Constants, and Levels of Measurement 143 Descriptive Statistics 145 Simple Plots 146 Statistical Assumptions 152 Evaluation Hypotheses 154 Sample Size, Power, and Effect Size 156 Aligning Statistical Analyses with Analytical Designs 159 Multivariate Analyses 161 Statistical Tests for Multiple Dependent Variables 163 Nonparametric Statistics 164 Reporting Statistics and Statistical Analysis 165 Working with Statisticians and Building Your Own Capacity 166 Summary 166 References 167 9. Analyzing and Interpreting Qualitative Data 169 Qualitative Thinking 169 Methodological Approaches and Analytical Strategies 170 Qualitative Approaches and How They Influence an Analysis 172 Ethical Issues in Qualitative Analysis 174 Preparing for the Analytical Process 174 Qualitative Analysis 175 Managing Data in Qualitative Analysis 182 Contents [ vii ] Summarizing, Organizing, and Describing Qualitative Information 183 Summary 185 References 185 PART FOUR: Report and Prepare Again 10. Reporting and Disseminating Findings 189 Written Evaluation Reports 189 Brief Written Reports 197 Oral Presentations 198 Storytelling 199 Electronic and Social Media 199 Acting 200 Publications in Professional Journals 201 Summary 202 References 202 11. Preparing for Complexity in Evaluation 204 From Program Theory to Systems Theory 205 Simple, Complicated, and Complex Situations and Problems 206 Static, Dynamic, and Dynamical Change 207 Realistic Evaluation for Complicated Situations and Dynamic Change 208 Systems Thinking and Complexity 209 Developmental Evaluation 213 Social Justice and Inclusive Evaluation Practice 213 Preparing for the Future: Evaluation Skills for a Changing Field 220 Summary and Implications for Evaluation Practice 221 References 222 Index 225 [ viii ] Contents PREFACE Practitioners across professions are continually faced with funders’ growing requirements for information that demonstrates a program’s worthiness of fi- nancial support and value to those it serves. Low programming budgets often prohibit small organizations, especially nonprofits, from hiring professional evaluators to address these requirements, so practicing professionals must have some level of understanding and ability to design and conduct a program evaluation. This book provides a resource for readers who want to build their capacity for program evaluation and be guided through its seemingly daunting and elusive process. Therefore, this book is for those who develop or coordinate programs and work with people, partners, and communities in disciplines such as public health, social work, education, environmental sciences, and community development. It provides a fundamental understanding of pro- gram evaluation concepts, strategies, and practices while maintaining a focus on those that have been most useful to me and my collaborators. It, therefore, fills a unique gap among other books on program evaluation through its focus on basic concepts, simple writing style, familiar examples, and practical tools. Throughout the book, I encourage readers to collaborate and partner with a program’s key stakeholders during the evaluation process so that the final product is both useful to and used by them. Collaborations and partnerships in evaluation can trigger disagreements and controversy among stakeholders with competing interests. So, this book prepares readers for some of the eth- ical and political challenges that may be encountered when conducting a pro- gram evaluation and provides strategies for how to handle them in today’s complex sociopolitical environment. At times, the book’s contents may seem a bit advanced for those who are not specialists in evaluation. Some advanced concepts are intentionally incorporated to build a reader’s evaluation capacity and avoid misapplied concepts, oversimplified approaches, or easy strategies that reduce the accuracy of information and potential power of evaluation. Although this book is designed and written as a resource for practitioners, it can be used to support courses, workshops, and other capacity-b uilding efforts

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.