ebook img

ERIC ED528495: Measuring Fidelity of Implementation--Methodological and Conceptual Issues and Challenges. CRESST Report 811 PDF

2.1 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC ED528495: Measuring Fidelity of Implementation--Methodological and Conceptual Issues and Challenges. CRESST Report 811

MEASURING FIDELITY OF IMPLEMENTATION- METHODOLOGICAL AND CONCEPTUAL ISSUES AND CHALLENGES JANUARY, 2012 Ellen Osmundson Joan Herman Cathy Ringstaff Yunyun Dai Mike Timms Measuring Fidelity of Implementation- Methodological and Conceptual Issues and Challenges CRESST Report 811 Ellen Osmundson, Joan Herman, and Yunyun Dai CRESST/University of California, Los Angeles Cathy Ringstaff and Mike Timms WestEd January, 2012 National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Center for the Study of Evaluation (CSE) Graduate School of Education & Information Studies University of California, Los Angeles 300 Charles E. Young Drive North GSE&IS Bldg., Box 951522 Los Angeles, CA 90095-1522 (310) 206-1532 Copyright © 2012 The Regents of the University of California The work reported herein was supported by prime sponsor number R305B070354 from the US Department of Education to WestEd, grant number 5387S07-091, from the American Institutes for Research with funding to the National Center for Research on Evaluation, Standards, and Student Testing (CRESST). The findings and opinions expressed in this report are those of the author(s) and do not necessarily reflect the positions or policies of WestEd, ASK, FOSS, or the US Department of Education. To cite from this report, please use the following as your APA reference: Osmundson, E., Herman, J., Ringstaff, C., Dai, Y., & Timms, M. (2012). Measuring fidelity of implementation- Methodological and conceptual issues and challenges. (CRESST Report 811). Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST). TABLE OF CONTENTS Abstract ..................................................................................................................................1 Introduction ............................................................................................................................1 Perspective .........................................................................................................................2 Overview of Current Study ................................................................................................3 Fidelity of Implementation Defined ...................................................................................4 Research Questions, Methods, and Findings .........................................................................5 Research Question 1: How Do We Effectively Measure Fidelity of Implementation? ................................................................................................................6 Data Sources and Analyses ................................................................................................7 Research Question 2: How Can Fidelity of Implementation Measures Be Validated? ........................................................................................................................11 Relationship between Indicators ......................................................................................16 Research Question 3: What Practical Challenges are Associated with Developing Effective Measures of Fidelity? ...................................................................20 Importance of a ―Practice‖ Year ......................................................................................22 Log Development and Refinement ..................................................................................22 Conclusions ..........................................................................................................................22 References ............................................................................................................................25 Appendix A: Magnetism and Electricity Module: Pre/post Assessment .............................27 Appendix B: Teacher Content Survey .................................................................................29 Appendix C: ASK/FOSS Classroom Assessment Observation Protocol ............................31 Appendix D: ASK Study Research-Phone Interview ..........................................................33 Appendix E: FOSS Study Research-Phone Interview .........................................................35 Appendix F: Weekly Teacher Log-FOSS/ASK Water Module ..........................................37 Appendix G: Weekly Teacher Log-FOSS Water Module ...................................................39 iii MEASURING FIDELITY OF IMPLEMENTATION - METHODOLOGICAL AND CONCEPTUAL ISSUES AND CHALLENGES Ellen Osmundson and Joan Herman CRESST/University of California, Los Angeles Cathy Ringstaff WestEd Yunyun Dai CRESST/ University of California, Los Angeles Mike Timms WestEd Abstract A central challenge in efficacy studies centers on the issue of ―fidelity of implementation,‖ that is, the extent to which participants use the curriculum specified by curriculum developers. In this study, we describe and discuss a ―fidelity of implementation‖ model using multiple methods and instruments to compare two versions of a science curriculum and embedded assessment system. We present results from our validation study and discuss the challenges in determining the extent to which teachers use a curriculum as designed. We focus as well on the practical curriculum implementation issues amidst the needs and perspectives from different stakeholders. Introduction The No Child Left Behind Act of 2001 produced an explosion of interest in the use of assessment to measure and improve student learning. Evidence from classroom-level assessment is clear: Teachers’ ongoing use of assessment to guide and inform instruction— formative assessment—can lead to statistically significant gains in student learning (Black & Wiliam, 1998, 2001, 2004). Taking the evidence on formative assessment to heart, a well- known, well-established hands-on science curriculum was recently revised to include an embedded assessment system. This system is designed to provide teachers with evidence of student learning to effectively guide instruction and support student achievement. In 2007, researchers received funding from the Institute of Education Sciences (IES) to conduct a Goal 3 Efficacy Study, utilizing a randomized controlled trial to determine which hands-on science curriculum (i.e., traditional versus one with an embedded assessment system) leads to higher student achievement. Both groups of teachers used the standard 1 curriculum, and were expected to teach all investigations in the modules, as well as follow the intended instructional sequence. Teachers in the control group used the original curriculum assessments – general tasks that follow the instructional sequence. Treatment teachers used an enhanced assessment system, one comprised of a series of curriculum– aligned embedded tasks for each lesson, and for each series of investigations. The embedded assessment system was designed to support teachers’ knowledge and understanding of students’ conceptual development as they engage in hands-on science experiences. A central challenge in efficacy studies centers on the issue of ―fidelity of implementation,‖ that is, the extent to which participants use the curriculum in the manner in which curriculum developers specify. Implementation of the curriculum is assumed, but often a realistic picture of ―full implementation‖ is incomplete due to measurement and logistical challenges. For example, how do we know if teachers follow the instructional sequence outlined by the curriculum or if they deviate from the intended sequence? Are lessons or additional materials added to the curriculum, or portions of lessons omitted? Critical as well to understanding the efficacy of a curriculum is information on the quality of the implementation. How well, for example, do teachers understand the concepts they are teaching? Does their understanding support accurate interpretation of student responses? Can teachers respond to students’ needs with appropriate instructional next steps? Answers to these questions are critical to determining which curriculum is ―better,‖ that is, more effective, as well as the specific factors associated with the implementation that contribute to student learning. Perspective To make valid comparisons between the outcomes associated with the use of the two versions of the science curriculum under investigation, we needed to determine the extent to which teachers implemented the curriculum with fidelity. Documentation and description of implementation levels are essential to determine if student performance can be associated with the use of the curriculum being studied. These data can also provide information about which aspects of the curriculum appear to contribute to improved student achievement. The information is particularly important if effects are inconclusive or negative (e.g., no difference between treatment and control conditions), and to determine if the outcomes can be attributed to the curriculum itself, or to curriculum implementation factors (e.g., partial or incomplete curriculum implementation). This paper focuses specifically on how researchers conceptualized fidelity of implementation, designed instruments to measure it, and validated these instruments, based 2

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.