ebook img

ERIC EJ915879: Measuring the Achievement of Professional Development Schools PDF

2010·0.68 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ915879: Measuring the Achievement of Professional Development Schools

Measuring the Achievement of Professional Development Schools Deb Theiss Carl Grigsby University of Central Missouri ABSTRACT: Universities systematically use assessments to evaluate programs of teacher edu- cation. The Professional Development Schools Partnership was a well-established collabora- tion with more than 11 years of work with area schools. However, two questions became the center of a discussion for evaluating, monitoring, and identifying the accomplishments of the professional development school network: first, “Was there a measure of how far we had really come?”; second, “What exactly has our Professional Development Schools Partnership accom- plished?” In an effort to answer these questions and learn about the growth of the organization, the board created a task force to develop assessment tools and put into place a systematic process for evaluation that would over time inform the partnership of growth, strengths, and areas of needed improvement. Two evaluative tools and scoring guides provided valuable in- sight toward how effectively the partnership had fully implemented standards for professional development schools. Data were collected and analyzed to provide a vehicle for decision making. Here is Edward Bear, coming downstairs of PDSs is important toward monitoring the now, bump, bump, bump, on the back growth of the partnership (Reed, Kochan, of his head, behind Christopher Robin. Ross, & Kunkel, 2001). An analysis of the It is, as far as he knows, the only way of partnership can contribute to identifying coming downstairs, but sometimes he feels strengths and areas of needed improvement to that there really is another way, if only effect a simultaneous renewal of energies and he could stop bumping for a moment and resources (Bullough, Kauchak, Crow, Hobbs, think of it. & Stokes, 1997). —Milne (1926, p. 3) The National Council for Accreditation of Teacher Education initiated a process for The impact of professional development assessing and evaluating PDSs, resulting in schools (PDSs) on preservice teachers is well the publication of the Standards for Professional documented and supports the position that Development Schools and the Handbook for the interns at PDS schools achieve higher than do Assessment of Professional Development Schools interns assigned to non-PDS schools (Castle, (Trachtman, 2007). A self-study process helps Fox, & Souder, 2006; Darling-Hammond, PDS partners to evaluate and determine the 2007; Levine, 2002; Snyder, 1999). Teacher program’s quality of experiences. Evaluating leadership developed within PDSs also con- and analyzing PDSs in terms of the council’s tributes to the overall success of the PDS stages of development and successful practices program (Darling-Hammond, Bullmaster, & contributes to understanding the effective- Cobb, 1995). Building individual and orga- ness of the PDS program (Reed et al., 2001). nizational capacity for assessing the impact School–University Partnerships Vol. 4, No. 1 29 30 DEB THEISS AND CARL GRIGSBY Phase 1 focuses on structural considerations; school) and interns seemed to be excited and Phase 2 examines the climate; and Phase 3 satisfied with the experience. Yet was there a analyzes the degree to which the five standards measure of how far they had really come? This have been implemented and reviews intended question became the center of a discussion and unintended consequences. Using multiple for evaluating, monitoring, and identifying sources to document the growth and change in the accomplishments of the PDS network. PDSs can add to the ability of the organization Two questions provided the stimulus to “stop to make informed decisions for improvement bumping for a moment and think”: first, (Teitel, 2001). “What exactly has our Professional Develop- ment Schools Partnership accomplished?”; second, “How are we doing?” Background As a result of these conversations, the board created a task force to create assessment The Professional Development Schools Part- tools and to put into place a systematic pro- nership in this Midwest region was well cess for evaluation in 2006–2007 that could established, with more than 11 years of col- be used over time to inform the Professional laborating with rural area schools. Just as Development Schools Partnership of growth, Edward Bear in the book Winnie the Pooh, strengths, and areas of needed improvement. the PDS partnership seemed to be bumping These initial documents were planned and along just fine . . . or was it? Everyone was implemented on the basis of the PDS learning working hard; the PDS faculty (university and model agreed on by the board (see Figure 1). Figure 1. Professional development schools learning model. Measuring the Achievement of Professional Development Schools 31 Data Collection Information was gathered from classroom teachers, administrators, and university fac- The tools designed to evaluate the partner- ulty. Individual school districts and the av- ship were based on the standards for the Na- erage for the four school districts in the tional Association for Professional Develop- partnership were assessed on their progress ment Schools’ five levels of implementation along the continuum of fully and effectively (Trachtman, 2007) and the standards for implementing the national PDS standards. teacher preparation in the state of Missouri The self-reporting instrument uses a ranking (Missouri Department of Elementary and system (does not meet, progressing, meets, ex- Secondary Education, 2006). Data collection ceeds), which is then converted to a 4-point began in the spring of 2007 with evaluations scale for data analysis. In addition to ranking completed and analyzed over a 2-year period the five standards, participants were asked to ending in the spring of 2008. Four school dis- discuss the strengths, note the areas of needed trict sites involved in the Professional Devel- improvement, and add any comments that opment Schools Partnership completed the would provide a deeper understanding of the Professional Development Schools Program collaboration. Evaluation and the Intern Evaluation as part of the College of Education’s annual data Instrumentation collection, which evaluates the effectiveness of field-based programs and their impact on As mentioned, the instrument is based on a student learning (preK–12) and the develop- Likert-type scale (0 = does not meet, 3 = ex- ing skills of preservice teachers. ceeds). Participants could also respond with The PDS internship program is part of the not observed. Four PDS school sites provided senior-block experience before student teach- feedback using the assessment documents. ing. Students are placed in a classroom within One school site did not provide data in the an elementary or middle school and assigned spring of 2007. to a district supervisor. University supervisors work in cooperation with the district supervi- Participants sor at the building site. During the summer in- stitute, university and district faculty set goals, A total of 136 evaluations were completed plan activities for the school year, and create over the 2-year period: 63 in the spring of a calendar for the school year. 2007 and 73 in the spring of 2008. District teachers completed all but 3 evaluations: District administrators completed 2 (spring The Professional Development 2007), and a university faculty member com- Schools Program Evaluation pleted 1 (spring 2007). There were no district data for one school site in the spring of 2007. The Professional Development Schools Pro- gram Evaluation was used to evaluate the Procedures progress of the Professional Development Schools Partnership in fully implementing The director of the Professional Development the partnership. A scoring guide was created Schools Partnership distributed the surveys to reflect these standards (see Appendix A). to the university faculty for each building At the four PDS district sites, university and site in the spring of 2007 and 2008. The uni- school faculty completed the scoring guides versity faculty distributed the surveys to the to provide insight into how the five standards district faculty and administrators. The direc- were evidenced in the collaborative work of tor then collected the surveys from the sites, the partnerships and to identify the PDSs dis- and quantitative and qualitative analysis was trict site’s stage of development. conducted. 32 DEB THEISS AND CARL GRIGSBY Findings in the combined average scores per standard for all school districts (see Figure 2). Quantitative results. The average score for Qualitative results. Participants answered each standard based on the compiled school three open-ended questions regarding the district surveys (spring 2007 and spring 2008) strengths that the Professional Development was examined to assess the progress of the Schools Partnership offered, the areas of needed Professional Development Schools Partner- improvement that were warranted, and any ad- ship’s efforts to fully implement the national ditional comments. For the combined 2007– standards for PDSs. Table 1 shows the aver- 2008 PDS program evaluation, there were 88 age per standard and the level at which each responses recorded as program strengths and standard was met. The range was from 2.23 33 for areas of needed improvement. The addi- for Standard 2 (accountability and quality tional comments were categorized as strengths assurance) to 2.33 for Standard 3 (collabora- or areas of needed improvement and were tion). All the standards were at the level of included in the number of responses recorded. meets. An analysis of these comments offered infor- The individual averages per school dis- mation about the Professional Development trict were calculated, offering a unique look Schools Partnership in terms of strengths and into how each district was progressing in areas of needed improvement. its development on the continuum of fully Regarding strengths, the 88 responses were implemented standards. This information was sorted according to the national standards shared with the districts and thus served as a (Standards 1–5). Of the responses sorted, 72 basis for goal setting and decision making for comments related to Standard 1 (learning sustenance and improvement of current pro- community); 7 were identified with Standard grams. Although the differences between the 3 (collaboration); and 9 were connected to 2007 and 2008 results were not significant (see Standard 5 (structures, resources, and roles). Table 2), there was a trend toward increasing No comments were linked to Standard 2 (ac- levels of implementation among the standards countability and quality assurance) or Stan- dard 4 (diversity and equity; see Table 3). Table 1. Average Score for Compiled School District The open-ended questions provided in- Surveys: Spring 2007 and 2008 Combined sight into the effectiveness of the program based on participant comments. For Standard Average Standard Score 1, statements characteristic of the participants were as follows: “an excellent opportunity for 1: Learning community 2.30 prospective educators to work collaboratively 2: Accountability and quality assurance 2.23 3: Collaboration 2.33 with veteran teachers in a real world setting 4: Diversity and equity 2.29 prior to student teaching” and “Book Study 5: Structures, resources, and roles 2.30 Groups provided professional development opportunities for the supervising teachers, Note. Level of implementation for each standard: meets. Table 2. Professional Development School Program Evaluation: Spring 2007 and 2008 District Spring Standard 1 Standard 2 Standard 3 Standard 4 Standard 5 1 2007 1.99 1.98 1.50 1.99 2.08 2008 2.47 2.29 2.47 2.20 2.47 2 2008 2.45 2.20 2.36 2.45 2.49 3 2007 2.33 2.33 2.49 2.41 2.25 2008 2.29 2.23 2.36 2.23 2.29 4 2007 2.44 2.03 2.29 2.22 2.15 2008 2.11 1.94 2.11 2.35 2.18 Note. See Table 1 for standards. Measuring the Achievement of Professional Development Schools 33 Figure 2. Professional development school program evaluation spring 2007 and 2008. the university faculty, and the interns.” For Regarding areas of needed improvement, Standard 3, comments included “I love the 33 responses were coded as such (see Table collaboration and the sense of development 4): Standard 1 (learning community) had 7 we experience together. The program has responses; Standard 2 (accountability and improved for us every year with more teachers quality assurance) included 10 comments; volunteering to participate—very positive” Standard 3 (collaboration) had 1 remark; and and “The program benefits the students in the Standard 5 (structures, resources, and roles) public school setting, the classroom teacher had 15 comments. There were no responses and the university students.” Participants’ coded for Standard 4 (diversity and equity). remarks typical of Standard 5 were character- A response characteristic of Standard 1 was istic of the following: “Most communication “Because of our school calendar this year it between [university] faculty and school district was hard to get to know and work with the faculty is clear, swift, and complete. Problems students [from the university]. Would like to are dealt with effectively, and I am always have more prof. development for the teachers satisfied with the quality of PDSs students at the schools from the university.” For Stan- joining my classroom.” dard 2, many respondents commented on the Intern Evaluation: The evaluation criteria don’t fit what the Table 3. Narrative Comments Regarding Strengths: student had time to do. Most of his time 2007–2008 Combined was spent observing. He taught one lesson Standard n and I don’t feel that was enough to be 1: Learning community 72 able to properly evaluate him using the 2: Accountability and quality assurance 0 current evaluation form; and the PDSs 3: Collaboration 7 student evaluation forms should be given 4: Diversity and equity 0 to the supervising district teacher before 5: Structures, resources, and roles 9 the PDSs students are gone! 34 DEB THEISS AND CARL GRIGSBY Table 4. Narrative Comments Regarding Areas of that strong inquiry-based practices may not Needed Improvement: 2007–2008 Combined consistently be institutionalized across all PDS Standard n sites. Additional investigation into this need is warranted. For the PDS to continue to build 1: Learning community 7 2: Accountability and quality assurance 10 the capacity of the organization, a consistent 3: Collaboration 1 approach needs to be strengthened toward 4: Diversity and equity 0 encouraging and facilitating ongoing inquiry- 5: Structures, resources, and roles 15 based investigations among interns and district/ university faculty, which will help the PDS There was only one comment for Stan- partnership continue to move on the contin- dard 3: “opportunities to present to college uum to full implementation at the exceeds level. classes.” Standard 5 reflected frustration with scheduling and expectations: Standard 2: Accountability and Quality Assurance As stated earlier, it would be helpful if PDS teachers have assignments guidelines The PDS partners have developed assess- and specifics in order to best assist stu- ments, collected information, and are in the dents in selecting instructional materials; process of identifying how best to use the and scheduling was a problem this year. I am quite certain that two out of my three survey results to inform the program. There is candidates (PDS) did not get the observa- feedback reported that indicates that the cur- tion hours needed. rent Intern Evaluation for the PDS experience before the student-teaching block may need revision to better mirror the experiences of the Discussion interns at this level. There were no comments (strengths or areas of needed improvement) The quantitative data from the program evalu- regarding the assessments as a vehicle for ation indicated that all national standards informing and guiding future work of the Pro- were being met, with a trend of improvement fessional Development Schools Partnership. for each standard for the combined average Because continuous assessment and evaluation score of the four school districts in the part- of goal achievement are a vital link to the im- nership. The qualitative data provide an in- pact on preK–12 student learning, it may be depth look at what the strengths and areas of helpful to examine the systematic process for needed improvement are for the Professional examination of how much the PDS partner- Development Schools Partnership and so offer ship increases learning for all. insights into how to further the partnership’s goal of attaining the highest levels of imple- Standard 3: Collaboration mentation. The following conclusions are based on the qualitative analysis. There is a sense that PDS partners collaborate through shared ideas and through working together to improve outcomes for preK–12 Standard 1: Learning Community students. Narrative comments focused on the The rural Professional Development Schools opportunity to work with others, share ideas, Partnership is a learning-centered community and support the learning of preK–12 students with interns and district/university faculty fo- in the classroom. To build the capacity of col- cused on increasing the learning capacity of laboration for the PDS, consideration might preK–12 students, interns, and faculty. Some be given to helping PDS partners engage in sites have strong inquiry-based practices that joint work with reward structures that support include study groups and ongoing professional collaboration. A systematic recognition and development. There is also an identified need celebration of the joint work and contribu- Measuring the Achievement of Professional Development Schools 35 tions that each partner has made will enhance The Intern Evaluation the culture for collaboration. The purpose of the Intern Evaluation was Standard 4: Diversity and Equity to determine the PDS interns’ competencies based on the knowledge, skills, and dispositions There were no comments (strengths or areas required of teacher education candidates in the of needed improvement) regarding the poli- state of Missouri. A scoring guide was created cies and practices that support equitable learn- to reflect these standards (see Appendix B). ing outcomes for diverse learning communities In addition, the data afforded an opportunity for the Professional Development Schools to monitor the developing skill level of pre- Partnership. The mere absence of comments service teachers as they continued their work may warrant a close examination of the sys- in partnering schools, beginning with their tems in place for analyzing data to address the senior-block PDS experience and culminating gaps in achievement among ethnic, racial, with their student-teaching experience. The gender, and socioeconomic groups, which creation of this assessment tool established a would include the assessment of interventions baseline for future comparison. Eleven stan- and an identification of supports in place to dards were assessed with the ranking scale (does provide equitable learning opportunities and not meet, progressing, meets, and exceeds) and outcomes for students. Currently, the Profes- then converted to a scale to ascertain to what sional Development Schools Partnership is degree the intern was meeting expectations. exploring the addition of new partnerships with diverse community populations. Participants Standard 5: Structures, A total of 240 evaluations were collected over the 2-year period: 139 in the spring of 2007 Resources, and Roles and 101 in the spring of 2008. All evalua- The Professional Development Schools Part- tions were completed by teachers employed in nership has established structures that support four school districts. Eight PDS building sites the learning and development of preK–12 within the four districts reported data for this students, candidates, faculty, and other profes- review. No district data were collected for one sionals. The PDS roles are well defined, and re- school site in the spring of 2008. sources are provided to support the PDS work. Ongoing communication will strengthen the Procedures coordination of the programs and thereby al- leviate some of the struggle, with clarification The director of the Professional Development of expectations for intern scheduling and as- Schools Partnership distributed the Intern signments. Evaluation at the same time that the Profes- A recurring theme that emerged from the sional Development Schools Program Evalu- comments was an affirmation of the work of ation was distributed to the university faculty this partnership. Words used over and over in (in the spring of 2007 and 2008), who then relation to the PDS program included valuable, distributed the surveys to the district faculty. wonderful, strength, excellent, enjoy, and asset. The PDS director facilitated collection of Participants believed that districts benefited both surveys. Quantitative and qualitative by seeing strategies taught by the PDS interns analysis was conducted after data collection. and that the university faculty and interns got a chance to experience a real classroom. Results Classroom teachers were provided opportuni- ties to showcase their expertise and build the The average scores for 2006–2007 across the emerging skills of a future educator. 11 standards on the Intern Evaluation ranged 36 DEB THEISS AND CARL GRIGSBY from 1.79 (classroom management) to 1.97 the developing skills of preservice teachers. (reflective practitioner; see Table 5). The av- A review of lowest scores for each standard erage scores for 2007–2008 ranged from 1.60 by site for 2006–2007 shows one standard re- (communication) to 1.80 (diversity; see Table porting two sites at or above meets: classroom 5). Tables 6 and 7 show the average scores management. In 2007–2008, the certification for each certification standard established standard with the least number of sites at meets in the state of Missouri. These data assessed was, again, classroom management. Table 5. Intern Evaluation: Comparison of Average Scores, 2006–2007 to 2007– 2008 Standard Teacher Competenciesa 2006–2007 2007–2008 1 Content knowledge 1.73 2.03 2 Learners and learning 1.72 2.05 3 Curriculum 1.66 2.01 4 Planning/instruction 1.75 2.02 5 Classroom management 1.72 2.12 6 Communication 1.60 1.92 7 Assessment 1.75 2.08 8 Technology 1.62 1.94 9 Diversity 1.80 2.02 10 Reflective practitioner 1.76 2.22 11 Professional relationships 1.69 2.02 aMissouri standards for teacher education programs, established by the Department of Elementary and Secondary Education. Table 6. Missouri Certification Standards, 2006–2007 Standard Site 1 2 3 4 5 6 7 8 9 10 11 M 1 1.79 2.09 1.79 1.93 1.64 1.85 1.90 1.70 2.07 1.79 2.07 1.87 2 2.25 2.13 2.25 2.25 2.25 2.25 2.13 2.13 2.07 1.79 2.07 1.87 3 1.83 1.83 1.80 1.70 1.70 1.83 1.70 1.75 1.60 2.16 1.83 1.79 4 2.00 2.08 2.08 2.00 1.83 2.17 1.88 2.25 1.90 2.29 1.86 2.03 5 2.27 2.36 2.50 2.18 2.36 2.45 2.33 2.45 2.36 2.09 2.30 2.33 6 1.71 1.71 1.86 2.14 1.71 1.43 2.00 2.00 1.71 2.00 2.00 1.84 7 1.49 1.41 1.43 1.35 1.35 1.45 1.32 1.34 1.29 1.67 1.48 1.42 8 1.67 1.61 1.59 1.81 1.50 1.75 1.50 1.47 1.64 1.61 1.75 1.63 M 1.88 1.90 1.91 1.92 1.79 1.90 1.85 1.89 1.85 1.97 1.93 1.89 Note. See Table 5 for standards. Table 7. Missouri Certification Standards, 2007–2008 Standard Site 1 2 3 4 5 6 7 8 9 10 11 M 1 2.00 2.00 1.60 1.42 1.85 1.71 1.33 2.00 1.71 1.66 2.00 1.75 2 2.60 2.30 2.00 3.00 2.00 2.60 2.00 2.00 3.00 3.00 2.30 2.44 3 2.20 2.20 2.25 2.20 1.40 2.50 2.20 2.40 2.30 2.20 2.20 2.19 5 2.44 2.44 2.40 2.55 2.22 2.33 2.40 2.33 2.55 2.57 2.50 2.43 6 1.63 1.63 1.80 1.73 1.63 1.78 2.00 1.63 1.63 1.78 2.20 1.77 7 2.25 2.25 2.00 2.25 2.25 2.33 1.66 2.00 2.33 2.25 2.50 2.19 8 2.00 2.05 1.93 2.23 1.82 2.06 1.71 2.00 1.85 2.13 2.14 1.99 M 2.16 2.12 2.00 2.20 1.88 2.19 1.90 2.05 2.20 2.23 2.26 2.11 Note. See Table 5 for standards. No data for Site 4. Measuring the Achievement of Professional Development Schools 37 Strengths and Areas of Needed Discussion Improvement The quantitative data from the Intern Evalu- The following information reflects the analysis ation indicate that all Missouri certification of the data for the 11 standards used to assess standards were being met at the progressing the interns, based on state requirements for level or higher. The 2007–2008 data show a teacher competencies. Upon review of the significant increase in scores, with all but two data collected by the Intern Evaluation for all Missouri certification standards being met at sites in 2006–2007, the lowest score recorded the meets level: classroom management and was 1.29 (progressing), for diversity. In 2007– assessment. The percentage change in score 2008, 1.33 (also progressing) was the lowest documentation demonstrates a significant per- score recorded, for assessment. In review of centage of increase in the overall averages for the average scores for the 11 standards, class- Missouri certification standards. The follow- room management (1.79, progressing) was the ing analysis of data will help us improve our lowest average for a certification standard in interns’ overall experience and achievement 2006–2007. In 2007–2008, the lowest average during their field experience. for a certification standard was also classroom Interns’ scores for Missouri certification stan- management (1.88, progressing). The high- dards, 2006–2007. The PDS interns scored the est average score out of the 11 certification highest in reflective practice and the lowest in standards reported in the 2006–2007 data was classroom management. The overall average reflective practitioner, at 1.97 (progressing). of 1.89 demonstrates a rating of progressing to- The highest average score for 2007–2008 was ward building teaching expertise based on the professional relationships, at 2.26 (meets). Missouri standards for teacher education. The In comparison of the 2 years of data col- current assessment for the education unit at lected (2006–2007 and 2007–2008), all certi- the university is the development of a student fication standards were rated higher in 2007– portfolio that supports the growth of preservice 2008. Table 8 shows the percentage change teachers as reflective practitioners. The results in scores. The data show that the greatest of this study support previous findings from an increase of score was for the diversity stan- analysis of portfolio entries, which found the dard, with an increase of 22.4%, followed by lowest rating for interns in the area of manage- communication, with a change of 20.4%. The ment of a classroom, in both instruction and least percentage change in standard scores in- behavior. These data also align with current cluded assessment (5.2%), curriculum (8.2%), feedback from all levels of field experience, and classroom management (8.8%). However, supporting the need for faculty to provide ad- eight of the Missouri certification standards ditional classroom management strategies and have a 15% increase in scores from 2006–2007 experiences for interns in this area. The base- to 2007–2008. line data provide an opportunity to compare Table 8. Missouri Certification Standards: Percentage Change in Scores, 2-Year Comparison Standard Site 1 2 3 4 5 6 7 8 9 10 11 1 11.7 –3.4 –10.6 –26.4 12.8 –7.6 –30.0 17.6 –17.4 –7.3 –3.4 2 15.6 8.0 –11.1 33.3 –11.1 15.6 –6.1 –6.1 33.3 40.8 8.0 3 2.0 20.2 25.0 29.4 –17.6 36.6 29.4 37.1 43.8 1.9 20.2 5 7.5 3.4 –4.0 17.0 –5.9 –4.9 3.0 –4.9 8.1 23.0 8.7 6 –4.7 –4.7 –3.2 –19.2 –4.7 24.5 0.0 –18.5 –4.7 –11.0 10.0 7 51.0 59.6 39.9 66.7 66.7 60.7 25.8 49.3 80.6 34.7 68.9 8 19.8 27.3 21.4 23.2 21.3 17.7 14.0 36.1 12.8 32.3 22.3 M 17.3 15.8 8.2 17.7 8.8 20.4 5.2 15.8 22.4 16.3 19.2 Note. See Table 5 for standards. No data for Site 4. 38 DEB THEISS AND CARL GRIGSBY subsequent data collections as a continuous Assessment is also an area identified as one review of the developing skills of interns. for growth in this research. The PDS faculty Interns’ scores for Missouri certification stan- is currently reviewing their unit’s assessment dards, 2007–2008. Table 7 shows a significant program. The education faculty have decided improvement of intern ratings for 2007–2008. to implement the Teacher Work Sample as the Intern scores increased in all but two stan- primary assessment, which will focus faculty dards: classroom management and assessment. and interns on assessment and the use of assess- Table 8 documents a significant percentage ment to make appropriate instructional deci- increase in 8 of 11 standards, demonstrating sions. Future course development will focus on a 15% or better increase. Two standards had instruction based on current research and the a 20% or better increase: communication and knowledge of effective assessment strategies. diversity. Exploring the underlying factors that Developing the interns’ understanding and contributed to the increase could help to iden- application of effective assessment practices tify ways to address identified areas of needed during field experience and in their own class- improvement. In both sets of data (2006–2007 rooms will support emerging skill levels. and 2007–2008), classroom management is Continued analysis of the Intern Eval- the lowest rating for interns. Future research uation results will include an analysis of by university faculty in the targeted areas of the qualitative data, the narrative comments classroom management will be beneficial for made by district and university faculty and ad- course development and implementation of ministrators. These data will help to validate best practices (see Tables 9 and 10). our quantitative findings and lead us to more informed decisions when making changes in our PDS program. Table 9. Average Lowest Score for Intern Evaluation per Professional Development School Site, 2006–2007 Future Considerations Site Category Score 1 Classroom management 1.64 Data for this first series of evaluations were 2 Assessment 2.13 3 Diversity 1.60 completed in the spring of 2007 with additional 4 Classroom management 1.83 data collection through the spring of 2008. 5 Reflective practitioner 2.09 The information gained from these assessments 6 Communication 1.43 has provided concrete evidence for answering 7 Diversity 1.29 the questions “What exactly has our Profes- 8 Technology 1.47 sional Development Schools Partnership ac- complished?” and “How are we doing?” There Table 10. Average Lowest Score for Intern is now a measuring stick that can gauge progress Evaluation per Professional Development School Site, over time and determine if the Professional De- 2007–2008 velopment Schools Partnership is steadily mov- Site Category Score ing forward in its efforts to fully implement an effective collaboration between the university 1 Assessment 1.33 and public schools. Data are being collected for 2 Classroom management 2.00 3 Classroom management 1.40 interns in their initial experience in PDSs, as 5 Classroom management 2.22 followed by data collected for the same interns 6 Content knowledge 1.63 in their student-teaching experience. The goal Learners and learning 1.63 in this collection is to determine the growth Classroom management 1.63 of individual interns in their experience in the Diversity 1.63 Technology 1.63 PDS setting, as well as to attain the overall 7 Assessment 1.66 growth of interns involved in the Professional 8 Assessment 1.71 Development Schools Partnership. The surveys have been refined to better communicate direc- Note. No data for Site 4.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.