ebook img

ERIC EJ905155: Holistic Assessment of Students in Undergraduate Industrial Technology Programs PDF

2004·0.15 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC EJ905155: Holistic Assessment of Students in Undergraduate Industrial Technology Programs

Holistic Assessment of Students in Undergraduate Industrial Technology Programs 78 es Dennis W. Field, Steven A. Freeman, and Michael J. Dyrenfurth di u t S y g “External constituents are demanding not tered assessment methods and have collected o ol only that departments say they are doing good and analyzed preliminary data towards reaching n h c things and not only that they measure how hard the goal of more holistic assessment of student e T of they are trying, but also that they measure out- progression. The planning, work, and outcomes al comes” (Walvoord et al., 2000). “[Colleges] to date that support the learning goals and n ur should be focused on the ‘value added’of the expected outcomes set forth by stakeholders o e J student experience. In today’s society, the need follow. However, a point of reference must be h T to educate for understanding—not just grades— set prior to any attempt to address the dual has never been more important” (Merrow, issues of student and program assessment by 2004). “What counts most is what students DO industrial technology faculty in higher educa- in college, not who they are, or where they go to tion. For example, consistent with both our college, or what their grades are” (Edgerton as experiences and the recommendations outlined cited in Merrow, 2004). in step one of the three-step “backward design” process, one must have a fairly specific vision As evidenced by these quotations, the of the knowledge, skills, and attitudes a technol- nature and assessment of education is changing ogy student should develop prior to embarking significantly, and the assessment trajectory is on his or her career before formulating an away from sole reliance on the traditional per- assessment plan. In other words, what is to spective of student grades. Technology faculty be assessed? This might include such things must respond to the changing requirements of as knowledge of technology and its associated student assessment and ensure that graduates processes; practical skills associated with mate- of the program meet both the expectations and rials, tools, processes, and systems related to standards of the institution as well as those of technology; ethics related to technology devel- other stakeholders, particularly the private sec- opment and application; attitudes toward tech- tor that typically employs those graduates. nology; and issues impacting student learning. We are reevaluating ways in which students A clear understanding of the reasons for in their departments are assessed. The three-step assessing technology students is also critical. “backward design” process recommended by These reasons may originate in basic require- Wiggins and McTighe (1998) has served as a ments to uncover information regarding stu- conceptual framework and useful design per- dents’knowledge, skills, or attitudes. One may spective. Broadly, these steps condense to (a) wish to verify that students can demonstrate identify desired results, (b) determine accept- practical technology skills and related profes- able evidence, and (c) plan learning experiences sional skills, or one may desire to motivate and and instruction. Wiggins and McTighe offered a enhance learning. Ultimately, it is a goal of the number of practical guidelines to the design and faculty to have more than just course grades to development of curricula, including the assess- reflect student performance. As noted by ment process. They stated that “the backward Wiggins and McTighe (1998), “Too often as design approach encourages us to think about a teachers, we rely on only one or two types of unit or course in terms of the collected assess- assessment, then compound that error by con- ment evidence needed to document and validate centrating on those aspects of the curriculum that the desired learning has been achieved, so that are most easily tested by multiple-choice that the course is not just content to be covered or short answer items” (p. 65). A well-structured or a series of learning activities” (p. 12). program should include assessment by a variety of methods and from a more holistic perspective We have identified a number of elements than is often currently employed. An ancillary driving assessments at the departmental level as benefit of a more holistic assessment may be well as those that address the degree candidates’ a more positive student attitude toward the needs. We have explored non-classroom-cen- discipline. From the departmental perspective, nology. TLC participants are organized into improved student assessment allows faculty small groups of students. Each student group 79 to do a better job evaluating and adjusting the works with a peer mentor, an industrial mentor technology curricula and program options (e.g., (an industrial technologist practicing in indus- Th e manufacturing and occupational safety) to meet try), a graduate assistant, the academic advisor, J o educational objectives. Improved assessment and industrial technology faculty members. In ur n also increases faculty credibility with other addition to more formal assessments, TLC stu- al o stakeholders in the educational process, such dents evaluate their experience each week in f T e as administration, parents, and employers. Given reflective summaries. Below are some com- c h n a clear understanding of the overall educational ments selected from TLC students’summaries: o lo objectives and forces driving the need for g y assessment, one may turn to questions of avail- • This was a very helpful class since it S t u able assessment strategies and benefits they provided to me a strong idea of what I die provide for the learner and the instructor. can expect, and in turn what is expected s of me. Peer mentor groups met in class We three authors formed an action research and we exchanged phone #'s and E-mail collaborative with the central goal being to addresses. develop, implement, and evaluate innovative forms of assessment that document student • Last class we had speakers come in growth in a holistic manner. The alternative from SME [Society of Manufacturing forms of assessment that we advance are not Engineers], ASSE [American Society envisioned as wholesale replacements for exist- of Safety Engineers], and SPE [Society ing knowledge-based tests; rather, we suggest of Plastics Engineers] clubs who talked they supplement existing grade data with assess- about what their organizations do and ments targeted toward technological capability how to get involved in these clubs. I think and problem solving. Outcomes of an improved this was a very good class day because assessment process allow faculty to make most people are unaware of how they can adjustments to both curricula and extracurricu- get involved in something like this and lar activities with greater clarity regarding the what they actually do in these clubs. I am impact of such changes on the industrial tech- very interested in SME and hope to join nology program and students. Ultimately, the next fall. I would have liked to join this faculty hope to accelerate the students’learning semester but with my current schedule more effectively and efficiently and “jumpstart” and obligations this is impossible. them into their profession. • Last week our group met with our peer Technology Learning Community mentor for half an hour. We talked about Changes to assessment processes are com- our schedules and how classes are going. plicated by simultaneous changes to the pro- Our peer mentor suggested that we do gram. For example, the context for our action our resumes and turn them in to him the research is a Technology “Learning Com- next time we meet. We discussed how munity” (TLC) that we established within the important it is to go to Career Day on Feb industrial technology curriculum at Iowa State 19th. He wants our resumes so he can go University. Learning communities are relatively over them and make possible corrections recent additions to the landscape of American so we can have them with us for career universities. Research by Lenning and Ebbers day. We have assigned a day and time for (1999) has indicated improved student retention our meetings, which is Wed. at 5 p.m. in and satisfaction with the academic experience the TLC room. through the use of learning communities. The • Last week’s class helped me to more TLC is an induction and support activity for fully understand the full potential that freshmen and transfer students in the industrial an Industrial Technology major could technology program. The purpose of the TLC is provide for me in the future. In our group to help entering students (regardless of their meeting we talked about how to improve academic stage) maximize their educational our resumes and how to sign up for ECS experience and begin their professional accul- [Engineering Career Services]. turation within the discipline of industrial tech- At the end of each semester, TLC students capability with technology, increased student were asked to evaluate the ability of the TLC satisfaction, higher academic performance, 80 experience to meet the goals of the TLC initia- refined career goals, and a greater awareness es tive using excellent, good, fair, or poorratings. of one’s learning style and how to most effec- di u When questioned about their experiences with tively utilize that information. When students t S y respect to the goals of “Orientation to the indus- conclude their studies within technology pro- g o trial technology discipline and profession” and grams, it would be desirable for them to have ol hn the “Process of developing realistic self-assess- indicators of capability to demonstrate academic c Te ments, career goals,” 93% of the students proficiency, beyond just course grades. of reported good/excellent. The responses were nal even more positive regarding the goals of In order to assess student outcomes in r ou “Connections with faculty, other students, a more holistic way, appropriate instruments J e and industry professionals” and the “Process must be available. Efforts are underway to h T of introducing the variety of professional roles identify and evaluate such instruments for use available through an industrial technology in the undergraduate industrial technology cur- degree,” with 96% of the students reporting ricula. In addition to the qualitative and quanti- good/excellentexperiences. tative analyses related to the TLC, activities include an evaluation of the use of Dyrenfurth’s The experiences within the introductory (1991) technological literacy instrument based course were crafted to establish both the small on the work reported in Dyrenfurth and Kozak group cohesiveness and interaction with peers (1991), the ACT Work Keys employability skills that is so essential to effective team-based tech- exams, and the National Association of nological problem solving that employers Industrial Technology (NAIT) certification demand of industrial technology (ITEC) gradu- exam; reviews of each student’s portfolio by ates. Another key to the success of the TLC is faculty and an industrial team; and comparison the use of industrial mentors. ITEC students are of participant satisfaction and academic per- very pragmatic—most have their sights set formance with other students not participating firmly on a career in industry. Hence, inputs in TLC activities. from industrial mentors are highly valued and persuasive. They convey high expectations while A number of other key targets have been demonstrating realistic practice and applications included as initial assessment areas. These tar- in industry. gets include student demographics, academic performance, learning styles, technological Student learning has been enhanced understanding and capability, and ethical dimen- through cooperative interaction with their TLC sions of technology. group members and mentoring team. The indus- trial and peer mentors have increased the TLC With these assessments, faculty hope to students’understanding of the discipline and the benchmark both the initial and exit competence importance of curriculum components beyond of students, document students’progression over what individual faculty members can accom- the course of their academic experiences, docu- plish in the classroom. The TLC students have ment differences among groups and types of a better understanding of their personal learning students, and investigate implications arising styles and how that impacts their studying from these differences for program design and habits and classroom interactions. By the end development. The assessments are also designed of the semester, TLC students have generated to focus attention on the various components of or updated their resumes, started professional competence (e.g., technical, managerial, founda- portfolios, and set the foundation of team build- tional, personal) and to increase attention to the ing and awareness of technology that they will assessment process in order to strengthen its continue to enhance and build upon throughout validity and reliability. their academic career and beyond. Additional The faculty has synthesized a set of information regarding the Iowa State University targeted industrial technology competencies TLC may be found in Freeman, Field, and for students at Iowa State University through Dyrenfurth (2001). the efforts of individual instructors responsible Student Outcomes Assessment for specific courses, the departmental curricu- Desired student outcomes include enhanced lum committee, and other stakeholders (e.g., Industrial Advisory Council and NAIT). and work by Kenealy and Skaar (1997) offered It would be advantageous for faculty to be able useful frameworks. Kenealy and Skaar suggest- 81 to collect a wide variety of demographic data ed an interesting outcomes-defined curriculum and longitudinal data for tracking student per- renewal process that has continued to influence Th e formance, as well as data indicating perform- this effort. Kenealy and Skaar described a multi- J o ance against desired outcomes; however, they step “action planning” process that has been ur n recognize that constraints exist on the amount adapted so that a large number of diverse facul- al o of data that can be collected and analyzed. ty and students could contribute cooperatively f T e and with a sense of ownership. The needs of c h The NAIT accredits industrial technology clientele groups led to the definition of major no programs. Department missions are expected to educational outcomes for the program, which in log y be compatible with the approved definition of turn formed the foundation for learning experi- S t u industrial technology. The mission, as listed in ences that would serve to meet student needs. die the 2002 NAIT Self Study Report(Department Kenealy and Skaar stated that by grouping s of Industrial Education and Technology, 2002), learning experiences, course titles and objec- states: “The Department of Industrial Education tives are defined for a renewed curriculum. and Technology at Iowa State University pre- Subsequently, courses are defined by specific pares technically oriented professionals to pro- learner competencies, which are edited for prop- vide leadership in manufacturing technology er sequencing. They also examined cognitive and occupational safety through an undergradu- learning skills to ensure that upper level skills ate industrial technology program” (p. 6.2-2). were represented throughout the curriculum. A second necessary, but not sufficient, con- Assessment Instruments dition for accreditation listed in the 2002 NAIT Assessment instruments exist that are Self Study Reportrequires that competencies appropriate and readily available and for which shall be identified that are relevant to employ- validity and reliability studies are already well ment opportunities available to graduates. While documented. We entered into discussions with the accreditation process serves as one driving ACT, Inc., regarding the use of the Work Keys® force for curriculum review and outcomes system at the undergraduate level. This is a sys- assessment, there are others. Apart from the tem designed to quantitatively measure certain expected continuous improvement efforts of the employability skills. It includes job profiling faculty, the department retains an Industrial and work-related assessments and serves a vari- Advisory Council to assure that the industrial ety of needs in the industrial and educational technology curriculum addresses the current and arenas. For example, the test component of the future needs of business and industry. The Work Keys system is designed to assess person- Council recommends and reviews curriculum al skill levels in important areas of employabili- and program changes that will enable the ty skills (ACT, 1997). There are currently eight department to be responsive to business and tests: (a) applied mathematics, (b) applied tech- industry (Department of Industrial Education nology, (c) listening, (d) locating information, and Technology, 1998). The department faculty, (e) observation, (f) reading for information, (g) NAIT, and the Industrial Advisory Council all teamwork, and (h) writing. ACT (1997) stated play an important role in defining and refining that educators can use the Work Keys informa- the competencies expected of industrial technol- tion to develop appropriate curricula and ogy graduates. instruction that target skills needed in the work- place. The Work Keys instruments have exten- Ultimately, a reevaluation and alignment sive reliability and validity studies completed of course content offered in the programs was (ACT, 1997) at the secondary level, but little needed to ensure that the competencies expected information was available demonstrating that of industrial technology graduates were realized. its use could be extended to the baccalaureate This involved a comparison of curricular con- level. Our preliminary research results with tent with required NAIT objectives. Gaps and/ undergraduates would seem to warrant addition- or superfluous material were identified and al investigation of the Work Keys system. Five addressed. Faculty members led this effort, of the eight tests were administered to under- but they were not without guidance with respect graduates, including (a) applied mathematics, to the process. The aforementioned three-step (b) applied technology, (c) locating information, design process by Wiggins and McTighe (1998) (d) teamwork, and (e) writing. Our primary con- used a modified Delphi technique to identify cern was that a majority of the students would core content, subject areas, and competencies. 82 score at the highest scale level of the exams, Thirteen core competency areas were identified es thus diminishing the usefulness of the exams including (a) leadership skills for supervisors, di u for assessment purposes at the baccalaureate (b) teamwork, (c) fundamentals of management, t S y level. This did not prove to be the case as only (d) safety management, (e) technical graphics/ g o 9.1% (n= 33) achieved this level in the Applied CADD, (f) quality, (g) electronics, (h) human ol hn Technology exam, 14.8% (n= 27) achieved this resource management, (i) technical writing, (j) c Te level in the Locating Information exam, and written communication, (k) verbal communica- of no students achieved the highest level in the tion, (l) computer integrated manufacturing, and al n Teamwork and Writing exams (n= 26 and n= (m) manufacturing automation. Rowe’s findings r u o 22, respectively). Applied Mathematics yielded indicated a need for expanding the use of writ- J e the only exam where significant numbers of ten and verbal information, particularly with h T students (45.2%, n= 31) scored at the highest respect to communicating technical information. level. The current NAIT certification test is a The ITEC program enrolls numerous trans- cognitive, norm-based, multiple-choice test. fer students from engineering. Many of these It contains four 40-question subsections: pro- students have not had a great deal of academic duction planning and control, quality control, success in the engineering program, but most safety, and management/supervision. Summary seem to thrive in the industrial technology statistics, including classical difficulty and dis- program. It has been posited that a mismatch crimination coefficients (point biserial correla- between instructional style and learning style tion) based on a sample size of approximately has been a leading cause of at least some of 1,200 students, are available from the authors. these students’previous academic problems and Efforts are also underway to analyze these exam that a change to a more “hands-on” curriculum items using item response theory (IRT) meth- has allowed them to flourish. While there may ods. There appears to be a significant level of indeed be systemic differences in the instruc- interest by NAIT-accredited programs for use tional approaches taken by engineering and of the certification exam in both program and industrial technology faculty, a recent study student evaluations. undertaken to investigate this question found that there were no statistically significant differ- Description and sequencing of the test con- ences in learning styles between groups of struction process (preparing test specifications, engineering students and industrial technology item construction and review, detecting item students at either Iowa State University or North bias, estimating reliability, etc.) are readily Carolina A & T State University (Fazarro, available in, for example, Crocker and Algina 2001). Fazarro used the Productivity Enviro- (1986). All tests used for assessment are expect- nmental Preference Survey to evaluate learning ed to pass a review against generally accepted style differences (Dunn, Dunn, & Price, 1996). test construction guidelines. A number of other planned assessment Other instruments currently under consider- instruments require some additional develop- ation include Dyrenfurth’s (1991) technology ment prior to their wholesale use. Some are literacy test, a survey of attitudes toward tech- lacking sufficient reliability and validity data, nology (DeVries, Dugger, & Bame, 1993; Raat whereas some offer only partial coverage of & DeVries, 1986), multiple technological prob- desired content areas. For example, the NAIT lem-solving appraisals developed in the manner certification test currently covers only four tech- suggested by Kimbell and Stables (1999), and nology categories: (a) production planning and the assessment of technology projects and activ- control, (b) safety, (c) quality control, and (d) ities using group process and student individual- management and supervision. There are other ized performance rubrics suggested by Custer, topics that fall under the technology umbrella Valesey, and Burke (2001). such as materials and processing, industrial We envisioned the deployment of the afore- training and development, energy, instrumenta- mentioned assessment instruments across the tion and control, and information technology. department and longitudinally over the students’ Rowe (2001) suggested an updated test blue- four-year period of study. The plan would have print for the NAIT certification exam. Rowe the department’s faculty establish a database of - Conduct additional literature review. specific course goals and objectives and then - Review American Educational Research 83 crosswalk each to the department outcomes. Association presentations. Existing tests and other assessments would - Explore available tests or identify and Th e be cross-indexed to these goals and objectives. develop appropriate new instruments. J o Tables of specification would be used to identi- • Define benchmarking process. ur n fy weighting and/or coverage gaps, inappropri- • Finalize assessment timing. al o ate proportions, etc. • Develop assessment-sampling matrix f T e that secures data, which may then be c h Subsequently, innovative assessment mech- aggregated without imposing all tests no anisms, such as authentic assessments, portfo- on all students. log y lios, rubrics, and adaptive testing, would be • Bring test administration and analysis S t u expanded to complement and/or modify the online through the use of off-the-shelf die current assessment strategies. Some authentic software. s assessment activities are currently in place in • Develop analysis plan. the industrial technology program at Iowa State University. Freeman and Field (1999) discussed Student performance will also be recorded assignments given to groups of safety students in a database and, within legal guidelines and that involve job safety analyses, along with pending student approval, certain elements of equipment and process reviews in labs run for the information will be available to assist peer manufacturing students. These assignments are and industrial mentors in conducting discussions identical to the tasks safety students might find with students. Peer and industrial mentors will when working in an industrial setting following be expected to offer each student constructive graduation. Students have also been asked, for perspectives as to how his or her performance example, to develop and construct tooling for is progressing against personal and professional specific tasks in metallic materials and process- standards. Assessment validation will involve es labs. Many of these laboratory-based activi- the program’s industrial advisory council mem- ties offer opportunities for authentic assessment. bers so that real-world standards are not only employed but are also made clear to students. Multiple-choice assessments will be subject A key goal is to demonstrate student awareness to item analysis and will be recorded in the item of growth over time. Potentially confidential database. These assessments have their place as topics from a student perspective will be han- tests of technological knowledge; however, dled through discussions with faculty and Kimbell and Stables (1999) do not consider faculty mentors. them to be valid tests of technological problem solving. Kimbell and Stables offer a great deal Summary of insight into the development of performance There are clear and persistent indicators assessment instruments for technological prob- that changes are needed and expected in our lem solving and the development of assessment system of education. In 1999, the National rubrics to translate performance qualities into Research Council reported, “The goals and numeric data for statistical analysis. expectations for schooling have changed quite dramatically during the past century, and new A summary listing of the anticipated goals suggest the need to rethink such questions system development activities are shown below: as what is taught, how it is taught, and how stu- • Conceptualize outcomes based on inputs dents are assessed” (pp. 152-153). In 2001, the from faculty, NAIT, and the Advisory Council reviewed and expanded on trends, Council. which “are changing expectations for student • Analyze objectives for each course in learning and the assessment of that learning” program. (National Research Council, 2001, p. 22). • Analyze NAIT certification examination Efforts to design and implement a more holistic table of specifications. assessment of students in industrial technology • Crosswalk course outcomes to objectives. programs are certainly timely and in keeping • Crosswalk objectives to NAIT certifica- with the spirit of the recommendations by the tion examination table of specifications National Research Council and others. Our and exams for each course. work represents an initial step towards • Select instruments: capitalizing on some of the innovation that, Dr. Steven A. Freeman is an associate while important, nevertheless reflects singular professor in the Department of Agricultural 84 enhancements. Our goal was also to integrate and Biosystems Engineering at Iowa State es several of these innovations into a more system- University. He is a member of the Alpha Xi di u atic approach. To this end, we have conceptual- Chapter of Epsilon Pi Tau. t S y ized a framework for moving forward, we have g o established the feasibility of using the Work Dr. Michael J. Dyrenfurth is assistant dean ol hn Keys®employability skills instrumentation, and for graduate studies in the School of Technology c Te we have described some necessary implementa- at Purdue University. He is a member of Mu of tion steps. We invite members of the profession Chapter of Epsilon Pi Tau and received his nal to join in the challenge of implementation of Distinguished Citation in 2001. r u o such enhanced systems of assessment. J e h T Dr. Dennis W. Field is an associate professor in the Department of Technology at Eastern Kentucky University. He is a member of the Alpha Xi Chapter of Epsilon Pi Tau. References ACT. (1997). Work Keys preliminary technical handbook. Iowa City, IA: Author. Crocker, L. M., & Algina, J. (1986). Introduction to classical & modern test theory.Orlando, FL: Harcourt Brace Jovanovich. Custer, R. L., Valesey, B. G., & Burke, B. N. (2001). An assessment model for a design approach to technological problem solving. Journal of Technology Education, 12(2). Retrieved June 9, 2003, from http://scholar.lib.vt.edu/ejournals/JTE/v12n2/custer.html Department of Industrial Education and Technology. (1998).Outcomes assessment report. Ames, IA: Iowa State University, Department of Industrial Education and Technology. Department of Industrial Education and Technology. (2002). National Association of Industrial Technology self study report. Ames, IA: Iowa State University, Department of Industrial Education and Technology. DeVries, M., Dugger, W., & Bame, A. (1993). PATT USA research report. Unpublished research report, Virginia Polytechnic Institute and State University, Blacksburg. Dunn, R., Dunn, K., & Price, K. E. (1996). Productivity environmental preference survey. Lawrence, KS: Price Systems. Dyrenfurth, M. J. (1991).Technological literacy instrument.Columbia, MO: American Evaluation Association. Dyrenfurth, M., & Kozak, M. (Eds.). (1991). Technology literacy. In 40thyearbook of the Council for Technology Teacher Education(pp. 138-183). Peoria, IL: Glencoe. Fazarro, D. E. (2001). A factor analysis of the preferred learning styles of industrial technology and engineering undergraduate students at North Carolina Agriculture and Technical State University and at Iowa State University (Doctoral dissertation, Iowa State University, 2001). Dissertation Abstracts International, DAI-A 62(06), 2056. Freeman, S. A., & Field, D. W. (1999). Benefits of a cross-functional safety curriculum. Journal of Industrial Technology, 15(4). Retrieved June 9, 2003, from http://www.nait.org/jit/Articles/ free0899.pdf Freeman, S. A., Field, D. W., & Dyrenfurth, M. J. (2001). Enriching the undergraduate experience through a technology learning community.Journal of Technology Studies, 27(1), 53-58. Kenealy, M. D., & Skaar, B. R. (1997). Outcomes defined curriculum renewal: Process and product. Journal of Animal Science, 75(Suppl. 1). Kimbell, R. A., & Stables, K. (1999).South Africa: Northwest province technology education project. London: Goldsmiths University of London, Technology Education Research Unit. 85 Lenning, O. I., & Ebbers, L. H. (1999). The powerful potential of learning communities: Improving T education for the future(Report 26-6, #SPO266). Washington, DC: George Washington h e University, American Society for Higher Education. Jo u r Merrow, J. (2004, June). Grade inflation: It’s not just an issue for the Ivy League. Retrieved September na 24, 2004, from http://www.carnegiefoundation.org/perspectives/perspectives2004.June.html l o f T e National Research Council. (1999).How people learn: Brain, mind, experience, and school(J. D. c h Bransford, A. L. Brown, & R. R. Cocking, Eds.). Washington, DC: National Academy Press. no lo g National Research Council. (2001). Knowing what students know: The science and design of y S educational assessment(J. Pelligrino, N. Chudowsky, & R. Glaser, Eds.). Washington, DC: t u d National Academy Press. ie s Raat, J. H., & DeVries, M. (1986). What do boys and girls think of technology?Eindhoven, The Netherlands: Eindhoven University of Technology. Rowe, S. E. (2001). Development of a test blueprint for the National Association of Industrial Technology certification exam (Doctoral dissertation, Iowa State University, 2001). Dissertation Abstracts International, DAI-A 62(11), 3754. Walvoord, B. E., Carey, A. K., Smith, H. L., Soled, S. W., Way, P. K., & Zorn, D. (2000). Assessing the pressures for departmental change. Retrieved September 24, 2004, from http://ctl.stanford.edu/Tomprof/postings/547.html Wiggins, G., & McTighe, J. (1998).Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development.

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.