ebook img

ERIC ED403164: Science Framework for the 1996 National Assessment of Educational Progress. NAEP Science Consensus Project. PDF

96 Pages·1996·0.89 MB·English
by  ERIC
Save to my drive
Quick download
Download
Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.

Preview ERIC ED403164: Science Framework for the 1996 National Assessment of Educational Progress. NAEP Science Consensus Project.

DOCUMENT RESUME ED 403 164 SE 059 728 TITLE Science Framework for the 1996 National Assessment of Educational Progress. NAEP Science Consensus Project. American Institutes for Research in the Behavioral INSTITUTION Sciences, Washington, D.C.; Council of Chief State School Officers, Washington, D.C.; National Center for Improving Science Education, Washington, DC. SPONS AGENCY National Assessment Governing Board, Washington, DC. REPORT NO ISBN-0-16-048903-2 PUB DATE 96 CONTRACT RS90081001 NOTE 96p. AVAILABLE FROM U.S. Government Printing Office, Superintendent of Documents, Mail Stop: SSOP, Washington, DC 20402-9328. PUB TYPE Descriptive (141) Reports EDRS PRICE MF01/PC04 Plus Postage. Academic Achievement; Elementary Secondary Education; DESCRIPTORS *Evaluation Criteria; *National Competency Tests; *Science Education; *Science Tests National Assessment of Educational Progress IDENTIFIERS ABSTRACT This document contains the Framework and rationale for assessing science achievement of students throughout the United States in 1996. It provides a general overview of the National Assessment of Educational Progress (NAEP), describes the 1996 NAEP Science Framework'adopted by the National Assessment Governing Board. (NAGB), and reviews the process by which the Framework was developed. Chapters include: (1) "The Nature of Science and the Science Curriculum"; (2) "The Framework for the 1996 NAEP Science Assessment"; (3) "Desired Attributes of the Assessment"; and (4) "Characteristics of Assessment Exercises." Appendices include Fields of Science, Examples of Themes by Grade Level, Science Content Outlines (Excerpts), and Consensus Committees and Project Staff. Contains 14 references. (JRH) *********************************************************************** Reproductions supplied by EDRS are the best that can be made * from the original document. *********************************************************************** SCUlot ui- IniLtittbi MU I .,c SE The ERIC Facility has assigned this document for processing to: In our judgment, this document 0 is also of interest to the Clear- noted to the right. Oposihnotsuldofrevtielewc.t their ffou° Ile aCC'6 HEloune 9 omood'onl cf oueoo aiancaanouf s° sli; ;fir .11 III It I111'IIIlll ll 111 00 0 00 0 U.S. DEPARTMENT OF EDUCATION Office Of Educational Research and Improvement DUCATIONAL RESOURCES INFORMATION CENTER (ERIC) 0 his document has been reproduced as wed from the person or organization r originating a O Minor changes have been made to improve reproduction quality. Points of view or or:rim:ins stated in this docu BEST COPY AVAILABLE ment do not necessarily represent official 1 1 OERI position or policy RIAZED Odom Canzemoaro Prod Assess Hoare matt GavanbanF, MCA Depart onomQ emommoDGD c. What Is NAEP? The National Assessment of Educational Progress (NAEP) is a congressionally mandated project of the U.S. Department of Education's National Center for Education Statistics. It assesses what U.S. students should know and be able to do in geography, reading, writing, mathematics, science, U.S. history, the arts, civics, and other academic subjects. Since 1969, NAEP has surveyed the achievement of students at ages 9, 13, and 17 and, since the 1980s, in grades 4, 8, and 12. Measuring educational achievement trends over time is critical to measuring progress toward the National Education Goals. The National Assessment Governing Board The National Assessment Governing Board (NAGB) was created by Congress to formulate policy for the National Assessment of Educational Progress (NAEP). Among the Board's responsibilities are developing objectives and test specifications, and designing the assessment methodolog for NAEP. Charlotte Crabtree Members Honorable Richard P. Mills Professor Commissioner of Education Mark D. Musick, Chairman University of California State Department of Education President Los Angeles, California Montpelier, Vermont Southern Regional Chester E. Finn, Jr. Carl J. Moser Education Board Founding Partner and Director of Schools Atlanta, Georgia Senior Scholar The Lutheran Church-Missouri Sync Honorable William T. Randall, The Edison Project St. Louis, Missouri Vice-Chair Washington, D.C. John A. Murphy Commissioner of Education Michael S. Glode Superintendent of Schools State Department of Education Wyoming State Board Charlotte-Mecklenburg Schools Denver, Colorado of Education Charlotte, North Carolina Parris C. Battle Saratoga, Wyoming Michael Nettles Education Specialist William J. Hume Professor Dade County Public Schools Chairman of the Board University of Michigan Miami, Florida Basic American, Inc. Ann Arbor, Michigan Honorable Evan Bayh San Francisco, California Honorable Carolyn Pollan Governor of Indiana Christine Johnson Arkansas House of Representatives Indianapolis, Indiana Director of Urban Initiatives Fort Smith, Arkansas Mary R. Blanton Education Commission of Thomas Topuzes Attorney the States Senior Vice President Blanton & Blanton Denver, Colorado Valley Independent Bank Salisbury, North Carolina John S. Lindley El Centro, California Boyd W. Boehlje Principal Marilyn Whirry Attorney and School Board Galloway Elementary School English Teacher Member Henderson, Nevada Mira Costa High School Pella, Iowa Honorable Stephen E. Merrill Manhattan Beach, California Linda R. Bryant Governor of New Hampshire Sharon P. Robinson (Ex-Officio) Dean of Students Concord, New Hampshire Assistant Secretary Florence Reizenstein Jason Millman U.S. Department of Education Middle School Professor Washington, D.C. Pittsburgh, Pennsylvania Cornell University Roy Truby Honorable Naomi K. Cohen Ithaca, New York Executive Director Office of Policy and NAGS Management Washington, D.C. State of Connecticut 3 Hartford, Connecticut rnpy RPM- A \ /AII API P The ERIC Facility has assigned this document for processing to: Science Framework In our judgment, this document is also of interest to the Clear- inghouses noted to the right. Indexing should reflect their special points of view. for the 1996 National Assessment of Educational Progress NAEP Science Consensus Project Developed under contract number RS90081001 by the Council of Chief State School Officers, with the National Center for Improving Science Education and the American Institutes for Research for the National Assessment Governing Board. I National Assessment Governing Board Mark D. Musick Chair William T. Randall Vice Chair Roy Truby Executive Director Mary Crovo Project Officer Science Consensus Project Ramsay W. Selden Council of Chief State School Officers Richard C. Clark Council of Chief State School Officers Senta A. Raizen National Center for Improving Science Education Science Assessment Framework for the 1996 National Assessment of Educational Progress Developed for the National Assessment Governing Board under contract number RS90081001 by the Council of Chief State School Officers with the National Center for Improving Science Education and the American Institutes for Research for the National Assessment Governing Board For further information, contact the National Assessment Governing Board: 800 North Capitol Street, NW Suite 825 Washington, DC 20002 5 For sale by the U.S. Government Printing Office Superintendent of Documents, Mail Stop: SSOP, Washington, DC 20402-9328 ISBN 0-16-048903-2 Table of Contents Introduction 1 Background 1 Development of the 1996 NAEP Science Assessment 3 Framework 5 Steering Committee Guidelines Chapter One: The Nature of Science and the 7 Science Curriculum 7 The Nature of Science 9 The Science Curriculum Chapter Two: The Framework for the 1996 NAEP Science Assessment 11 Introduction 11 Framework Elements 13 13 The Matrix 17 The Fields of Science Earth Science 17 Physical Science 18 19 Life Science 20 Knowing and Doing Science 21 Conceptual Understanding 23 Scientific Investigation 24 Practical Reasoning 26 The Nature of Science 26 Science 27 Technology 28 Themes Chapter Three: Desired Attributes of the Assessment 31 31 Some Assessment Issues The Nature of Testing and the Nature of Knowledge and Learning 31 6 iii Inferring Understanding From Student Responses 32 Developments in Assessment 34 Criteria for Assessing Learning and Achievement in Science 34 Preliminary Achievement Levels in Science 35 Chapter Four: Characteristics of Assessment Exercises 41 Types of Exercises 41 Performance Exercises 41 Open-Ended Pencil-and-Paper Items 42 Multiple-Choice Items 43 Additional Considerations 43 Pilot Testing 44 Procedures for Development of the Assessment 45 Special Study in Science 45 References 47 Appendix A: Fields of Science 49 Earth Science 51 Physical Science 56 Life Science 66 Appendix B: Examples of Themes by Grade Level 71 Systems 73 Patterns of Change 74 Models 77 Appendix C: Science Content Outlines (Excerpts) 81 Grade 4Earth Science 83 Grade 8Life Science 84 Grade 12Physical Science 86 Appendix D: Consensus Committees and Project Staff 89 7 iv Introduction This document contains the Framework and rationale for assess- ing science achievement of students throughout the United States in 1996. It provides a general overview of the National Assessment of Educational Progress (NAEP), describes the 1996 NAEP Science Framework adopted by the National Assessment Governing Board (NAGB), and reviews the process by which the Framework was developed. Background The National Assessment of Educational Progress is authorized by Congress and funded by the federal government; it is the only nationally representative and continuing assessment of what America's students know and can do. For more than 20 years, NAEP has been charged with collecting and reporting information on student achievement in mathematics, reading, science, U.S. history, writing, and other subjects. NAEP assessments were conducted on an annual basis until 1981, when they became bien- nial. Originally, assessments were of students at ages 9, 13, and 17, but beginning in 1983, they have included students at grades 4, 8, and 12, as well. NAEP reports provide descriptive information about student performance in various subjects, including basic and higher order skills, and comparisons of performance by race/ethnicity, gender, type of community, and geographic region. They also show rela- tionships between achievement and certain background variables, such as time spent on homework or educational level of parents. In the past, only results from a national sample of students have been reported for each NAEP subject assessed; but in 1987, a national study group chaired by Lamar Alexander, then Governor of Tennessee, and H. Thomas James, President Emeritus of the Spencer Foundation, recommended to the Secretary of Education that NAEP collect representative data on student achievement at the state level. For the first time, then, in 1990 a trial state assessment was conducted for 8th-grade mathematics. The trial continued in 1992 with state-level assessments in mathematics at grades 4 and 8 and reading at grade 4. In 1994, reading was assessed at the state 8 1 level in grade 4. State-level assessments beyond 1994 are being planned. Assessment is not without critics. The Alexander-James Study Group expressed concerns that "national assessments of science were sporadic and almost exclusively devoted to assessing factual information," and that they did not attempt to assess abilities to organize and transform a body of information into a coherent scientific account (The Nation's Report Card: Improving the Assessment of Student Achievement, 1987). To address this criticism, the study group recommended that NAEP broaden its scope to include the collection of information on whether students are able to design, perform, and analyze experi- ments; on whether they have acquired complex thinking abilities essential to various fields of science; on whether they are able to integrate basic concepts of the various scientific fields; and on whether they can perceive fundamental relationships. To measure these kinds of knowledge and abilities, NAEP was urged to include open-ended items and performance tasks in its assessment techniques. NAEP has great prestige, considerable influence, and well- publicized results. Therefore, the weight of its findings may have a substantial impact on science education, possibly influencing state curriculum frameworks and, ultimately, even what teachers teach. Therefore, the assessment should reflect consensus on priorities, best practices, and conclusions from research on science education. A specific example of how NAEP can influence state frame- works is illustrated by the selection of special assessment tech- niques, such as hands-on science. Performance, for instance, places a value on the "doing" of science. That priority actually may not be reflected in classroom practice, although it always was considered to be a standard teaching technique in science. Thus, as the NAEP science reports are circulated, states and curriculum policymakers become sensitized and may incorporate more hands-on activities into classroom lessons. Science educators, by and large, do not quarrel with the essential concept of assessment, but there has been no formal agreement on a common framework, outcomes, or goals and objectives to assess. The 1996 NAEP Science Assessment attempts to reflect a compre- hensive, contemporary view of science so that those affected by the 9 2 National Assessment are satisfied that it addresses the complex issues in science education without oversimplification. The framers of this document have attempted to tread a fine line between clear communication and technical accuracy in the hope that their efforts represent a step forward in building national consensus on the key outcomes of science education. Development of the 1996 NAEP Science Assessment Framework The following factors guided the process for developing consensus on the Science Assessment Framework: The first factor is the general process of consensus devel- opment both as it is set forth in law and as it evolves over time. The process calls for active participation and broad involvement of curriculum specialists, science teachers, local science supervisors, state supervisors, administrators, and parents, and representatives of scientific associations, business and industry, government officials, unions, cognitive psy- chologists, and science educators, as well as participation from the public and private education sectors. The second factor is emphasis on the important outcomes of science education. As much as possible, this Science Framework represents what is considered essential learning in science, setting the stage for the 1996 and subsequent assessments and recommending innovative assessment techniques to probe the critical abilities and content areas. The third factor is recognition that the various "players" in education and in industry often hold diverse and some- times conflicting views. Further, research and general agree- ment in the field is lacking. This lack of agreement on a common scope of instruction and sequence, components of scientific literacy, important outcomes of learning, and the nature of overarching themes in science hinders clear communication between science educators and the public. The process of developing the Framework document and accom- panying reports occurred between October 1990 and August 1991. Original plans called for the new NAEP Science Assessment to be given in 1994. Due to a budget shortfall, however, both the new science and mathematics assessments were rescheduled for 1996. 3 1 0

See more

The list of books you might like

Most books are stored in the elastic cloud where traffic is expensive. For this reason, we have a limit on daily download.