PISA 2009 Technical Report The PISA 2009 Technical Report describes the methodology underlying the PISA 2009 survey. It examines additional features related to the implementation of the project at a level of detail that allows researchers to understand and replicate its analyses. The reader will fi nd a wealth of information on the test and sample design, methodologies used to analyse the data, technical features of the project and quality control mechanisms. Contents Chapter 1. Programme for International Student Assessment: An overview PISA 2009 Technical Report Chapter 2. Test design and test development Chapter 3. The development of the PISA context questionnaires Chapter 4. Sample design Chapter 5. Translation and verifi cation of the test and survey material Chapter 6. Field operations Chapter 7. Quality assurance Chapter 8. Survey weighting and the calculation of sampling variance Chapter 9. Scaling PISA cognitive data Chapter 10. Data management procedures Chapter 11. Sampling outcomes Chapter 12. Scaling outcomes Chapter 13. Coding reliability studies Chapter 14. Data adjudication Chapter 15. Profi ciency scale construction Chapter 16. Scaling procedures and construct validation of context questionnaire data Chapter 17. Digital reading assessment Chapter 18. International database THE OECD PROGRAMME FOR INTERNATIONAL STUDENT ASSESSMENT (PISA) PISA focuses on young people’s ability to use their knowledge and skills to meet real-life challenges. This orientation refl ects a change in the goals and objectives of curricula themselves, which are increasingly concerned with what students can do with what they learn at school and not merely with whether they have mastered specifi c curricular content. PISA’s unique features include its: – Policy orientation, which highlights differences in performance patterns and identifi es features common to high-performing students, schools and education systems by linking data on learning outcomes with data on student characteristics and other key factors that shape learning in and outside of school. – Innovative concept of “literacy”, which refers both to students’ capacity to apply knowledge and skills in key subject areas and to their ability to analyse, reason and communicate effectively as they pose, interpret and solve problems in a variety of situations. P – Relevance to lifelong learning, which goes beyond assessing students’ competencies in school subjects by asking them to report IS on their motivation to learn, their beliefs about themselves and their learning strategies. A – Regularity, which enables countries to monitor their progress in meeting key learning objectives. 2 0 – Breadth of geographical coverage and collaborative nature, which, in PISA 2009, encompasses the 34 OECD member countries 0 9 and 41 partner countries and economies. T e c h n ic Please cite this publication as: a l R OECD (2012), PISA 2009 Technical Report, PISA, OECD Publishing. e http://dx.doi.org/10.1787/9789264167872-en p o r This work is published on the OECD iLibrary, which gathers all OECD books, periodicals and statistical databases. t Visit www.oecd-ilibrary.org, and do not hesitate to contact us for more information. Programme for International Student Assessment ISBN 978-92-64-04018-2 -:HSTCQE=UYUV]W: 98 2012 01 1 P 982012011.indd 1 21-Mar-2012 10:20:32 AM PISA 2009 Technical Report This work is published on the responsibility of the Secretary-General of the OECD. The opinions expressed and arguments employed herein do not necessarily reflect the official views of the Organisation or of the governments of its member countries. This document and any map included herein are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Please cite this publication as: OECD (2012), PISA 2009 Technical Report, PISA, OECD Publishing. http://dx.doi.org/10.1787/9789264167872-en ISBN 978-92-64-04018-2 (print) ISBN 978-92-64-16787-2 (PDF) Series: PISA ISSN 1990-8539 (print) ISSN 1996-3777 (online) The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law. Photo credits: Cover © . Corrigenda to OECD publications may be found on line at: www.oecd.org/publishing/corrigenda. © OECD 2012 You can copy, download or print OECD content for your own use, and you can include excerpts from OECD publications, databases and multimedia products in your own documents, presentations, blogs, websites and teaching materials, provided that suitable acknowledgement of OECD as source and copyright owner is given. All requests for public or commercial use and translation rights should be submitted to [email protected]. Requests for permission to photocopy portions of this material for public or commercial use shall be addressed directly to the Copyright Clearance Center (CCC) at [email protected] or the Centre français d’exploitation du droit de copie (CFC) at [email protected]. Foreword The OECD’s Programme for International Student Assessment (PISA) surveys, which take place every three years, have been designed to collect information about 15-year-old students in participating countries. PISA examines how well students are prepared to meet the challenges of the future, rather than how well they master particular curricula. The data collected during each PISA cycle are an extremely valuable source of information for researchers, policy makers, educators, parents and students. It is now recognised that the future economic and social well-being of countries is closely linked to the knowledge and skills of their populations. The internationally comparable information provided by PISA allows countries to assess how well their 15-year-old students are prepared for life in a larger context and to compare their relative strengths and weaknesses. PISA is methodologically highly complex, requiring intensive collaboration among many stakeholders. The successful implementation of PISA depends on the use, and sometimes further development, of state-of-the-art methodologies and technologies. The PISA 2009 Technical Report describes those methodologies, along with other features that have enabled PISA to provide high quality data to support policy formation and review. The descriptions are provided at a level that will enable review and, potentially, replication of the implemented procedures and technical solutions to problems. This report contains a description of the theoretical underpinning of the complex techniques used to create the PISA 2009 Database, which includes information on 470 000 students in 65 countries.1 The database includes not only information on student performance in the three main areas of assessment – reading, mathematics and science – but also their responses to the Student Questionnaire that they completed as part of the assessment. Data from the principals of participating schools are also included. The PISA 2009 Database was used to generate information and to be the basis for analysis for the PISA 2009 initial report. The information in this report complements the PISA Data Analysis Manuals (OECD, 2009), which give detailed accounts of how to carry out the analyses of the information in the database. The PISA surveys are guided by the governments of the participating countries on the basis of shared policy-driven interests. The PISA Governing Board, which decides on the assessment and reporting of results, is composed of representatives from each participating country. The OECD recognises the creative work of Raymond Adams, of the Australian Council for Educational Research (ACER), who is project director of the PISA Consortium and John Cresswell who acted as editor for this report. The team supporting them comprised Alla Berezner, Wei Buttress, Steve Dept, Andrea Ferrari, Cees Glas, Béatrice Halleux, Khurrem Jehangir, Nora Kovarcikova, Sheila Krawchuk, Greg Macaskill, Barry McCrae, Juliette Mendelovits, Alla Routitsky, Keith Rust, Ross Turner and Maurice Walker. A full list of the contributors to the PISA project is included in Annex H of this report. The editorial work at the OECD Secretariat was carried out by Marika Boiron, Elizabeth Del Bourgo, Miyako Ikeda, Maciej Jakubowski, Sophie Vayssettes and Elisabeth Villoutreix. Lorna Bertrand Barbara Ischinger Chair of the PISA Governing Board Director for Education, OECD 3 PISA 2009 TECHNICAL REPORT – © OECD 2012 foreword Note 1. The implementation and data for PISA 2009 plus countries are not discussed in this report, however, the procedures, technical standards and statistical methods used in the PISA 2009 plus study were identical to those discussed here. 4 © OECD 2012 – PISA 2009 TECHNICAL REPORT Table of Contents Foreword .................................................................................................................................................................................................................................................3 CHAPTER 1 Programme For InternatIonaL Student aSSeSSment: an overvIew .....................................................21 Participation .............................................................................................................................................................................................................................................23 Features of PISA ...................................................................................................................................................................................................................................24 Managing and implementing PISA ........................................................................................................................................................................................24 Organisation of this report ..........................................................................................................................................................................................................26 CHAPTER 2 teSt deSIgn and teSt deveLoPment .............................................................................................................................................27 Test scope and format ......................................................................................................................................................................................................................28 • Paper and pencil assessment .................................................................................................................................................................................................28 • Digital Reading Assessment (DRA) ...................................................................................................................................................................................28 Test design .................................................................................................................................................................................................................................................29 • Paper-based assessment ............................................................................................................................................................................................................29 • Digital Reading Assessment .................................................................................................................................................................................................31 Test development centres .............................................................................................................................................................................................................31 Development timeline .....................................................................................................................................................................................................................31 The PISA 2009 reading literacy framework.......................................................................................................................................................................32 Item development process ...........................................................................................................................................................................................................33 • First phase of development ....................................................................................................................................................................................................33 • Second phase of development.............................................................................................................................................................................................34 • National item submissions .....................................................................................................................................................................................................34 • National review of items .........................................................................................................................................................................................................35 • International item review ........................................................................................................................................................................................................36 • Reading for School questionnaire .....................................................................................................................................................................................36 • Preparation of dual (English and French) source versions .................................................................................................................................36 Field trial ......................................................................................................................................................................................................................................................37 • Field trial selection .....................................................................................................................................................................................................................37 • Field trial design ............................................................................................................................................................................................................................38 • Despatch of field trial instruments ....................................................................................................................................................................................39 • Field trial coder training ...........................................................................................................................................................................................................39 • Field trial coder queries ............................................................................................................................................................................................................39 • Field trial outcomes .....................................................................................................................................................................................................................40 • National review of field trial items ...................................................................................................................................................................................40 Main study .................................................................................................................................................................................................................................................40 • Main survey reading item selection ................................................................................................................................................................................40 • Main survey mathematics items .........................................................................................................................................................................................43 • Main survey science items ....................................................................................................................................................................................................44 • Released items ................................................................................................................................................................................................................................44 • Despatch of main survey instruments .............................................................................................................................................................................44 5 PISA 2009 TECHNICAL REPORT – © OECD 2012 table of contents • Main survey coder training ....................................................................................................................................................................................................45 • Main survey coder query service .......................................................................................................................................................................................45 • Review of main survey item analyses .............................................................................................................................................................................45 CHAPTER 3 the deveLoPment oF the PISa Context QueStIonnaIreS .....................................................................................47 Introduction .............................................................................................................................................................................................................................................48 The development of the PISA 2009 Questionnaire Framework ......................................................................................................................48 Research areas in PISA 2009 .........................................................................................................................................................................................................49 The development of the PISA 2009 context questionnaires ..............................................................................................................................52 The field-trial of the PISA 2009 context questionnaires ........................................................................................................................................52 The coverage of the questionnaire material ...................................................................................................................................................................53 • Student and School Questionnaires .................................................................................................................................................................................53 • Educational Career Questionnaire ....................................................................................................................................................................................54 • ICT Familiarity Questionnaire ..............................................................................................................................................................................................54 • Parent Questionnaire ..................................................................................................................................................................................................................54 The implementation of the context questionnaires ................................................................................................................................................54 CHAPTER 4 SamPLe deSIgn ......................................................................................................................................................................................................57 Target population and overview of the sampling design .....................................................................................................................................58 Population coverage, and school and student participation rate standards .......................................................................................58 • Coverage of the PISA international target population ..........................................................................................................................................59 • Accuracy and precision ............................................................................................................................................................................................................60 • School response rates ................................................................................................................................................................................................................60 • Student response rates ...............................................................................................................................................................................................................61 Main study school sample ............................................................................................................................................................................................................62 • Definition of the national target population ...............................................................................................................................................................62 • The sampling frame .....................................................................................................................................................................................................................62 • Stratification .....................................................................................................................................................................................................................................63 • Assigning a measure of size to each school ...............................................................................................................................................................66 • School sample selection ..........................................................................................................................................................................................................66 • Special school sampling situations ...................................................................................................................................................................................68 • Monitoring school sampling .................................................................................................................................................................................................71 • Student samples .............................................................................................................................................................................................................................75 • Definition of school ....................................................................................................................................................................................................................76 CHAPTER 5 tranSLatIon and verIFICatIon oF the teSt and Survey materIaL ............................................................81 Introduction .............................................................................................................................................................................................................................................82 Development of source versions ............................................................................................................................................................................................82 Double translation from two source languages ..........................................................................................................................................................83 PISA Translation and Adaptation Guidelines .................................................................................................................................................................84 Translation Training Session ........................................................................................................................................................................................................84 Testing languages and translation/adaptation procedures .................................................................................................................................84 International verification of the national versions ....................................................................................................................................................86 • Verification of test units ............................................................................................................................................................................................................87 • Main survey verification ..........................................................................................................................................................................................................88 6 © OECD 2012 – PISA 2009 TECHNICAL REPORT table of contents • Verification of the booklet shell ..........................................................................................................................................................................................91 • Verification of link units ...........................................................................................................................................................................................................91 • Verification of questionnaires .............................................................................................................................................................................................91 • Final optical check of test booklets, questionnaire booklets and coding guides ...............................................................................93 • Verification of operational manuals .................................................................................................................................................................................95 • Verification of Digital Reading Assessment (DRA) units ....................................................................................................................................95 • Quantitative analyses of verification outcomes .......................................................................................................................................................96 Summary of items deleted at the national level, due to translation, printing or layout errors.............................................96 CHAPTER 6 FIeLd oPeratIonS ...............................................................................................................................................................................................97 Overview of roles and responsibilities ...............................................................................................................................................................................98 • National Project Managers .....................................................................................................................................................................................................98 • School Co-ordinators .................................................................................................................................................................................................................98 • Test Administrators .......................................................................................................................................................................................................................99 • School Associates .........................................................................................................................................................................................................................99 The selection of the school sample.......................................................................................................................................................................................99 Preparation of test booklets, questionnaires and manuals ............................................................................................................................100 Selection of the student sample ...........................................................................................................................................................................................101 Packaging and shipping materials .......................................................................................................................................................................................101 Receipt of materials at the national centre after testing ....................................................................................................................................102 Coding of the tests and questionnaires ..........................................................................................................................................................................102 • Preparing for coding ................................................................................................................................................................................................................102 • Logistics prior to coding ........................................................................................................................................................................................................104 • Single coding design ...............................................................................................................................................................................................................106 • Multiple coding ..........................................................................................................................................................................................................................109 • Managing the coding process ...........................................................................................................................................................................................111 • Cross-national coding .............................................................................................................................................................................................................112 • Questionnaire coding .............................................................................................................................................................................................................112 Data entry, data checking and file submission .........................................................................................................................................................113 • Data entry .......................................................................................................................................................................................................................................113 • Data checking ..............................................................................................................................................................................................................................113 • Data submission .........................................................................................................................................................................................................................113 • After data were submitted ....................................................................................................................................................................................................113 The main survey review ...............................................................................................................................................................................................................113 CHAPTER 7 QuaLIty aSSuranCe ......................................................................................................................................................................................115 PISA quality control ........................................................................................................................................................................................................................116 • Comprehensive operational manuals ..........................................................................................................................................................................116 • National level implementation planning document ..........................................................................................................................................116 PISA quality monitoring ..............................................................................................................................................................................................................116 • Field trial and main survey review .................................................................................................................................................................................116 • Final optical check ....................................................................................................................................................................................................................117 • National Centre Quality Monitor (NCQM) visits .................................................................................................................................................117 • PISA Quality Monitor (PQM) visits ................................................................................................................................................................................118 7 PISA 2009 TECHNICAL REPORT – © OECD 2012 table of contents • Test administration ....................................................................................................................................................................................................................118 • Delivery ............................................................................................................................................................................................................................................118 • Post final optical check ..........................................................................................................................................................................................................118 CHAPTER 8 Survey weIghtIng and the CaLCuLatIon oF SamPLIng varIanCe .........................................................119 Survey weighting ...............................................................................................................................................................................................................................120 • The school base weight .........................................................................................................................................................................................................121 • The school base weight trimming factor ....................................................................................................................................................................121 • The school non-response adjustment...........................................................................................................................................................................122 • The within-school base weight .........................................................................................................................................................................................122 • The grade non-response adjustment .............................................................................................................................................................................125 • The within school non-response adjustment ..........................................................................................................................................................125 • Trimming the student weights ...........................................................................................................................................................................................126 • Weighting for Digital Reading Assessment ...............................................................................................................................................................126 Calculating sampling variance ................................................................................................................................................................................................126 • The balanced repeated replication variance estimator ....................................................................................................................................126 • Reflecting weighting adjustments ...................................................................................................................................................................................128 • Formation of variance strata ...............................................................................................................................................................................................128 • Countries and economies where all students were selected for PISA....................................................................................................128 CHAPTER 9 SCaLIng PISa CognItIve data ...........................................................................................................................................................129 The mixed coefficients multinomial logit model.....................................................................................................................................................130 • The population model ............................................................................................................................................................................................................131 • Combined model .......................................................................................................................................................................................................................131 Application to PISA ..........................................................................................................................................................................................................................132 • National calibrations ...............................................................................................................................................................................................................132 • National reports ..........................................................................................................................................................................................................................133 • International calibration ........................................................................................................................................................................................................139 • Student score generation ......................................................................................................................................................................................................140 Booklet effects ....................................................................................................................................................................................................................................141 Analysis of data with plausible values ..............................................................................................................................................................................142 Developing common scales for the purposes of trends ...................................................................................................................................143 • Linking PISA 2009 for science and mathematics .................................................................................................................................................144 • Linking PISA 2009 for reading ..........................................................................................................................................................................................144 • Uncertainty in the link ...........................................................................................................................................................................................................144 CHAPTER 10 data management ProCedureS ..................................................................................................................................................147 Introduction ..........................................................................................................................................................................................................................................148 Data management at the national centre ......................................................................................................................................................................150 • National modifications to the database ......................................................................................................................................................................150 • Student sampling with KeyQuest ....................................................................................................................................................................................150 • Data entry quality control ....................................................................................................................................................................................................150 Data cleaning at ACER ...................................................................................................................................................................................................................152 • Recoding of national adaptations ...................................................................................................................................................................................152 • Data cleaning organisation .................................................................................................................................................................................................152 8 © OECD 2012 – PISA 2009 TECHNICAL REPORT