DOCUMENT RESUME ED 475 318 EC 309 498 Thompson, Sandra; Thurlow, Martha; Moore, Michael AUTHOR Using Computer-Based Tests with Students with Disabilities. TITLE NCEO Policy Directions. National Center on Educational Outcomes, Minneapolis, MN. INSTITUTION Special Education Programs (ED/OSERS), Washington, DC. SPONS AGENCY NCEO-15 REPORT NO PUB DATE 2003-01-00 NOTE 8p. H326G000001 CONTRACT AVAILABLE FROM National Center on Educational Outcomes, University of Minnesota, 350 Elliott Hall, 75 East River Rd., Minneapolis, MN 55455 ($3.50). Tel: 612-624-8561; Fax: 612-624-0879; e- mail: [email protected]. For full text: http://education.umn.edu/ NCEO. Guides Non-Classroom (055) PUB TYPE EDRS PRICE EDRS Price MF01/PC01 Plus Postage. DESCRIPTORS *Computer Assisted Testing; *Disabilities; Educational Testing; Elementary Secondary Education; Language Minorities; *Test Construction; *Test Format; *Testing Accommodations; Testing Problems ABSTRACT This report presents factors to consider in the design of computer-based testing for all students, including students with disabilities and students with limited English proficiency. It also provides a process for the initial transformation of paper/pencil assessments to inclusive computer- based testing. Steps include: (1) assemble a group of experts to guide the transformation, including experts on assessment design, accessible Web design, universal design, and assistive technology; (2) decide how each accommodations will be incorporated into the computer-based test; (3) consider each accommodation or assessment feature in light of the constructs; (4) consider the feasibility of incorporating the accommodation into computer-based tests; and (5) consider training implications for staff and students with special attention being given to the computer literacy of students and their experience using features like screen readers. The report then provides the following examples of specific accommodations for students with disabilities and students with limited English proficiency: large print and magnification, audio presentation of instruction and test items, simplified instructions, writing in the test booklet, using a calculator, breaks and multiple test sessions, and individual or small group administration. The report closes with .a list of considerations in the transformation of accommodations from paper/pencil to computer-based tests. (Contains 10 references.) (CR) Reproductions supplied by EDRS are the best that can be made from the original document. Using Computer-Based Tests with Students with Disabilities NCEO Policy Directions By Sandra Thompson Martha Thurlow Michael Moore January 2003/Number 15 U.S. D RTMENT OF EDUCATION REPRODUCE AND Office ducational Research and Improvement PERMISSION TO EDU MATERIAL HAS IONAL RESOURCES INFORMATION DISSEMINATE THIS BY CENTER (ERIC) SEEN GRANTED This document has been reproduced as received from the person or organization II 001 VI L originating it. Minor changes have been made to improve reproduction quality. Points of view or opinions stated in this TO THE EDUCATIONAL (ERIC) document do not necessarily represent FORMATION CENTER INFORMATION official OERI position or policy. EST COPY HATEABLE 2 January 2003 Number 15 Using Computer-based Tests with Students with Disabilities While computer-based testing may Computer-based testing is viewed Background address the challenges of NCLB by many policymakers as a way to and has many other positive meet the requirements of the No Computer-based testing has been characteristics as well, it poten- Child Left Behind Act of 2001 called the "next frontier in testing" tially creates other problems unless (NCLB). The need to produce as educators, testing companies, a thoughtful and systematic pro- itemized score analyses, disaggre- and state departments quickly cess is used to transfer existing gation within each school and work to transform paper/pencil paper/pencil assessments to district by gender, racial and ethnic tests into technology-based for- computer-based assessments. Not group, migrant status, English mats. These efforts have occurred only will poor design elements on proficiency, disability, and income in a variety of ways and for a the paper test transfer to the challenges states to create new and variety of tests. Some educators screen, but additional challenges more efficient ways to administer, have transferred all of their class- may result that reduce the validity score, and report assessment room quizzes and tests into a of the assessment results and results. computer-based format. possibly exclude some groups of students from assessment partici- There clearly are many opportuni- With the dramatic increase in the pation. ties created when computer-based use of the Internet over the past tests are used. These include more few years, and the considerable This Policy Directions presents efficient test administrations, the potential of online learning, assess- factors to consider in the design of availability of immediate results, ment will need to undergo a computer-based testing for all and student preferences for this complete transformation to keep students, including students with form of testing over paper and pace. Experts suggest that the disabilities and students with pencil tests. In addition, computer- Internet will be used to develop limited English proficiency. It also based testing opens up the possi- tests and present items through bility for built-in accommodations, dynamic and interactive stimuli student selection of testing op- such as audio, video, and anima- tions, and increased authenticity in NATIONAL tion. Given this momentum, it is items that are included. Other CENTER ON not surprising that there is a trend benefits have been identified as EDUCATIONAL toward investigating and incorpo- OUTCOMES well, so there is considerable rating the Internet as the testing pressure to move toward com- medium for statewide assessments. The College of Education puter-based testing. & Human Development UNIVERSITY OF MINNESOTA 3 NCEO Policy Directions Number 15 provides a process for the initial computer, in and of itself, does not >Lack of Ability to Design transformation of paper/pencil improve the overall quality of Accessible Web Pages assessments to inclusive computer- student writing. We continue to According to WebAIM, (Web based testing. find significantly lower mean test Accessibility in Mind, an initiative scores for students with disabilities of the Center for Persons with A report to the National Gover- than for their peers without dis- Disabilities at Utah State Univer- nors' Association sums up what abilities. The following are some sity), there are 27.3 million people we need to remember as computer- challenges that must be overcome with disabilities who are limited in based testing grows across the in order for computer-based the ways they can use the Internet: United States and throughout the testing to be effective. "The saddest aspect of this fact is world: that the know-how and the tech- >Issues of Equity and Skill in nology to overcome these limita- Do not forget why electronic Computer Use tions already exist, but they are assessment is desired. Electronic Concerns continue to exist in the greatly under-utilized, mostly assessment will enable states to area of equity, where questions are because Web developers simply do get test results to schools faster asked about whether the required not know enough about the issue and, eventually, cheaper. It will use of computers for important to design pages that are accessible help ensure assessment keeps tests puts some students at a to people with disabilities." pace with the tools that students disadvantage because of lack of are using for learning and with access, use, or familiarity. Concerns the ones that adults are increas- include unfamiliarity with answer- Developing ingly using at work. The tech- ing standardized test questions on Inclusive Computer- nology will also help schools a computer screen, using buttons based Testing improve and better prepare to search for specific items, and students for the next grade, for The transformation of traditional indecision about whether to use postsecondary learning, and for paper/pencil tests to inclusive traditional tools (e.g., hand held the workforce. (Using Electronic computer-based tests takes careful calculator) vs. computer-based Assessment to Measure Student and thorough work that includes tools. Performance, 2002, p. 9) the collaborative expertise of many people. Five steps should be used >Added Challenges for Some to address these transformation Students Challenges issues (see Table 1). Some research questions whether the medium of test presentation Despite the potential advantages Step 1. Assemble a group of affects the comparability of the offered by computer-based testing, experts to guide the transforma- tasks students are being asked to there remain several challenges, tion. Include experts on assess- complete. For example, (1) com- especially in the transition from ment design, accessible Web puter-based testing places more paper/pencil assessments. First of design, universal design, and demands on certain skills such as all, the use of technology cannot assistive technology, along with typing, using multiple screens to take the place of content mastery. state and local assessment and recall a passage, mouse navigation, No matter how well a test is special education personnel and and the use of key combinations, designed, or what media are used parents. (2) some people become more for administration, students who fatigued when reading text on a have not had an opportunity to Step 2. Decide how each accom- computer screen than on paper, (3) learn the material tested will modation will be incorporated long passages may be more diffi- perform poorly. Students need into the computer-based test. cult to read on a computer screen, access to the information tested in Examine each possible accommo- and (4) the inability to see an entire order to have a fair chance at dation in light of computer-based problem on screen at one time is performing well. Researchers administration. Some traditional challenging. strongly caution that the use of a paper/pencil accommodations will 2 Table 1. Steps for Developing Inclusive Computer-based Testing Assemble a group of experts to guide the transformation. Step 1. Decide how each accommodation will be incorporated into the computer-based test. Step 2. Consider each accommodation or assessment feature in light of the constructs being tested. Step 3. Consider the feasibility of incorporating the accommodation into computer-based tests. Step 4. Consider training implications for staff and students. Step 5. no longer be needed, while others Step 5. Consider training implica- with limited English proficiency, tions for staff and students. The will become built-in features that students with both disabilities and are available to every test-taker. best technology will be useless if limited English proficiency, and students or staff do not know how students who do not receive Step 3. Consider each accommo- special services but have a variety to use it. Special consideration dation or assessment feature in needs to be given to the computer of unique learning and response light of the constructs being literacy of students and their styles and needs. Here are some tested. For example, what are the experience using features like considerations for a few examples screen readers. Information about implications of the use of a screen of specific accommodations. reader when the construct being the features available on computer- >Large print and magnifica- measured is reading, or the use of based tests needs to be available to tion (presentation) a spellcheck when achievement in IEP teams to use in planning a When type is enlarged on a screen, student's instruction and assess- spelling is being measured as part students may need to scroll back of the writing process? As the use ments. Practice tests that include and forth, or up and down to read these features need to be available. of speech recognition technology an entire test item. Graphics, when permeates the corporate world, enlarged, may become very constructs that focus on writing on Skipping any of these steps may pixilated and difficult to view. paper without the use of a dictio- result in the design of assessments Students who use hand held nary or spellchecker may need to that exclude large numbers of magnifiers or monocular devices students. be reconsidered. when working on paper may not Considerations and be able to use these devices on a Step 4. Consider the feasibility of screen because of the distortion of incorporating the accommodation Examples computer images. If a graphics into computer-based tests. The Most states have a list of possible user interface is used (versus text feasibility of some accommoda- or common accommodations for based), students will not have the tions may require review by students with disabilities within option of altering print size on the technical advisors, or members of a the categories of presentation, policy/budget committee, or may screen. response, timing/scheduling, and require short-term solutions along setting. Some states also list ac- >Audio presentation of in- with long term planning. Con- commodations specifically de- structions and test items struct a specific plan for building signed for students with limited (presentation) in features that are not immedi- English proficiency. The list of Screen readers can present text as ately available, and conduct considerations in Table 2 was synthesized speech. The use of extensive pilot tests with a variety generated to address the needs of text-to-speech for test items may of equipment scenarios and acces- students with a variety of accom- not be a viable option if the con- sibility features. modation needsincluding stu- struct tested is the ability to read dents with disabilities, students print. 5 3 BEST COPY AVAILABLE Computer-based Tests Table 2. Considerations in the Transformation of Accommodations from Paper/pencil to Computer-based Tests Presentation Accommodations Capacity for any student to self-select print size or magnification Graphics and text-based user interfaces have different challenges Scrolling issues Variations in screen size Effects of magnification on graphics and tables Capacity for any student to self-select audio (screen reader), alternate language, or signed versions of instructions and test items (all students wear ear/headphones) Capacity to have instructions repeated as often as student chooses Variable audio speed and quality of audio presentation Capacity for pop-up translation Use of screen reader that converts text into synthesized speech or Braille Alternative text or "alt tags" for images Avoidance of complex backgrounds that interfere with readability of overlying text Tactile graphics or three-dimensional models may be needed for some images Capacity for multiple screen and text colors Response Accommodations Capacity for multiple options for selecting response mouse click, keyboard, touch screen, speech recognition, assistive devices to access the keyboard (e.g., mouth stick or head wand) Option for paper/pencil in addition to computer (e.g., scratch paper for solving problems, drafting ideas) Option for paper/pencil in place of computer (e.g., extended response items) Capacity for any student to self-select spell check option Capacity to disable spell check option when spelling achievement is being measured Spelling implications when using speech recognition software Capacity for any student to select calculator or dictionary option Timing/Scheduling Accommodations Availability/location of computers and peripherals Flexible, individualized timing Capacity of network system Maintaining place and saving completed responses during breaks Capacity to turn off monitor/ blank screen temporarily Test security Capacity for self-selection of subtest order Setting Accommodations Grouping arrangements Use of earphones or headphones Use of individual setting if response method distracts other students Availability/comparability/location of computers and peripherals Glare from windows or overhead lights Adaptive furniture Test security REST COPY AVAILABLE 4 Number 15 NCEO Policy Directions students, including students with Breaks and multiple test Instructions simplified/ disabilities and English language sessions (timing/scheduling) clarified (presentation) learners, without the addition of Technology is required for multiple Instructions for all students need special accommodations. However, test sessions that would allow clearly worded text that can be even though items on universally individual students to submit their followed simply and intuitively, designed assessments are acces- completed responses and be able with a consistent navigational sible for most students, there will to log out and back on again at scheme between pages/items. still be some specialized accommo- another time, starting at the place Students may need an option to dations, and computer-based where they previously left off. self-select alternate forms of testing must be amenable to these Careful scheduling is needed for instructions in written or audio accommodations. multiple test sessions to make sure format. that computers are available. Test Students with disabilities will be at Write in test booklet security becomes an issue if stu- a great disadvantage if paper/ dents who have responded to the (response) pencil tests are simply copied on same test items have opportunities There are many options for mark- screen without any flexibility. Until to interact with each other between ing responses on computer-based the implications of the use of test sessions. tests that are not available on graphics versus text-based user paper. It would still be possible for interfaces are considered and Individual or small group a student to dictate responses to a resolved, a large number of stu- administration (setting) teacher, who would then mark dents will need to continue to use Computer-based tests create them on the computer. The option paper/pencil tests, with a possible increased individualization for of speech recognition software is reduction in the comparability of every student. Each student can be also becoming more available. results, and an increase in adminis- seated at a separate computer Speech recognition technology trative time and potential errors station wearing ear/headphones enables computers to translate when paper/pencil responses are for audio instructions or items. human speech into a written transferred by a test administrator Students using speech recognition format. Currently, speech recogni- to a computer for scoring. systems or other distracting re- tion only works for some people, sponse methods need to be tested while others, especially those who Resources in individual settings. are not native English speakers or those with speech impairments, can be frustrated by the software's Access to Computer-based Testing for Summary lack of ability to differentiate many Students with Disabilities (Synthesis of the sounds that they make. Report 45). Thompson, S.J., Thurlow, With the enactment of NCLB, M.L., Quenemoen, R.F., & Lehr, C.A. nearly all states are in the process Calculator (response) (2002). Minneapolis, MN: University of Minnesota, National Center on of designing new assessments. As Calculator use is often allowed on Educational Outcomes. http:/ / part of this process, several states paper/pencil tests when arithmetic education.umn.edu/nceo/ are considering the use of com- is not the construct being mea- OnlinePubs/Synthesis45.html puter-based testing, since this is sured. However, standardization the mode in which many students of the type of calculator used has Assistive Technology Competencies are already learning. Several states been very difficult and would be for Special Educators. Lahm, E.A., & have already begun designing and much easier if all students had the Nickels, B.L. (1999). Teaching Excep- implementing computer-based tional Children, 32(1), 566-63. same online calculator to use. Use testing. of an online calculator is challeng- Computerized Test Accommodations: ing for some students, especially if A New Approach for Inclusion and Because many accessibility fea- they have not had practice with Success for Students with Disabilities. tures can be built into computer- this tool in their daily work. Burk, M. (1999). Washington, D.C.: based tests, the validity of test A.U. Software. results can be increased for many 5 Computer-based Tests Effects of Computer-based Test OttOtt*E0 Accommodations on Mathematics OPV-40°*ff Performance Assessments for Second- ary Students with Learning Disabili- ties. Calhoon, M.B., Fuchs, L.S., & The National Center on Educa- Deb A. Albus Hamlett, C.L. (2000). Learning Disabil- Jane L. Krentz tional Outcomes (NCEO) was ity Quarterly, 23, 271-282. established in 1990 to provide Kristi K. Liu Introduction to Web Accessibility. national leadership in the identifi- Jane E. Minnema WebAIM (2001). Retrieved March, cation of outcomes and indicators Michael L. Moore 2002, from the World Wide Web: to monitor educational results for Rachel F. Quenemoen http:/ /www.webaim.org/intro/ all students, including students Dorene L. Scott with disabilities. NCEO addresses Sandra J. Thompson Reinventing Assessment: Speculations the participation of students with on the Future of Large-scale Educa- disabilities in national and state Martha L. Thurlow, director tional Testing. Bennett, R.E. (1998). Princeton, NJ: Policy Information assessments, standards-setting Center, Educational Testing Service. The University of Minnesota is committed efforts, and graduation require- to the policy that all persons shall have Retrieved March, 2002, from the World ments. equal access to its programs, facilities, and Wide Web: www.ets.org/research/ employment without regard to race, color, pic/bennett.html creed, religion, national origin, sex, age, The Center represents a collabora- marital status, disability, public assistance tive effort of the University of Technology and Assessment: Thinking status, veteran status, or sexual Minnesota, the Council of Chief Ahead: Proceedings of a Workshop. orientation. National Research Council. (2002). State School Officers (CCSSO), and Washington, DC: Board on Testing and the National Association of State Assessment, Center for Education. Directors of Special Education Division of Behavioral and Social (NASDSE). Sciences and Education, National Academy Press. Retrieved March 2002 The Center is supported primarily from the World Wide Web: http:/ / through a Cooperative Agreement www.nap.edu/books/0309083206/ (#H326G000001) with the Research html to Practice Division, Office of Universal Design Applied to Large- Special Education Programs, U.S. 111 scale Assessments (Synthesis Report Department of Education. Addi- lb 44). Thompson, S.J., Johnstone, C.J., & tional support for targeted Thurlow, M.L. (2002). Minneapolis, projects, including those on LEP MN: University of Minnesota, Na- students, is provided by other tional Center on Educational Out- I federal and state agencies. The comes. http: / /education.umn.edu/ Center is affiliated with the Insti- nceo/OnlinePubs/Synthesis44.html tute on Community Integration in Using Electronic Assessment to the College of Education and Measure Student Performance. Na- Human Development, University I tional Governors' Association. (2002). of Minnesota. Opinions expressed Education Policy Studies Division: herein do not necessarily reflect ' National Governors Association. those of the U.S. Department of Retrieved March, 2002, from the World Education or Offices within it. Wide Web: http:/ /www.nga.org/cda/ files/ELECTRONICASSESSMENT.pdf Web Accessibility Initiative, World Wide Web Consortium. Retrieved March, 2002, from the World Wide Web: http: / /www.w3.org /WAI /A 6 AVAILABLE BEST CNY U.S. Department of Education Office of Educational Research and Improvement (OERI) Educe Ilona! Beaune: Information Calder National Library of Education (NLE) Educational Resources Information Center (ERIC) NOTICE Reproduction Basis This document is covered by a signed "Reproduction Release (Blanket)" form (on file within the ERIC system), encompassing all or classes of documents from its source organization and, therefore, does not require a "Specific Document" Release form. This document is Federally-funded, or carries its own permission to reproduce, or is otherwise in the public domain and, therefore, may be reproduced by ERIC without a signed Reproduction Release form (either "Specific Document" or "Blanket"). EFF-089 (1/2003)