Skip to Content

Consortium for Modified Alternate Assessment Development &Implementation (C-MAADI Project)

Assessment Intervention Learning Policy Training

Abstract


Final regulations released by the U.S. Department of Education (April, 2007) under the NCLB and IDEIA allow states to develop modified academic achievement standards for students whose disability prevents them from achieving grade-level proficiency and who are unlikely to reach grade-level achievement in the same timeframe as their peers. These modified achievement standards are intended to provide a challenging measure for a small group of students' mastery of grade-level content, but should be less difficult than grade-level achievement standards.

The primary purpose of the Consortium for Modified Alternate Assessment Development and Implementation (CMAADI) is to provide state partners -- Arizona and Indiana-- expert leadership and technical support in the development and implementation of alternate assessments based on modified academic achievement standards (AA-MAS). In addition, the team of Project Directors will facilitate the evaluation of these new testing practices and collaborate with states on the documentation for their test with the professional assessment and disabilities community. The CMAADI project is organized around seven functional and measurable goals to: (1) Develop and implement criteria for participation in an AA-MAS; (2) Develop reading and mathematics test items that are highly accessible, aligned with grade-level content standards, and less complex than those on existing general achievement tests; (3) Implement a field test of the AA-MASs at multiple grade levels; (4) Evaluate effectiveness of professional development and the technical aspects of the field-test items; (5) Implement AA-MASs statewide; (6) Set achievement standards for the AA-MASs; and (7) Document and disseminate the uses and technical qualities of the new assessments.

The project directly addresses Absolute Priority A identified in the IDEA General Supervision Enhancement Grant application. The project utilizes a consortium approach, lead by assessment and special education experts who have worked together previously to provide leadership for the Consortium for Alternate Assessment Validity and Experimental Studies (CAAVES). The CMAADI project extends the CAAVES project and is directly influenced by technical standards for high-quality assessments (Standards and Assessments Peer Review Guidance, USDOE, revised July 2007), guided by the Modified Academic Achievement Standards: Non-Regulatory Guidance draft document (USDOE, April 2007) and principles of universal design (NCEO, September 2006), and based on theory and previous research on item development and testing accommodations for students with disabilities. Collectively, these professional documents on high quality assessments and the published research on test development and testing of students with disabilities provide strong guidelines for designing high quality tests and successful testing programs of modified achievement standards. Over the 3-year period, the CMADDI Project will contribute to the development of new alternate assessments in Arizona and Indiana, build the capacity of participating states to conduct future validity studies of their assessments, and expand understanding of the academic achievement of thousands of students with disabilities who historically have performed poorly on achievement tests and in their classrooms.

Progress

Item Modification Sessions

In July, 2008, item writing teams consisting of teachers, assessment developers, and assessment researchers convened for one week in Phoenix to evaluate and modify a large pool of reading and mathematics items across multiple grade-levels. To prepare item modification teams, a panel of researchers from Vanderbilt University and the University of Minnesota delivered a half-day training session with the aim of providing a comprehensive overview of accessibility theory for assessment, including universal design principles, cognitive load theory, and item and test development research. Additionally, teams were trained in the use of the Test Accessibility and Modification Inventory(TAMI; Beddow, Kettler, & Elliott), a decision-making tool to facilitate a comprehensive analysis of test items with the purpose of enhancing their accessibility for all test-takers.
Following the item modification session, personnel at the Arizona Department of Education's Office of Assessment and Accountability revised items according to the recommendations of modification teams and compiled dual sets of brief test forms consisting of a combination of original and modified items, with their sibling items contained in each alternate test form.

Cognitive Interview Study

In fall of 2008, researchers from Vanderbilt University conducted a cognitive interview ("think-aloud") study with a sample of 42 students in grades 4-8 and high school. Specifically, students were randomly assigned to one of the two test booklets for their respective grade-level and asked to complete the test items in the booklet while reporting their thoughts about the items. Following each section of the test, students were asked to indicate the relative difficulty of each item for them on a scale of 1-7. These self-reported ratings were transformed into Cognitive Ease z¬-scores and plotted against the student's performance on each item to determine the relative cognitive efficiency of the items prior to, and following, modification for accessibility. Additionally, each student's reading and mathematics teacher completed a Curricular Experiences Survey to indicate the extent to which the content measured by the test items had been covered in class during the school year, a proxy for the student's opportunity to learn the tested content. The results of this extensive cognitive interview study were compiled in a report that was provided to the Arizona team in January, 2009.

Pilot Test / Focus Group Study

In early 2009, a subset of the modified items in grades 7 and 10 from the July, 2008 conference underwent subsequent revisions commensurate with the team's data-based conclusions from the cognitive interview study. These items were assembled into test booklets. In April of 2009, students across the state of Arizona participated in the statewide AIMS assessment, which included the modified items in their original forms. In May of 2009, 300 students in grades 7 and 10 with and without IEPs completed the aforementioned brief test booklets containing modified versions of a subset of the items they received a few weeks before the general AIMS test. Additionally, a subsample of these students completed MAZE reading fluency probes.

Following these test sessions, researchers conducted focus groups with small groups of students with and without IEPs (i.e., those identified with disabilities, those not identified with disabilities) to collect their observations, perceptions, and thoughts about tests, test items, and about the particular modifications that were made to the test items they received during the study. These focus groups were audio-recorded for later analysis.
Data analyses are currently underway, using student data from the modified items along with recent student AIMS data, to determine the effectiveness of the item modification procedures to enhance the accessibility of test items.

Item Accessibility Review

In spring of 2009, a team of CMAADI investigators used an additional component to the TAMI called the Accessibility Rating Matrix (ARM; Beddow, Elliott, & Kettler, 2009) to examine the accessibility of a large pool of multiple-choice and constructed-response items from the state of Indiana across grades 3-8 in Language Arts, Mathematics, Science, and Social Studies. Item accessibility was rated on a 4-point scale (4 = Maximally accessible for nearly all test-takers; 1 = Inaccessible for many test-takers). Ratings were disaggregated by item elements, which consisted of the passage or item stimulus, item stem, visuals, answer choices, and page/item layout. Each item also was assigned an overall accessibility rating. The evaluation team identified several positive attributes of the reviewed set of items and identified a number of modifications to improve the accessibility of the items. Detailed feedback was provided for each item reviewed. Reliability indices indicated that the item accessibility review procedures were applied consistently across raters.

Dissemination of Findings

The CMAADI project was highlighted at the CCSSO 2009 National Conference on Student Assessment, with presentations by four CMAADI investigators (slides are available for download at the bottom of this page.) The presentations included discussions of lessons learned from the first year of the project, including several audio clips from the aforementioned focus groups, as well as early findings from the cognitive interview study. Additionally, each participant received a copy of the tool used to conduct the Indiana Item Accessibility Review. (For additional information about the Test Accessibility and Modification Inventory and TAMI Accessibility Rating Matrix, and to download copies of each instrument, please visit the TAMI webpage.

Work completed as part of the CMAADI project has influenced several articles in the fall 2010 edition of the Peabody Journal of Education, titled Alternate Assessments Based on Modified Academic Achievement Standards: New Policy, New Practices, and Persistent Challenges. The special issue is edited by Kettler and Elliott, and five of the articles include members of the CMAADI team as authors.
Additionally, Elliott, Kettler, Beddow, and Kurz are working on an edited volume called Accessible Tests of Student Achievement: Issues, Innovations, and Applications to be published by Springer Publishing Company in 2010. The book will include several chapters based in part on lessons learned from the current project.

PRESENTATION AT CCSSO 2009 NATIONAL CONFERENCE

Modifying Achievement Test Items: A Theory-Guided and Data-Based Approach, Stephen N. Elliott

Designing More Accessible Achievement Tests for All Students, Stephen N. Elliott

Quantifying and Impoving the Accessibility of Tests and Test Items, Peter Beddow

Students' Perceptions of Item Modifications: Using Cognitive Labs and Questionaires, Andrew Roach

Examining Distractor Effectiveness in Modified Items for Students with Disabilities, Michael C. Rodriguez

 

October 2009 Special Issue of the Peabody Journal of Education
Alternate Assessments based on Modified Academic Achievement Standards: New Policy, New Practices, and Persistent Challenges

Alternate Assessment based on Modified Academic Achievement Standards: Introduction to the Federal Policy and Related Implementation Issues - Kettler and Elliott

The "Two Percent Students:" Considerations and Consequences of Eligibility Decisions - Zigmond and Kloo

The Changing Landscape of Alternate Assessments Based on Modified Academic Achievement Standards (AA-MAS): An Analysis of Early Adopters of AA-MAS - Lazarus and Thurlow

Modifying Achievement Test Items: A Theory-Guided and Data-Based Approach for Better Measurement of What Students with Disabilities Know - Kettler, Elliott and Beddow 

Writing Performance Level Descriptors and Setting Performance Standards for Assessments of Modified Achievement Standards: The Role of Innovation and Importance of Following Conventional Practice - Egan, Ferrara, Schneider and Barton 

Opportunities and Options for Facilitating and Evaluating Access to the General Curriculum for Students with Disabilities  - Roach, Chilungu, LaSalle, Talapatra, Vignieri, and Kurz

State Perspectives on Implementing, or Choosing Not to Implement, an Alternate Assessment based on Modified Academic Achievement Standards - Palmer

Psychometric Considerations for Alternate Assessments Based on Modified Academic Achievement Standards - Rodriguez

Commentary on Peabody Special Issue Articles on Alternate Assessments on Modified Academic Achievement Standards: The Road Ahead - Weigert

Presentations at 2010 CEC Conference

1% + 2% = __  :  Adding Up What We Know and Don't Know About Alternate Assessments - Elliott

Alternate Assessments' Contributions to Better Classroom Instruction & Testing - Elliott/Kettler


 
©