Skip Navigation
National Profile on Alternate Assessments Based on Alternate Achievement Standards:

NCSER 2009-3014
August 2009

A. Overview

The Overview section presents selected key features of alternate assessments in 2006–07, including the purposes the state reported for the alternate assessment, the general approaches and procedures used, and the coverage of academic content standards. NCLB required states, beginning in 2005–06, to administer assessments in reading/language arts and in mathematics in each of grades 3 through 8 and at least once in grades 10 through 12. Although states were required to develop achievement standards in science by 2005–06, assessments in science were not required to be administered until 2007–08.

Alternate assessment1 title (A1)

This item asked for the name of the alternate assessment being used during the 2006–07 school year. One state (Michigan) used two alternate assessments to assess students with significant cognitive disabilities.

The titles of the alternate assessments are not reported here but are reported by state in the NSAA State Profiles and in table A1 in appendix B, NSAA Data Tables.

Purposes of alternate assessment (A2)

This item asked for the stated purposes and goals for the alternate assessment, in addition to meeting accountability requirements set by federal law. This was an open-ended item, and the following response categories emerged during coding. Multiple responses were possible and are presented graphically in figure A2 and for individual states in table A2 in appendix B, NSAA Data Tables.

  • Evaluate programs – This category was coded when the state specifically mentioned program evaluation as a purpose of the alternate assessment. Thirty-one percent of states (16 states) reported this purpose.
  • Guide classroom instruction – This category was coded when the state reported that results of the assessment were intended to inform and refine classroom instruction for students with significant cognitive disabilities. Fifty-nine percent of states (30 states) reported this purpose, reflecting a majority of the states.
  • Measure student progress/performance toward state standards – This category was coded when the state reported that measurement of individual student learning outcomes within the context of state accountability and state standards was a purpose of the alternate assessment. Eighty-six percent of states (44 states) reported this purpose, reflecting a majority of the states and the highest frequency.
  • Assess student access to content standards/general curriculum – This category was coded when the state reported that evaluating access to the general education academic content standards for students with significant cognitive disabilities was a purpose of the alternate assessment. Fifty-seven percent of states (29 states) reported this purpose, reflecting a majority of the states
  • Assess individual students' strengths/weaknesses – This category was coded when the state reported that the gathering of information to measure the performance of individual students was a purpose of the alternate assessment. Fifty-one percent of states (26 states) reported this purpose, reflecting a majority of the states.
  • Document academic achievement – This category was coded when the state reported that documenting academic achievement and/or providing reports of student academic achievement to parents was a purpose of the alternate assessment. Fifty-nine percent of states (30 states) reported this purpose, reflecting a majority of the states.
  • Measure student progress toward IEP goals – This category was coded when the state reported that a purpose of the alternate assessment was to inform IEP development or document whether IEP goals were or were not met. Eighteen percent of states (9 states) reported this purpose.

Top

Alternate assessment approaches (structures/types of items used) (A3)

This item characterized the approaches states reported using for their 2006–07 alternate assessments. This was a multiple-choice item, and multiple responses were possible for states that used a combined approach (e.g., a series of performance tasks/events in combination with submitted portfolios) and are presented graphically in figure A3 and for individual states in table A3 in appendix B, NSAA Data Tables.

  • Rating scale/checklist – Twenty-five percent of states (13 states) reported using this approach.
  • Portfolio/body of evidence – Fifty-nine percent of states (30 states) reported using this approach, reflecting a majority of the states and the highest frequency reported among the general types of assessment approaches.
  • Performance task/events – Forty-one percent of states (21 states) reported using this approach.
  • Multiple choice/constructed response – Twelve percent of states (6 states) reported using this approach.

Top

What content areas were included in the alternate assessment? (A4)

This multiple-choice item asked for the specific content areas that were addressed by the state's alternate assessments. Multiple responses were possible and are presented graphically in figure A4 below and for individual states in table A4 in appendix B, NSAA Data Tables.

  • Reading/language arts – One hundred percent of states (51 states) reported that they assessed students in reading/language arts, refl ecting the highest frequency reported, along with the assessment of students in mathematics.
  • Mathematics – One hundred percent of states (51 states) reported that they assessed students in mathematics, refl ecting the highest frequency reported, along with the assessment of students in reading/language arts.
  • Science – Fifty-seven percent of states (29 states) reported that they assessed students in science, refl ecting a majority of the states.
  • Social studies – Twenty-fi ve percent of states (13 states) assessed students in social studies.
  • Functional skills – Four percent of states (2 states) assessed students on functional skills.

Top

Grades assessed (A5)

This multiple-choice item asked for the specific grades (3 to 12) in which the state assessed students using the alternate assessment for measuring adequate yearly progress (AYP). Multiple responses were possible and are presented graphically in figure A5 and for individual states in table A5 in appendix B, NSAA Data Tables.

  • Grades 3 to 7 – One hundred percent of states (51 states) reported that they assessed students in the third, fourth, fifth, sixth, and seventh grades using the alternate assessment, reflecting the highest frequencies reported.
  • Grade 8 – Ninety-eight percent of states (50 states) reported that they assessed students in the eighth grade using the alternate assessment, reflecting a majority of the states.
  • Grade 9 to 12 – One hundred percent of states (51 states) reported that they assessed students at least once in ninth through twelfth grade.
  • Grade 9 – Twenty-nine percent of states (15 states) reported that they assessed students in the ninth grade using the alternate assessment.
  • Grade 10 – Sixty-seven percent of states (34 states) reported that they assessed students in the 10th grade using the alternate assessment, reflecting a majority of the states.
  • Grade 11 – Fifty-nine percent of states (30 states) reported that they assessed students in the 11th grade using the alternate assessment, reflecting a majority of the states.
  • Grade 12 – Twelve percent of states (6 states) reported that they assessed students in the 12th grade using the alternate assessment.

Top

What was the time frame within which the alternate assessment occurred? (A6)

This multiple-choice item asked about the time frame of the administration of the alternate assessment by providing four mutually exclusive response options. The responses are presented graphically in fi gure A6 below and for individual states in table A6 in appendix B, NSAA Data Tables.

  • One day to 2 weeks – Two percent of states (1 state) reported that the alternate assessment occurred within 1 day to 2 weeks during the school year.
  • More than 2 weeks to 1 month – Four percent of states (2 states) reported that the alternate assessment occurred within more than 2 weeks to 1 month during the school year.
  • More than 1 month to 2 months – Thirty-three percent of states (17 states) reported that the alternate assessment occurred within more than 1 month to 2 months during the school year.
  • More than 2 months – Sixty-one percent of states (31 states) reported that the alternate assessment occurred within more than 2 months to the full school year, reflecting a majority of states and the highest frequency reported.

Top

How many state content standards were there for reading/language arts? On how many content standards in reading/language arts were students with significant cognitive disabilities assessed using the alternate assessment? (A7)

Two related items were investigated together: the number of general education content standards the state had in place for reading/language arts and on how many of those standards students with significant cognitive disabilities were assessed using the alternate assessment.

States used different terms to refer to various levels of their system of general education content standards. For this item, the term "content standard" was used to refer to the highest level in a hierarchy of skills and knowledge, of which there were only a limited number (typically 10 or fewer) for each content area. Although states often articulated additional subdomains of skills and knowledge, often down to deeper levels of specificity that described actual student performance, tasks, and/or activities, those levels are not reported here.

The second part of this item asked for the number of general education content standards on which students with significant cognitive disabilities were assessed by the state using an alternate assessment. In some states, each general education content standard was addressed in the alternate assessment in a way thought to be appropriate for students with significant cognitive disabilities. In other states, only a portion of the general education content standards were addressed in the alternate assessment. This information is presented graphically in figure A7 and for individual states in table A7 in appendix B, NSAA Data Tables.

  • The number of general education content standards in place in a state in reading/ language arts ranged from 1 to 13 or varied by grade level.
  • The number of content standards on which students with significant cognitive disabilities were assessed on the alternate assessment ranged from 1 to 8 or varied by grade level or teacher discretion.
  • Two percent of states (1 state) reported assessing students with significant cognitive disabilities on standards other than the state content standards.
  • Thirty-one percent of states (16 states) reported that there was a one-to-one correspondence between each general education content standard and the standards assessed on the alternate assessment.
  • Forty-five percent of states (23 states) reported that the alternate assessment assessed fewer general education content standards than were in place for the general education student population, reflecting the highest frequency reported.
  • Twenty percent of states (10 states) reported that there was variation in the number of content standards assessed based on the grade level of the student.
  • Two percent of states (1 state) reported that there was variation in the number of content standards assessed based on the discretion of the student's teacher.

Top

How many state content standards were there for mathematics? On how many content standards in mathematics were students with significant cognitive disabilities assessed using the alternate assessment? (A8)

Two related items were investigated together: the number of general education content standards the state had in place for mathematics and on how many of those standards students with significant cognitive disabilities were assessed using the alternate assessment.

States used different terms to refer to various levels of their system of general education content standards. For this item, the term "content standard" was used to refer to the highest level in a hierarchy of skills and knowledge, of which there were only a limited number (typically 10 or fewer) for each content area. Although states often articulated additional subdomains of skills and knowledge, often down to deeper levels of specificity that described actual student performance, tasks, and/or activities, those levels are not reported here.

The second part of this item asked for the number of general education content standards on which students with significant cognitive disabilities were assessed by the state using an alternate assessment. In some states, each general education content standard was addressed in the alternate assessment in a way thought to be appropriate for students with significant cognitive disabilities. In other states, only a portion of the general education content standards were addressed in the alternate assessment. This information is presented graphically in figure A8 below and for individual states in table A8 in appendix B, NSAA Data Tables.

  • The number of general education content standards in place in a state in mathematics ranged from 3 to 11 or varied by grade level.
  • The number of content standards on which students with significant cognitive disabilities were assessed on the alternate assessment ranged from 1 to 7 or varied by grade level or teacher discretion.
  • Two percent of states (1 state) reported assessing students with significant cognitive disabilities on standards other than the state content standards.
  • Thirty-seven percent of states (19 states) reported that there was a one-to-one correspondence between each general education content standard and the standards assessed on the alternate assessment.
  • Forty-one percent of states (21 states) reported that the alternate assessment assessed fewer general education content standards than were in place for the general education student population, reflecting the highest frequency reported.
  • Eighteen percent of states (9 states) reported that there was variation in the number of content standards assessed based on the grade level of the student.
  • Two percent of states (1 state) reported that there was variation in the number of content standards assessed based on the discretion of the student's teacher.

Top

Alternate assessment developer (A9)

This item asked who was involved in the development of the alternate assessment. This was an open-ended item, and the following response categories emerged during coding. Multiple responses were possible and are presented graphically in figure A9 and for individual states in table A9 in appendix B, NSAA Data Tables.

  • Assessment company – Forty-nine percent of states (25 states) reported that an assessment company was involved in the development of the alternate assessment.
  • Research company/university/independent researcher – Sixty-seven percent of states (34 states) reported that a research company, a university, or an independent researcher was involved in the development of the alternate assessment, reflecting a majority of the states.
  • Technical assistance provider (e.g., regional resource centers) – Thirty-three percent of states (17 states) reported that a technical assistance provider was involved in the development of the alternate assessment.
  • State personnel – Ninety-six percent of states (49 states) reported that state personnel were involved in the development of the alternate assessment, reflecting a majority of the states and the highest frequency reported.
  • Parents – Forty-nine percent of states (25 states) reported that parents of students with significant cognitive disabilities were involved in the development of the alternate assessment.
  • Stakeholders – Seventy-eight percent of states (40 states) reported that a group of stakeholders were involved in the development of the alternate assessment, reflecting a majority of the states.

Top

Who administered/assembled the alternate assessment? (A10)

This item asked who was involved in administering/assembling the alternate assessment. Multiple responses were possible and are presented graphically in figure A10 and for individual states in table A10 in appendix B, NSAA Data Tables.

  • The student's special education teacher – One hundred percent of states (51 states) reported that the student's special education teacher administered or assembled the alternate assessment, reflecting a majority of the states and the highest frequency reported.
  • A certified educator who was not the student's teacher – This response category was coded when members of an assessment team, other classroom teachers, the student's IEP team, or other support staff at the school or district level were allowed to administer or assemble the assessment but the student's teacher was not involved in the assessment administration. Thirty-seven percent of states (19 states) reported that a certified educator who was not the student's teacher administered or assembled the alternate assessment.
  • Paraprofessional – This response category was coded when aides or nonlicensed assistants were allowed to administer or assemble the alternate assessment. Eight percent of states (4 states) reported that a paraprofessional administered or assembled the alternate assessment.

Top

Who scored the alternate assessment? (A11)

This item asked who was allowed to score the alternate assessment. Multiple responses were possible and are presented graphically in figure A11 and for individual states in table A11 in appendix B, NSAA Data Tables.

  • Student's classroom teacher – Fifty-three percent of states (27 states) reported that the student's classroom teacher was allowed to score the alternate assessment, reflecting a majority of the states and the highest frequency reported.
  • School- or district-based educator – This response category was coded when the scorer was not the student's teacher but someone designated by the school or district administration, such as another teacher, IEP team member, counselor, or relatedservices personnel. Twenty-nine percent of states (15 states) reported that a school- or district-based educator who was not the student's regular teacher was allowed to score the alternate assessment.
  • State or state-contracted scorer – This response category was coded when the scorer was someone who did not work at the student's school and served as a state agent in scoring the assessment, such as a test vendor staff member or an individual who served at a scoring "camp." Fifty-one percent of states (26 states) reported that a state or statecontracted scorer who did not work at the school was allowed to score the alternate assessment, reflecting a majority of the states.
  • Machine scored – This response category was coded when student performance was evaluated electronically and not by the teacher or any other individual. This differed from instances in which a machine did the final tabulation of results or applied formulas to the results of individual scoring interpretations. Six percent of states (3 states) reported that the alternate assessment was machine scored.
  • Paraprofessional – This response category was coded when aides or nonlicensed assistants were allowed to administer the alternate assessment. Six percent of states (3 states) reported that a paraprofessional or aide was allowed to score the alternate assessment.

Top

1 Throughout the text, alternate assessment based on alternate academic achievement standards is referred to as "alternate assessment."