Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios An Alternative Statewide Assessment ...
Home arrow_forward_ios ... arrow_forward_ios An Alternative Statewide Assessment ...
Information on IES-Funded Research
Grant Closed

An Alternative Statewide Assessment Strategy that Uses Test Results to Support Learning and Includes Measures of Problem Solving

NCER
Program: Education Research Grants
Program topic(s): Cognition and Student Learning
Award amount: $2,097,419
Principal investigator: Faranak Rohani
Awardee:
Florida State University
Year: 2011
Award period: 4 years (03/01/2011 - 02/28/2015)
Project type:
Measurement
Award number: R305A110121

Purpose

Conventional statewide assessments, such as those used to meet No Child Left Behind mandates, can measure only a subset of competencies essential to the K-12 curriculum. These skills are primarily those that can be assessed through conventional paper-and-pencil tests. This limitation encourages schools and teachers to emphasize that subset, often at the expense of other skills that are harder to assess, including complex problem solving. An additional problem is that statewide assessments focus almost exclusively on generating summative data (i.e., what students already know) rather than formative information (i.e., what to focus on to improve teaching and learning). This focus may not support actual student growth. Research has established that learning most often occurs when assessments play a formative role instead of a summative one. The purpose of this research is to evaluate an alternate approach to assessing state-level standards that measures complex cognitive competencies, such as problem solving, and facilitates formative use of these assessments at the classroom level. The research, in part, may help to determine if separate complex assessments developed by teachers and an agency external to the classroom cross-validate each other.

Project Activities

The research team will work with middle school teachers, administrators, and other stakeholders to first develop standards for the assessments and then to develop materials that teachers can use to generate their own assessments that attend to these standards. The development process will be iterative. To determine whether the teacher-generated assessments are as valid as those generated by external agents, the researchers will compare their own assessments to the teachers' and test them for validity and consistency.

Structured Abstract

Setting

The research will take place in schools in the northern region of Florida that serve urban and rural populations.

Sample

Seventh-grade science teachers and their students will be the research subjects.

Intervention

This research team will develop an assessment strategy that includes three components: (a) a series of performance assessments of problem solving and other cognitively complex competencies that measure selected state-level benchmarks; (b) performance assessment "specifications" that define comparable measures to be developed by teachers, linking teachers' assessments to those administered statewide, which can be used to guide teachers' development of comparable classroom assessments; and (c) information about the use of these performance assessments to generate both summative and formative data. The assessment strategy also includes training that helps teachers create complex assessments and uses those assessments to guide learning through effective formative feedback.

Research design and methods

Year one is devoted to developing and pilot-testing every element of the proposed assessment strategy using middle school students and teachers from two of the participating schools. Benchmarks from Florida's Next Generation Sunshine State Standards that address higher cognitive skills will be identified, and performance specifications and corresponding assessments for a subset of benchmarks and teacher training materials will be developed. These materials will be related to the use of specifications in the development of assessments and effective formative feedback will be developed and pilot-tested. The proposed research occurs in the context of science instruction taught at the middle school level. However, the intent is to establish procedures that are useful at other grade levels and in other subject areas. Teachers, administrators, and other stake-holders will be involved throughout the research project. Teachers will be trained to generate questions and administer tests. They will also give feedback that will aid in the revision of teacher training materials and the assessments. Other stakeholders will help to identify benchmarks and assessment specifications. Their feedback will also inform revisions.

Years two and three are designated for full development, administration, and analysis, during which the systems and procedures developed in year one will be implemented and the collected data analyzed. State review and external advisory teams will be involved at every significant juncture and their feedback incorporated.

Control condition

The results from performance assessments developed and administered by teachers will be compared with the assessments developed and administer by the researchers.

Key measures

An evidence-centered design approach and other analyses of scores will be used to establish the comparability of assessments that were developed independently by teachers and the research team. Classroom observations and interviews of teachers and other educators will establish whether best-practice approaches to formative feedback can be employed and are perceived to be practical.

Data analytic strategy

Analyses will establish whether performance on assessments generalize and whether procedures are scalable to the state level. The evidence-centered design will be used to establish competencies being measured, and analysis of variance techniques will be used to assess generalizability. Descriptive statistics and the Bookmark method will be used to compare score patterns on performance assessments administered to samples of students.

People and institutions involved

IES program contact(s)

Erin Higgins

Education Research Analyst
NCER

Project contributors

Janet Sanfilippo

Co-principal investigator

Products and publications

Products include an online data management and training system that contains training and teaching resources, incorporating teacher training materials that can be used to help teachers design and administer their own assessments. In addition, the team will develop a scalability framework that addresses the implementation of the performance-based assessment program at the state level. The researchers will also produce scholarly reports of findings.

Publications:

Journal article, monograph, or newsletter

Oosterhof, A. (2011). Upgrading High-Stakes Assessment. Better Evidence-based Education, 3(3): 20-21.

Sherdan, D., Anderson, A., Rouby, A., LaMee, A., Gilmer, P. J., & Oosterhof, A. (2014). Including often-missed knowledge and skills in science assessments. Science Scope, 38(1), 56-62.

Yang, Y., Oosterhof, A., and Xia, Y. (2015). Reliability of Scores on the Summative Performance Assessments. Journal of Educational Research, 108(6): 465-479.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

CognitionData and AssessmentsScienceStudentsTeaching

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote