Skip to main content

Breadcrumb

Home arrow_forward_ios Information on ... arrow_forward_ios Center for Rese ...
Home arrow_forward_ios ... arrow_forward_ios Center for Rese ...
Information on ...
Grant Closed

Center for Research on Evaluation, Standards, and Student Testing (CRESST)

NCER
Program: Education Research and Development Centers
Program topic(s): Assessment, Standards, and Accountability Research
Award amount: $9,968,718
Principal investigator: Eva Baker
Awardee:
University of California, Los Angeles
Year: 2005
Award period: 6 years (07/01/2005 - 06/30/2011)
Project type:
Development and Innovation, Efficacy, Measurement
Award number: R305A050004

Purpose

The National Center for Research on Evaluation, Standards, and Student Testing (CRESST) has been at the forefront of efforts to improve the quality of education and learning in America. Located within UCLA's Graduate School of Education & Information Studies, CRESST contributes to the development of scientifically based evaluation and testing techniques, encourages the development, validation and use of sound data for improved accountability and decision making, and explores technological applications to improve assessment and evaluation practice, specifically for English Learners. To achieve its goals, CRESST developed and evaluated a middle-school (6th, 7th, and 8th grade) mathematics formative assessment middle school intervention, POWERSOURCE.

 

Project Activities

Focused program of research

Assessment of POWERSOURCE

Building on its own prior research, the Center developed POWERSOURCE, an assessment intervention emphasizing fundamental mathematical principles essential to student success in first year algebra. The assessments were designed to help students organize their mathematical knowledge into big ideas or schema. The intervention also included professional development to enable teachers to understand the POWERSOURCE assessments and their underlying mathematical principles, and to use them to improve teaching and learning.

In a randomized study involving seven school districts, more than 90 teachers, and over 4000 students, Center researchers examined the impact of POWERSOURCE on student learning. Within schools, all sixth, seventh, and eighth grade mathematics teachers are participating. A cohort of sixth grade students were followed from the beginning of sixth grade through the end of their eighth grade year. Three of the districts used a within-school design where random assignment was accomplished within each school. The impact of POWERSOURCE on student learning was compared with that of schools' standard instruction and test preparation conditions. Participating schools were in Pennsylvania and Hawaii.

Assessment of English Language Learning Students: Review of Literature
The No Child Left Behind Act of 2001 had a significant impact on states' English language learner assessment policies. In particular, the legislation mandated that English language learning students were: (a) subject to annual assessments of English language development, and (b) included in annual state assessments and yearly progress performance targets. Many states rushing to meet these federally mandated requirements found that they lacked the expertise, time, or resources to systematically document or address many fundamental issues related to the assessment of English language learning students.

To provide guidelines for the appropriate development and use of English language learning assessments, the Center completed a review of the relevant research literature. This literature review: (a) examined issues related to the overall validity and reliability of the instruments; (b) reported the latest research findings, focusing on the construct validity of the assessments; and (c) reviewed issues in the assessment of content knowledge.

Providing Support to States to Improve the Assessment of English Language Learners: Test Validity, Designation, and Test Accommodation Use
Although the validity of assessments designed for English language learning students has been a topic of expert recommendation, relatively little rigorous research has been conducted examining the validity of existing instruments. This project aimed to: (a) help educators understand and improve the performance of English language learners by investigating the validity of their assessments, and (b) provide states with much needed guidance to improve the validity of their assessments for English language learning students.

In partnership with three to four collaborating states, the Center analyzed available data and existing documents in order to: (a) understand the nature and validity of states' current English language learner assessment practices, (b) prioritize needs, and (c) provide preliminary recommendations for improving the validity of their assessments and decision making. In addition, the Center conducted targeted, experimental studies in two states in order to provide scientifically based evidence on which to base decisions about English language learner assessment practices and policies. By the end of the project, the Center developed, revised, and disseminated specific guidelines that states may use to guide their English language learner assessment policy and practice decisions. In addition, the Center made available a usable, generalized methodology and tools for examining critical assessment validity issues as they emerge.

National leadership and outreach activities

The CRESST website includes several resources for school districts, teachers, and parents. 

Key outcomes

  • There was no main treatment effect, but there was a substantial positive correlation with the pretest of the same construct.
  • POWERSOURCE required twelve class periods of classroom implementation and nine hours of professional development to show a statistically significant impact on student performance on the transfer measure.
  • Results indicated that students with higher pretest scores benefited more from the treatment compared to students with lower pretest scores. Further analysis showed that the intervention had a substantial positive effect for students familiar with the algebraic operations by organizing and consolidating the students understanding. The final year results showed a main effect for 6th grade students.

People and institutions involved

IES program contact(s)

Allen Ruby

Project contributors

Joan Herman

Co-principal investigator

Robert L. Linn

Key Personnel

SubAwardee(s)

Harvard University

RAND Corporation

University of Colorado, Boulder

Products and publications

Project website:

https://seis.ucla.edu/research_centers/center-for-research-on-evaluation-and-student-testing-cresst/

Publications:

Book chapter

Baker, E.L. (2009). The Influence of Learning Research on the Design and Use of Assessment. In K.A. Ericsson (Ed.), Development of Professional Expertise: Toward Measurement of Expert Performance and Design of Optimal Learning Environments (pp. 333-355). New York: Cambridge University Press.

Baker, E.L., Chung, G.K.W.K., and Delacruz, G.C. (2008). Design and Validation of Technology-Based Performance Assessments. In J.M. Spector, M.D. Merrill, J.J.G. Van Merrinboer, and M.P. Driscoll (Eds.), Handbook of Research on Educational Communications and Technology (3rd ed., pp. 595-604). Mahwah, NJ: Erlbaum.

Baker, E.L., Niemi, D., and Chung, G.K.W.K. (2008). Simulations and the Transfer of Problem Solving Knowledge and Skills. In E.L. Baker, J. Dickieson, W. Wulfeck, and H.F. O'Neil (Eds.), Assessment Of Problem Solving Using Simulations(pp. 1-17). Mahwah, NJ: Erlbaum.

Chung, G.K.W.K., Baker, E.L., Delacruz, G.C., Bewley, W.L., Elmore, J., and Seely, B. (2008). A Computational Approach to Authoring Problem-Solving Assessments. In E.L. Baker, J. Dickieson, W. Wulfeck, and H.F. O'Neil (Eds.), Assessment of Problem Solving Using Simulations (pp. 289-307). Mahwah, NJ: Erlbaum.

Chung, G.K.W.K., O'Neil, H.F., Bewley, W.L., and Baker, E.L. (2008). Computer-Based Assessments to Support Distance Learning. In E. Klieme, J. Hartig, and A. Jurecka (Eds.), Assessment of Competencies in Educational Contexts(pp. 253-276). Göttingen, Germany: Hogrefe and Huber.

Journal article, monograph, or newsletter

Heritage, M., Kim, J., Vendlinski, T., and Herman, J. (2009). From Evidence to Action: A Seamless Process in Formative Assessment?. Educational Measurement, 28(3): 24-31.

Wolf, M.K., Farnsworth, T., and Herman, J.L. (2008). Validity Issues in Assessing English Language Learners' Language Proficiency. Educational Assessment, 13(2): 80-107.

Nongovernment report, issue brief, or practice guide

Herman, J.L. (2007). Accountability and Assessment: Is Public Interest in K‐12 Education Being Served?. Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing.

Ruiz-Primo, M.A., Li, M., Tsai, S., and Schneider, J. (2008). Testing One Premise of Scientific Inquiry in Science Classrooms: A Study That Examines Students' Scientific Explanations. Los Angeles: University of California.

Vendlinski, T.P., Baker, E.L., and Niemi, D. (2008). Templates and Objects in Authoring Problem-Solving Assessments. Los Angeles: University of California.

Vendlinski, T.P., Howard, K.E., Hemberg, B.C., Vinyard, L., Martel, A., Kyriacou, E., Casper, J., Chai, Y., Phelan, J.C., and Baker, E.L. (2008). Using Data and Big Ideas: Teaching Distribution as an Instance of Repeated Addition. Los Angeles: University of California.

Wolf, M.K., Herman, J.L., Kim, J., Abedi, J., Leon, S., Griffin, N., Bachman, P.L., Chang, S.M., Farnsworth, T., Jung, H., Nollner, J., and Shin, H.W. (2008). Providing Validity Evidence to Improve the Assessment of English Language Learners. Los Angeles: University of California, CRESST.

Wolf, M.K., Kao, J., Griffin, N., Herman, J.L., Bachman, P.L., Chang, S.M., and Farnsworth, T. (2008). Issues in Assessing English Language Learners: English Language Proficiency Measures and Accommodation Uses—Practice Review. Los Angeles: University of California, CRESST.

Wolf, M.K., Kao, J., Herman, J.L., Bachman, L.F., Bailey, A.L., Bachman, P.L., Farnsworth, T., and Chang, S.M. (2008). Issues in Assessing English Language Learners: English Language Proficiency Measures and Accommodation Uses—Literature Review. Los Angeles: University of California, CRESST.

Proceeding

Encarnacao, A., Espinosa, P.D., Au, L., Chung, G.K.W.K., Johnson, L., and Kaiser, W.J. (2008). Individualized, Interactive Instruction (3I): An Online Formative Assessment and Instructional Tool. In Proceedings of the Annual Meeting of the American Society of Engineering Education (Session AC 2007-1524) (pp. 1-17). Honolulu, HI: American Society for Engineering Education.

Working paper

Phelan, J., Kang, T., Niemi, D.N., Vendlinski, T., and Choi, K. (2009). Some Aspects of the Technical Quality of Formative Assessments in Middle School Mathematics (CRESST 750). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing Working Paper.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsEnglish Learners (EL)MathematicsPolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Other

Expanding School Supports for Kinship Caregivers a...

January 16, 2026
Read More
Zoomed in IES logo
Other Resource

Improving Data Dashboards: A Feedback Process to E...

Author(s): U.S. Department of Education
Read More
Zoomed in IES logo
Blog

Making Informed 9th Grade Math Placement Decisions

January 07, 2026 by Lilla Pivnick
Read More
icon-dot-govicon-https icon-quote