Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Innovative Computer-Based Formative Assessment via a Development, Delivery, Scoring, and Report-Generative System
Center: NCER Year: 2012
Principal Investigator: Wilson, Mark Awardee: University of California, Berkeley
Program: Science, Technology, Engineering, and Mathematics (STEM) Education      [Program Details]
Award Period: 4 years (3/1/2012 – 2/29/2016) Award Amount: $1,426,540
Type: Measurement Award Number: R305A120217
Description:

Co-Principal Investigator: Richard Lehrer, Vanderbilt University

Purpose: Formative assessments, when used correctly and consistently, can improve student learning. Furthermore, computerized formative assessments can provide teachers with rich diagnostic information about students' learning in a timely manner which allows for instructional planning. Many online assessment products currently exist but these products do not typically take into consideration students' learning progressions, incorporate complex item formats that evaluate constructed responses, or include sophisticated item response modeling techniques. To address this need, the current study will develop and validate a computerized formative assessment system around the topics of statistics and modeling to improve the quality and usefulness of the assessment data provided to teachers.

Project Activities: The researchers will develop and validate a set of new computer-based, online assessments embedded into the Assessing Data Modeling and Statistical Reasoning curriculum that are designed to provide diagnostic information about middle school students' learning of statistics and modeling. After an initial iterative development process, the team will complete a series of studies to determine the reliability and validity of the new online assessment system.

Products: Products include a fully developed and validated computerized formative assessment system focusing on statistics and modeling in middle school, a website for teachers to access the materials, and peer-reviewed publications.

Structured Abstract

Setting: The study will be conducted in urban, suburban, and rural middle schools in Arkansas.

Sample: The study sample will consist of a diverse sample of 9 middle school teachers and approximately 450 of their students.

Assessment: The researchers will develop and validate computer-based formative assessments for six constructs from the Assessing Data Modeling and Statistical Reasoning (ADM) curriculum that comprise a learning progression for statistics and modeling from late elementary through the middle school grades. The six constructs are: data display; meta-representational competence; conceptions of statistics; chance; modeling variability; and informal inference. The assessments will include computerized modifications of existing paper-and-pencil tasks, along with new computer-based assessment tasks (e.g., card-sort problems, interactive graph problems). Student performance on paper-and-pencil and computer-based tasks will be compared at both the task and test level to ensure appropriate and comparable performance.

Research Design and Methods: The assessments will undergo an iterative process of development and refinement. The researchers will focus on: (1) refining the Berkeley Evaluation and Assessment Research Software Environment (BEAR SE) to support the development, calibration, use and training of the new assessment system; (2) embedding the relevant ADM curriculum materials into BEAR SE, constructing computer-deliverable and computer-scorable task equivalents of the current paper-and-pencil items, and developing new computerized reports and support materials for teachers; and (3) investigating the usefulness of the new software in the context of the ADM curriculum. Nine teachers familiar with the ADM curriculum will be selected to participate in the assessment development activities. The first usability and validity study of the computer-based assessments and software tools will be conducted with curriculum developers, teachers, and students. In this phase of the study, the researchers will begin to explore evidence related to reliability and validity, including interviewing students as they work with items in the BEAR SE. Once revisions to the new ADM technology enhancements and online formative assessments are complete, the system will be re-tested with 9 teachers and their students (approximately 450 students) to assess its reliability and validity.

Control Condition: Due to the nature of the research design, there is no control condition.

Key Measures: The key measures for the study include students' responses on the assessments, and student and teacher interviews, observations, and responses to questionnaires.

Data Analytic Strategy: Analyses of reliability and validity will be carried out using item response modeling of student data with the Multidimensional Random Coefficients Multinomial Logit model. Wright Maps of the levels of item responses will be produced, along with differential item functioning analysis by gender and ethnicity. The relationship of the various constructs, which are part of the learning progression, will be examined by estimating the connection between the constructs and by estimating the overall difficulty of the achievement levels across the different constructs.

Related IES Projects: Assessing Data Modeling and Statistical Reasoning (R305K060091) and Data Modeling Supports the Development of Statistical Reasoning (R305A110685)

Products and Publications

Book chapter

Lehrer, R., Kim, M.J., Ayers, E., and Wilson, M. (2013). Toward Establishing a Learning Progression to Support the Development of Statistical Reasoning. In J. Confrey, and A. Maloney (Eds.), Learning Over Time: Learning Trajectories in Mathematics Education . Charlotte, NC: Information Age Publishers.

Lehrer, R., Kim, M-J., Ayers, E., and Wilson, M. (2014). Toward establishing a learning progression to support the development of statistical reasoning. In A. Maloney, J. Confrey and K. Nguyen (Eds.), Learning over Time: Learning Trajectories in Mathematics Education (pp. 31–60).

Journal article, monograph, or newsletter

Fisher, W. P., Jr., and Wilson, M. (2015). Building a Productive Trading Zone in Educational Assessment Research and Practice. Pensamiento Educativo: Revista de Investigacion Educacional Latinoamericana, 52 (2): 55–78.

Fisher, W.P., Jr., and Wilson, M. (2015). Building a productive trading zone in educational assessment research and practice. Pensamiento Educativo: Revista de Investigacion Educacional Latinoamericana, 52 (2): 55–78.

Irribarra, D.T., Freund, R., Fisher, W., and Wilson, M. (2015). Metrological Traceability in Education: A Practical Online System for Measuring and Managing Middle School Mathematics Instruction. Journal of Physics: Conference Series, 588.

Wilson, M., Mari, L., Maul, A., and Torres Irribarra, D. (2015). A Comparison of Measurement Concepts Across Physical Science and Social Science Domains: Instrument Design, Calibration, and Measurement. Journal of Physics: Conference Series, 588.


Back