Project Activities
This evaluation will use two cluster randomized control trials to determine if the Indiana Diagnostic Assessment Intervention is meeting its intended aims of modifying teacher instruction and raising student test scores. It will use non-experimental methods to explore the effect of fidelity.
Structured Abstract
Setting
The evaluation will take place in public elementary schools across the state of Indiana.
Sample
The Indiana Department of Education expects the Diagnostic Assessment Intervention will come to be used with all K-8 students in the state. The two randomized control trials within this evaluation will include 100 schools which volunteered for early implementation of the intervention and which have not had experience with similar interventions. The secondary data analysis component of this evaluation will use assessment data on all K-8 students.
Indiana began the roll-out of the Diagnostic Assessment program in the summer of 2008 by training teachers from more than 570 schools teaching some 230,000 K–8 students. Additional schools will volunteer to participate in the Diagnostic Assessments during each of the next several years so that essentially all elementary schools and students statewide will be active participants by 2013–14. The program includes two sets of commercially available interim assessments that are to be used three or four times a year with each student. In grades K–2, teachers will use the mCLASS: Reading 3D (which incorporates the Dynamic Indicators of Basic Early Literacy Skills (DIBELS) and running records) and the mCLASS: Math (which uses curriculum-based assessments and diagnostic interviews) from Wireless Generation. Both assessments run on personal digital assistants and the data are synchronized daily with a classroom computer, which maintains connections to the vendor's server. In grades 3–8, an online assessment system called Acuity from CTB/McGraw-Hill is used with multiple choice tests in reading and math. Results are synchronized automatically with the vendor's servers. Teachers are to access reports on each student to diagnose strengths and weaknesses and to adjust instruction accordingly. Several times per year, the Indiana Department of Education receives data on all participating students from the vendors' servers for research purposes.
Initial research
Two cluster randomized control trials will be used. In Year 1, 50 of the schools volunteering to use the Diagnostic Assessment Intervention (and which have no prior history using the intervention or highly similar products) will be randomly assigned into the treatment group (30 schools) or the control group (20 schools). The treatment group will receive the intervention and use it during the Year 1 school year and each control school will continue its customary instructional regime. The control group will adopt the intervention in the next school year. In Year 2, a second experiment will begin with another 50 schools and follow the same format. The Indiana Department of Education has agreed to provide individual student outcomes on the Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) assessment, as well as data from the Wireless Generation and Acuity assessments. Detailed information on teacher instructional practices will be collected at grades 2 and 5 using teacher surveys and logs. Teacher use of the interim assessments and the generated reports will be obtained from the research data sets provided by the vendors to the Indiana Department of Education.
Control condition
Schools which were not randomly chosen to implement the Diagnostic Assessment Intervention will continue their customary instructional regimes.
Key measures
The study uses grade 3–8 students' scores from the Indiana Statewide Testing for Educational Progress-Plus (ISTEP+) in reading and math, and results from the Wireless Generation and Acuity interim assessments to measure student achievement. Teachers will complete sections of the Surveys of Enacted Curriculum (SEC) online regarding English, language arts and math in regards to content taught, and expectations for student learning of that content. Teachers will complete instructional logs for each of eight students multiple times during the school year. Both the SEC and the logs will be used to measure teachers' instructional practices.
Data analytic strategy
The results from the experiments will be analyzed using multilevel statistical models (students in schools, teachers in schools, and where feasible, students in teachers in schools) to assess the effects of the intervention. Multiple prior student scores, available from Indiana's longitudinal data systems, will be used in these analyses to increase precision of estimation.
People and institutions involved
IES program contact(s)
Project contributors
Products and publications
Publications:
Journal article, monograph, or newsletter
Konstantopoulos, S., Li, W., Miller, S.R., and van der Ploeg, A. (2016). Effects of Interim Assessments Across the Achievement Distribution: Evidence From an Experiment. Educational and Psychological Measurement, 76 (4), 587-608.
Konstantopoulos, S., Li, W., Miller, S., and van der Ploeg, A. (2019). Using Quantile Regression to Estimate Intervention Effects Beyond the Mean. Educational and Psychological Measurement, 0013164419837321.
Konstantopoulos, S., Miller, S.R., and van der Ploeg, A. (2013). The Impact of Indiana's System of Interim Assessments on Mathematics and Reading Achievement. Education Evaluation and Policy Analysis, 35 (4): 481-499.
Konstantopoulos, S., Miller, S.R., van der Ploeg, A. and Li, W. (2016). Effects of Interim Assessments on Student Achievement: Evidence from a Large-Scale Experiment. Journal of Research on Educational Effectiveness, 9 (1): 188-208.
Williams, R.T., Swanlund, A., Miller, S., Konstantopoulos, S., Eno, J., van der Ploeg, A., and Meyers, C. (2014). Measuring Instructional Differentiation in a Large-Scale Experiment. Educational and Psychological Measurement, 74 (2): 263-279.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.