Search Results: (1-12 of 12 records)
|Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Assessment Item Level File (ILF), Read Me
This ReadMe provides guidance and documentation for users of the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Assessment Item Level File (ILF)(NCES 2023-014) made available to researchers under a restricted use only license. Other supporting documentation includes MGLS_Math_and_Reading_Items_User_Guide.xlsx, MGLS_MS1_Math_Item_Images.pdf, MGLS_MS2_Math_Item_Images.pdf, MGLS_MS1_MS2_Reading_Sample_Item_Type_Images.pdf, MGLS_MS1_MS2_EF_HeartsFlowers_Instructions.pptx, and MGLS_MS2_EF_Spatial_2-back_Instructions.pptx
|MGLS 2017 Assessment Item Level Files (ILF)
The Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) measured student achievement in mathematics and reading along with executive function. The MGLS:2017 ILF contains the item level data from these direct measures that can be used in psychometric research for replicating or enhancing the scoring found in the MGLS:2017 RUF or in creating new scores. The Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Assessment Item Level File (ILF) contains two .csv files representing the two rounds of data collection: the MGLS:2017 Main Study (MS) Base Year (MS1) and the Main Study Follow-up (MS2) files.
|Variability in Pretest-Posttest Correlation Coefficients by Student Achievement Level
State assessments are increasingly used as outcome measures for education evaluations. The scaling of state assessments produces variability in measurement error, with the conditional standard error of measurement increasing as average student ability moves toward the tails of the achievement distribution. This report examines the variability in pretest-posttest correlation coefficients of state assessment data for samples of low-performing, average-performing, and proficient students to illustrate how sample characteristics (including the measurement error of observed scores) affect pretest-posttest correlation coefficients. As an application, this report highlights how statistical power can be attenuated when correlation coefficients vary according to sample characteristics. Achievement data from four states and two large districts in both English/Language Arts and Mathematics for three recent years are examined. The results confirm that pretest-posttest correlation coefficients are smaller for samples of low performers, reducing statistical power for impact studies. Substantial variation across state assessments was also found. These findings suggest that it may be useful to assess the pretest-posttest correlation coefficients of state assessments for an intervention’s target population during the planning phase of a study.
|Education Longitudinal Study of 2002 (ELS:2002) Base-Year to Second Follow-up Data File Documentation (including Field Test report)
The Data File Documentation reports on the procedures and methodologies employed during the Education Longitudinal Study of 2002 base year, first, and second follow-ups, with special emphasis on the second follow-up (2006). The document is designed to provide guidance for users of the restricted-use data as released in Electronic Codebook (ECB) format (NCES 2008346) and also contains information that will be valuable to users of the public-use Data Analysis System (DAS). Included in the documentation are the following: an overview of the study and its predecessor studies; an account of instrumentation and high school records (academic transcript) collection; documentation of the sample design, weighting, imputation, and design effects; a summary of the data collection methodology and results, including detailed response rates; a description of data preparation and processing activities including linkage to ancillary external data sets and creation of composite variables; and an overview of data file structure and contents. Numerous appendices supply a glossary of terms and a synopsis of the field test, and treat of topics such as cross-cohort comparison and occupational coding scheme crosswalks.
The Field Test report for the ELS:2002/04 second follow-up data collection can be found in Appendix C of the second follow-up data file documentation manual.
|Trends in International Mathematics and Science Study (TIMSS) 2003 Nonresponse Bias Analysis
This technical report explores the extent of potential bias introduced into the U.S. TIMSS study through nonresponse on the part of schools. Data from the third cycle of TIMSS, conducted in April-June, 2003, are the basis for the analyses. The analyses compare selected characteristics likely to reflect bias in participation from participating and non-participating schools. Two forms of analysis were undertaken: a test of the independence of each school characteristic and participation status, and logistic regression in which the conditional independence of selected school characteristics as predictors of participation was examined. The investigation into nonresponse bias at the school level for U.S. TIMSS 2003 samples for grades 4 and 8 shows that there was no statistically significant relationship detected between participation status and the majority of school characteristics that are available for analysis.
|The Measurement of Instructional Background indicators: Cognitive Laboratory Investigations of the Responses of Fourth and Eighth Grade Students and Teachers to Questionnaire Items
To improve the National Assessment of Educational Progress (NAEP), cognitive lab interviews were conducted with 4th- and 8th-grade students and their teachers to evaluate the validity of responses to background and instructional questions. Frequent discrepancies were found between students' and teachers' responses to the same questions. Causes are analyzed, and the paper contains recommendations for collecting more valid information.
|The Measurement of Home Background Indicators
This report describes cognitive laboratory investigations of how 4th and 8th grade students respond to home background questions, and whether they know the information being asked for.
|Measurement Error Studies at the National Center for Education Statistics
This report focuses on illustrating an important part of NCES' commitment to the evaluation of the quality of its survey data through systematic, ongoing efforts to monitor the components of error in its data products, making after-the-fact corrections as necessary, and constantly improving the survey process designs to eliminate errors before they occur. It reviews a sample of past and current measurement error studies conducted by NCES and summarizes the results of each study, drawing upon relevant NCES publications.
|High School and Beyond Third Follow-Up (1986) Sample Design Report
The High School and Beyond third follow-up survey was conducted during the spring of 1986. This report provides information that fully documents major technical aspects of the third follow-up sample selection and implementation, describes the weighting procedures, documents major technical aspects of the third follow-up sample selection and implementation, describes the weighting procedures, examines the possible impact of nonresponse on sample estimates, and evaluates the precision of estimates derived from the sample.
|The National Longitudinal Study of the High School Class of 1972 (NLS-72) Fifth Follow-Up (1986) Sample Design Report
This report is the methodology report for the National Longiitudinal Study of the High School Class of 1972 follow-up in 1986. Chapter 2 summarizes the base year sample selection procedures and describes in detail the selection procedures for the fifth follow-up survey. Chapter 3 describes the calculation of sample case weights that adjust for differential probabilities of selection and for nonresponse within weighting cells. In order to provide full technical information, the nonresponse adjustment factors for all weighting cells are included in appendix A. Chapter 4 examines the possible impact of survey nonresponse, a potential source of bias. The amount of bias depends on the proportion of nonrespondents and the magnitude of any difference between respondents- and nonrespondents on variables of interest. Chapter 4 presents a description of nonresponse rates among various subclasses of the fifth follow-up sample. Chapter 5 describes procedures for computing sampling errors and design effects. The NLS-72 sample, because it is a clustered, stratified, and disproportionately allocated sample, presents some special difficulties in estimating actual sampling errors. Chapter 5 discusses the approach NORC has taken to this problem. Sampling errors and design effects are presented for a set of proportions for the entire sample and for important domains or subgroups. Finally, several "rules of thumb" are offered for estimating standard errors under various circumstances.
|Quality of Responses of High School Students to Questionnaire Items, High School and Beyond
This methodological report describes the quality of responses of high school students to High School and Beyond questionnaire items in terms of reliability and validity.
|National Longitudinal Study Of The High School Class Of 1972 Sample Design Efficiency Study: Effects Of Stratification, Clustering, And Unequal Weighting On The Variances Of NLS Statistics
This report describes the effects of stratification, clustering, and unequal weighting on the variances for the National Longitudinal Study Of The High School Class Of 1972.
1 - 12