IES Blog

Institute of Education Sciences

Working Toward a Successful National Data Collection: The ECLS Field Test

The National Center for Education Statistics (NCES) conducts some of the most complex education surveys in the world, and we work hard to make these surveys as effective and efficient as possible. One way we make sure our surveys are successful is by conducting multiple tests before we fully launch a national data collection.

Even prior to a field test, NCES develops survey materials and procedures using much smaller-scale cognitive laboratory testing and focus-group processes. These initial development procedures help ensure that materials are clear and procedures are understood before we conduct field testing with larger and more representative groups of respondents. Then, we launch the field tests to test data-collection operations and survey processes and procedures. Field tests are small-scale surveys that include a range of respondents and are designed to test the survey questionnaires and survey administration procedures in a real-world situation prior to the launch of a major study. The field test results allow us to make any necessary adjustments before starting the national data collection. Field tests also allow us to test specific survey items and ensure that they are valid and reliable. Without a field test, we could risk spending the public’s time and money on large data-collection efforts that do not produce the intended information.

NCES is about to begin the Early Childhood Longitudinal Study, Kindergarten Class of 2022–23 (ECLS-K:2023) with a field test early this year. The ECLS-K:2023 will focus on children’s early school experiences, beginning with preschool and continuing through fifth grade. From the spring of 2022 through the spring of 2028, we will collect national study data from children and their parents, teachers, and school administrators to answer questions about children’s early learning and development, transition into kindergarten and beyond, and experiences in the elementary grades. 

Although the ECLS-K:2023 will be similar in many ways to prior ECLS kindergarten studies, we are adding a round of data collection prior to the children’s kindergarten year—the national spring 2022 preschool round. For this preschool survey, we’ll send an invitation to participate to a sample of residential addresses within selected areas of the United States. Potential participants will first be asked to fill out a brief screener questionnaire. If they report that an ECLS-eligible child is in the household, they will be asked additional important questions about early childhood topics, such as their child’s literacy, language, math, and social skills; activities done with the child in the home (e.g., singing songs, playing games, reading); and characteristics of any early care and education (i.e., child care) arrangements for the child.   

Because the ECLS-K:2023 preschool data need to be comprehensive and reliable so that they can inform public discussions and policies related to early elementary education, it’s crucial that we test our procedures and questions for this new preschool round by conducting a field test in early 2020.  

If you receive a letter about participating in the 2020 ECLS field test, you’re being selected to represent thousands of households like yours and provide NCES with the data we need to make decisions about how to best conduct the ECLS-K:2023. The participation of all the selected households who receive our mailings, even those without children, is essential for a successful field test and, ultimately, a successful ECLS-K:2023.

If you are selected for the ECLS field test and have any questions about participating, please visit the participant information page

For more information on the ECLS-K:2023 or its 2020 field test, please email the ECLS study team.

For information about other ECLS program studies, please visit https://nces.ed.gov/ecls/.

 

By Jill Carlivati McCarroll

Learning from CTE Research Partnerships: How Michigan Built Trust with Researchers to Better Understand State Data

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Austin Estes, Senior Policy Associate at Advance CTE, and Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES), are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The first interview was with Jill Kroll of the Michigan Department of Education and Dan Kreisman of Georgia State University (and Director of CTEx). [Note: this interview has been edited for length; you can find the full interview transcript here].

 

Jill Kroll Dan Kreisman
Michigan Department of Education Georgia State University

 

The first question we have is about the projects that you work on together: what were some of the research questions you came up with, and how did you come to settle on those research questions?

Jill – I first connected with Dan and with Brian Jacob at University of Michigan when I saw Brian present to our P-20 council about some research that he was doing connecting the wage record data for five community colleges. I was like “Gee, is there any way you can do something similar with the statewide secondary student data?” And he said it was possible. So I worked within our department procedures to find out how we could go about establishing a relationship that would allow this opportunity.

Dan – That led to a whole bunch of other discussions of things that we thought were interesting. So, to say that there is a set of research questions is not the way I view our relationship. We talk with folks in Jill’s office regularly to hear what questions are pressing for them, and then we try to help facilitate answering those and then see where those lead us. I think one of the important things is we try to think about where there are policy levers, so we want to say “If we answer this question, how can the state or the districts use that information to further their mission of providing CTE programming to students in Michigan?”

Jill – I’ve been really happy with the extent to which Dan and the research team have consistently focused on the “so what?” Rather than focusing on vague research questions of interest only to other researchers, they have emphasized their interest in doing research that has practical application, that can be used by educators in the field.

Could you share an example of how you’ve been able to use some of this evidence and research to change policy, or at least to shape your understanding on some decisions that you’re making at the state level?

Jill - When we were starting to work on our Perkins V [the Strengthening Career and Technical Education for the 21st Century Act] state plan, we had a short time to determine what we wanted to consider for our secondary indicator of program quality. Because Brian, Dan, and their students had been working with this data for so many years, they had the capacity to very quickly do the matching and

 come up with an approximation for us about what postsecondary credit attainment would look like, and what strengths and weaknesses they saw in the data. It would have been really difficult for our office, or even multiple state agencies, to have been able to work that quickly and give it the critical analysis that they did.

The other thing they did when we were making the decision for that indicator is look at the data that we had for work-based learning and tell us what could be done with it. What came out of that was that the data was not in any form that could be analyzed (text and PDFs). This was really revealing to our State Director Brian Pyles, and it led him to set a policy that we are going to build a consistent way of collecting data on work-based learning. So that is another piece where it influenced practice and policy. One of the most exciting and valuable things that I find about the partnership is that Dan and the other researchers have a lot more capacity to analyze the data in a way that we just don’t have the time to do. Sometimes we don’t have the expertise, and sometimes we just don’t look at the data in the same way.

Dan –And there’s a flip side that without their input, we often are looking at data and can’t make heads or tails of something. And we can get on the phone or write an email to someone over there and say “Hey we’re seeing this thing. Can you tell me what that means?” And they will come back with “Oh, the system changed” or “There was this one policy,” and “Here’s what you have to do to make it fit everything else.” And this happens all the time. We would be completely lost without this open channel that we have to their office.

I think it’s important not to dismiss the power of good descriptive work. Lots of times, the questions that states are grappling with can often be illuminated with some really careful and good descriptive work. You can say, “This is what we’re seeing, this is the big picture,” if you step back for a minute, and that information lots of times has been as valuable as the stuff we try to do that is more causally oriented in our research.

Jill – I agree, and I want to follow up on the whole issue of how important trust is. I cannot emphasize enough how important it is to me that Dan and the other researchers come to us with those questions, that they check in with us. That’s absolutely critical. Anyone who works with any kind of data knows that it’s just so complex. If you link tables wrong, or misunderstand a data field, you can come to a completely wrong decision. So that communication and that interaction and trust are key to accurate outcomes.

As you’re both looking ahead, what’s next on the agenda? What are some of the research questions and priorities you have for this partnership?

Dan – Number one is tracking students into the labor market. That’s our biggest and most outstanding question. And the degree to which CTE programs are preparing students for college and the labor market and careers. In terms of other projects, one of the things we’re interested in is technical assessments. We’re also part of a consortium of several states – that’s the CTEx group. We meet annually together, and that allows us to harmonize things across states to see how trends are similar, how enrollment rates work, all sorts of different questions across multiple states.

Jill – One of the things we’re talking about right now is that we don’t have, in an accessible form, data on access to a particular program. We know that career centers serve certain districts, but if someone asked, “If student A is going to Central High School, what programs do they have access to? we don’t have a good way of answering that at the moment. We’ve had a couple of discussions about how we can work together to build basically a dataset that clarifies that. That would be mutually beneficial and would take resources from both in order to do something like that.

Thinking back on this partnership, is there any advice you would give to other State Directors or CTE researchers?

Dan – Building a strong relationship is the first thing you have to do. And part of that is spending time face to face talking about questions, moving around ideas, looking at data together. We had the benefit of a long windup period. We spent at least a year just talking about questions and putting together data before we even started doing any analyses. We also had buy-in from Jill’s office up and down the line from folks who were doing the research to people who were in policymaking roles. And without all of that, none of this would even have been possible.

And the second part is to not downplay the value of just providing good information. A lot of us on the research side don’t realize how little time folks in the state offices have to take a step back and say, “What’s going on with our data? Let’s look at the big picture.” And one of the things we can provide them is just giving them that big picture and handing it to them in a digestible way. And doing that is the first step, is a really good way to start building that trust. They really see the value of what you can do early on. And then you can start to get into more difficult or longer-term questions.

Jill – The first advice I would give is: Do it! Partner with researchers. I can’t say enough positive about it. The second is: Follow department procedures and be transparent with department leadership. You know that windup might be really, really slow while you jog through the channels that you need to in your department to do things by the book, but I think it pays off in the long run.

My third one is: Be transparent and open with school districts. Share what you’re doing and invite their input. Anybody who works with state data would probably know, you’re always a little hesitant about what the public would think about this use of data. The way that Dan and the postdocs and graduate students have openly shared the work that they’ve done with our CTE administrators has really helped, in that I have not gotten any doubt from districts.

The full transcript can be accessed in Advance CTE’s Learning that Works Resource Center. Other blog posts in this series can be viewed here.

New Report Highlights Progress and Challenges in U.S. High School Dropout and Completion Rates

A new NCES report has some good news about overall high school dropout and completion rates, but it also highlights some areas of concern.

Using a broad range of data, the recently released Trends in High School Dropout and Completion Rates in the United States report shows that the educational attainment of young adults has risen in recent decades. The public high school graduation rate is up, and the status dropout rate (the percentage of 16- to 24-year-olds who are not enrolled in school and have not completed high school) is down. Despite these encouraging trends, there are significant disparities in educational attainment among young adults in the United States. The report shines new light on these disparities by analyzing detailed data from the U.S. Census Bureau.

For large population groups, the report provides status dropout rates calculated using annual data from the American Community Survey (ACS), administered by the U.S. Census Bureau. For example, in 2017, some 5.4 percent of 16- to 24-year-olds who were not enrolled in high school lacked a high school diploma or equivalent credential.

For smaller population groups, there are not enough ACS respondents during any given year to allow for precise and reliable estimates of the high school status dropout rate. For these demographic subgroups, NCES pools the data from 5 years of the ACS in order to obtain enough respondents to accurately describe patterns in the dropout rate.

For example, while the overall status dropout rate for Asian 16- to 24-year-olds was below the national average in 2017, the rates for specific subgroups of Asian young adults varied widely. Based on 5 years of ACS data, high school status dropout rates among Asian 16- to 24-year-olds ranged from 1.1 percent for individuals of Korean descent to 23.2 percent for individuals of Burmese descent. These rates represent the “average” status dropout rate for the period from 2013 to 2017. They offer greater precision than the 1-year estimates, but the 5-year time span might make them difficult to interpret at first glance. 

 


Figure 1. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by selected Asian subgroups: 2013–2017

‡ Reporting standards not met. Either there are too few cases for a reliable estimate or the coefficient of variation (CV) is 50 percent or greater.
If the estimation procedure were repeated many times, 95 percent of the calculated confidence intervals would contain the true status dropout rate for the population group.
NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations. Asian subgroups exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017.


 

The 5-year ACS data can also be used to describe status dropout rates for smaller geographic areas with more precision than the annual ACS data. For example, the average 2013–2017 status dropout rates ranged from 3.8 percent in Massachusetts to 9.6 percent in Louisiana. The 5-year ACS data allowed us to calculate more accurate status dropout rates for each state and, in many cases, for racial/ethnic subgroups within the state. Access the complete state-level dropout rates by race/ethnicity here.
 


Figure 2. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by state: 2013–2017

NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017. See table 2.3.


 

For more information about high school dropout and completion rates, check out the recently released Trends in High School Dropout and Completion Rates in the United States report. For more information about the 5-year ACS datasets, visit https://www.census.gov/programs-surveys/acs/guidance/estimates.html.

 

By Joel McFarland