IES Blog

Institute of Education Sciences

The High School and Beyond Midlife Study

Over the years, NCES has conducted several longitudinal studies that collect information on a representative cohort of high school students and follow the students’ outcomes through postsecondary education and/or entry into the workforce. These studies have led to important research on the educational trajectories of young adults.

But what happens after that? A recent data collection provides some answers by following up with survey participants later in life.

In 2014–15, the High School and Beyond (HS&B) Midlife Study collected information from a cohort of individuals in their early- to mid-50s, all of whom had first completed an HS&B survey in 1980 when they were in high school. By linking high school survey data with information collected 35 years later, this new collection offers an exciting opportunity to conduct research on the long-term outcomes of education.

Some preliminary research using the HS&B Midlife Study shows that high school and college experiences continue to play important roles in individuals’ lives into midlife.

 

Education (Grodsky and Doren 2015)

  • Between the ages of 28 and 50, a majority of cohort members (61 percent) enrolled in some sort of formal education, and in the process, they earned higher level degrees. By age 50,
     
    • 12 percent had earned a master’s, graduate, or professional degree, compared with 4 percent at age 28.
       
    • 36 percent had earned a bachelor’s or graduate degree, compared with 27 percent at age 28.
       
    • 36 percent had earned only a high school diploma or less, compared with 54 percent at age 28.
       
  • Gaps in educational attainment by gender, race/ethnicity, and parental education observed in early adulthood remained largely unchanged in midlife, with a notable exception:
     
    • A higher proportion of cohort adults whose parents had higher levels of education enrolled in graduate school between the ages of 28 and 50, which may be related to high school academic achievement (e.g., grades, test scores).

 

Labor Force Participation (Bosky 2019)

  • Men and women who took college preparatory math coursework in high school (i.e., Algebra II or higher) had lower unemployment at midlife, even after controlling for whether they completed a bachelor’s degree. In addition,
     
    • Women who earned higher GPAs were employed at higher rates.
       
    • Men who scored higher on math achievement tests were employed at higher rates.
       
  • At midlife, the percentage of workers who held jobs with low pay and/or no health or retirement benefits was higher for women than for men, even among workers with similar levels of educational attainment. This gender gap was smaller among people who had taken advanced math coursework in high school (i.e., Algebra II or above).
     
  • Across levels of education, higher percentages of women than men experienced economic insecurity at midlife, as indicated by their perceived ability to pay for a large unexpected expense in the near-term. The percentage of women experiencing midlife economic insecurity was lower for those with a college degree than for those without a college degree. Also,
     
    • For people without a college degree, higher math achievement test scores were associated with lower rates of economic insecurity, even after controlling for work, health, and family characteristics at midlife.
       
    • A lower percentage of women who had taken college preparatory math coursework in high school were economically insecure at midlife, regardless of whether they had completed a bachelor’s degree.
       
    • A lower percentage of married women than unmarried women were economically insecure. This gap was largest among women without a college degree.

 

Health

  • Adolescents who took coursework that was more advanced in high school reported better health and physical functioning at midlife (Carroll et al. 2017).
     
  • Earning a bachelor’s degree by age 28 predicted body weight at midlife. This relationship differed by sex (Pattison 2019).
     
  • Mortality risk was higher among the following groups:
     
    • People who had not taken college preparatory math coursework in high school.
       
    • People with more frequent absences from high school. (Warren et al. 2017)
       

Survey data from the HS&B Midlife Study are now available for researchers. In order to protect the privacy of survey respondents, the dataset is available only to researchers who have a restricted-use data license. For more information about the survey, visit https://sites.utexas.edu/hsb/, and for more information on the restricted-use data program, visit https://nces.ed.gov/pubsearch/licenses.asp.  

 

Funding Acknowledgement

The 2014–2015 HS&B Midlife Study was supported by a combination of government and nongovernment sources, including the Alfred P. Sloan Foundation (Grant 2012-10-27), the Institute for Education Sciences of the U.S. Department of Education (Grant R305U140001), and the National Science Foundation (Grants HRD1348527 and HRD1348557). It also benefited from direct funding from NORC at the University of Chicago and support provided by the Eunice Kennedy Shriver National Institute for Child Health and Human Development (NICHD) to the University of Texas at Austin (R24-HD042849), the University of Wisconsin-Madison (P2C-HD047873), and the University of Minnesota (P2C-HH041023).

 

References

Bosky, A.L. (2019). Academic Preparation in High School and Gendered Exposure to Economic Insecurity at Midlife (Doctoral dissertation). Retrieved from https://repositories.lib.utexas.edu/bitstream/handle/2152/76122/BOSKY-DISSERTATION-2019.pdf?sequence=1&isAllowed=y.

Carroll, J.M., Muller, C., Grodsky, E., and Warren, J.R. (2017). Tracking Health Inequalities from High School to Midlife. Social Forces, 96(2): 591–628. doi: 10.1093/sf/sox065.

Grodsky, E., and Doren, C. (2015). Coming in to Focus: Education and Stratification at Midlife. Paper presented at the Invited Lecture at Columbia University, March 26, 2015, New York.

Pattison, E. (2019). Educational Stratification and Obesity in Midlife: Considering the Role of Sex, Social Class, and Race/Ethnicity (Doctoral dissertation). Retrieved from https://repositories.lib.utexas.edu/bitstream/handle/2152/76097/PATTISON-DISSERTATION-2019.pdf?sequence=1&isAllowed=y.

Warren, J.R., Milesi, C., Grigorian, K., Humphries, M., Muller, C., and Grodsky, E. (2017). Do Inferences About Mortality Rates and Disparities Vary by Source of Mortality Information? Annals of Epidemiology, 27(2): 121–127. doi: 10.1016/j.annepidem.2016.11.003.

 

By Chandra Muller, University of Texas at Austin, and Elise Christopher, NCES

Building the Evidence Base for BEST in CLASS – Teacher Training to Support Young Learners with the Most Challenging Classroom Behavior

Classroom teachers of young children face a seemingly never-ending challenge – how to manage disruptive behavior while simultaneously teaching effectively and supporting the needs of every student in the classroom. Researchers at Virginia Commonwealth University and the University of Florida have received five IES research grants over the past decade – three through the National Center for Special Education Research (NCSER) and two from the National Center for Education Research (NCER) – to develop and test a model of training and professional development, including coaching, for early childhood and early elementary school teachers on how best to support children who engage in disruptive and otherwise challenging classroom behaviors.

With their first IES grant in 2008, Drs. Maureen Conroy and Kevin Sutherland developed the original BEST in CLASS model for early childhood teachers. The goal of BEST in CLASS - PK is to increase the quantity and quality of specific instructional practices with young children (ages 3-5 years old) who engage in high rates of challenging behaviors with the ultimate goal of preventing and reducing problem behavior. Professional development consists of a six-hour workshop that uses didactic and interactive learning activities supported by video examples and practice opportunities. Following the workshop, teachers receive a training manual and 14 weeks of practice-based coaching in the classroom. 

The results of this promising development work led to a 2011 IES Efficacy study to test the impact of BEST in CLASS - PK on teacher practices and child outcomes. Based on positive findings from that Efficacy study the team was awarded two additional Development and Innovation grants – one in 2016 to develop a web-based version of BEST in CLASS – PK to increase accessibility and scalability and another in 2015 to adapt BEST in CLASS – PK for early elementary school classrooms (BEST in CLASS – Elementary). Drs. Sutherland and Conroy are currently in the second year of an Efficacy study to test the impact of BEST in CLASS - Elementary to determine if the positive effects of BEST in CLASS in preschool settings are replicated in early elementary classrooms.

Written by Emily Doolittle, NCER Team Lead for Social Behavioral Research, and Jacquelyn Buckley, NCSER Team Lead for Disability Research

Working Toward a Successful National Data Collection: The ECLS Field Test

The National Center for Education Statistics (NCES) conducts some of the most complex education surveys in the world, and we work hard to make these surveys as effective and efficient as possible. One way we make sure our surveys are successful is by conducting multiple tests before we fully launch a national data collection.

Even prior to a field test, NCES develops survey materials and procedures using much smaller-scale cognitive laboratory testing and focus-group processes. These initial development procedures help ensure that materials are clear and procedures are understood before we conduct field testing with larger and more representative groups of respondents. Then, we launch the field tests to test data-collection operations and survey processes and procedures. Field tests are small-scale surveys that include a range of respondents and are designed to test the survey questionnaires and survey administration procedures in a real-world situation prior to the launch of a major study. The field test results allow us to make any necessary adjustments before starting the national data collection. Field tests also allow us to test specific survey items and ensure that they are valid and reliable. Without a field test, we could risk spending the public’s time and money on large data-collection efforts that do not produce the intended information.

NCES is about to begin the Early Childhood Longitudinal Study, Kindergarten Class of 2022–23 (ECLS-K:2023) with a field test early this year. The ECLS-K:2023 will focus on children’s early school experiences, beginning with preschool and continuing through fifth grade. From the spring of 2022 through the spring of 2028, we will collect national study data from children and their parents, teachers, and school administrators to answer questions about children’s early learning and development, transition into kindergarten and beyond, and experiences in the elementary grades. 

Although the ECLS-K:2023 will be similar in many ways to prior ECLS kindergarten studies, we are adding a round of data collection prior to the children’s kindergarten year—the national spring 2022 preschool round. For this preschool survey, we’ll send an invitation to participate to a sample of residential addresses within selected areas of the United States. Potential participants will first be asked to fill out a brief screener questionnaire. If they report that an ECLS-eligible child is in the household, they will be asked additional important questions about early childhood topics, such as their child’s literacy, language, math, and social skills; activities done with the child in the home (e.g., singing songs, playing games, reading); and characteristics of any early care and education (i.e., child care) arrangements for the child.   

Because the ECLS-K:2023 preschool data need to be comprehensive and reliable so that they can inform public discussions and policies related to early elementary education, it’s crucial that we test our procedures and questions for this new preschool round by conducting a field test in early 2020.  

If you receive a letter about participating in the 2020 ECLS field test, you’re being selected to represent thousands of households like yours and provide NCES with the data we need to make decisions about how to best conduct the ECLS-K:2023. The participation of all the selected households who receive our mailings, even those without children, is essential for a successful field test and, ultimately, a successful ECLS-K:2023.

If you are selected for the ECLS field test and have any questions about participating, please visit the participant information page

For more information on the ECLS-K:2023 or its 2020 field test, please email the ECLS study team.

For information about other ECLS program studies, please visit https://nces.ed.gov/ecls/.

 

By Jill Carlivati McCarroll

Learning from CTE Research Partnerships: How Michigan Built Trust with Researchers to Better Understand State Data

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Austin Estes, Senior Policy Associate at Advance CTE, and Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES), are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The first interview was with Jill Kroll of the Michigan Department of Education and Dan Kreisman of Georgia State University (and Director of CTEx). [Note: this interview has been edited for length; you can find the full interview transcript here].

 

Jill Kroll Dan Kreisman
Michigan Department of Education Georgia State University

 

The first question we have is about the projects that you work on together: what were some of the research questions you came up with, and how did you come to settle on those research questions?

Jill – I first connected with Dan and with Brian Jacob at University of Michigan when I saw Brian present to our P-20 council about some research that he was doing connecting the wage record data for five community colleges. I was like “Gee, is there any way you can do something similar with the statewide secondary student data?” And he said it was possible. So I worked within our department procedures to find out how we could go about establishing a relationship that would allow this opportunity.

Dan – That led to a whole bunch of other discussions of things that we thought were interesting. So, to say that there is a set of research questions is not the way I view our relationship. We talk with folks in Jill’s office regularly to hear what questions are pressing for them, and then we try to help facilitate answering those and then see where those lead us. I think one of the important things is we try to think about where there are policy levers, so we want to say “If we answer this question, how can the state or the districts use that information to further their mission of providing CTE programming to students in Michigan?”

Jill – I’ve been really happy with the extent to which Dan and the research team have consistently focused on the “so what?” Rather than focusing on vague research questions of interest only to other researchers, they have emphasized their interest in doing research that has practical application, that can be used by educators in the field.

Could you share an example of how you’ve been able to use some of this evidence and research to change policy, or at least to shape your understanding on some decisions that you’re making at the state level?

Jill - When we were starting to work on our Perkins V [the Strengthening Career and Technical Education for the 21st Century Act] state plan, we had a short time to determine what we wanted to consider for our secondary indicator of program quality. Because Brian, Dan, and their students had been working with this data for so many years, they had the capacity to very quickly do the matching and

 come up with an approximation for us about what postsecondary credit attainment would look like, and what strengths and weaknesses they saw in the data. It would have been really difficult for our office, or even multiple state agencies, to have been able to work that quickly and give it the critical analysis that they did.

The other thing they did when we were making the decision for that indicator is look at the data that we had for work-based learning and tell us what could be done with it. What came out of that was that the data was not in any form that could be analyzed (text and PDFs). This was really revealing to our State Director Brian Pyles, and it led him to set a policy that we are going to build a consistent way of collecting data on work-based learning. So that is another piece where it influenced practice and policy. One of the most exciting and valuable things that I find about the partnership is that Dan and the other researchers have a lot more capacity to analyze the data in a way that we just don’t have the time to do. Sometimes we don’t have the expertise, and sometimes we just don’t look at the data in the same way.

Dan –And there’s a flip side that without their input, we often are looking at data and can’t make heads or tails of something. And we can get on the phone or write an email to someone over there and say “Hey we’re seeing this thing. Can you tell me what that means?” And they will come back with “Oh, the system changed” or “There was this one policy,” and “Here’s what you have to do to make it fit everything else.” And this happens all the time. We would be completely lost without this open channel that we have to their office.

I think it’s important not to dismiss the power of good descriptive work. Lots of times, the questions that states are grappling with can often be illuminated with some really careful and good descriptive work. You can say, “This is what we’re seeing, this is the big picture,” if you step back for a minute, and that information lots of times has been as valuable as the stuff we try to do that is more causally oriented in our research.

Jill – I agree, and I want to follow up on the whole issue of how important trust is. I cannot emphasize enough how important it is to me that Dan and the other researchers come to us with those questions, that they check in with us. That’s absolutely critical. Anyone who works with any kind of data knows that it’s just so complex. If you link tables wrong, or misunderstand a data field, you can come to a completely wrong decision. So that communication and that interaction and trust are key to accurate outcomes.

As you’re both looking ahead, what’s next on the agenda? What are some of the research questions and priorities you have for this partnership?

Dan – Number one is tracking students into the labor market. That’s our biggest and most outstanding question. And the degree to which CTE programs are preparing students for college and the labor market and careers. In terms of other projects, one of the things we’re interested in is technical assessments. We’re also part of a consortium of several states – that’s the CTEx group. We meet annually together, and that allows us to harmonize things across states to see how trends are similar, how enrollment rates work, all sorts of different questions across multiple states.

Jill – One of the things we’re talking about right now is that we don’t have, in an accessible form, data on access to a particular program. We know that career centers serve certain districts, but if someone asked, “If student A is going to Central High School, what programs do they have access to? we don’t have a good way of answering that at the moment. We’ve had a couple of discussions about how we can work together to build basically a dataset that clarifies that. That would be mutually beneficial and would take resources from both in order to do something like that.

Thinking back on this partnership, is there any advice you would give to other State Directors or CTE researchers?

Dan – Building a strong relationship is the first thing you have to do. And part of that is spending time face to face talking about questions, moving around ideas, looking at data together. We had the benefit of a long windup period. We spent at least a year just talking about questions and putting together data before we even started doing any analyses. We also had buy-in from Jill’s office up and down the line from folks who were doing the research to people who were in policymaking roles. And without all of that, none of this would even have been possible.

And the second part is to not downplay the value of just providing good information. A lot of us on the research side don’t realize how little time folks in the state offices have to take a step back and say, “What’s going on with our data? Let’s look at the big picture.” And one of the things we can provide them is just giving them that big picture and handing it to them in a digestible way. And doing that is the first step, is a really good way to start building that trust. They really see the value of what you can do early on. And then you can start to get into more difficult or longer-term questions.

Jill – The first advice I would give is: Do it! Partner with researchers. I can’t say enough positive about it. The second is: Follow department procedures and be transparent with department leadership. You know that windup might be really, really slow while you jog through the channels that you need to in your department to do things by the book, but I think it pays off in the long run.

My third one is: Be transparent and open with school districts. Share what you’re doing and invite their input. Anybody who works with state data would probably know, you’re always a little hesitant about what the public would think about this use of data. The way that Dan and the postdocs and graduate students have openly shared the work that they’ve done with our CTE administrators has really helped, in that I have not gotten any doubt from districts.

The full transcript can be accessed in Advance CTE’s Learning that Works Resource Center. Other blog posts in this series can be viewed here.

New Report Highlights Progress and Challenges in U.S. High School Dropout and Completion Rates

A new NCES report has some good news about overall high school dropout and completion rates, but it also highlights some areas of concern.

Using a broad range of data, the recently released Trends in High School Dropout and Completion Rates in the United States report shows that the educational attainment of young adults has risen in recent decades. The public high school graduation rate is up, and the status dropout rate (the percentage of 16- to 24-year-olds who are not enrolled in school and have not completed high school) is down. Despite these encouraging trends, there are significant disparities in educational attainment among young adults in the United States. The report shines new light on these disparities by analyzing detailed data from the U.S. Census Bureau.

For large population groups, the report provides status dropout rates calculated using annual data from the American Community Survey (ACS), administered by the U.S. Census Bureau. For example, in 2017, some 5.4 percent of 16- to 24-year-olds who were not enrolled in high school lacked a high school diploma or equivalent credential.

For smaller population groups, there are not enough ACS respondents during any given year to allow for precise and reliable estimates of the high school status dropout rate. For these demographic subgroups, NCES pools the data from 5 years of the ACS in order to obtain enough respondents to accurately describe patterns in the dropout rate.

For example, while the overall status dropout rate for Asian 16- to 24-year-olds was below the national average in 2017, the rates for specific subgroups of Asian young adults varied widely. Based on 5 years of ACS data, high school status dropout rates among Asian 16- to 24-year-olds ranged from 1.1 percent for individuals of Korean descent to 23.2 percent for individuals of Burmese descent. These rates represent the “average” status dropout rate for the period from 2013 to 2017. They offer greater precision than the 1-year estimates, but the 5-year time span might make them difficult to interpret at first glance. 

 


Figure 1. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by selected Asian subgroups: 2013–2017

‡ Reporting standards not met. Either there are too few cases for a reliable estimate or the coefficient of variation (CV) is 50 percent or greater.
If the estimation procedure were repeated many times, 95 percent of the calculated confidence intervals would contain the true status dropout rate for the population group.
NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations. Asian subgroups exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017.


 

The 5-year ACS data can also be used to describe status dropout rates for smaller geographic areas with more precision than the annual ACS data. For example, the average 2013–2017 status dropout rates ranged from 3.8 percent in Massachusetts to 9.6 percent in Louisiana. The 5-year ACS data allowed us to calculate more accurate status dropout rates for each state and, in many cases, for racial/ethnic subgroups within the state. Access the complete state-level dropout rates by race/ethnicity here.
 


Figure 2. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by state: 2013–2017

NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017. See table 2.3.


 

For more information about high school dropout and completion rates, check out the recently released Trends in High School Dropout and Completion Rates in the United States report. For more information about the 5-year ACS datasets, visit https://www.census.gov/programs-surveys/acs/guidance/estimates.html.

 

By Joel McFarland