IES Blog

Institute of Education Sciences

Partnering with Researchers Can Help State Leaders Build the Case for CTE

In Massachusetts, Career/Vocational Technical Education Schools (CVTE) are renowned for offering rigorous, high-quality programs of study across a variety of disciplines. While CVTE graduates have always experienced high rates of success academically and in their careers, state leaders in Massachusetts wanted to know whether these outcomes directly result from the CVTE model. In 2017, the Massachusetts Department of Elementary and Secondary Education partnered with Shaun Dougherty (at the time, a researcher at the University of Connecticut), and learned that CVTE students are significantly more likely to graduate from high school and earn an industry-recognized credential than similar students who were not admitted.

Demand for rigorous research on Career Technical Education (CTE) has increased as more policymakers ask questions about the impact on college and career readiness. State CTE Directors may be interested in similar questions as researchers (such as “Does CTE improve educational and career outcomes? Do different programs help different students? What types of programs offer students the highest economic returns?”) but may not think to seek out and collaborate with them or know how to prioritize among the many research requests they receive.

This blog series, a partnership between Advance CTE and the Institute for Education Sciences (IES) seeks to break down the barriers between State CTE Directors and researchers to encourage partnerships that can benefit both.

What Can Research with State Data Tell Us?

Research can be a powerful tool to help State CTE Directors understand what’s working, what isn’t working, and what needs to change. The findings described below provide examples of how strong partnerships between researchers and state policymakers can result in actionable research (click on state name for link to full article).

  • In Arkansas, students with greater exposure to CTE are more likely to graduate from high school, enroll in a two-year college, be employed, and earn higher wages. The study, which was rigorous but not causal, also found that students taking more CTE classes are just as likely to pursue a four-year degree as their peers, and that CTE provides the greatest boost to boys and students from low-income families.
  • Boys who attended CTE high schools in Connecticut experienced higher graduation rates and post-graduation earnings than similar students who did not attend CTE high schools. Further follow-ups using both postsecondary and labor data could provide information about college completion and employment and earnings for different occupational sectors.
  • CTE concentrators in Texas had greater enrollment and persistence in college than their peers. Although rates of CTE concentration decreased, student participation in at least some CTE programming, as well as number of CTE credits earned, increased between the 2008 and 2014 cohorts. Unsurprisingly, the study also found differences by CTE programs of study. Education & Training; Finance; Health Science; and Science, Technology, Engineering & Mathematics (STEM) were most strongly associated with postsecondary enrollment, particularly in baccalaureate programs.

How Can States Use CTE Research to Improve Policy and Practice?

Here are a few things states can do today to start building a CTE research base:

  • Create a codebook of CTE variables in your state’s data system: Include K-12, postsecondary, and labor force variables if you have them. Define the variables clearly – what do they measure, at what level (student, program, district), and for how many years did you collect these variables? Are the measures comparable across years and across datasets?
  • Maximize opportunities to collect longitudinal data: longitudinal databases that span education levels and connect to workforce outcomes permit researchers to conduct rigorous studies on long-term outcomes.
  • Identify universities in your state with strong education, economics, or public policy departments:  Make a list of questions that policymakers in your state most wanted answered, and then approach universities with these proactively. Reach out to the chair(s) of these departments to connect with faculty who may be interested in partnering on answering the questions. Universities can often apply for a research grant that will cover part or all of the funding for state personnel to work on the research project. IES, which provides funding of this nature, opens its next grant competition in summer 2020.
  • Reach out to your Regional Educational Lab (REL) or the REL Career Readiness Research Alliance to inquire about partnering on CTE research: The mission of these IES-funded labs is to provide research and evidence to help educators in the states in their region. For example, REL Central is currently working with four states to replicate the Arkansas study described above (see “Review of Career and Technical Education in Four States”).
  • Stay up to date on the latest research findings in CTE: New research is regularly posted on the CTE Research Network and other websites. This can help you get ideas for what types of research you would like to conduct in your state. Another good source of inspiration is the recommendations of the CTE technical workgroup, which was convened by IES in late 2017 to guide future CTE research directions.
  • Become familiar with how researchers approach CTE research: Learn about why it’s so challenging to understand its impact. The CTE Research Network will hold research trainings for different audiences—including state agency staff— beginning in the summer of 2020. Stay tuned!

Over the next several months, Advance CTE and IES will publish a series of Q&A blog posts with researchers and state CTE leaders talking about how their partnerships developed and what states can do to advance CTE research.

This blog series was co-authored by Corinne Alfeld at IES (corinne.alfeld@ed.gov) and Austin Estes from Advance CTE (aestes@careertech.org), with thanks to Steve Klein of Education Northwest for editorial suggestions. IES began funding research grants in CTE in 2017 and established a CTE Research Network in 2018. IES hopes to encourage more research on CTE in the coming years in order to increase the evidence base and guide program and policy decisions. At the same time, Advance CTE has been providing resources to help states improve their CTE data quality and use data more effectively to improve CTE program quality and equity.

Updates from the CTE Research Network!

“Does Career and Technical Education (CTE) work?” and “For whom does CTE work and how?” are questions on many policymakers’ and education leaders’ minds and ones that the CTE Research Network aims to answer. The mission of the Network, as described in a previous blog post, is to increase the amount of causal evidence in CTE that can inform practice and policy. The Network’s members, who are researchers funded by IES to examine the impact of CTE, have been busy trying to answer all of these questions.

This blog describes three Network updates:

  • Shaun Dougherty, of Vanderbilt University, and his colleagues at the University of Connecticut have been studying the effects of attending a CTE-focused high school among 60,000 students in Connecticut as part of their Network project. They recently reported that:
    • When compared to males attending traditional high schools, males who attended CTE schools were 10 percentage points more likely to graduate from high school and were earning 31 percent more by age 23. The authors noted that the more CTE courses that are available at the regular high school, the less attendance at a CTE high school makes a difference.
    • Analyses of potential mechanisms behind these findings reveal that male students attending a technical high school have higher 9th grade attendance rates and higher 10th grade test scores. However, they are 8 percentage points less likely to attend college (though some evidence indicates that the negative impact on college attendance fades over time).
    • Attending a CTE high school had no impacts on female students. Further, the effects did not differ over student attributes like race and ethnicity, free lunch eligibility or residence in a poor, central city school district.

The study results are being disseminated widely in the media, including via the Brookings Brown Center Chalkboard, The Conversation, and the National Bureau of Economic Research.

  • In other news, the CTE Research Network has welcomed a fourth IES-funded project, led by Julie Edmunds. Edmunds’ team is studying dual enrollment pathways in North Carolina, and one of the pathways focuses on CTE.
  • Finally, the two co-PIs for the Network Lead, Kathy Hughes and Shaun Dougherty, recently participated in a Q&A in Techniques magazine about the purpose of the CTE Network, how the Network will help the field of CTE, and how each of their careers has led them to this work.

The Network Lead has launched a new website where you can get new information about ongoing work and sign up to receive their newsletter.

This post was written by Corinne Alfeld, the NCER-IES program officer responsible for the CTE research topic and the CTE Research Network. Contact her at Corinne.Alfeld@ed.gov with questions.

New Study on U.S. Eighth-Grade Students’ Computer Literacy

In the 21st-century global economy, computer literacy and skills are an important part of an education that prepares students to compete in the workplace. The results of a recent assessment show us how U.S. students compare to some of their international peers in the areas of computer information literacy and computational thinking.

In 2018, the U.S. participated for the first time in the International Computer and Information Literacy Study (ICILS), along with 13 other education systems around the globe. The ICILS is a computer-based international assessment of eighth-grade students that measures outcomes in two domains: computer and information literacy (CIL)[1] and computational thinking (CT).[2] It compares U.S. students’ skills and experiences using technology to those of students in other education systems and provides information on teachers’ experiences, school resources, and other factors that may influence students’ CIL and CT skills.

ICILS is sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and is conducted in the United States by the National Center for Education Statistics (NCES).

The newly released U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report provides information on how U.S. students performed on the assessment compared with students in other education systems and describes students’ and teachers’ experiences with computers.


U.S. Students’ Performance

In 2018, U.S. eighth-grade students’ average score in CIL was higher than the average of participating education systems[3] (figure 1), while the U.S. average score in CT was not measurably different from the average of participating education systems.

 


Figure 1. Average computer and information literacy (CIL) scores of eighth-grade students, by education system: 2018p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

⁵ Data collected at the beginning of the school year.

NOTE: The ICILS computer and information literacy (CIL) scale ranges from 100 to 700. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average CIL scores, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Given the importance of students’ home environments in developing CIL and CT skills (Fraillon et al. 2019), students were asked about how many computers (desktop or laptop) they had at home. In the United States, eighth-grade students with two or more computers at home performed better in both CIL and CT than their U.S. peers with fewer computers (figure 2). This pattern was also observed in all participating countries and education systems.

 


Figure 2. Average computational thinking (CT) scores of eighth-grade students, by student-reported number of computers at home and education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

NOTE: The ICILS computational thinking (CT) scale ranges from 100 to 700. The number of computers at home includes desktop and laptop computers. Students with fewer than two computers include students reporting having “none” or “one” computer. Students with two or more computers include students reporting having “two” or “three or more” computers. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average scores of students with two or more computers at home, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

U.S. Students’ Technology Experiences

Among U.S. eighth-grade students, 72 percent reported using the Internet to do research in 2018, and 56 percent reported completing worksheets or exercises using information and communications technology (ICT)[4] every school day or at least once a week. Both of these percentages were higher than the respective ICILS averages (figure 3). The learning activities least frequently reported by U.S eighth-grade students were using coding software to complete assignments (15 percent) and making video or audio productions (13 percent).

 


Figure 3. Percentage of eighth-grade students who reported using information and communications technology (ICT) every school day or at least once a week, by activity: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

NOTE: The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Activities are ordered by the percentages of U.S. students reporting using information and communications technology (ICT) for the activities, from largest to smallest.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Browse the full U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report to learn more about how U.S. students compare with their international peers in their computer literacy skills and experiences.

 

By Yan Wang, AIR, and Linda Hamilton, NCES

 

[1] CIL refers to “an individual's ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society” (Fraillon et al. 2019).

[2] CT refers to “an individual’s ability to recognize aspects of real-world problems which are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer” (Fraillon et al. 2019). CT was an optional component in 2018. Nine out of 14 ICILS countries participated in CT in 2018.

[3] U.S. results are not included in the ICILS international average because the U.S. school level response rate of 77 percent was below the international requirement for a participation rate of 85 percent.

[4] Information and communications technology (ICT) can refer to desktop computers, notebook or laptop computers, netbook computers, tablet devices, or smartphones (except when being used for talking and texting).

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Duckworth, D., and Friedman, T. (2019). IEA International Computer and Information Literacy Study 2018: Assessment Framework. Cham, Switzerland: Springer. Retrieved October 7, 2019, from https://link.springer.com/book/10.1007%2F978-3-030-19389-8.

New 2019 Reading and Mathematics Assessment Data on 4th- and 8th-Grade Students

The average reading score for U.S. 4th- and 8th-grade students decreased between 2017 and 2019. Changes in mathematics scores were mixed during this period, with an increase at grade 4 and a decrease at grade 8. These data are from the National Assessment of Educational Progress (NAEP)—also known as The Nation’s Report Card. NAEP is the largest nationally representative and continuing assessment of what students in the United States know and can do in various subject areas and is frequently referred to as the “gold standard” of student assessments.

In 4th-grade reading, the average scale score in 2019 was 220, one point lower than in 2017 (figure 1). In 8th-grade reading, the average scale score was 263, three points lower than in 2017 (figure 2). Compared with a decade ago in 2009, the 2019 average reading scale scores at each grade were not significantly different, but they were higher than the scale scores in 1992, the first time the reading assessment was administered.

 


Figure 1. Average National Assessment of Educational Progress (NAEP) reading scale scores of 4th-grade students: Selected years, 1992–2019

* Significantly different (p < .05) from 2019

--- Accommodations not permitted

— Accommodations permitted

 

Figure 2. Average National Assessment of Educational Progress (NAEP) reading scale scores of 8th-grade students: Selected years, 1992–2019

* Significantly different (p < .05) from 2019

--- Accommodations not permitted

— Accommodations permitted


 

In 4th-grade mathematics, the average scale score in 2019 was 241, one point higher than in 2017 (figure 3). In 8th-grade mathematics, the average scale score in 2019 was 282, one point lower than in 2017 (figure 4). Like reading, average scale scores for mathematics at both grades in 2019 were not significantly different than in 2009. Mathematics scale scores for both grade were higher in 2019 than in 1990, the first time the mathematics assessments were administered.

 


Figure 3. Average National Assessment of Educational Progress (NAEP) mathematics scale scores of 4th-grade students: Selected years, 1990–2019

* Significantly different (p < .05) from 2019

--- Accommodations not permitted

— Accommodations permitted

 

Figure 4. Average National Assessment of Educational Progress (NAEP) mathematics scale scores of 8th-grade students: Selected years, 1990–2019

* Significantly different (p < .05) from 2019

--- Accommodations not permitted

— Accommodations permitted


 

The Nation’s Report Card also presents data by different demographic groups—such as race/ethnicity—gender, school type, and region. White and Black 4th- and 8th-grade students scored lower in reading in 2019 than in 2017. Hispanic and American Indian/Alaska Native 8th-grade students also scored lower in reading in 2019 than in 2017. In mathematics, 4th-grade Hispanic students scored higher in 2019 than in 2017, and 8th-grade American Indian/Alaska Native students scored lower in 2019 than in 2017. From 2017 to 2019, males’ scores increased in mathematics at grade 4 but decreased in reading at both grades.

NCES administered the 2019 NAEP mathematics and reading assessments to almost 600,000 4th- and 8th-graders in public and private schools in all 50 states, the District of Columbia, the U.S. Department of Defense schools, and 27 urban districts. Samples of schools and students are drawn from each state and from the District of Columbia and Department of Defense schools.

Visit https://nces.ed.gov/nationsreportcard/ to view the report.

Cost Considered “Very Important” to Parents Who Chose Relatives as Caregivers for Young Children

When it comes to choosing a child care arrangement, cost is a big factor in the choices parents make, according to recently released data from the National Center for Education Statistics (NCES).

Every 3 years, NCES conducts the Early Childhood Program Participation (ECPP) component of the National Household Education Surveys Program (NHES) to answer questions about young children’s care and education before starting kindergarten. The ECPP survey reported that 60 percent of children under age 5 who were not yet in kindergarten participated in at least one weekly nonparental care arrangement in 2016. Of those receiving nonparental care,

  • 42 percent received only center-based care;
  • 25 percent received only relative care;
  • 20 percent received multiple types of care; and
  • 12 percent received only nonrelative care.

When asked what factors influenced their choice of child care arrangements, 51 percent of parents ranked the cost as “very important” when selecting an arrangement in 2016. This percentage was higher among parents of children in relative care (63 percent) than among parents of children in multiple types of care arrangements (50 percent) and parents of children only in center-based care (47 percent).

Overall, in 2016, some 39 percent of parents with children in nonparental care reported that they had difficulty finding child care. This rate was lowest for parents of children only in relative care (23 percent) and highest for parents of children only in nonrelative care (53 percent). However, among parents who had difficulty trying to find child care, cost was a larger concern for those with children only in relative care than it was for those with children in other arrangements (see figure 1).

 


Figure 1. Percentage of children under age 5 whose parents reported that cost was the primary reason for difficulty finding child care arrangements, by type of arrangement: 2016

NOTE: Data are for children participating in at least one weekly nonparental care arrangement. Excludes children enrolled in kindergarten.

SOURCE: U.S. Department of Education, National Center for Education Statistics, The Costs of Child Care: Results From the 2016 Early Childhood Program Participation Survey (ECPP-NHES:2016).


 

In 2016, fees were less common and costs were generally lower for parents with children in relative care than for parents with children in other types of nonparental care arrangements. Thirty-two percent of parents with children in at least one care arrangement were not charged fees for care, and 58 percent of those children were in relative care. Among children in relative care, 80 percent were cared for by grandparents. When parents paid grandparents for their children’s care, they paid an average of $4.86 per hour, less than the average across all types of care arrangements ($6.93 per hour).

For more detailed information about costs of child care, see The Costs of Child Care: Results From the 2016 Early Childhood Program Participation Survey (ECPP-NHES:2016).

 

By Tracae McClure and Sarah Grady