IES Blog

Institute of Education Sciences

Learning from CTE Research Partnerships: How Michigan Built Trust with Researchers to Better Understand State Data

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Austin Estes, Senior Policy Associate at Advance CTE, and Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES), are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The first interview was with Jill Kroll of the Michigan Department of Education and Dan Kreisman of Georgia State University (and Director of CTEx). [Note: this interview has been edited for length; you can find the full interview transcript here].

 

Jill Kroll Dan Kreisman
Michigan Department of Education Georgia State University

 

The first question we have is about the projects that you work on together: what were some of the research questions you came up with, and how did you come to settle on those research questions?

Jill – I first connected with Dan and with Brian Jacob at University of Michigan when I saw Brian present to our P-20 council about some research that he was doing connecting the wage record data for five community colleges. I was like “Gee, is there any way you can do something similar with the statewide secondary student data?” And he said it was possible. So I worked within our department procedures to find out how we could go about establishing a relationship that would allow this opportunity.

Dan – That led to a whole bunch of other discussions of things that we thought were interesting. So, to say that there is a set of research questions is not the way I view our relationship. We talk with folks in Jill’s office regularly to hear what questions are pressing for them, and then we try to help facilitate answering those and then see where those lead us. I think one of the important things is we try to think about where there are policy levers, so we want to say “If we answer this question, how can the state or the districts use that information to further their mission of providing CTE programming to students in Michigan?”

Jill – I’ve been really happy with the extent to which Dan and the research team have consistently focused on the “so what?” Rather than focusing on vague research questions of interest only to other researchers, they have emphasized their interest in doing research that has practical application, that can be used by educators in the field.

Could you share an example of how you’ve been able to use some of this evidence and research to change policy, or at least to shape your understanding on some decisions that you’re making at the state level?

Jill - When we were starting to work on our Perkins V [the Strengthening Career and Technical Education for the 21st Century Act] state plan, we had a short time to determine what we wanted to consider for our secondary indicator of program quality. Because Brian, Dan, and their students had been working with this data for so many years, they had the capacity to very quickly do the matching and

 come up with an approximation for us about what postsecondary credit attainment would look like, and what strengths and weaknesses they saw in the data. It would have been really difficult for our office, or even multiple state agencies, to have been able to work that quickly and give it the critical analysis that they did.

The other thing they did when we were making the decision for that indicator is look at the data that we had for work-based learning and tell us what could be done with it. What came out of that was that the data was not in any form that could be analyzed (text and PDFs). This was really revealing to our State Director Brian Pyles, and it led him to set a policy that we are going to build a consistent way of collecting data on work-based learning. So that is another piece where it influenced practice and policy. One of the most exciting and valuable things that I find about the partnership is that Dan and the other researchers have a lot more capacity to analyze the data in a way that we just don’t have the time to do. Sometimes we don’t have the expertise, and sometimes we just don’t look at the data in the same way.

Dan –And there’s a flip side that without their input, we often are looking at data and can’t make heads or tails of something. And we can get on the phone or write an email to someone over there and say “Hey we’re seeing this thing. Can you tell me what that means?” And they will come back with “Oh, the system changed” or “There was this one policy,” and “Here’s what you have to do to make it fit everything else.” And this happens all the time. We would be completely lost without this open channel that we have to their office.

I think it’s important not to dismiss the power of good descriptive work. Lots of times, the questions that states are grappling with can often be illuminated with some really careful and good descriptive work. You can say, “This is what we’re seeing, this is the big picture,” if you step back for a minute, and that information lots of times has been as valuable as the stuff we try to do that is more causally oriented in our research.

Jill – I agree, and I want to follow up on the whole issue of how important trust is. I cannot emphasize enough how important it is to me that Dan and the other researchers come to us with those questions, that they check in with us. That’s absolutely critical. Anyone who works with any kind of data knows that it’s just so complex. If you link tables wrong, or misunderstand a data field, you can come to a completely wrong decision. So that communication and that interaction and trust are key to accurate outcomes.

As you’re both looking ahead, what’s next on the agenda? What are some of the research questions and priorities you have for this partnership?

Dan – Number one is tracking students into the labor market. That’s our biggest and most outstanding question. And the degree to which CTE programs are preparing students for college and the labor market and careers. In terms of other projects, one of the things we’re interested in is technical assessments. We’re also part of a consortium of several states – that’s the CTEx group. We meet annually together, and that allows us to harmonize things across states to see how trends are similar, how enrollment rates work, all sorts of different questions across multiple states.

Jill – One of the things we’re talking about right now is that we don’t have, in an accessible form, data on access to a particular program. We know that career centers serve certain districts, but if someone asked, “If student A is going to Central High School, what programs do they have access to? we don’t have a good way of answering that at the moment. We’ve had a couple of discussions about how we can work together to build basically a dataset that clarifies that. That would be mutually beneficial and would take resources from both in order to do something like that.

Thinking back on this partnership, is there any advice you would give to other State Directors or CTE researchers?

Dan – Building a strong relationship is the first thing you have to do. And part of that is spending time face to face talking about questions, moving around ideas, looking at data together. We had the benefit of a long windup period. We spent at least a year just talking about questions and putting together data before we even started doing any analyses. We also had buy-in from Jill’s office up and down the line from folks who were doing the research to people who were in policymaking roles. And without all of that, none of this would even have been possible.

And the second part is to not downplay the value of just providing good information. A lot of us on the research side don’t realize how little time folks in the state offices have to take a step back and say, “What’s going on with our data? Let’s look at the big picture.” And one of the things we can provide them is just giving them that big picture and handing it to them in a digestible way. And doing that is the first step, is a really good way to start building that trust. They really see the value of what you can do early on. And then you can start to get into more difficult or longer-term questions.

Jill – The first advice I would give is: Do it! Partner with researchers. I can’t say enough positive about it. The second is: Follow department procedures and be transparent with department leadership. You know that windup might be really, really slow while you jog through the channels that you need to in your department to do things by the book, but I think it pays off in the long run.

My third one is: Be transparent and open with school districts. Share what you’re doing and invite their input. Anybody who works with state data would probably know, you’re always a little hesitant about what the public would think about this use of data. The way that Dan and the postdocs and graduate students have openly shared the work that they’ve done with our CTE administrators has really helped, in that I have not gotten any doubt from districts.

The full transcript can be accessed in Advance CTE’s Learning that Works Resource Center. Other blog posts in this series can be viewed here.

New Report Highlights Progress and Challenges in U.S. High School Dropout and Completion Rates

A new NCES report has some good news about overall high school dropout and completion rates, but it also highlights some areas of concern.

Using a broad range of data, the recently released Trends in High School Dropout and Completion Rates in the United States report shows that the educational attainment of young adults has risen in recent decades. The public high school graduation rate is up, and the status dropout rate (the percentage of 16- to 24-year-olds who are not enrolled in school and have not completed high school) is down. Despite these encouraging trends, there are significant disparities in educational attainment among young adults in the United States. The report shines new light on these disparities by analyzing detailed data from the U.S. Census Bureau.

For large population groups, the report provides status dropout rates calculated using annual data from the American Community Survey (ACS), administered by the U.S. Census Bureau. For example, in 2017, some 5.4 percent of 16- to 24-year-olds who were not enrolled in high school lacked a high school diploma or equivalent credential.

For smaller population groups, there are not enough ACS respondents during any given year to allow for precise and reliable estimates of the high school status dropout rate. For these demographic subgroups, NCES pools the data from 5 years of the ACS in order to obtain enough respondents to accurately describe patterns in the dropout rate.

For example, while the overall status dropout rate for Asian 16- to 24-year-olds was below the national average in 2017, the rates for specific subgroups of Asian young adults varied widely. Based on 5 years of ACS data, high school status dropout rates among Asian 16- to 24-year-olds ranged from 1.1 percent for individuals of Korean descent to 23.2 percent for individuals of Burmese descent. These rates represent the “average” status dropout rate for the period from 2013 to 2017. They offer greater precision than the 1-year estimates, but the 5-year time span might make them difficult to interpret at first glance. 

 


Figure 1. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by selected Asian subgroups: 2013–2017

‡ Reporting standards not met. Either there are too few cases for a reliable estimate or the coefficient of variation (CV) is 50 percent or greater.
If the estimation procedure were repeated many times, 95 percent of the calculated confidence intervals would contain the true status dropout rate for the population group.
NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations. Asian subgroups exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017.


 

The 5-year ACS data can also be used to describe status dropout rates for smaller geographic areas with more precision than the annual ACS data. For example, the average 2013–2017 status dropout rates ranged from 3.8 percent in Massachusetts to 9.6 percent in Louisiana. The 5-year ACS data allowed us to calculate more accurate status dropout rates for each state and, in many cases, for racial/ethnic subgroups within the state. Access the complete state-level dropout rates by race/ethnicity here.
 


Figure 2. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by state: 2013–2017

NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017. See table 2.3.


 

For more information about high school dropout and completion rates, check out the recently released Trends in High School Dropout and Completion Rates in the United States report. For more information about the 5-year ACS datasets, visit https://www.census.gov/programs-surveys/acs/guidance/estimates.html.

 

By Joel McFarland

NCES’s Top Hits of 2019

As 2019 comes to an end, we’re taking stock of NCES’s most downloaded reports, most viewed indicators, Fast Facts, and blog posts, and most engaging tweets over the past year. As you reflect on 2019 and kick off 2020, we encourage you to take a few minutes to explore the wide range of education data NCES produces.

 

Top Five Reports, by PDF downloads

1. Condition of Education 2019 (8,526)

2. Condition of Education 2018 (5,789)

3Status and Trends in the Education of Racial and Ethnic Groups 2018 (4,743)

4. Student Reports of Bullying: Results From the 2015 School Crime Supplement to the National Crime Victimization Survey (4,587)

5. Digest of Education Statistics 2017 (4,554)

 

Top Five indicators from the Condition of Education, by number of web sessions

1. Children and Youth With Disabilities (86,084)

2. Public High School Graduation Rates (68,977)

3. Undergraduate Enrollment (58,494)

4. English Language Learners in Public Schools (50,789)

5. Education Expenditures by Country (43,474)

 

Top Five Fast Facts, by number of web sessions

1. Back to School Statistics (227,510)

2. College Graduate Rates (109,617)

3. Tuition Costs of Colleges and Universities (107,895)

4. College Endowments (71,056)

5. High School Dropout Rates (67,408)

 

Top Five Blog Posts, by number of web sessions

1. Free or Reduced Price Lunch: A Proxy for Poverty? (5,522)

2. Explore Data on Mental Health Services in K–12 Public Schools for Mental Health Awareness Month (4,311)

3. Educational Attainment Differences by Students’ Socioeconomic Status (3,903)

4. Education and Training Opportunities in America’s Prisons (3,877)

5. Measuring Student Safety: Bullying Rates at School (3,706)

 

Top Five Tweets, by number of impressions

1. Condition of Education (45,408 impressions)

 

2. School Choice in the United States (44,097 impressions)

 

3. NAEP Music and Visual Arts Assessment (32,440 impressions)

 

4. International Education Week (29,997 impressions)

 

5. Pop Quiz (25,188 impressions)

 

Be sure to check our blog site and the NCES website in 2020 to keep up-to-date with NCES’s latest activities and releases. You can also follow NCES on Twitter, Facebook, and LinkedIn for daily updates and content.

 

By Thomas Snyder

Higher Rates of Homeschooled Students Than Enrolled Students Participated in Family Learning Activities in 2016

About 3 percent of the school-age population—around 1.7 million students—was homeschooled in 2016. We know that homeschooled students have different educational experiences than students who are enrolled in public or private schools, and recently released data explore some of those differences.

The Parent and Family Involvement in Education survey of the National Household Education Surveys Program (NHES) provides information on homeschooled and public and private school students based on a nationally representative sample. Parents provide information about their children’s formal education and learning activities outside of school.

The survey asks about six broad types of family learning activities that students experienced in the month prior to the survey. The 2016 results indicate that homeschooled students were more likely than their peers enrolled in public or private schools to participate in five of these six activities.

In 2016, higher percentages of homeschooled students than of students enrolled in public or private schools visited a library; a bookstore; an art gallery, museum, or historical site; and a zoo or aquarium in the month prior to completion of the survey (figure 1). A higher percentage of homeschooled students also attended an event sponsored by a community, religious, or ethnic group with their parents in the month prior to completion of the survey. The one activity for which there was no measurable difference between homeschooled and students enrolled in public or private schools was going to a play, concert, or other live show.

 


Figure 1. Percentage of 5- to 17-year-old students participating in selected family learning activities in the past month, by homeschool and enrollment status: 2016

 

NOTE: Includes 5- to 17-year-old students in grades or grade equivalents of kindergarten through grade 12. Homeschooled students are school-age children who receive instruction at home instead of at a public or private school either all or most of the time. Excludes students who were enrolled in public or private school more than 25 hours per week and students who were homeschooled only because of temporary illness. Selected activities with the child may have included any member of the household.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2016.


 

The NHES data do not tell us why these differences exist, but parents’ availability of time and parenting style may be a factor. However, more research is needed to understand these differences.

A recent report, Homeschooling in the United States: Results from the 2012 and 2016 Parent and Family Involvement Survey (PFI-NHES:2012 and 2016), provides the full complement of data from the NHES about homeschoolers’ experiences in 2016. In addition to family learning activities, the report provides information about the following:
 

  • Homeschooler demographics

  • Reasons for homeschooling

  • Providers of homeschool instruction

  • Amount of time homeschoolers spent attending public schools, private schools, or college

  • Participation in local homeschool group activities

  • Homeschool teaching styles

  • Sources of homeschool curriculum and books

  • Online coursetaking of homeschool students

  • Homeschool subject areas

  • Parent expectations of homeschooled students’ future education
     

For more information on the National Household Education Surveys Program, please go to https://nces.ed.gov/nhes/.

 

By Sarah Grady

An Example of the Unquantifiable Effect of Research on Practice

At IES, we continue to think about ways to positively impact education practice through research. It is relatively straightforward to count and share the publications and research outputs produced by our grants. A bigger challenge is measuring the impact IES-funded research has on implementing evidence-based practice after the research project is complete. So we were thrilled when we received the following letter from Patrice Bain—a middle school teacher, author, education specialist, speaker, and consultant—who has worked closely with IES for many years.

I used to think of government agencies as impersonal bureaucracies often hidden from the public eye. One agency, IES, not only proved me wrong, it positively changed my life.

In 2006, Drs. Henry Roediger III and Mark McDaniel from Washington University in St. Louis obtained a grant from IES to research how students learn in an authentic classroom. The classroom where this research began was mine. And this is where the life-changing impact began.

The IES grant paid for technology to be used in my school’s classrooms and research assistants to aid our teachers. Heading up the research at my school was Pooja Agarwal, and this began a collaboration lasting over a decade.

In 2007, IES invited me to be the sole K-12 educator to co-author a practice guide. The large organization, to me, now had a face: Elizabeth Albro, who warmly welcomed me. I clearly recall sitting at a large table in Washington, DC, surrounded by my cognitive science superheroes: Drs. Hal Pashler, Mark McDaniel, Brian Bottge, Art Graesser, Janet Metcalfe, and Ken Koedinger. Each talked about important research that would impact learning in classrooms, and I knew my newly-expanded teaching repertoire now would be based in the science of learning. The final result of our meetings became the highly cited practice guide Organizing Instruction and Study to Improve Student Learning. In addition, information from this guide was featured on the website Doing What Works.

As Pooja and I delved into how retrieval, spacing, and metacognition played a role in student learning at my school, I was contacted by REL Mid-Atlantic, a part of IES that offers research-based professional development in Delaware, Washington, DC, New Jersey, Pennsylvania, and Maryland. Touting the benefits of teaching using the seven recommendations in Organizing Instruction and Study, I gave professional development presentations in the Mid-Atlantic regions with Drs. Hal Pashler, Ken Koedinger, and Nate Kornell.

Pooja and I also gave several presentations that included the research happening in my classroom. With IES funding, that research became a multi-year project involving over 1500 middle and high school students. With the passing of each year and research on learning becoming more defined, I was able to develop strategies utilizing retrieval, spacing, and metacognition. Pooja and I continued our collaboration. I was seeing success in the eyes of my students: I wasn’t just teaching content, I was teaching them how to learn.

A wealth of information on the science of learning seemed to be making a mark. Yet learning myths—those based on anecdotes and fads—were still circulating. To combat this, IES and NCER invited me to be on a working task group to tackle Neuromyths vs. Neurotruths. Once again, as I sat around a table in Washington, DC with learning superheroes, we explored how to begin to dispel prevalent myths of learning.

Because of IES and the opportunities I was given, I wanted to shout from a mountaintop that we can transform teaching. I’ve seen it. I’ve done it in my classroom. I realized a book started to brew within me. I’m not sure how the decision occurred, but I knew my collaboration with Pooja Agarwal was worthy of documenting. And so it began. We wanted to write a practical, evidence- and research-based book. Books had been written by cognitive scientists; books had been written by teachers. However, our book would be the first written by a cognitive scientist and an educator.

Powerful Teaching: Unleash the Science of Learning was released in June 2019. The ideas have resonated with educators across the globe. We are transforming education.

And it all started with IES approving a grant.