IES Blog

Institute of Education Sciences

Updates on Research Center Efforts to Increase Diversity, Equity, Inclusion, and Accessibility

As we begin a new school year, NCER and NCSER wanted to share with our community some of the work we have been doing—and are seeking to do more of—in relationship to diversity, equity, inclusion, and accessibility (DEIA). We plan to provide occasional updates via this blog to share progress and keep the conversations going.  

Actions on Diversity

At the end of 2020, IES convened a Technical Working Group (TWG) to get feedback on ways that the research centers could improve our investments focused on DEIA. Under the leadership of Drs. Katina Stapleton and Christina Chhin, we convened a stellar panel that participated in a robust conversation. That conversation and the recommendations from the panel are available in this summary document. We are already implementing some of the recommendations and wanted to share steps that we have taken and our plans for next steps to advance DEIA in IES-funded research.

  1. One of the first steps that we took in response to the TWG recommendations was to take a close look at our Requests for Applications (RFAs), identify potential barriers to applicants from underrepresented groups, and revise and/or add language that more clearly articulated our commitment to DEIA, both in terms of those that conduct the research and in the populations studied. These changes were reflected in our FY 2022 RFAs, and we will continue to revise and improve our application materials.
  2. IES has been committed to building expertise among a broad range of scholars in the education sciences for nearly two decades. The TWG noted, however, that there is a pressing need to provide funds for early career investigators who may be working at MSIs, teaching-intensive institutions, and/or at institutions with limited opportunities for research mentorship. In response, IES launched an Early Career Mentoring for Faculty at MSIs research program. This new program extends our FY 2016 training investment in MSIs that we recompeted in FY 2021: the Pathways to the Education Sciences Training program. This program is designed to encourage undergraduate, post-baccalaureate, and masters-level students in groups that are historically underrepresented in doctoral education to pursue graduate study relevant to education research. Currently, there are seven IES-funded Pathways training programs in the United States, hosted by minority serving institutions (MSIs) and their partners. We are excited to see who applied in this first round of the Early Career Mentoring program and anticipate investing in this program in FY 2023 and beyond.  
  3. The TWG also recommended that IES intentionally reach out to the MSI community to ensure that they know about the opportunities available at IES. We held our first such event since the TWG on September 7, 2021, where IES hosted a virtual listening session at HBCU week. More than 250 scholars attended that session and provided valuable feedback on barriers to HBCU-based researchers applying for research funding from IES. We are in the process of scheduling additional listening sessions with other communities of researchers to provide more opportunities for input from diverse stakeholders and underrepresented groups.
  4. The TWG also recommended that IES take a deeper look at the demographic and institutional data of applicants to our grants programs to identify which groups of researchers and institutions are underrepresented. Data indicate that the percentage of applications received from MSIs between 2013 and 2020 was very small—4% of applications to NCER and 1% to NCSER. Of those applications that were funded, 10% of NCER’s awards were made to MSIs and none of NCSER’s awards were made to MSIs. IES reviewed the demographic information FY 2021 NCER and NCSER grant applicants and awardees voluntarily submitted, and among those who reported their demographic information, we found the following:
    • Gender (response rate of approximately 82%) - The majority of the principal investigators that applied for (62%) and received funding (59%) from IES identified as female.
    • Race (response rate of approximately 75%) - The majority of principal investigators that applied for (78%) and received funding (88%) from IES identified as White, while 22% of applicants and 13% of awardees identified as non-White or multi-racial.
    • Ethnicity (response rate of approximately 72%) - The majority of principal investigators that applied for (95%) and received funding (97%) identified as non-Hispanic.
    • Disability (response rate of approximately 70%) - The majority of principal investigators that applied for (97%) and received funding (96%) identified as not having a disability.

These data underscore the need for IES to continue to broaden and diversify the education research pipeline, including institutions and researchers, and better support the needs of underrepresented researchers in the education community. However, tracking our progress has proven to be a challenge. Responding to the demographic survey was voluntary so a significant number of applicants chose not to respond to particular questions. We strongly encourage all our grant applicants to respond to the demographic survey so that we will be better able to track our progress in improving diversity in our grant programs.

Addressing Misconceptions that Limit Diversity in IES Applicants

TWG panel members and attendees at the HBCU session highlighted a series of misconceptions that the education sciences community holds about the funding process at IES and recommended that IES identify communication strategies to address these misconceptions. IES hears that message loud and clear and wants to address at least a few of those misconceptions here.

Myth: IES only funds randomized controlled trials, limiting the range of researchers and institutions that can be competitive for IES grants.

Reality: IES funds a range of research, including measurement work, exploratory research, intervention development and testing, and efficacy and replication studies. We also fund a wide range of methods, including various experimental and quasi-experimental designs and mixed methods that combine quantitative and qualitative methods.

Myth: IES doesn’t support course buyout or summer salary.

Reality: IES supports grant personnel time to carry out research related activities. This can include course buyout and summer salary. Principal investigators on grants coordinate their budget planning with sponsored projects officers to ensure that their budgets comply with institutional guidelines as well as federal guidelines.

Myth: IES program officers are too busy to help novice applicants.

Reality: Because IES program officers are not involved in the peer review of applications, they can provide in-depth technical assistance and advice throughout the application process. They can even review drafts of proposals prior to submission! IES program officers can be your best resource in helping you submit a competitive grant proposal.

 

If you’d like to learn more about DEIA at IES, please see our Diversity Statement. You can also subscribe to our Newsflash and follow us on Twitter (@IESResearch) for announcements of future listening sessions. Please send any feedback or suggestions to NCER.Commissioner@ed.gov (National Center for Education Research) or NCSER.Commissioner@ed.gov (National Center for Special Education Research). Also, watch this blog over the next few months to read about the wide range of IES grantees and fellows from diverse backgrounds and career paths. Next up is our Hispanic Heritage Month (Sept. 15-Oct. 15, 2021) blog series.


Christina Chhin (Christina.Chhin@ed.gov), Katina Stapleton (Katina.Stapleton@ed.gov), and Katie Taylor (Katherine.Taylor@ed.gov) assisted Commissioners Albro and McLaughlin in writing this blog.

Working to Understand the Policy Process in the Development of Michigan’s Read by Grade Three Law

In recent decades, there has been an emphasis on quantitative, causal research in education policy. These methods are best suited for answering questions about the effects of a policy and whether it achieved its intended outcomes. While the question of, “Did it work?” remains critical, there is a need for research that also asks, “Why did it work? For whom? In what contexts?” To answer these types of questions, researchers must incorporate rigorous qualitative methods into their quantitative studies. Education research organizations like the Association for Education Finance and Policy have explicitly invited proposals using qualitative and mixed methodologies in an effort to elevate research addressing a range of critical education policy questions. Funding organizations, including IES, encourage applicants to incorporate qualitative methods into their research process. In this guest blog, Amy Cummings, Craig De Voto, and Katharine Strunk discuss how they are using qualitative methods in their evaluation of a state education policy.

 

In our IES-funded study, we use qualitative, survey, and administrative data to understand the implementation and impact of Michigan’s early literacy law—the Read by Grade Three Law. Like policies passed in 19 other states, the Read by Grade Three Law aims to improve K-3 student literacy skills and mandates retention for those who do not meet a predetermined benchmark on the state’s third-grade English language arts assessment. Although the retention component of these policies remain controversial, similar laws are under consideration in several other states, including Alaska, Kentucky, and New Mexico. Below are some of the ways that we have integrated qualitative methods in our evaluation study to better understand the policy process in the development of the Read by Grade Three Law.  

Collecting qualitative sources helped us understand how the policy came to be, thereby assisting in the structure of our data collection for examining the law’s implementation and subsequent effects. In our first working paper stemming from this study, we interviewed 24 state-level stakeholders (policymakers, state department of education officials, early literacy leaders) involved in the development of the law and coded state policy documents related to early literacy to assess the similarity between Michigan’s policy and those of other states. Understanding the various components of the Law and how they ended up in the policy led us to ensure that we asked educators about their perceptions and implementation of these components in surveys that are also part of our evaluation. For example, because our interviews made clear the extent to which the inclusion of the retention component of the Law was controversial during its development, we included questions in the survey to assess educators’ perceptions and intended implementation of this component of the Law. In addition, it confirmed the importance of our plan to use state administrative retention and assessment data to evaluate the effect of retention on student literacy outcomes.

To trace the Read by Grade Three Law’s conception, development, and passage, we analyzed these qualitative data using two theories of the policy process: Multiple Streams Framework (MSF) and policy transfer. MSF says that policy issues emerge on government agendas through three streams: problem, policy, and political. When these streams join, a policy window is opened during which there is a greater opportunity for passing legislation. Meanwhile, policy transfer highlights how policies enacted in one place are often used in the development of policies in another.

We found that events in the problem and political streams created conditions ripe for the passage of an early literacy policy in Michigan:

  • A national sentiment around improving early literacy, including a retention-based third-grade literacy policy model that had been deemed successful in Florida
  • A pressing problem took shape, as evidenced by the state’s consistently below average fourth-grade reading scores on the National Assessment of Educational Progress
  • A court case addressing persistently low-test scores in a Detroit-area district
  • Previous attempts by the state to improve early literacy

As a result of these events, policy entrepreneurs—those willing to invest resources to get their preferred policy passed—took advantage of political conditions in the state and worked with policymakers to advance a retention-based third-grade literacy policy model. The figure below illustrates interviewee accounts of the Read by Grade Three Law’s development. Our policy document analysis further reveals that Michigan’s and Florida’s policies are very similar, only diverging on nine of the 50 elements on which we coded.

 

 

Although this study focuses on the development and passage of Michigan’s early literacy law, our findings highlight both practical and theoretical elements of the policy process that can be useful to researchers and policymakers. To this end, we show how particular conditions, coupled by policy entrepreneurs, spurred Michigan’s consideration of such a policy. It is conceivable that many state education policies beyond early literacy have taken shape under similar circumstances: a national sentiment combined with influential brokers outside government. In this way, our mixed-methods study provides a practical model of what elements might manifest to enact policy change more broadly.

From a theoretical standpoint, this research also extends our understanding of the policy process by showing that MSF and the theory of policy transfer can work together. We learned that policy entrepreneurs can play a vital role in transferring policy from one place to another by capitalizing on conditions in a target location and coming with a specific policy proposal at the ready.

There is, of course, more to be learned about the intersection between different theories of the policy process, as well as how external organizations as opposed to individuals operate as policy entrepreneurs. As the number of education advocacy organizations continues to grow and these groups become increasingly active in shaping policy, this will be an exciting avenue for researchers to continue to explore.

This study is just one example of how qualitative research can be used in education policy research and shows how engaging in such work can be both practically and theoretically valuable. The most comprehensive evaluations will use different methodologies in concert with one another to understand education policies, because ultimately, how policies are conceptualized and developed has important implications for their effectiveness.


Amy Cummings is an education policy PhD student and graduate research assistant at the Education Policy Innovation Collaborative (EPIC) at Michigan State University (MSU).

Craig De Voto is a visiting research assistant professor in the Learning Sciences Research Institute at the University of Illinois at Chicago and an EPIC affiliated researcher.

Katharine O. Strunk is the faculty director of EPIC, the Clifford E. Erickson Distinguished Chair in Education, and a professor of education policy and by courtesy economics at MSU.

Data Collection for Cost Analysis in an Efficacy Trial

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

In one of our periodic conversations about addressing cost analysis challenges for an efficacy trial, the Cost Analysis in Practice (CAP) Project and Promoting Accelerated Reading Comprehension of Text-Local (PACT-L) teams took on a number of questions related to data collection. The PACT-L cost analysts have a particularly daunting task with over 100 schools spread across multiple districts participating in a social studies and reading comprehension intervention. These schools will be served over the course of three cohorts. Here, we highlight some of the issues discussed and our advice.

Do we need to collect information about resource use in every district in our study?

For an efficacy study, you should collect data from all districts at least for the first cohort to assess the variation in resource use. If there isn’t much variation, then you can justify limiting data collection to a sample for subsequent cohorts.

Do we need to collect data from every school within each district?

Similar to the previous question, you would ideally collect data from every participating school within each district and assess variability across schools. You may be able to justify collecting data from a stratified random sample of schools, based on study relevant characteristics, within each district and presenting a range of costs to reflect differences. You might consider this option if funding for cost analysis is limited. Note that “district” and “school” refer to an example of one common setup in an educational randomized controlled trial, but other blocking and clustering units can stand in for other study designs and contexts.

How often should we collect cost data? 

The frequency of data collection depends on what the intervention is, length of implementation, and the types of resources (“ingredients”) needed. People’s time is usually the most important resource used for educational interventions, often 90% of the total costs. That’s where you should spend the most effort collecting data. Unfortunately, people are notoriously bad at reporting their time use, so ask for time use as often as you can (daily, weekly). Make it as easy as possible for people to respond and offer financial incentives, if possible. For efficacy trials in particular, be sure to collect cost data for each year of implementation so that you are accurately capturing the resources needed to produce the observed effects.

What’s the best way to collect time use data?

There are a few ways to collect time use data. The PACT-L team has had success with 2-question time logs (see Table 1) administered at the end of each history lesson during the fall quarter, plus a slightly longer 7-question final log (see Figure 2).

 

Table 1. Two-question time log. Copyright © 2021 American Institutes for Research.
1. Approximately, how many days did you spend teaching your [NAME OF THE UNIT] unit?  ____ total days
2. Approximately, how many hours of time outside class did you spend on the following activities for [NAME OF UNIT] unit? 

Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)

   a. Developing lesson plans _____ hour(s)
   b. Grading student assignments _____ hour(s)
   c. Developing curricular materials, student assignments, or student assessments _____ hour(s)
   d. Providing additional assistance to students _____ hour(s)
   e. Other activities (e.g., coordinating with other staff; communicating with parents) related to unit _____ hour(s)

 

Table 2. Additional questions for the final log. Copyright © 2021 American Institutes for Research.
3. Just thinking of summer and fall, to prepare for teaching your American History classes, how many hours of professional development or training did you receive so far this year (e.g., trainings, coursework, coaching)? _____ Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
4. So far this year, did each student receive a school-provided textbook (either printed or in a digital form) for this history class? ______Yes     ______No
5. So far this year, did each student receive published materials other than a textbook (e.g., readings, worksheets, activities) for your American history classes? ______Yes     ______No
6. So far this year, what percentage of class time did you use the following materials for your American History classes? Record average percent of time used these materials (It has to add to 100%)
   a. A hardcopy textbook provided by the school _____%
   b. Published materials that were provided to you, other than a textbook (e.g., readings, worksheets, activities) _____%
   c. Other curricular materials that you located/provided yourself _____%
   d. Technology-based curricular materials or software (e.g., books online, online activities) _____%
       Total 100%
7. So far this year, how many hours during a typical week did the following people help you with your American history course? Please answer for all that apply Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
   a. Teaching assistant _____ hours during a typical week
   b. Special education teacher _____ hours during a typical week
   c. English learner teacher _____ hours during a typical week
   d. Principal or assistant principal _____ hours during a typical week
   e. Other administrative staff _____ hours during a typical week
   f. Coach _____ hours during a typical week
   g. Volunteer _____ hours during a typical week

 

They also provided financial incentives. If you cannot use time logs, interviews of a random sample of participants will likely yield more accurate information than surveys of all participants because the interviewer can prompt the interviewee and clarify responses that don’t make sense (see CAP Project Template for Cost Analysis Interview Protocol under Collecting and Analyzing Cost Data). In our experience, participants enjoy interviews about how they spend their time more than trying to enter time estimates in restricted survey questions. There also is good precedent for collecting time use through interviews: the American Time Use Survey is administered by trained interviewers who follow a scripted protocol lasting about 20 minutes.

Does it improve accuracy to collect time use in hours or as a percentage of total time?

Both methods of collecting time use can lead to less than useful estimates like the teacher whose percentage of time on various activities added up to 233%, or the coach who miraculously spent 200 hours training teachers in one week. Either way, always be clear about the relevant time period. For example, “Over the last 7 days, how many hours did you spend…” or “Of the 40 hours you worked last week, what percentage were spent on…” Mutually exclusive multiple-choice answers can also help ensure reasonable responses. For example, the answer options could be “no time; less than an hour; 1-2 hours; 3-5 hours; more than 5 hours.

What about other ingredients besides time?

Because ingredients such as materials and facilities usually represent a smaller share of total costs for educational interventions and are often more stable over time (for example, the number of hours a teacher spends on preparing to deliver an intervention may fluctuate from week to week, but the classrooms tend to be available for use for a consistent amount of time each week), the burden of gathering data on other resources is often lower. You can add a few questions to a survey about facilities, materials and equipment, and other resources such as parental time or travel once or twice per year, or better yet to an interview, or better still, to both. One challenge is that even though these resources may have less of an impact on the bottom line costs, they can involve quantities that are more difficult for participants to estimate than their own time such as the square footage of their office.

If you have additional questions about collecting data for your own cost analysis and would like free technical assistance from the IES-funded CAP Project, submit a request here. The CAP Project team is always game for a new challenge and happy to help other researchers brainstorm data collection strategies that would be appropriate for your analysis.


Robert D. Shand is Assistant Professor in the School of Education at American University

Iliana Brodziak is a senior research analyst at the American Institutes for Research

Exploring the Growing Impact of Career Pathways

Career pathways programs for workforce development are spreading across the country at both the secondary and postsecondary levels. Based on a synthesis of studies examining career pathways programs that integrate postsecondary career-technical education (CTE), the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide presents five recommendations for implementing evidence-based practices:

Cover of advising practice guide
  1. Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment.
  2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways.
  3. Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways.
  4. Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways.
  5. Develop and continuously leverage partnerships to prepare students and advance their labor market success.

Led by the WWC’s postsecondary contractor, Abt Associates, this practice guide was created by an expert panel of researchers and practitioners to provide examples of career pathways strategies and components and guidance to implement them; advise on strategies to overcome potential obstacles; and summarize evidence associated with rigorous research studies that met WWC standards.

As a long-time researcher of postsecondary CTE and many other important aspects of community college education, I welcome the opportunity to reflect on these five recommendations. I hope that my blog will help readers understand how this new practice guide fits into a larger landscape of research focusing on programs, policies, and practices aligned with the career pathways framework. Far from new, the notion of career pathways goes back several decades; thus, it is not surprising that we see an evolution in research to measure students’ education and employment outcomes. And still, there is a need for more rigorous studies of career pathways.

The Abt team located about 16,000 studies that were potentially relevant to the practice guide. Those studies used a wide variety of methods, data (quantitative and qualitative), and analysis procedures. Only 61 of them were eligible for review against the WWC standards, however; and only 21 of those met the WWC standards. Interestingly, most of those 21 studies focused on non-degree postsecondary credentials, rather than on college degrees, with policies and programs associated with workforce development and adult education well represented. Thus, lessons from the practice guide speak more directly to career pathways programs that culminate in credentials below the associate degree level than about those programs leading to the associate or baccalaureate degree level.

This dearth of rigorous career pathways research is problematic, as educational institutions of all types, including community colleges, seek to deliver positive, equitable outcomes to students during and beyond the COVID-19 pandemic.

Focus on Career Pathways

After examining the evidence from the studies that met the WWC standards, it was clear that the evidence converged around career pathways programs following requirements in the Strengthening Career and Technical Education for the 21st Century Act and Workforce Innovation and Opportunity Act (WIOA). In alignment with the WIOA definition of career pathways, the set of studies in the practice guide examine a “combination of rigorous and high-quality education, training, and other services” that align with the skill needs of industries in the region or state and accelerate participants’ educational and career advancement, to the extent practicable.

As defined by WIOA, career pathways support learners in pursuing their education and career goals, lead to at least one postsecondary credential, and provide entry or advancement in a particular occupation or occupational cluster. Because a growing number of community colleges employ a career pathways approach, as advocated by the federal legislation, it made sense to focus the practice guide on rigorous results and evidence-based recommendations that may help to move career pathway design and implementation forward.

The Five Recommendations

Recommendation 1: Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment. Our panel advocated for the intentional design and structure of career pathways for good reason. Whereas all educational institutions enroll students in courses and programs, career pathways prioritize the student’s entire educational experience, from access and entry, to completion and credentialing, and on to employment and career advancement. This purposeful approach to supporting student attainment is theorized to lead to positive student outcomes.

Applying the meta-analysis process required by the WWC, we determined from the 21 studies whether career pathways were achieving this crucial goal. We found nine of the studies showed overall statistically significant, positive results on industry-recognized credential attainment. Of the 12 studies supporting this recommendation, most  measured non-degree credentials; only two measured degree attainment—an important point to recognize, because these are the studies that have been conducted thus far.

This very small number of rigorous studies measuring degree attainment leaves open the question of whether career pathways increase postsecondary degree attainment—specifically the predominant credential in the community college context, the associate degree—and calls for greater investment in research on student completion of associate degrees (as well as baccalaureate degrees, a growing phenomenon in the United States).

Recommendation 2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways. Studies that met WWC standards showed a positive impact of career pathways on college credit accumulation and industry-recognized credential attainment. Only one study measured postsecondary degree attainment relative to contextualized and basic skills instruction and it reported statistically significant and negative results. However, descriptive and correlational studies suggest that contextualized and basic skills instruction contribute to positive educational outcomes for students enrolled in Adult Basic Education in addition to postsecondary CTE and workforce training.

That results of rigorous research complement descriptive studies, some of which provide rich details on program implementation, is information useful for scaling up community college career pathways. Having said this, we still need to know more about how contextualized, basic skills instruction—and other applied instructional interventions—affect the outcomes of students, especially those from racial minoritized groups, with low incomes, and who are the first generation to attend college, all purported to be well served by the career pathways approach.

Recommendation 3: Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways. Studies supporting this recommendation focused on five education outcomes: industry-recognized credential attainment, academic performance, technical skill proficiency, credit accumulation, and postsecondary degree attainment. As seen with the previous two recommendations, results on industry-recognized credential attainment were statistically significant and positive. Results on academic performance, technical skill proficiency, and credit accumulation were indeterminate, meaning findings could be positive or negative but were not statistically significant.

What is important to reiterate here is that nearly all the studies that met the WWC standards focused on non-degree credentials, providing limited information about results on the education outcome of postsecondary degree attainment. To be clear, our panel is not saying career pathways should focus exclusively on non-degree credentials; rather that results on postsecondary degree attainment are not definitive. Even so, that findings linking flexible scheduling and non-degree credential attainment are positive is important to know now, when the country is dealing with the pandemic.

Community colleges nationwide are rethinking instructional delivery to better meet students’ dire health, family, and employment needs. Rigorous research on career pathways interventions, such as flexible delivery, is needed, particularly studies involving diverse student populations. In times of economic and social struggle, it is essential that community college career pathways produce the equitable outcomes they purport to provide.

Recommendation 4: Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways. The rigorous studies meeting WWC standards and measuring outcomes relative to comprehensive student supports focused on the education outcome domain only. Similar to the previous recommendation on flexible scheduling, findings on industry-recognized credential attainment were statistically significant and positive. However, on supports, findings on credit accumulation were statistically significant and positive, reinforcing findings generated by other studies showing holistic supports improve student outcomes. For example, a meta-analysis of studies of the Trade Adjustment Assistance Community College and Career Training grants that used rigorous evaluation designs reported favorable results for holistic supports in counseling and advising, case management, and various other support services and educational outcomes.

Consistent with the recommendations in this practice guide, a growing body of evidence favors integrating comprehensive student supports with career pathways. These supports are intended to meet the needs of the diverse population of students who attend community colleges; so, they should demonstrate equitable results on educational outcomes. More rigorous research is needed to measure whether and how career pathways provide access, opportunity, and outcomes for racially minoritized, low-income, and other underserved student groups. These studies should ascertain the impact of student supports on both education and employment outcomes, recognizing that students seek a high-quality credential and a good job that offers economic security and career mobility.

Recommendation 5: Develop and continuously leverage partnerships to prepare students and advance their labor market success. This recommendation specifically emphasizes labor market success, based on studies that examine labor market outcomes only. Supporting this recommendation were findings from studies of four labor market outcomes: short-term employment, short-term earnings, medium-term employment, and medium-term earnings. (The studies did not include long-term findings.)

Overall, statistically significant and positive outcomes were found in the meta-analysis for short-term employment, short-term earnings, and medium-term earnings. However, for medium-term employment, the meta-analysis results were indeterminate. To clarify, this does not mean employment-focused partnerships do not lead to labor market success; instead it points to a dearth of research that tracks students through training and into employment for long enough to measure long-term outcomes.

Even so, these initial findings from the meta-analysis are promising and suggest that developing and leveraging such partnerships may help move the needle on short- and medium-term employment outcomes. Longitudinal research that tracks students for periods sufficient to know whether long-term employment and earnings are affected should be a priority in the future.

Moving Forward

As I reflect on the research that I have conducted on career pathways over the years, I am gratified to see mounting evidence of positive student outcomes. As a first-generation college student myself, it has always made sense to me to demystify the college education process. Helping learners understand the entire educational journey, from start to finish, is bound to help them see how what they are learning may contribute to future education and career choices. I went to college not knowing what it would be like or whether I would be able to succeed, and I benefited from faculty and advisors who helped me see how my future could progress.

For other students like me who enter college without the benefit of family members sharing their stories of college-going, and for those who have to balance school with work and family care-taking responsibilities, it is important to know how a college education, including postsecondary CTE, can lead to positive educational and employment outcomes. Student groups underserved by postsecondary education deserve our most resolute and far-reaching efforts.

To this end, additional rigorous evidence on the impact of postsecondary CTE on college degree attainment could help to inform career pathways design, funding, and implementation. Also, as I reflected on the five recommendations, I was struck by the modest amount of research on medium-term labor market outcomes and the lack of any studies of long-term labor market outcomes. When the focus of career pathways is creating a path to living-wage employment and career advancement over the long term, it isn’t enough to know that students’ immediate employment outcomes were improved. When many students attending community colleges are already working, it isn’t even clear what immediate employment means.

If the outcome of interest for the majority of community college students who are adults and working is whether they get a better job and higher salary than they were getting pre-education, more nuanced measures and longer follow-up periods are needed than those provided by any of the research reviewed for this practice guide. It seems to me that finding more evidence of medium- and long-term outcomes could also provide more useful evidence of how career pathways work for diverse learner groups who are under-studied at the present time.

I was honored to help develop the practice guide with Hope Cotner, Grant Goold, Eric Heiser, Darlene Miller, and Michelle Van Noy. What an enormously gratifying experience it was to work with these professionals, the WWC team at Abt, and the Institute of Education Sciences staff. Working on this practice guide has left me feeling more optimistic about what we could learn with a more sizeable federal investment in research on postsecondary CTE in general, and on career pathways specifically. Rigorous evidence is needed to test models, explore interventions, and understand results for the plethora of learner groups who attend community colleges.

As the nation struggles to pull out of the pandemic that continues to rage in pockets across the country, it is the right time to invest in research that helps prepare students for good jobs that advance living-wage careers over a lifetime. A true commitment to equity in CTE programming is necessary for the nation, and now is the time to invest.

_____________________________________________________________________________________________________________

Debra D. Bragg, PhD, is president of Bragg & Associates, Inc., and the founder of research centers focusing on community college education at the University of Illinois at Urbana-Champaign and the University of Washington. She spent the first 15 years of her career in academe studying postsecondary CTE for federally funded research centers, having devoted her entire research agenda to improving education- and employment-focused policies, programs, and practices to create more equitable outcomes for community college students. She served as an expert panelist for the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide.

 

 

New Analysis Reveals Differences in Parents’ Satisfaction With Their Child’s School Across Racial/Ethnic Groups

The National Household Education Surveys (NHES) program collects nationally representative, descriptive data on the educational activities of children and families in the United States. Specifically, NHES’s Parent and Family Involvement in Education (PFI) survey collects data about how families of K–12 students connect to their child’s school. Parents are asked questions about their involvement in and satisfaction with their child’s school as well as school choice.

This blog expands on the PFI First Look report, and more analysis of race and ethnicity in education and early childhood is available in new web tables.

The results from 2019 PFI survey—which was administered before the coronavirus pandemic—show differences across racial/ethnic groups1 in parents’ satisfaction with their child’s school. Overall, White students tended to have parents who were “very satisfied” with their child’s schools, teachers, and academic standards at the highest rates. 

Satisfaction with schools

In 2019, about two-thirds (67 percent) of White students had parents who were “very satisfied” with their child’s school (figure 1). This percentage was higher than the percentages for Hispanic students (64 percent), Asian or Pacific Islander students (61 percent), Black students (59 percent), and “Other race” students2 (57 percent).

A higher percentage of Hispanic students had parents who were “very satisfied” with their child’s school (64 percent) than did Black students (59 percent) and “Other race” students (57 percent).


Figure 1. Percentage of students enrolled in kindergarten through 12th grade whose parent/guardian reported being "very satisfied" with the student’s school, by student’s race/ethnicity: 2019

\1\"Other race" includes non-Hispanic students of Two or more races and non-Hispanic students whose parents did not choose any race from the categories provided on the race item in the questionnaire.
NOTE: Race categories exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2019.


Satisfaction with teachers

Sixty-six percent of White students had parents who were “very satisfied” with their child’s teachers in 2019 (figure 2). This percentage was higher than the percentages for Hispanic students (62 percent), Black students (60 percent), “Other race” students (58 percent), and American Indian or Alaska Native students (49 percent). The percentage for Asian or Pacific Islander students was not measurably different from the percentages for any other racial/ethnic group.

Figure 2. Percentage of students enrolled in kindergarten through 12th grade whose parent/guardian reported being "very satisfied" with the student’s teachers, by student’s race/ethnicity: 2019

\1\"Other race" includes non-Hispanic students of Two or more races and non-Hispanic students whose parents did not choose any race from the categories provided on the race item in the questionnaire.
NOTE: Race categories exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2019.


Satisfaction with academic standards

In 2019, about 64 percent of White students had parents who were “very satisfied” with the academic standards of their child’s school (figure 3). This percentage was higher than the percentages for Black students and Hispanic students (60 percent each), Asian or Pacific Islander students (56 percent), and “Other race” students (55 percent). The percentage for American Indian or Alaska Native students was not measurably different from the percentages for any other racial/ethnic group.

Figure 3. Percentage of students enrolled in kindergarten through 12th grade whose parent/guardian reported being "very satisfied" with the academic standards of the student's school, by student’s race/ethnicity: 2019

\1\"Other race includes non-Hispanic students of Two or more races and non-Hispanic students whose parents did not choose any race from the categories provided on the race item in the questionnaire.
NOTE: Race categories exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2019.


Explore the NHES Table Library to find more data about differences in parents’ satisfaction with their child’s school.


[1] Race categories exclude students of Hispanic ethnicity, which are all included in the Hispanic category.
[2] "Other race" includes non-Hispanic students of Two or more races, and non-Hispanic students whose parents did not choose any race from the categories provided on the race item in the questionnaire..

 

By Rachel Hanson and Jiashan Cui, AIR