NCEE Blog

National Center for Education Evaluation and Regional Assistance

Regional Educational Laboratories Develop New Tools for Educators Based on WWC Practice Guides

Whenever we get the chance to share information about the Regional Educational Laboratory (REL) program with the public, we’re often asked, “How are the 2022-2027 RELs different from past REL cycles?” In this blog, we focus on one major new effort that each REL is undertaking: the creation of a toolkit for educators based on one of the What Works Clearinghouse (WWC) Practice Guides. Each REL toolkit will include a set of resources for educators to implement and institutionalize evidence-based recommendations from a WWC Practice Guide. Importantly, each REL is co-developing their resources with educators, school, and district leaders or with postsecondary faculty and staff to ensure the toolkits’ relevance and actionability. Following the toolkit development phase, RELs will partner with educators not involved in developing the toolkits to test the usability of each toolkit and its efficacy in improving student and teacher outcomes. The RELs have current partners for toolkit development and usability testing but are looking for partner schools, districts, and postsecondary institutions in which to test the efficacy of the toolkits. These efficacy-testing partners will be among the first to benefit from the evidence-based toolkits.

Why this investment of REL and partner time and resources? WWC Practice Guides are among IES’ premier resources for translating evidence on effective practice into accessible and usable strategies for educators. Each Guide is based on a synthesis of the most rigorous research on teaching a particular subject or achieving a particular education goal. Each Guide is also based on the input of a panel of expert practitioners and researchers and includes—

  • Key recommendations for educational practice based on a synthesis of rigorous research
  • Supporting evidence for each recommendation
  • Steps to carry out each recommendation
  • Examples of the practices
  • Discussions of common implementation challenges and strategies for overcoming those challenges

WWC Practice Guide Associated with Each REL Toolkit:

REL

Practice Guide

Appalachia

Teaching Math to Young Children

Central

Teaching Strategies for Improving Algebra Knowledge in Middle and School Students

Mid-Atlantic

Teaching Elementary School Students to Be Effective Writers

Midwest

Developing Effective Fractions Instruction for Kindergarten through 8th Grade

Northeast & Islands

Assisting Students Struggling with Math: Intervention in the Elementary Grades

Northwest

Using Technology to Support Postsecondary Student Learning 

Pacific

Teaching Secondary Students to Write Effectively

Southwest

Providing Reading Interventions for Students in Grades 4 – 9

Southeast

Assisting Students Struggling with Reading: Response to Intervention (RtI) and Multi-Tier Intervention in the Primary Grades

West

Improving Reading Comprehension in Kindergarten Through 3rd Grade

RELs Emphasize Active Learning to Support Implementation of Evidence-Based Practices

Although WWC Practice Guides are some of IES’ most popular products, we also know that teachers and leaders cannot simply read about a new practice to master it. Instead, they need to engage in active learning by observing the new practice, discussing it, implementing it, receiving feedback on the practice, and continuing to improve. The REL toolkits are designed to support educators in the creation and implementation of a professional learning community (PLC) focused on the evidence-based practices outlined in a WWC Practice Guide. In these PLCs, educators will learn about the Practice Guide recommendations by reading about the practices, discussing them with colleagues, and by developing plans for implementing the practices in their classrooms. Educators will also put those plans into action and then debrief on those implementation experiences. To support this work, the toolkits will include PLC guides, workbooks, self-study guides, and rubrics. Some toolkits will also include videos of teachers effectively implementing the practices.

Each toolkit will also include the following:

  • An initial diagnostic and ongoing monitoring instrument for assessing instructional practices against the practices recommended in the WWC Practice Guide
  • A tool that enables teachers, teacher leaders, and administrators to assess the extent to which their school, district or postsecondary institution supports the implementation and ongoing monitoring of the evidence-based practice recommendations
  • A discussion of implementation steps for institutionalizing supports that help educators, building leaders, and other administrators adopt the evidence-based practices and sustain them over time

Some RELs have already started usability testing of their toolkits. Across 2025 and 2026, nine of our 10 RELs will publish final versions of their toolkits and efficacy studies on their toolkit. Both will be freely available on the REL website.[1] Visit our Newsflash page and sign up to receive newsflashes from the RELs and the IES center that houses the program—the National Center for Education Evaluation and Regional Assistance (NCEE).[2]

Partner with RELs: Help IES Study REL Toolkits

RELs will soon recruit partner schools, districts, and postsecondary institutions in their regions to conduct the toolkit efficacy studies. If you are interested in having your school, district, or institution participate in an efficacy study and benefit from being one of the first users of these toolkits, please email us at Elizabeth.Eisner@ed.gov or Chris.Boccanfuso@ed.gov. The efficacy study for each REL’s toolkit must take place within each REL’s region. Not sure which REL region is yours? Check out the “About the RELs page” on the IES website or the map visualization on our program homepage.

If you have other questions, concerns, or ideas about this work, please reach out to us. We welcome your input so that you can help IES and the RELs make the toolkits as useful and effective as possible.

Past REL Professional Development Resources based on WWC Practice Guides:

The RELs have a successful track record of creating professional development resources that complement WWC Practice Guides. For example, see:

Professional Learning Community: Improving Mathematical Problem Solving for Students in Grades 4 Through 8 Facilitator’s Guide (REL Southeast).

Professional learning communities facilitator’s guide for the What Works Clearinghouse practice guide: Foundational skills to support reading for understanding in kindergarten through 3rd grade (REL Southeast).

Professional Learning Communities Facilitator's Guide for the What Works Clearinghouse Practice Guide Teaching Academic Content and Literacy to English Learners in Elementary and Middle School (REL Southwest).

The new toolkits will expand the number of WWC Practice Guides for which the RELs develop professional development resources and will also provide instruments for assessing instructional practice and implementing institutional supports. 

Liz Eisner, Associate Commissioner for Knowledge Use

Chris Boccanfuso, REL Branch Chief


[1] REL Southwest’s contract started 11 months after the contracts of the other 9 RELs, so the REL Southwest toolkit will be released in 2027.

[2] You can also sign up for Newsflashes from IES and its other three centers—NCES, NCER, & NCSER.

Taking a pause…

In a first for ERIC, the supply of education research content the program aspires to index is regularly exceeding its capacity to do so. For the past 15 years, ERIC has consistently indexed 4,000 records a month. This pace has allowed us to index all approved sources without significant delays. However, over the past two years, the volume of content published in our approved sources has doubled. This has resulted in a backlog of publishable content and, as a result, ERIC cannot index new work in a timely fashion.

As part of our periodic collection analyses, we have been investigating potential causes of this backlog. We have found that several journals are publishing far more content than when they were originally selected to be included in ERIC. For example, one journal was publishing fewer than 20 articles per year when approved, but now is publishing over 850 per year. This is close to a 5000% increase in production.

An increased volume of published work in the education sciences—as a whole—is a good thing. However, when an individual journal dramatically increases the number of articles it publishes, it is noteworthy. In those instances, ERIC wants to ensure that the journal is still adhering to the standards and criteria it met when originally included in the index, including rigorously applying the peer-review process, if applicable, and maintaining its original aim and scope. Both are important to ensuring that work contained in ERIC is of high quality and that a wide range of key topics in education can be indexed.

ERIC’s Selection Policy already requires an ongoing review of currently indexed sources, including identifying sources that may no longer meet the Policy’s standard and criteria. As part of that review, ERIC will now identify journals that have published over twice as many articles from the year it was accepted and flag them for further review. As part of that review, ERIC will assess whether the increase is temporary and associated with a unique event, like a special issue, or if the increase reflects an ongoing trend. If the review indicates the increase is persistent, the ERIC team will recommend that indexing of that journal be paused. If a pause is approved, ERIC will stop indexing subsequent issues for a two-year period. The journal will also be removed from ERIC’s Journals List, because this list only contains actively indexed journals.

After the two-year pause, ERIC will re-review the journal. If ERIC reinstates the journal, ERIC will notify the journal and will index any content published during the two-year pause. If the journal is not reinstated, ERIC will not index any issues published during the pause.

This decision has led to a few questions:

  • Why can’t ERIC index everything? ERIC is funded by the U.S. Department of Education’s Institute of Education Sciences and has limited resources. It must prioritize indexing the highest quality education research, include content for all topic areas, and can only index a set number of records per month.
  • What is the concern with the increased volume? Increased volume may signal that the journal has changed scope to accommodate a broader set of articles or that quality assurance processes have been affected in a rush-to-publish environment. Particularly among journals that were considered to be peer reviewed when originally accepted into ERIC, the publication of articles shortly after submission may signal a substantive change in a journal’s quality assurance process such that they no longer meet the criteria needed to receive that designation. There is also a concern that if a journal greatly produces more records than estimated, the collection will get skewed in a way which would favor one topic area over another.
  • How will ERIC identify the journals to pause? As part of the source selection process, ERIC will monitor two years of current publishing and compare the number of articles published to the number published during the year the journal was selected for ERIC.
  • Why is the pause for two years? The ERIC Selection Policy says that sources may be reviewed after 2 years (24 months). To be consistent with this policy, we will automatically review paused sources after this same time frame.
  • How will I know if my journal is paused? ERIC will email the journal representatives to inform them of this decision in the coming weeks.
  • Can my journal appeal the decision to be paused? Journals may not appeal the decision to pause indexing. However, if at the end of the pause period the journal is not automatically reinstated, journals may apply for re-review 24 months later.
  • What is ERIC looking for in the automatic review at the end of the pause period? ERIC will conduct a full review of the journal and consider the two years of published content against the criteria set forth in the ERIC Selection Policy.
  • Can authors submit their article published in a “paused” journal via online submission during this period? No, authors must wait for a re-review of the journal to be conducted after the two-year pause. If the journal is selected again, the articles from the paused issues will be indexed in ERIC.

How State Education Agencies Can Leverage Their Regional Educational Laboratory to Support Students’ Academic, Social, and Mental Health Needs

(A Dear Colleague Letter sent to Chief State School Officers on February 23, 2023.)

Dear Colleague:

As state and local education agencies leaders reflect upon the successes and challenges of the 2022-2023 school year—and the opportunity that summer 2023 presents to further support students’ academic, social, and mental health needs—I am writing today to encourage you to take full advantage of the services offered by your Regional Educational Laboratory (REL).

The REL Program, sponsored by the U.S. Department of Education’s Institute of Education Sciences, supports educators and policymakers at the state and local levels in the use of data and evidence-based practices to improve student outcomes. All REL services are provided free of charge and are designed in partnership with state and local partners to meet their specific needs. Each REL is led by a Director with deep expertise in education policy, practice, and research who can help you navigate how best to leverage REL supports to address your state’s most pressing needs. A list of REL Directors, including their contact information, is attached.

Your REL can support a wide range of state and local initiatives. They include:

  • Analyzing student progress and outcome data (e.g., achievement, chronic absenteeism, graduation rate, English language proficiency) to understand the ongoing effects of the COVID-19 pandemic. Your REL can analyze longitudinal student data provided by the state or district partners to better understand the trajectory of student performance prior to the pandemic, during the pandemic, and today. When disaggregated by student group, school characteristics, or other relevant features, these analyses can support decision-makers in focusing resources, monitoring improvement, and adjusting implementation efforts. RELs Midwest and Mid-Atlantic recently provided similar services for their state and district-level partners.
  • Supporting the identification of existing, or the design of new, evidence-based practices to meet students’ academic, social, and mental health needs. Your REL can support state and local efforts to identify practices that prior evidence suggests can promote learning and development. REL Southeast recently published a review on the effectiveness of early literacy interventions across several domains in response to a request from partners regionwide. When high-quality evidence does not exist, or existing practices are not well-aligned to state or local needs, RELs can support efforts to design and pilot research-based innovations.
  • Coaching state and local education agency staff on the use of data to improve the ongoing implementation of education policies, programs, and practices. Your REL offers coaching and training services for state and local leaders on data-driven approaches to continuous quality improvement. These services are particularly beneficial when a program is relatively new to a state or district and leaders are focused on timely feedback to ensure an evidence-based practice is well-implemented at scale. REL Southwest recently supported the Oklahoma State Department of Education’s (OSDE) rollout of Oklahoma Excel, a data-driven and job-embedded professional development program for educators in participating districts. A 2-part video series provides background on the program, and the supports REL Southwest provided to OSDE staff who administer the program.
  • Evaluating the impact of state or local interventions on important student outcomes. Your REL can support the rigorous evaluation of well-implemented policies, programs, or practices to document those efforts’ impacts on important student outcomes. For example, a 2021 REL Northwest study examined the implementation and impact of full-day kindergarten in Oregon in light of a funding structure shift that incentivized districts to offer the programming. When a rigorous evaluation is not feasible, your REL can advise you on credible, alternative approaches to understanding the outcomes associated with a policy or program.
  • Coaching state or local education agency staff on the use of existing REL tools and resources. Through their work with state and local partners, RELs have developed a wide range of actionable resources designed to support the implementation of evidence-based practices. Your REL can coach state and local education agency staff on how to  customize and use tools developed elsewhere to meet your needs. Examples include REL Appalachia’s Community Math Night Facilitators’ Toolkit and REL Southeast’s Professional Learning Community on Emergent Literacy.
  • Providing expert guidance to senior state or local education agency leaders. Finally, your REL can leverage its network of internal and external experts to offer guidance on data- and evidence-driven approaches to addressing problems of policy and practice. This “Ask-an-Expert” service is available to senior leaders of both state and local education agencies. A recent REL Appalachia “Ask an Expert” response to a Tennessee-based partner shared best practices for administering and using data from Kindergarten readiness screeners.

REL Directors are routinely in contact with senior education agency leadership as part of their on-going work to better understand the kinds of supports that might benefit states in their region. However, if you or senior members of your leadership team have not yet had the opportunity to meet with your REL Director (or have not done so recently), please consider contacting them at your convenience. I am also glad to facilitate that connection at your request.

Finally, I would be remiss if I did not mention the critical relationship between your REL and the Regional Comprehensive Center (RCC) that serves your state. Sponsored by the Department’s Office of Elementary and Secondary Education, RCCs support state education agencies in their efforts to implement evidence-based policy and practice and realize the goals set in their Consolidated State Plans.

If you have any questions about the REL Program, please do not hesitate to contact me or a senior member of my team.

Sincerely,

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance
Matthew.Soldner@ed.gov 

Note: This blog reflects slight edits to the letter sent to Chief State School Officers. References to an attached brochure and a contact list for REL Program staff have been removed.

Leading with Evidence: Celebrating Evidence-Based Policymaking at the U.S. Department of Education

Matthew Soldner,

Commissioner, National Center for Education Evaluation and Regional Assistance &

Chief Evaluation Officer, U.S. Department of Education

(Remarks made on November 17, 2022, at Results for America’s 2022 Invest in What Works Federal Standards of Excellence events.)

If you walk into almost any kindergarten classroom today, you’ll almost always find a “Star Chart.” It’s usually displayed with pride next to the chalkboard in the front of the room. On it, the name of every student in the class. And next to each name, a long row of stars, each signifying a task earnestly mastered by a young learner. (More than forty years later, I can still remember the pride associated with finally getting a gold star for “can tie shoes,” which I was seriously delayed in getting due to an overabundance of shoes secured with Velcro in my youth.)

Today, I’m very proud to acknowledge that the U.S. Department of Education has received its own gold star. It was bestowed by Results for America, which advocates for the use of evidence in federal, state, and policymaking to improve outcomes for students, families, and communities, as part of its 2022 Federal Standards of Excellence program. Each year, participating federal agencies are evaluated on their progress in using high-quality evidence as a “north star” in policymaking. This year, Education was recognized alongside two other agencies – the Millennium Challenge Corporation and the U.S. Agency for International Development – as a top-scorer, earning the “Gold” designation.

ED’s national leadership in using evidence to inform policymaking has been a journey that now spans more than two decades. Its roots can be traced to the 107th Congress, which in 2001 reauthorized the Elementary and Secondary Education Act of 1965 as the No Child Left Behind Act and, in 2002, passed the Education Sciences Reform Act. The latter authorized the founding of a group toward which I’m somewhat partial: the Institute of Education Sciences (IES). (I would invite you to join my colleagues and me in celebrating #IESat20, now through mid-2023!) But no single event, and no distinct component of the Department, is individually responsible for our success building evidence about “what works” in education and putting that evidence to work to better serve learners, educators, and communities. I often say “evidence-building is a team sport at the Department of Education,” and it truly does take the commitment of talented professionals from across the organization to make it a reality.

This year, that team has been particularly busy. Department-wide, we have supported states, school districts, and institutions of higher education in their continued efforts to meet the challenge of pandemic recovery. Much of that work has focused on the use of evidence-based practices to accelerate learning for all students, making the most of historic investments in education such as the Coronavirus Aid, Relief, and Economic Security (CARES) and American Rescue Plan (ARP) Acts. Key partners in that work include IES’ Regional Educational Laboratories, operated by the National Center for Education Evaluation and Regional Assistance (NCEE);  the Office of Elementary and Secondary Education’s Comprehensive Centers; the Office of Special Education and Rehabilitative Services’ Technical Assistance and Research Centers; and the Office of Planning, Evaluation, and Policy Development’s Grant Policy Office.

Elsewhere in the Department, the emphasis has been on evidence-building. Here, IES has taken a particular leadership role. The National Center for Education Statistics’ (NCES’) School Pulse Panel is a critical new component in our evidence-building infrastructure. The Pulse allows us to more rapidly collect and report descriptive information about conditions on the ground in our nation’s schools, addressing topics from the extent of staffing shortages to the programs schools are offering to support learning acceleration. That and other information supports a vibrant research and development infrastructure, led by the National Centers for Education Research (NCER) and Special Education Research (NCSER). In addition to their regular education grant programs, both Centers ran special competitions in Fiscal Year 2022 specifically designed to support pandemic recovery, including those aimed at better leveraging longitudinal data to support state recovery policymaking and building evidence about the approaches states and districts used to address the pandemic, and, when possible, their effectiveness.  

In Fiscal Year 2023, more good work is already underway.

First, I would be remiss if I did not acknowledge an important investment this most recent Congress has made in the evidence-building work of the Department: authority as part of the Consolidated Appropriations Act of 2022 to reserve up to one-half of one percent from selected programs authorized by the Higher Education Act of 1965, as amended, to support high-quality research and evaluation related to the Department’s postsecondary programs. A similar set-aside for the Department’s K-12 programs dramatically catalyzed our ability to build and use evidence there – and I have every confidence this new authority, if continued, would do the same for our postsecondary portfolio.

Second, and consistent with my belief that “evidence building is a team sport,” I want to take a moment to encourage you to join the team! As an initial step, I’d like to invite you to join me and special guests from the Department in our new webinar series creatively entitled “Evidence-based Policymaking at ED: Introducing the U.S. Department of Education’s Inaugural Learning Agenda.” Across three installments, we’ll discuss the Department’s evidence-building priorities in three areas: the educator workforce; meeting students’ social, emotional, and academic needs; and increasing postsecondary value. In each, we’ll dig a bit deeper into each topic and its nexus with an equitable recovery from the COVID-19 pandemic.

I hope – but cannot promise – that Education will, this time next year, report that we’ve earned another gold star for building and using evidence in service of smart policymaking. What I can promise is that, because both educators and education policymakers will continue to need high-quality evidence to do their best work on behalf of the nation’s learners, we will do our best to help them meet challenges both old and new. Thanks to Results for America for today’s recognition, and to all those who support our nation’s students, educators, and communities every day.

 

How the 2017-2022 Cohort of RELs Supported the Use of Evidence in Education

Three adults discuss a chart that is displayed on a laptop.

This winter is a special season that comes along once every five years for the Regional Educational Laboratory (REL) program at IES. It’s a winter when the REL team manages the end of five-year REL contracts and oversees the launch of nine new REL contracts.[i]  During this exciting time, we actively reflect on the successes and lessons of the previous contracts—the 2017-2022 REL cohort—and channel those reflections into our work with the 2022-2027 REL cohort. 

As I collaborate with the REL team on the new RELs, I want to share some of the successes of the RELs that finished their work early this year. We expect the new RELs to build on these successes and to engage in new, innovative work that I will discuss in a future blog.

As we look back at the large body of work that the last cohort of RELs produced, I want to share some exciting results. Over three-quarters of participants in REL researcher-practitioner partnerships who responded to the REL Stakeholder Feedback Survey (SFS) reported that they used or were in the process of using the research or data that they learned about through the REL partnerships. On average across the last three years, an additional 17 percent reported that they were making plans to use research or data presented by the REL:

Image of a chart entitled “Responses to REL Partnership Stakeholder Feedback Survey (SFS).” The chart shows that in 2019, 77 percent of 695 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 19 percent said they were making plans to use the research, and 4 percent said they had no plans to use the research; in 2020, 81 percent of 397 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 17 percent said they were making plans to use the research, and 2 percent said they had no plans to use the research; and in 2021, 82 percent of 582 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 15 percent said they were making plans to use the research, and 3 percent said they had no plans to use the research.

While these survey results are promising, I want to provide a more vivid picture of how the RELs partnered with stakeholders to use evidence to improve teaching and learning. Read on to learn how REL work has been integral to education policy and practice across the country.

REL Mid-Atlantic and REL Southeast both engaged in projects that supported efforts to safely educate students during the pandemic:

  • In Pennsylvania, REL Mid-Atlantic helped the Pennsylvania Department of Education (PDE) provide evidence to inform the reopening of schools in the state during the first year of the COVID-19 pandemic. REL Mid-Atlantic worked with PDE to produce an extensive memo that included (1) a rapid review of existing evidence on public-health and educational issues relevant to the reopening of schools, (2) findings from interviews with a cross-section of stakeholders from across Pennsylvania to assess concerns and challenges related to reopening, and (3) agent-based modeling simulations of the potential spread of COVID-19 under alternative approaches to reopening schools.  The two largest school districts in the state—the School District of Philadelphia and the Pittsburgh Public School District—along with at least 25 other school districts and one Catholic archdiocese drew on the findings in the memo to make decisions about whether and how to reopen schools. 
  • Shortly after two of four of REL Southeast's teacher guides were released in early 2020, schools across the country shut down due to the COVID-19 pandemic. The REL realized that the content of the guides—originally created to support teachers in working with families to facilitate their children’s literacy development—would be immediately useful to parents across the county who were suddenly thrust into the role of teacher for their children at home. The content of the guides was based on the What Works Clearinghouse Practice Guide: Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade.

REL Southeast made all the content, which included videos and activities, available on the REL website so that parents could easily access them and use them to support their children during that difficult time.The landing page for the content—Supporting Your Child's Reading at Home—has been visited nearly 130,000 times since April of 2020. And landing pages for the four guides for teachers—A Kindergarten Teacher's Guide, A First Grade Teacher's Guide, A Second Grade Teacher's Guide and A Third Grade Teacher's Guide—have each been accessed between 1,300 and 7,500 times since their release. 

REL West and REL Midwest both worked with states in their regions to support student health and the need to identify and recruit more teachers.  These topics proved to be particularly  important as a result of the pandemic:

  • Robla Elementary School District (RESD) and several other districts in California’s Central Valley began offering telemedicine services during the 2017/18 school year as part of a broader “whole-child” strategy for improving student health, well-being, and attendance. Telemedicine is the remote evaluation, diagnosis and treatment of patients using telecommunications technology. RESD contracted with and paid Hazel Health, a telemedicine provider that operates virtual health clinics in school settings.  The telemedicine visits were free to students and families and did not require scheduled appointments. To learn more about the implementation of the program and whether it was associated with students staying in school throughout the day, RESD enlisted REL West for assistance.

REL West's study of the telemedicine services found that districtwide, a little over one-quarter of students used the services at least once over two years, with nine percent of students using telemedicine multiple times. Non-communicable physical illnesses/conditions such as stomach aches, headaches, allergies, and asthma were consistently the most common reason for school-based telemedicine visits across the two years of implementation. Ninety-four percent of all telemedicine visits resulted in students returning to class and receiving, on average, three more hours of instruction in the school day. Approximately 39 percent of Black students used telemedicine services compared with 17 percent of Asian students. Due to these findings, the district decided to continue with the program. The telemedicine provider is working to identify possible reasons for the differences in use by different student groups to ensure that all students are comfortable accessing the services.

  • Even before the COVID-19 pandemic, Michigan was experiencing teacher shortages in several subjects and geographic areas. This led Michigan members of the REL Midwest Alliance for Teacher Preparation to partner with the REL to examine nonteaching-certified teachers’ reasons for not teaching and incentives that could motivate them to return to the classroom. The REL Midwest study found that salary and certification/recertification requirements were among the most frequent barriers to teachers entering or remaining in the teaching profession.

As a result, the Michigan Department of Education launched the “Welcome Back Proud Michigan Educator” campaign, which seeks to recruit nonteaching educators into the teacher workforce. The first wave of the campaign, which began in April 2021, recruited educators with expired teaching certificates by reducing—and in some cases eliminating—professional learning requirements for recertification. The second wave, which began in October 2021, recruited teachers who had a valid certificate but were not teaching in public schools. As of January 2022, 218 educators have been recertified or issued a teaching permit, and 27 educators are in the pipeline to reinstate their teaching credentials. Of those with valid certificates, 123 educators started in a teaching position in fall 2021 and an additional 244 educators took a non-teaching assignment, such as day-to-day substitute teaching.

Concerns about the lack of equity in educational opportunities and in disciplinary practices led stakeholders to partner with REL Appalachia and REL Northwest:

  • Throughout the country, students are often encouraged to study Algebra I in middle school so that they can take more advanced math courses in high school and can graduate with a college-ready diploma. Concerned that economically disadvantaged students and English learners might be taking Algebra I later than their peers and earning college preparatory diplomas at lower rates than other students, Virginia’s Department of Education asked REL Appalachia for assistance analyzing the state’s data. The REL researchers found that the Department of Education’s hypotheses were correct. They found that, among all 5th graders rated as “advanced proficient” on the state’s math assessment, economically disadvantaged and English learner students were less likely take Algebra before 9th grade and less likely to earn a college preparatory diploma. As a result of these findings, the Virginia Department of Education asked the REL to work with school districts across the state to analyze data to identify student course-taking patterns and to further examine district-level policies and practices that may be contributing to the inequitable course-taking patterns and outcomes. 
  • REL Northwest undertook several projects with the Equity in School Discipline (ESD) collaborative: a cross-state collaborative of districts, state education agencies, community-based organizations, and institutions of higher education in Oregon and Washington committed to increasing equity in school climate and discipline policies and practices. ESD sought to reduce the use of exclusionary discipline practices and to eliminate disproportionately high rates of exclusion for students who identify as American Indian, Black, and Hispanic. REL Northwest and ESD district leaders in four districts partnered to co-design and pilot training materials to help school and district teams increase equity in school discipline practices. REL Northwest also produced a tool so other districts and states can identify discipline disproportionality.

REL Pacific helped to make a language program more evidence-based:

  • Recognizing the role of the school in sustaining Kosrae’s cultural and linguistic heritage and preparing students for the global world, Kosrae Department of Education (KDOE) leaders reached out to REL Pacific for support in creating a new language immersion policy and program that better supports the goal of building student proficiency in both Kosraean and English. REL Pacific supported KDOE by providing coaching on the research behind effective bilingual education models, policy implementation frameworks, and language assessments. REL Pacific and Region 18 Comprehensive Center (RC18) subsequently collaborated to provide complementary supports to ensure KDOE had increased capacity to implement its bilingual language policy in schools across the island. As REL Pacific continued support in best practices in bilingual instruction, classroom observation, and teacher professional learning, RC18 provided supports such as bilingual materials development and financing options for the new policy. KDOE began piloting the new policy in two elementary schools in the fall of 2021.

REL Central supported Nebraska by providing evidence-based resources and training to support the implementation of new legislation:

  • In 2018, the Nebraska Reading Improvement Act was passed to decrease the number of struggling readers in grade 3 across the state. The Nebraska State Board of Education (NSBE) and the Nebraska Department of Education enlisted REL Central’s support in providing the state’s elementary school teachers with evidence-based practices for the teaching of reading. To meet this need, REL Central reviewed strategies in eight What Works Clearinghouse practice guides on reading, writing, and literacy instruction and distilled the information into summary documents that were aligned with the state’s initiative. Each document is featured on NDE’s NebraskaREADS website and each describes a practice guide recommendation, how it should be implemented, and discusses the appropriate grade level or target student population (for example, English learners). REL Central also provided trainings to support regional education service unit staff and school-based educators in reviewing, selecting, and testing evidence-based reading strategies.

Finally, through applied research studies, REL Northeast and Islands and REL Southwest helped education leaders answer important questions about whether students in certain localities had equitable access to important services. These studies informed leaders’ decisions about state programs or indicators:

  • In an effort to increase the percentage of children ready for kindergarten, Vermont passed Act 166 in 2014 that provided access to high-quality prekindergarten (pre-K) for all 3- and 4-year-olds and for 5-year-olds not yet in kindergarten. As universal pre-K began in the 2016/17 school year, officials were concerned about unequal distribution and availability of high-quality pre-K programs across the state. The Vermont Agency of Education, the Agency of Human Services’ Department for Children and Families, and Building Bright Futures (Vermont’s early childhood advisory council) participated in the Vermont Universal PreK Research Partnership with REL Northeast & Islands to answer these important questions. Through one study, the REL found that although the majority of pre-K children were enrolled in the highest quality programs, some children had less access to high quality programs in their home districts. These findings led the Vermont legislature to maintain a provision that allows families to enroll their children in programs outside their home district.
  • Texas House Bill 3 (HB3), a comprehensive reform of the state’s school finance system passed in 2019, established a college, career, and military readiness outcomes bonus, which provides extra funding to districts for each annual graduate demonstrating college, career, or military readiness under the state accountability system. Leaders at the Texas Education Agency (TEA) were concerned that it may be hard for small and rural districts to demonstrate career readiness through the required accountability measure. Through a partnership with TEA, REL Southwest conducted a study that found that there were no substantive differences by district size or locale with respect to the percentage of students meeting the career readiness standard. Further, the study found that students who fell into two of the alternative career readiness options—CTE completers and work-based learners—had higher rates of college enrollment than graduates who met the existing career readiness accountability standard. The study also indicated that CTE completers had higher rates of either college persistence or of credential attainment after high school than graduates who met the existing career readiness accountability standard. These findings led the Commissioner of Education to recommend, and the Texas legislature to create, a new measure of career readiness in the state accountability system that met the needs of the districts across the state.

From these examples, one takeaway is clear: REL work can make a difference. RELs supported educators’ and policymakers’ efforts to improve educational programs, policies, and outcomes through use of research and evidence-based practice between 2017 and 2022. The new RELs will continue this work and, as I will write about in a future blog, they will also undertake some new types of projects. Until then, please visit the new REL website or reach out to me at Elizabeth.Eisner@ed.gov  if you have questions about the REL program and how it can help your community.

Liz Eisner is the associate commissioner of the Knowledge Use Division at the National Center for Education Evaluation and Regional Assistance


[i] One REL contract—REL Southwest (REL SW)—is on a different schedule. The current REL SW contract ends in late November of 2022 and the next REL SW contract will begin the day after the current contract ends. The contracts that just ended were the 2017-2022 contracts and the contracts that just started are the 2022-2027 contracts.

Is believing in yourself enough? Growth mindset and social belonging interventions for postsecondary students

The WWC recently reviewed the strength of evidence for two types of interventions designed to help students succeed in college: one report focuses on growth mindset interventions and another on social belonging. The WWC found that (1) neither type of intervention had a discernible effect on full-time college enrollment, (2) social belonging interventions had mixed effects on progressing in college and academic achievement, and (3) growth mindset interventions had potentially positive effects on academic achievement. We asked Greg Walton, an Associate Professor at Stanford University, IES-funded researcher, and expert on these kinds of interventions, to discuss what college faculty, administrators, and students should make of these findings.  

Can you walk through how growth mindset interventions and social belonging interventions with postsecondary students work? Were the interventions reviewed by the WWC typical interventions in this space?

Growth mindset interventions focus on the underlying “implicit” beliefs students have about the nature of intelligence: Is intelligence fixed or can it grow? These beliefs inform how students make sense of everyday academic challenges in school. If you think that intelligence is fixed, that you either have it or you don’t, then a setback like a poor grade can seem to be evidence that you don’t have what it takes. That can make students avoid academic challenges, withdraw, and ultimately learn and achieve less. Growth mindset interventions offer students the view that intelligence can grow with effort, hard work, good strategies, and help from others. The theory is that that mindset can help students see setbacks simply as evidence that they haven’t learned the material yet, or that their strategies haven’t been successful yet, and thus to sustain their efforts. These interventions typically start by sharing information from neuroscience about how the brain grows “like a muscle” during learning, especially when students work on challenging material. Then students might read stories from older students who used a growth mindset to persist through challenges. Finally, they may be asked to describe this idea to help younger students struggling in school, a technique termed “saying-is-believing.” That makes the experience active rather than passive and positions students as benefactors rather than beneficiaries, which would be potentially stigmatizing.

Social-belonging interventions target “belonging uncertainty,” a persistent doubt students can feel about whether “people like me” can belong in a school setting. This doubt arises most strongly for people who belong to groups that have historically faced exclusion in school settings, negative stereotypes that pose them as less able and less deserving of educational opportunities, or who are underrepresented in a school context. When students experience this doubt, everyday challenges such as feeling lonely, being excluded, or getting critical feedback can seem like evidence that they don’t belong in general. Social-belonging interventions share stories from older students who describe how they worried at first about whether they belonged in a new school and how these worries dissipated with time as they developed friendships and study partners, joined student groups, and formed mentor relationships. Belonging interventions offer students the view that it’s normal to worry about belonging at first in a new school but this gets better with time. Like growth mindset interventions, belonging interventions use written exercises to give students the opportunity to reflect on the intervention message and advocate for it to younger students. The theory is that this message can help students sustain a sense of belonging and stay engaged in school even when they face challenges, and that that helps students develop friendships and mentor relationships that support higher rates of achievement.

Social-belonging interventions were designed specifically to address circumstances in which people face underrepresentation or negative stereotypes in school. Even if all students have reasons to worry whether they belong in school, only some students have reason to question whether “people like me” belong. I am a White person whose parents both graduated from college. So, when I went to college, I felt homesick but I didn’t wonder whether “people like me” could belong.

That said, belonging concerns are felt by almost everyone, and in some cases belonging interventions have produced main effects (benefits for all students) rather than interactions predicated on group identity (e.g., Borman et al., 2019 for evidence from students in grade 6). However, most trials find greater benefits for students who face underrepresentation or negative stereotypes in specific settings. One study found that women in more gender-diverse engineering majors (averaging 33% women) showed no achievement gap with men in the first year and no benefit from a belonging intervention. But women in male-dominated majors (averaging 10% women) showed a large achievement gap in first year performance, but that gap was closed by the intervention (Walton et al., 2015; see also Binning et al., 2020) [Editor’s note: These two latter studies did not meet WWC standards for internal validity. Although this suggests caution in drawing conclusions from the studies, failing to meet WWC standards does not imply that an intervention is ineffective.]

Taken together, a fixed-mindset of intelligence and belonging uncertainty can be like a toxic tornado for students, swirling into each other and creating cascading self-doubt. I’m describing these interventions separately because they grew up independently in the literature, and the WWC’s two reports look at each separately. But for students, they are often experienced together.

It’s also important to state that, although the interventions reviewed by the WWC are typical of those conducted with postsecondary students, these are highly active areas with new trials reported regularly. Studies have explored new populations and college contexts (e.g., Murphy et al., 2020) and are increasingly focused on identifying boundary conditions that determine where we should and should not predict effects (see Bryan, Tipton, & Yeager, 2020). It is also noteworthy how few studies have examined the critical question of progress in college (3 in each report). We need much more research here, exploring effectiveness, implementation strategies, and boundary conditions. Further, research is increasingly complementing direct-to-student interventions by exploring how we can support practices in school that support growth mindset and belonging (Murphy et al., 2021). For example, recent research shows that highlighting pro-diversity peer norms—namely that most students endorse diversity—can facilitate more inclusive treatment among college students and, in turn, reduce achievement gaps between privileged and marginalized students (Murrar et al., 2020).

What are the key components that are needed for a social belonging or growth mindset intervention to have a good chance of working? What elements need to be in place to help students improve academically or to stay enrolled in college?

I would distinguish two layers of this question.

One layer is what does it take for a discrete exercise focused on belonging or growth-mindset—such as the focus of the trials reviewed by WWC—to help students. In general, we should consider what, how, when, and where.

What is it you want to offer students? It should give students an authentic and adaptive way to make sense of common challenges they face, a way of thinking they can use to achieve their goals in college. Simple exhortations such as, “I know you can do it” or “You belong!” do not effectively impart a growth mindset or a sense of belonging, as Carol Dweck and I have written. Instead, it is useful to use high-quality materials developed and validated in research. Examples of materials available online are here and here.

How will you convey this? The goal of these interventions is to address foundational beliefs students have about school, such as “Can I do it?” and “Can people like me belong here?” It’s not to do something else, like to build a skill. That means the experience need not take long—typically, interventions last 30-60 minutes—but it should be immersive and interactive. You want students to deeply reflect on the ideas you present and connect these ideas to their lived experience.

That said, the more you can implement approaches that are scalable within an institutional context the more students you can potentially help. That’s one reason recent trials that reach large samples have focused on online modules (e.g., LaCosse et al., 2020; Yeager, Walton, & Brady et al., 2016). Students can log-on individually and complete materials at near-zero marginal cost. However, these approaches also have challenges, as online modules may not be as engrossing as in-person experiences. As we have moved from delivering these interventions in one-on-one, in-person experiences to larger studies with materials delivered online, we have found that students spend less time on the same materials and write less in response to prompts. Another alternative is having students meet in-person in groups to participate in these interventions or discuss their content (see Binning et al., 2020; Murphy et al., 2020), but that may be more difficult to implement on a large scale. So, there can be trade-offs between reaching scale and creating deep and impactful experiences.

When should you do this? In general, it is valuable if an intervention happens earlier rather than later, so it can alter trajectories going forward. However, it may be optimal to deliver interventions soon after students have encountered some challenges, but before they have taken steps in response to those challenges that are hard to reverse (e.g., dropping out). In general, social-psychological interventions are more sensitive to timing than to dosage. Growth mindset and belonging interventions have been delivered from the summer before college (Yeager, Walton, Brady, et al., 2016), to the first academic term (Walton et al., 2015), to the second (Walton & Cohen, 2011).

Where should you deliver interventions? This brings us to the second layer. So far, I’ve addressed the first layer, where you are focused on a discrete experience or set of experiences. But the second layer is that, growth mindset and belonging interventions will be most effective in contexts in which (1) the message offered is legitimate and authentic (locally true) and (2) students have real opportunities to get academic support and to develop a sense of belonging. In the end, to produce the most robust change, we must create cultures in schools in which adaptive ideas about ability and belonging are normal and reinforced. There are many ways that institutions signal to students, even inadvertently, messages about the nature of intelligence and who belongs. In welcoming a new class to campus, do we extol the past achievements of a few, which may only heighten imposter syndrome among everyone else? Can we instead talk about what students can do in the future and who they can become? In welcoming students to class, do faculty communicate that they expect to weed out large numbers of students? Or do they design assignments and evaluations to support students’ learning and growth (Canning et al., 2019)? Another question involves how well colleges foster opportunities for students to develop in-group pride and identity. Tiffany Brannon at UCLA finds that African American students do better in college when they have more opportunities to participate in events that celebrate and explore Black culture (Brannon & Lin, 2021). Some resources to help researchers and practitioners create cultures of growth and belonging for all students are available at the Student Experience Project, co-led by the College Transition Collaborative (https://collegetransitioncollaborative.org/student-experience/).

Recently, you and your colleagues have distinguished between people with different characteristics - and environments with different characteristics. You’ve argued that researchers should be looking more closely at the contexts, or what you’ve called “psychological affordances” in which these interventions might have different effects. Why is this work important? Why should educators be paying attention?

Social-psychological interventions operate within complex systems. Those systems invariably determine the specific effect any intervention has. To understand this, my colleagues and I have found it useful to consider the affordances of a school context: What does a context make possible (Walton & Yeager, 2020)? For instance, no psychological intervention will help English-language speakers learn Chinese if they aren’t receiving instruction in Chinese.

We distinguish two kinds of affordances. One is structural: What is it that different institutions make possible for students to do? As an example, in a forthcoming study, Shannon Brady, Parker Goyer, David Yeager, and I tracked college outcomes of students randomly assigned to a social belonging intervention or a control condition at the end of high school. The intervention raised the rate of bachelor’s degree completion for students who first enrolled in more selective 4-year institutions from 26% to 43%. These are institutions that tend to have higher retention and graduation rates and tend to spend more per student on instruction and student services than less selective 4-year institutions. They thus afford higher 4-year completion rates. At the same time, the same belonging intervention had no effect on bachelor’s degree completion rates for students who first enrolled in less selective 4-year institutions.

The second kind of affordance is psychological: What is it that students can believe in a school context? Does the cultural context in which an intervention is delivered one in which the way of thinking offered by the intervention can take hold and thrive? Or is it one that makes that way of thinking illegitimate, inauthentic, or not useful?  A large-scale social-belonging intervention delivered online to students in 21 diverse colleges and universities increased first-year full-time completion rates for students from historically underperforming groups, but only in colleges that afforded, or fostered, a sense of belonging to members of those groups. Let’s break this down: In some college contexts, students from historically underperforming groups (who were not exposed to the intervention) realized a high sense of belonging by the end of the first year. Here the belonging message was “locally true” (true here, for people like me). Although we don’t know exactly why this was the case, presumably in these schools students from the given group had more opportunities to develop friendships, to join student groups, and to form meaningful relationships with instructors. In other colleges, students did not attain this high sense of belonging by the end of the first year. Only in the first case did the belonging intervention raise first-year completion rates (Walton, Murphy et al., in prep; described in Walton & Yeager, 2020).

In both cases, the belonging intervention helped students take advantage of opportunities available to them, whether to graduate or to belong. An important implication is that it may be necessary to address both students’ beliefs and whether contexts support more positive beliefs. That’s helpful, because it gives us a precise way to think about how to make contexts more supportive: To what extent do they make adaptive beliefs about intelligence and belonging legitimate and authentic and, if they do not, what can we do about this?

It sounds like you’re saying postsecondary leaders who want to foster greater student success and reduce gaps in retention and academic performance may want to consider these kinds of interventions, in part because they are relatively inexpensive to deliver to large numbers of students. But they should also consider how hospitable their campus is to students who might initially struggle in college.

For example, to reinforce a growth mindset, universities need to make academic support resources truly accessible; to reinforce a sense of belonging, universities might look for multiple ways to communicate that successful students of all kinds of backgrounds have initially experienced self-doubt, and that feeling like you don’t belong is a fairly normal and temporary part of adjusting to college.

That’s right. Growth mindset and belonging are about both student beliefs or ways of thinking and institutional practices—either alone may not be enough. So, to support a growth mindset, institutions should both (1) convey that all students can learn and grow with effort, good strategies, and support from others and (2) back that up by creating learning environments designed to support growth, including adequate academic supports, and classes that focus on fostering growth rather than identifying who is allegedly smart and who is not. To support belonging, institutions should (1) acknowledge that nearly all new college students worry at first about whether they belong, that this is normal and improves with time and (2) create classroom and out-of-classroom environments in which all of the diverse students we serve can develop strong friendships and mentoring relationships and find communities in which they belong.

Thanks very much, Greg.

 

Read the WWC’s summary of evidence for these interventions in the Growth Mindset Intervention Report and the Social Belonging Intervention Report. Find related resources at the The College Transition Collaborative (https://collegetransitioncollaborative.org/) or the Project for Education Research That Scales (https://www.perts.net/)

 

Carter Epstein, Senior Associate at Abt Associates, produced this blog with Greg Walton, Associate Professor of Psychology at Stanford University.

 

Note: The discussion above reflects the opinions of Greg Walton and does not necessarily reflect the opinions of the Institute of Education Sciences or the What Works Clearinghouse. Some of the studies cited above have not been reviewed by the What Works Clearinghouse.

 

REFERENCES

Binning, K.R., Kaufmann, N., McGreevy, E.M., Fotuhi, O., Chen, S., Marshman, E., Kalender, Z.Y., Limeri, L., Betancur, L., & Singh, C. (2020). Changing social contexts to foster equity in college science courses: An ecological-belonging intervention. Psychological Science, 31,1059-1070. https://doi.org/10.1177/0956797620929984

Borman, G.D., Rozek, C.S., Pyne, J., & Hanselman, P. (2019). Reappraising academic and social adversity improves middle school students’ academic achievement, behavior, and well-being. Proceedings of the National Academy of Sciences 116 (33), 16286-16291. https://doi.org/10.1073/pnas.1820317116

Brady, S. T., Walton, G. M., Goyer, J. P., & Yeager, D. S. (in prep). [Where does a brief belonging intervention increase the attainment of a college degree? The role of institutional affordances.] Manuscript in preparation.

Bryan, C. J., Tipton, E., & Yeager, D. S. (2021). Behavioural science is unlikely to change the world without a heterogeneity revolution. Nature human behaviour, 5(8), 980–989. https://doi.org/10.1038/s41562-021-01143-3

 Bryk, A. S., Grunow, A., Gomez, L. M., & LeMahieu, P. G. (2015). Learning to improve: How America’s schools can get better at getting better.  Harvard Education Press.

Canning, E. A., Muenks, K. ,Green, D.J., & Murphy, M.C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://www.science.org/doi/10.1126/sciadv.aau4734 

Dweck, C. (2016, January 11). Recognizing and overcoming false growth mindset. Edutopia. https://www.edutopia.org/blog/recognizing-overcoming-false-growth-mindset-carol-dweck

Murphy, M.C., Fryberg, S.A., Brady, L.M, Canning, E.A., & Hecht, C.A. ( 2021, August 25). Global Mindset Initiative Paper 1: Growth mindset cultures and teacher practices. https://ssrn.com/abstract=3911594

Murrar, S., Campbell, M.R. & Brauer, M. (2020). Exposure to peers’ pro-diversity attitudes increases inclusion and reduces the achievement gap. Nature Human Behavior 4, 889–897 . https://doi.org/10.1038/s41562-020-0899-5

Walton, G.M. (2021, November 9). Stop telling students, “You belong!” Three ways to make a sense of belonging real and valuable. Education Week. https://www.edweek.org/leadership/opinion-stop-telling-students-you-belong/2021/11

Walton, G. M., Logel, C., Peach, J. M., Spencer, S. J., & Zanna, M. P. (2015). Two brief interventions to mitigate a “chilly climate” transform women’s experience, relationships, and achievement in engineering. Journal of Educational Psychology, 107(2), 468–485. https://eric.ed.gov/?id=EJ1061905

Walton, G. M., Murphy, M. C., Logel, C., Yeager, D. S., Goyer, J. P., Brady, S. T., . . . Krol, N. (in preparation). Where and with whom does a brief social-belonging intervention raise college achievement? Manuscript in preparation.

Walton, G. M. & Yeager, D. S. (2020). Seed and soil: Psychological affordances in contexts help to explain where wise interventions succeed or fail. Current Directions in Psychological Science, 29, 219-226. http://gregorywalton-stanford.weebly.com/uploads/4/9/4/4/49448111/waltonyeager_2020.pdf

Yeager, D. S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D., Keane, L., Kamentz, D., Ritter, G., Duckworth, A. L., Urstein, R., Gomez, E. M., Markus, H. R., Cohen, G. L., & Dweck, C. S. (2016). Teaching a lay theory before college narrows achievement gaps at scale. Proceedings of the National Academy of Sciences of the United States of America, 113(24), E3341-E3348. https://doi.org/10.1073/pnas.1524360113

 

 

Exploring the Growing Impact of Career Pathways

Career pathways programs for workforce development are spreading across the country at both the secondary and postsecondary levels. Based on a synthesis of studies examining career pathways programs that integrate postsecondary career-technical education (CTE), the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide presents five recommendations for implementing evidence-based practices:

Cover of advising practice guide
  1. Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment.
  2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways.
  3. Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways.
  4. Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways.
  5. Develop and continuously leverage partnerships to prepare students and advance their labor market success.

Led by the WWC’s postsecondary contractor, Abt Associates, this practice guide was created by an expert panel of researchers and practitioners to provide examples of career pathways strategies and components and guidance to implement them; advise on strategies to overcome potential obstacles; and summarize evidence associated with rigorous research studies that met WWC standards.

As a long-time researcher of postsecondary CTE and many other important aspects of community college education, I welcome the opportunity to reflect on these five recommendations. I hope that my blog will help readers understand how this new practice guide fits into a larger landscape of research focusing on programs, policies, and practices aligned with the career pathways framework. Far from new, the notion of career pathways goes back several decades; thus, it is not surprising that we see an evolution in research to measure students’ education and employment outcomes. And still, there is a need for more rigorous studies of career pathways.

The Abt team located about 16,000 studies that were potentially relevant to the practice guide. Those studies used a wide variety of methods, data (quantitative and qualitative), and analysis procedures. Only 61 of them were eligible for review against the WWC standards, however; and only 21 of those met the WWC standards. Interestingly, most of those 21 studies focused on non-degree postsecondary credentials, rather than on college degrees, with policies and programs associated with workforce development and adult education well represented. Thus, lessons from the practice guide speak more directly to career pathways programs that culminate in credentials below the associate degree level than about those programs leading to the associate or baccalaureate degree level.

This dearth of rigorous career pathways research is problematic, as educational institutions of all types, including community colleges, seek to deliver positive, equitable outcomes to students during and beyond the COVID-19 pandemic.

Focus on Career Pathways

After examining the evidence from the studies that met the WWC standards, it was clear that the evidence converged around career pathways programs following requirements in the Strengthening Career and Technical Education for the 21st Century Act and Workforce Innovation and Opportunity Act (WIOA). In alignment with the WIOA definition of career pathways, the set of studies in the practice guide examine a “combination of rigorous and high-quality education, training, and other services” that align with the skill needs of industries in the region or state and accelerate participants’ educational and career advancement, to the extent practicable.

As defined by WIOA, career pathways support learners in pursuing their education and career goals, lead to at least one postsecondary credential, and provide entry or advancement in a particular occupation or occupational cluster. Because a growing number of community colleges employ a career pathways approach, as advocated by the federal legislation, it made sense to focus the practice guide on rigorous results and evidence-based recommendations that may help to move career pathway design and implementation forward.

The Five Recommendations

Recommendation 1: Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment. Our panel advocated for the intentional design and structure of career pathways for good reason. Whereas all educational institutions enroll students in courses and programs, career pathways prioritize the student’s entire educational experience, from access and entry, to completion and credentialing, and on to employment and career advancement. This purposeful approach to supporting student attainment is theorized to lead to positive student outcomes.

Applying the meta-analysis process required by the WWC, we determined from the 21 studies whether career pathways were achieving this crucial goal. We found nine of the studies showed overall statistically significant, positive results on industry-recognized credential attainment. Of the 12 studies supporting this recommendation, most  measured non-degree credentials; only two measured degree attainment—an important point to recognize, because these are the studies that have been conducted thus far.

This very small number of rigorous studies measuring degree attainment leaves open the question of whether career pathways increase postsecondary degree attainment—specifically the predominant credential in the community college context, the associate degree—and calls for greater investment in research on student completion of associate degrees (as well as baccalaureate degrees, a growing phenomenon in the United States).

Recommendation 2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways. Studies that met WWC standards showed a positive impact of career pathways on college credit accumulation and industry-recognized credential attainment. Only one study measured postsecondary degree attainment relative to contextualized and basic skills instruction and it reported statistically significant and negative results. However, descriptive and correlational studies suggest that contextualized and basic skills instruction contribute to positive educational outcomes for students enrolled in Adult Basic Education in addition to postsecondary CTE and workforce training.

That results of rigorous research complement descriptive studies, some of which provide rich details on program implementation, is information useful for scaling up community college career pathways. Having said this, we still need to know more about how contextualized, basic skills instruction—and other applied instructional interventions—affect the outcomes of students, especially those from racial minoritized groups, with low incomes, and who are the first generation to attend college, all purported to be well served by the career pathways approach.

Recommendation 3: Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways. Studies supporting this recommendation focused on five education outcomes: industry-recognized credential attainment, academic performance, technical skill proficiency, credit accumulation, and postsecondary degree attainment. As seen with the previous two recommendations, results on industry-recognized credential attainment were statistically significant and positive. Results on academic performance, technical skill proficiency, and credit accumulation were indeterminate, meaning findings could be positive or negative but were not statistically significant.

What is important to reiterate here is that nearly all the studies that met the WWC standards focused on non-degree credentials, providing limited information about results on the education outcome of postsecondary degree attainment. To be clear, our panel is not saying career pathways should focus exclusively on non-degree credentials; rather that results on postsecondary degree attainment are not definitive. Even so, that findings linking flexible scheduling and non-degree credential attainment are positive is important to know now, when the country is dealing with the pandemic.

Community colleges nationwide are rethinking instructional delivery to better meet students’ dire health, family, and employment needs. Rigorous research on career pathways interventions, such as flexible delivery, is needed, particularly studies involving diverse student populations. In times of economic and social struggle, it is essential that community college career pathways produce the equitable outcomes they purport to provide.

Recommendation 4: Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways. The rigorous studies meeting WWC standards and measuring outcomes relative to comprehensive student supports focused on the education outcome domain only. Similar to the previous recommendation on flexible scheduling, findings on industry-recognized credential attainment were statistically significant and positive. However, on supports, findings on credit accumulation were statistically significant and positive, reinforcing findings generated by other studies showing holistic supports improve student outcomes. For example, a meta-analysis of studies of the Trade Adjustment Assistance Community College and Career Training grants that used rigorous evaluation designs reported favorable results for holistic supports in counseling and advising, case management, and various other support services and educational outcomes.

Consistent with the recommendations in this practice guide, a growing body of evidence favors integrating comprehensive student supports with career pathways. These supports are intended to meet the needs of the diverse population of students who attend community colleges; so, they should demonstrate equitable results on educational outcomes. More rigorous research is needed to measure whether and how career pathways provide access, opportunity, and outcomes for racially minoritized, low-income, and other underserved student groups. These studies should ascertain the impact of student supports on both education and employment outcomes, recognizing that students seek a high-quality credential and a good job that offers economic security and career mobility.

Recommendation 5: Develop and continuously leverage partnerships to prepare students and advance their labor market success. This recommendation specifically emphasizes labor market success, based on studies that examine labor market outcomes only. Supporting this recommendation were findings from studies of four labor market outcomes: short-term employment, short-term earnings, medium-term employment, and medium-term earnings. (The studies did not include long-term findings.)

Overall, statistically significant and positive outcomes were found in the meta-analysis for short-term employment, short-term earnings, and medium-term earnings. However, for medium-term employment, the meta-analysis results were indeterminate. To clarify, this does not mean employment-focused partnerships do not lead to labor market success; instead it points to a dearth of research that tracks students through training and into employment for long enough to measure long-term outcomes.

Even so, these initial findings from the meta-analysis are promising and suggest that developing and leveraging such partnerships may help move the needle on short- and medium-term employment outcomes. Longitudinal research that tracks students for periods sufficient to know whether long-term employment and earnings are affected should be a priority in the future.

Moving Forward

As I reflect on the research that I have conducted on career pathways over the years, I am gratified to see mounting evidence of positive student outcomes. As a first-generation college student myself, it has always made sense to me to demystify the college education process. Helping learners understand the entire educational journey, from start to finish, is bound to help them see how what they are learning may contribute to future education and career choices. I went to college not knowing what it would be like or whether I would be able to succeed, and I benefited from faculty and advisors who helped me see how my future could progress.

For other students like me who enter college without the benefit of family members sharing their stories of college-going, and for those who have to balance school with work and family care-taking responsibilities, it is important to know how a college education, including postsecondary CTE, can lead to positive educational and employment outcomes. Student groups underserved by postsecondary education deserve our most resolute and far-reaching efforts.

To this end, additional rigorous evidence on the impact of postsecondary CTE on college degree attainment could help to inform career pathways design, funding, and implementation. Also, as I reflected on the five recommendations, I was struck by the modest amount of research on medium-term labor market outcomes and the lack of any studies of long-term labor market outcomes. When the focus of career pathways is creating a path to living-wage employment and career advancement over the long term, it isn’t enough to know that students’ immediate employment outcomes were improved. When many students attending community colleges are already working, it isn’t even clear what immediate employment means.

If the outcome of interest for the majority of community college students who are adults and working is whether they get a better job and higher salary than they were getting pre-education, more nuanced measures and longer follow-up periods are needed than those provided by any of the research reviewed for this practice guide. It seems to me that finding more evidence of medium- and long-term outcomes could also provide more useful evidence of how career pathways work for diverse learner groups who are under-studied at the present time.

I was honored to help develop the practice guide with Hope Cotner, Grant Goold, Eric Heiser, Darlene Miller, and Michelle Van Noy. What an enormously gratifying experience it was to work with these professionals, the WWC team at Abt, and the Institute of Education Sciences staff. Working on this practice guide has left me feeling more optimistic about what we could learn with a more sizeable federal investment in research on postsecondary CTE in general, and on career pathways specifically. Rigorous evidence is needed to test models, explore interventions, and understand results for the plethora of learner groups who attend community colleges.

As the nation struggles to pull out of the pandemic that continues to rage in pockets across the country, it is the right time to invest in research that helps prepare students for good jobs that advance living-wage careers over a lifetime. A true commitment to equity in CTE programming is necessary for the nation, and now is the time to invest.

_____________________________________________________________________________________________________________

Debra D. Bragg, PhD, is president of Bragg & Associates, Inc., and the founder of research centers focusing on community college education at the University of Illinois at Urbana-Champaign and the University of Washington. She spent the first 15 years of her career in academe studying postsecondary CTE for federally funded research centers, having devoted her entire research agenda to improving education- and employment-focused policies, programs, and practices to create more equitable outcomes for community college students. She served as an expert panelist for the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide.

 

 

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

Teachers Should Not Be Left Wondering What Works

The past two school years have posed many new and unexpected challenges for students and teachers. One thing that has not changed much is that educators continue to need quick access to evidence on strategies that can best support students. The What Works Clearinghouse (WWC), an initiative of the U.S. Department of Education’s Institute of Education Sciences, aims to meet these needs with ready-to-use practices supported by evidence. The WWC Practice Guides describe these practices and how to implement them, most recently in the new guide for assisting students struggling in mathematics. These Practice Guides contain the classroom strategies and tips that are most likely to help improve student outcomes.

More than two dozen free Practice Guides address challenges educators face in teaching math, reading, and writing; supporting positive student behavior; and preventing dropout. The recommendations in Practice Guides are based on evidence from well-designed and well-implemented studies, the experiences of practitioners, and the expert opinions of a panel of nationally recognized experts.

Ann Jolly, an instructional program manager at the Charlotte-Mecklenburg Schools’ Program for Exceptional Children, has used WWC Practice Guides for years. She describes her experiences using the WWC resources below. Her experiences may help teachers or instructional leaders understand how to better incorporate evidence-based practices into their own practice.


The COVID-19 pandemic has us all wondering where the time goes. We want to use the most promising evidence-based practices to support our students. However, as expressed by one teacher who understands how easy it is to forget about trying out something new in the face of day-to-day demands, “Yeah, you just get busy teaching…

Whether you are a new teacher trying to figure out how to balance teaching, lesson planning, grading, and other duties, or a veteran who is “busy teaching,” you should check out the WWC. The WWC, created by the U.S. Department of Education, is an easy-to-navigate website with valuable resources. I know that, as teachers, we are constantly seeking out resources that will enable us to provide the best instruction to our students. The WWC can help by searching for research, reviewing studies for quality, and summarizing findings, so that busy teachers like us can focus on our students! Here’s a quick look at some of the WWC resources I have used to make a difference in my school and district as an instructional leader collaborating with teachers and families.

When I needed help boosting reading comprehension among my special education students, I used the WWC Practice Guide Improving Reading Comprehension in Kindergarten Through 3rd Grade. This guide provided me with recommendations of practices and other relevant information that the WWC gathered to support classroom instruction. For example, I was able to quickly see that teaching students how to use reading comprehension strategies had the strongest evidence, so I knew to focus on that. The guide gave me easy-to-understand resources about how to bring the strategies into my classroom, plus videos and reference tools with examples. These were easy to digest and I was able to immediately implement the recommendations in my classroom.

When I needed strategies to support literacy at home and in school, I used the WWC Practice Guide Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade and its supplemental resources. Not only does the guide include a wealth of information for teachers, but companion documents include a summary of recommendations, a Professional Learning Communities Facilitator’s Guide, and Tips for Supporting Reading Skills at Home. I used the last tool to develop a presentation for parents. Parents took notes and asked questions as they made connections between the guide and the practices they could use at home with their children. Finding opportunities like this one to build relationships between teachers and parents may be even more important now, during a pandemic, than it was when I held this workshop. 

When my school was looking for strategies to improve student behavior, I facilitated a book club with school staff using the WWC Practice Guide Reducing Behavior Problems in the Elementary School Classroom. I began the club after noticing that other teachers were coming to me for suggestions about a common pattern of behaviors interfering with student learning.  This WWC guide offered several strategies to share. Although we started by discussing a specific behavioral issue and a recommended practice to address it, we eventually worked through the whole guide, chapter by chapter. The WWC Practice Guide gave us a free resource with powerful evidence-based strategies and practices for us to try. Teachers across grade levels and content areas actively collaborated through the book club and were able to build a common language and understanding about schoolwide practices. One of the great embedded features in WWC Practice Guides are the “Obstacles” or “Roadblocks.” This feature acknowledges perceived and actual barriers to implementing evidence-based practices and suggests solutions to overcome them!

The WWC has created a wide range of other Practice Guides, covering students from early childhood through high school graduation (and beyond). The most recent products include Assisting Students Struggling with Mathematics: Intervention in the Elementary Grades, a Practice Guide for educators in grades K to 6 that provides ready-to-use strategies for assisting struggling students. Some of my colleagues have used the guides on Teaching Secondary Students to Write Effectively, Teaching Math to Young Children, and Using Student Achievement Data to Support Instructional Decision Making. So many more Practice Guides are available!

I also encourage you to sign up now for the WWC News Flash and add the WWC to your social media network on Twitter, Facebook, or YouTube to easily keep up with the most current information. Research evidence on “what works” in education is there just for you. When you have a question, rely on the WWC…and don’t be left wondering what works!

This blog was written by Ann C. Jolly, Instructional Program Manager, Programs for Exceptional Children at Charlotte-Mecklenburg Schools with Data Rotz, Mathematica. 

When “More Research is Needed” Is the Key Finding: Improving the Evidence Base for Nonacademic Interventions for Postsecondary Success in Rural and High-Poverty Contexts

Stakeholders in rural and high-poverty districts in Regional Educational Laboratory (REL) Appalachia’s region have noticed a troubling trend: many students graduate from high school academically well prepared but fail to enroll in college or enroll in college only to struggle and drop out within the first year. Stakeholders believe these high-performing students may face nonacademic challenges to postsecondary success, such as completing financial aid paperwork, securing transportation and housing at colleges far from home, or adjusting to campus life. To address these challenges, education leaders are looking for interventions that address nonacademic competencies: the knowledge, skills, and behaviors that enable students to navigate the social, cultural, and other implicit demands of postsecondary study.

To fill this need, REL Appalachia researchers conducted a review of the existing evidence of the impact of nonacademic interventions – that is, those designed to address nonacademic competencies – on postsecondary enrollment, persistence, and completion. The review had a particular focus on identifying interventions that also have evidence of effectiveness in communities serving students similar to those in Appalachia—high-poverty, rural students. Only one intervention, Upward Bound, demonstrated impact in rural, high-poverty communities. The review showed that Upward Bound, as implemented in the early 1990s, benefited high-poverty rural students’ college enrollment, with no demonstrated impact on persistence or completion.

Schools and communities need access to nonacademic interventions that benefit students served in high-poverty rural communities. Researchers: read on to learn more about the methods used in the evidence review, its findings, and steps you can take to support rural and high-poverty communities in improving enrollment and success in postsecondary education!

Nonacademic challenges to postsecondary success for rural students

All students face nonacademic challenges to postsecondary success, but rural populations and high-poverty populations in particular may benefit from interventions addressing those challenges because they enroll in and complete college at significantly lower rates than their nonrural or low-poverty peers. Although academic challenges contribute to this gap, rural and high-poverty populations also face unique nonacademic challenges to postsecondary enrollment and success. For example, rural students are less likely to encounter college-educated role models and high-poverty students often face inadequate college counseling at their schools (see research here, here, and here). As a result, rural and high-poverty students may have inadequate access to knowledgeable adults who can help them understand the steps needed to enroll or prepare them for the challenges of persisting in postsecondary education.  Nonacademic interventions can support students in developing the knowledge, skills, and behaviors necessary to overcome these challenges and improve postsecondary enrollment and success for rural and high-poverty students.

The need for evidence-based interventions

To support decisionmakers at rural and high-poverty schools in identifying evidence-based nonacademic interventions, researchers at REL Appalachia conducted an extensive search of the published research. The search looked for rigorous studies of nonacademic interventions with evidence of positive impact on college enrollment, persistence, performance, and completion for students attending rural schools or who were identified as high poverty. The purpose of the project was to identify a suite of interventions to recommend to these education leaders.

The results of our review indicate there may be gaps in the evidence available to all decisionmakers who are trying to help their students succeed in postsecondary education. The search first identified any studies that focused on postsecondary outcomes of nonacademic interventions serving students ages 5–19. Of the 1,777 studies with the relevant keywords, only 65 focused on the postsecondary outcomes of nonacademic interventions. Next, we evaluated these 65 studies against the What Works Clearinghouse (WWC) design standards, which assess the quality of evaluation study designs. Only 17 studies met WWC’s rigorous study design standards with or without reservations. Finally, researchers from REL Appalachia identified studies that showed positive impacts on students overall, and studies that looked at rural students and students identified as high poverty in particular. Only eight studies showed positive, statistically significant impacts on students’ postsecondary enrollment or success overall. Of the eight studies that showed positive impacts of nonacademic interventions on postsecondary outcomes, only three focused on high-poverty populations, and only one reported specifically on rural populations.

This figure shows the number of studies remaining at each stage of screening. The original searches returned 1,777 unique studies. Of these, 65 focused on postsecondary outcomes of nonacademic interventions with students ages 5 to 19. At the next stage, 17 studies remained that met these criteria and also met WWC standards. At the final stage, 8 studies remained that met all criteria and had a positive effect on postsecondary outcomes.

 Without additional research that focuses on low-income and rural contexts, schools and districts are left to implement programs with limited or no evidence of effectiveness. For example, the Quantum Opportunity Program (QOP) provides mentors to students as part of a long-term after school program. However, WWC reviews of QOP studies (here and here) showed indeterminate effects of the program on postsecondary outcomes. The lack of evidence should not detract from the important role QOP has in serving students, but it leaves open the question of whether those efforts are having the intended effects. With few clear alternatives, schools and districts continue to implement programs with limited evidence of effectiveness.

Action steps

Nationwide, 19 percent of U.S. public school students are enrolled in a rural school, and 24 percent are enrolled in a high-poverty school. To help districts and schools provide effective supports to those students, researchers can provide high-quality evidence on the effectiveness of nonacademic interventions in these contexts.

Carry out more studies on specific interventions designed to improve nonacademic competencies. REL Appalachia’s review found that the research on nonacademic competencies often focuses on defining the competencies themselves, rather than on studying interventions designed to develop the competencies. Of the 1,777 unique studies identified in our review, only 65 (3 percent) studied outcomes of interventions designed to improve nonacademic competencies. From these, we identified only 17 studies, representing nine interventions, with sufficiently rigorous designs to examine evidence of effectiveness.

The limited availability of rigorous evaluations of interventions suggests that, as researchers, we need to increase our focus on evaluating new interventions as they are developed or tested. Decisionmakers rarely design their own programs or interventions from scratch; they need to be able to identify existing programs and policies that are within their power to implement and have been proven effective in similar communities. Researchers can help decisionmakers select and implement successful interventions by providing evidence on whether interventions that develop students’ nonacademic competencies have positive effects on students’ postsecondary outcomes.

Design studies to generalize to rural and high-poverty populations. As researchers, we can also increase our focus on rural and high-poverty populations. REL Appalachia’s review found only three studies that focused on a high-poverty population and one that focused on a rural population. As researchers, we can address this gap in two ways: (a) we can carry out more studies specifically focused on rural and high-poverty areas; and (b) when using large national datasets or multi-site studies, we can consider rural and high-poverty populations in our sampling and disaggregate our results for these populations.

Summary

Stakeholders in rural and high-poverty contexts are looking for nonacademic interventions that will be effective with their students. To that end, REL Appalachia carried out an extensive review of evidence-based interventions. The review found few rigorous studies of nonacademic interventions, and even fewer that examined findings for students identified as high poverty or in rural settings. Without additional research, schools and districts serving rural and high-poverty populations may implement interventions that are not designed for their circumstances and may not achieve intended outcomes. As a result, resources may be wasted while rural and high-poverty students receive inadequate support for postsecondary success.  In addition to investing in rigorous studies, which can take a long time to complete, researchers and practitioners can also collaborate to implement short-term research methods to identify early indicators of the success of these programs. For example, researchers may be able to support schools and districts in developing descriptive studies examining change over time or change in formative assessment outcomes.

 

 

 Researchers have a role in helping more high school graduates from rural communities enroll, persist, and succeed in postsecondary education.

 

Rural and high-poverty schools and districts have unique strengths and challenges, and the lack of information about how interventions perform in those contexts presents a dilemma for decisionmakers: do nothing, or else muddle through with existing evidence, investing in interventions that don’t address local needs. As researchers, we can help resolve this dilemma by providing rigorous evidence about effective interventions tailored to rural and high-poverty contexts, as well as supporting practitioners in using more accessible methods to investigate the short-term outcomes of the programs they are already implementing.

 

by Rebecca A. Schmidt and CJ Park, Regional Educational Laboratory Appalachia