NCEE Blog

National Center for Education Evaluation and Regional Assistance

Leading with Evidence: Celebrating Evidence-Based Policymaking at the U.S. Department of Education

Matthew Soldner,

Commissioner, National Center for Education Evaluation and Regional Assistance &

Chief Evaluation Officer, U.S. Department of Education

(Remarks made on November 17, 2022, at Results for America’s 2022 Invest in What Works Federal Standards of Excellence events.)

If you walk into almost any kindergarten classroom today, you’ll almost always find a “Star Chart.” It’s usually displayed with pride next to the chalkboard in the front of the room. On it, the name of every student in the class. And next to each name, a long row of stars, each signifying a task earnestly mastered by a young learner. (More than forty years later, I can still remember the pride associated with finally getting a gold star for “can tie shoes,” which I was seriously delayed in getting due to an overabundance of shoes secured with Velcro in my youth.)

Today, I’m very proud to acknowledge that the U.S. Department of Education has received its own gold star. It was bestowed by Results for America, which advocates for the use of evidence in federal, state, and policymaking to improve outcomes for students, families, and communities, as part of its 2022 Federal Standards of Excellence program. Each year, participating federal agencies are evaluated on their progress in using high-quality evidence as a “north star” in policymaking. This year, Education was recognized alongside two other agencies – the Millennium Challenge Corporation and the U.S. Agency for International Development – as a top-scorer, earning the “Gold” designation.

ED’s national leadership in using evidence to inform policymaking has been a journey that now spans more than two decades. Its roots can be traced to the 107th Congress, which in 2001 reauthorized the Elementary and Secondary Education Act of 1965 as the No Child Left Behind Act and, in 2002, passed the Education Sciences Reform Act. The latter authorized the founding of a group toward which I’m somewhat partial: the Institute of Education Sciences (IES). (I would invite you to join my colleagues and me in celebrating #IESat20, now through mid-2023!) But no single event, and no distinct component of the Department, is individually responsible for our success building evidence about “what works” in education and putting that evidence to work to better serve learners, educators, and communities. I often say “evidence-building is a team sport at the Department of Education,” and it truly does take the commitment of talented professionals from across the organization to make it a reality.

This year, that team has been particularly busy. Department-wide, we have supported states, school districts, and institutions of higher education in their continued efforts to meet the challenge of pandemic recovery. Much of that work has focused on the use of evidence-based practices to accelerate learning for all students, making the most of historic investments in education such as the Coronavirus Aid, Relief, and Economic Security (CARES) and American Rescue Plan (ARP) Acts. Key partners in that work include IES’ Regional Educational Laboratories, operated by the National Center for Education Evaluation and Regional Assistance (NCEE);  the Office of Elementary and Secondary Education’s Comprehensive Centers; the Office of Special Education and Rehabilitative Services’ Technical Assistance and Research Centers; and the Office of Planning, Evaluation, and Policy Development’s Grant Policy Office.

Elsewhere in the Department, the emphasis has been on evidence-building. Here, IES has taken a particular leadership role. The National Center for Education Statistics’ (NCES’) School Pulse Panel is a critical new component in our evidence-building infrastructure. The Pulse allows us to more rapidly collect and report descriptive information about conditions on the ground in our nation’s schools, addressing topics from the extent of staffing shortages to the programs schools are offering to support learning acceleration. That and other information supports a vibrant research and development infrastructure, led by the National Centers for Education Research (NCER) and Special Education Research (NCSER). In addition to their regular education grant programs, both Centers ran special competitions in Fiscal Year 2022 specifically designed to support pandemic recovery, including those aimed at better leveraging longitudinal data to support state recovery policymaking and building evidence about the approaches states and districts used to address the pandemic, and, when possible, their effectiveness.  

In Fiscal Year 2023, more good work is already underway.

First, I would be remiss if I did not acknowledge an important investment this most recent Congress has made in the evidence-building work of the Department: authority as part of the Consolidated Appropriations Act of 2022 to reserve up to one-half of one percent from selected programs authorized by the Higher Education Act of 1965, as amended, to support high-quality research and evaluation related to the Department’s postsecondary programs. A similar set-aside for the Department’s K-12 programs dramatically catalyzed our ability to build and use evidence there – and I have every confidence this new authority, if continued, would do the same for our postsecondary portfolio.

Second, and consistent with my belief that “evidence building is a team sport,” I want to take a moment to encourage you to join the team! As an initial step, I’d like to invite you to join me and special guests from the Department in our new webinar series creatively entitled “Evidence-based Policymaking at ED: Introducing the U.S. Department of Education’s Inaugural Learning Agenda.” Across three installments, we’ll discuss the Department’s evidence-building priorities in three areas: the educator workforce; meeting students’ social, emotional, and academic needs; and increasing postsecondary value. In each, we’ll dig a bit deeper into each topic and its nexus with an equitable recovery from the COVID-19 pandemic.

I hope – but cannot promise – that Education will, this time next year, report that we’ve earned another gold star for building and using evidence in service of smart policymaking. What I can promise is that, because both educators and education policymakers will continue to need high-quality evidence to do their best work on behalf of the nation’s learners, we will do our best to help them meet challenges both old and new. Thanks to Results for America for today’s recognition, and to all those who support our nation’s students, educators, and communities every day.

 

How the 2017-2022 Cohort of RELs Supported the Use of Evidence in Education

Three adults discuss a chart that is displayed on a laptop.

This winter is a special season that comes along once every five years for the Regional Educational Laboratory (REL) program at IES. It’s a winter when the REL team manages the end of five-year REL contracts and oversees the launch of nine new REL contracts.[i]  During this exciting time, we actively reflect on the successes and lessons of the previous contracts—the 2017-2022 REL cohort—and channel those reflections into our work with the 2022-2027 REL cohort. 

As I collaborate with the REL team on the new RELs, I want to share some of the successes of the RELs that finished their work early this year. We expect the new RELs to build on these successes and to engage in new, innovative work that I will discuss in a future blog.

As we look back at the large body of work that the last cohort of RELs produced, I want to share some exciting results. Over three-quarters of participants in REL researcher-practitioner partnerships who responded to the REL Stakeholder Feedback Survey (SFS) reported that they used or were in the process of using the research or data that they learned about through the REL partnerships. On average across the last three years, an additional 17 percent reported that they were making plans to use research or data presented by the REL:

Image of a chart entitled “Responses to REL Partnership Stakeholder Feedback Survey (SFS).” The chart shows that in 2019, 77 percent of 695 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 19 percent said they were making plans to use the research, and 4 percent said they had no plans to use the research; in 2020, 81 percent of 397 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 17 percent said they were making plans to use the research, and 2 percent said they had no plans to use the research; and in 2021, 82 percent of 582 respondents reported that they used or were in the process of using the research data they learned through REL partnerships, 15 percent said they were making plans to use the research, and 3 percent said they had no plans to use the research.

While these survey results are promising, I want to provide a more vivid picture of how the RELs partnered with stakeholders to use evidence to improve teaching and learning. Read on to learn how REL work has been integral to education policy and practice across the country.

REL Mid-Atlantic and REL Southeast both engaged in projects that supported efforts to safely educate students during the pandemic:

  • In Pennsylvania, REL Mid-Atlantic helped the Pennsylvania Department of Education (PDE) provide evidence to inform the reopening of schools in the state during the first year of the COVID-19 pandemic. REL Mid-Atlantic worked with PDE to produce an extensive memo that included (1) a rapid review of existing evidence on public-health and educational issues relevant to the reopening of schools, (2) findings from interviews with a cross-section of stakeholders from across Pennsylvania to assess concerns and challenges related to reopening, and (3) agent-based modeling simulations of the potential spread of COVID-19 under alternative approaches to reopening schools.  The two largest school districts in the state—the School District of Philadelphia and the Pittsburgh Public School District—along with at least 25 other school districts and one Catholic archdiocese drew on the findings in the memo to make decisions about whether and how to reopen schools. 
  • Shortly after two of four of REL Southeast's teacher guides were released in early 2020, schools across the country shut down due to the COVID-19 pandemic. The REL realized that the content of the guides—originally created to support teachers in working with families to facilitate their children’s literacy development—would be immediately useful to parents across the county who were suddenly thrust into the role of teacher for their children at home. The content of the guides was based on the What Works Clearinghouse Practice Guide: Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade.

REL Southeast made all the content, which included videos and activities, available on the REL website so that parents could easily access them and use them to support their children during that difficult time.The landing page for the content—Supporting Your Child's Reading at Home—has been visited nearly 130,000 times since April of 2020. And landing pages for the four guides for teachers—A Kindergarten Teacher's Guide, A First Grade Teacher's Guide, A Second Grade Teacher's Guide and A Third Grade Teacher's Guide—have each been accessed between 1,300 and 7,500 times since their release. 

REL West and REL Midwest both worked with states in their regions to support student health and the need to identify and recruit more teachers.  These topics proved to be particularly  important as a result of the pandemic:

  • Robla Elementary School District (RESD) and several other districts in California’s Central Valley began offering telemedicine services during the 2017/18 school year as part of a broader “whole-child” strategy for improving student health, well-being, and attendance. Telemedicine is the remote evaluation, diagnosis and treatment of patients using telecommunications technology. RESD contracted with and paid Hazel Health, a telemedicine provider that operates virtual health clinics in school settings.  The telemedicine visits were free to students and families and did not require scheduled appointments. To learn more about the implementation of the program and whether it was associated with students staying in school throughout the day, RESD enlisted REL West for assistance.

REL West's study of the telemedicine services found that districtwide, a little over one-quarter of students used the services at least once over two years, with nine percent of students using telemedicine multiple times. Non-communicable physical illnesses/conditions such as stomach aches, headaches, allergies, and asthma were consistently the most common reason for school-based telemedicine visits across the two years of implementation. Ninety-four percent of all telemedicine visits resulted in students returning to class and receiving, on average, three more hours of instruction in the school day. Approximately 39 percent of Black students used telemedicine services compared with 17 percent of Asian students. Due to these findings, the district decided to continue with the program. The telemedicine provider is working to identify possible reasons for the differences in use by different student groups to ensure that all students are comfortable accessing the services.

  • Even before the COVID-19 pandemic, Michigan was experiencing teacher shortages in several subjects and geographic areas. This led Michigan members of the REL Midwest Alliance for Teacher Preparation to partner with the REL to examine nonteaching-certified teachers’ reasons for not teaching and incentives that could motivate them to return to the classroom. The REL Midwest study found that salary and certification/recertification requirements were among the most frequent barriers to teachers entering or remaining in the teaching profession.

As a result, the Michigan Department of Education launched the “Welcome Back Proud Michigan Educator” campaign, which seeks to recruit nonteaching educators into the teacher workforce. The first wave of the campaign, which began in April 2021, recruited educators with expired teaching certificates by reducing—and in some cases eliminating—professional learning requirements for recertification. The second wave, which began in October 2021, recruited teachers who had a valid certificate but were not teaching in public schools. As of January 2022, 218 educators have been recertified or issued a teaching permit, and 27 educators are in the pipeline to reinstate their teaching credentials. Of those with valid certificates, 123 educators started in a teaching position in fall 2021 and an additional 244 educators took a non-teaching assignment, such as day-to-day substitute teaching.

Concerns about the lack of equity in educational opportunities and in disciplinary practices led stakeholders to partner with REL Appalachia and REL Northwest:

  • Throughout the country, students are often encouraged to study Algebra I in middle school so that they can take more advanced math courses in high school and can graduate with a college-ready diploma. Concerned that economically disadvantaged students and English learners might be taking Algebra I later than their peers and earning college preparatory diplomas at lower rates than other students, Virginia’s Department of Education asked REL Appalachia for assistance analyzing the state’s data. The REL researchers found that the Department of Education’s hypotheses were correct. They found that, among all 5th graders rated as “advanced proficient” on the state’s math assessment, economically disadvantaged and English learner students were less likely take Algebra before 9th grade and less likely to earn a college preparatory diploma. As a result of these findings, the Virginia Department of Education asked the REL to work with school districts across the state to analyze data to identify student course-taking patterns and to further examine district-level policies and practices that may be contributing to the inequitable course-taking patterns and outcomes. 
  • REL Northwest undertook several projects with the Equity in School Discipline (ESD) collaborative: a cross-state collaborative of districts, state education agencies, community-based organizations, and institutions of higher education in Oregon and Washington committed to increasing equity in school climate and discipline policies and practices. ESD sought to reduce the use of exclusionary discipline practices and to eliminate disproportionately high rates of exclusion for students who identify as American Indian, Black, and Hispanic. REL Northwest and ESD district leaders in four districts partnered to co-design and pilot training materials to help school and district teams increase equity in school discipline practices. REL Northwest also produced a tool so other districts and states can identify discipline disproportionality.

REL Pacific helped to make a language program more evidence-based:

  • Recognizing the role of the school in sustaining Kosrae’s cultural and linguistic heritage and preparing students for the global world, Kosrae Department of Education (KDOE) leaders reached out to REL Pacific for support in creating a new language immersion policy and program that better supports the goal of building student proficiency in both Kosraean and English. REL Pacific supported KDOE by providing coaching on the research behind effective bilingual education models, policy implementation frameworks, and language assessments. REL Pacific and Region 18 Comprehensive Center (RC18) subsequently collaborated to provide complementary supports to ensure KDOE had increased capacity to implement its bilingual language policy in schools across the island. As REL Pacific continued support in best practices in bilingual instruction, classroom observation, and teacher professional learning, RC18 provided supports such as bilingual materials development and financing options for the new policy. KDOE began piloting the new policy in two elementary schools in the fall of 2021.

REL Central supported Nebraska by providing evidence-based resources and training to support the implementation of new legislation:

  • In 2018, the Nebraska Reading Improvement Act was passed to decrease the number of struggling readers in grade 3 across the state. The Nebraska State Board of Education (NSBE) and the Nebraska Department of Education enlisted REL Central’s support in providing the state’s elementary school teachers with evidence-based practices for the teaching of reading. To meet this need, REL Central reviewed strategies in eight What Works Clearinghouse practice guides on reading, writing, and literacy instruction and distilled the information into summary documents that were aligned with the state’s initiative. Each document is featured on NDE’s NebraskaREADS website and each describes a practice guide recommendation, how it should be implemented, and discusses the appropriate grade level or target student population (for example, English learners). REL Central also provided trainings to support regional education service unit staff and school-based educators in reviewing, selecting, and testing evidence-based reading strategies.

Finally, through applied research studies, REL Northeast and Islands and REL Southwest helped education leaders answer important questions about whether students in certain localities had equitable access to important services. These studies informed leaders’ decisions about state programs or indicators:

  • In an effort to increase the percentage of children ready for kindergarten, Vermont passed Act 166 in 2014 that provided access to high-quality prekindergarten (pre-K) for all 3- and 4-year-olds and for 5-year-olds not yet in kindergarten. As universal pre-K began in the 2016/17 school year, officials were concerned about unequal distribution and availability of high-quality pre-K programs across the state. The Vermont Agency of Education, the Agency of Human Services’ Department for Children and Families, and Building Bright Futures (Vermont’s early childhood advisory council) participated in the Vermont Universal PreK Research Partnership with REL Northeast & Islands to answer these important questions. Through one study, the REL found that although the majority of pre-K children were enrolled in the highest quality programs, some children had less access to high quality programs in their home districts. These findings led the Vermont legislature to maintain a provision that allows families to enroll their children in programs outside their home district.
  • Texas House Bill 3 (HB3), a comprehensive reform of the state’s school finance system passed in 2019, established a college, career, and military readiness outcomes bonus, which provides extra funding to districts for each annual graduate demonstrating college, career, or military readiness under the state accountability system. Leaders at the Texas Education Agency (TEA) were concerned that it may be hard for small and rural districts to demonstrate career readiness through the required accountability measure. Through a partnership with TEA, REL Southwest conducted a study that found that there were no substantive differences by district size or locale with respect to the percentage of students meeting the career readiness standard. Further, the study found that students who fell into two of the alternative career readiness options—CTE completers and work-based learners—had higher rates of college enrollment than graduates who met the existing career readiness accountability standard. The study also indicated that CTE completers had higher rates of either college persistence or of credential attainment after high school than graduates who met the existing career readiness accountability standard. These findings led the Commissioner of Education to recommend, and the Texas legislature to create, a new measure of career readiness in the state accountability system that met the needs of the districts across the state.

From these examples, one takeaway is clear: REL work can make a difference. RELs supported educators’ and policymakers’ efforts to improve educational programs, policies, and outcomes through use of research and evidence-based practice between 2017 and 2022. The new RELs will continue this work and, as I will write about in a future blog, they will also undertake some new types of projects. Until then, please visit the new REL website or reach out to me at Elizabeth.Eisner@ed.gov  if you have questions about the REL program and how it can help your community.

Liz Eisner is the associate commissioner of the Knowledge Use Division at the National Center for Education Evaluation and Regional Assistance


[i] One REL contract—REL Southwest (REL SW)—is on a different schedule. The current REL SW contract ends in late November of 2022 and the next REL SW contract will begin the day after the current contract ends. The contracts that just ended were the 2017-2022 contracts and the contracts that just started are the 2022-2027 contracts.

Is believing in yourself enough? Growth mindset and social belonging interventions for postsecondary students

The WWC recently reviewed the strength of evidence for two types of interventions designed to help students succeed in college: one report focuses on growth mindset interventions and another on social belonging. The WWC found that (1) neither type of intervention had a discernible effect on full-time college enrollment, (2) social belonging interventions had mixed effects on progressing in college and academic achievement, and (3) growth mindset interventions had potentially positive effects on academic achievement. We asked Greg Walton, an Associate Professor at Stanford University, IES-funded researcher, and expert on these kinds of interventions, to discuss what college faculty, administrators, and students should make of these findings.  

Can you walk through how growth mindset interventions and social belonging interventions with postsecondary students work? Were the interventions reviewed by the WWC typical interventions in this space?

Growth mindset interventions focus on the underlying “implicit” beliefs students have about the nature of intelligence: Is intelligence fixed or can it grow? These beliefs inform how students make sense of everyday academic challenges in school. If you think that intelligence is fixed, that you either have it or you don’t, then a setback like a poor grade can seem to be evidence that you don’t have what it takes. That can make students avoid academic challenges, withdraw, and ultimately learn and achieve less. Growth mindset interventions offer students the view that intelligence can grow with effort, hard work, good strategies, and help from others. The theory is that that mindset can help students see setbacks simply as evidence that they haven’t learned the material yet, or that their strategies haven’t been successful yet, and thus to sustain their efforts. These interventions typically start by sharing information from neuroscience about how the brain grows “like a muscle” during learning, especially when students work on challenging material. Then students might read stories from older students who used a growth mindset to persist through challenges. Finally, they may be asked to describe this idea to help younger students struggling in school, a technique termed “saying-is-believing.” That makes the experience active rather than passive and positions students as benefactors rather than beneficiaries, which would be potentially stigmatizing.

Social-belonging interventions target “belonging uncertainty,” a persistent doubt students can feel about whether “people like me” can belong in a school setting. This doubt arises most strongly for people who belong to groups that have historically faced exclusion in school settings, negative stereotypes that pose them as less able and less deserving of educational opportunities, or who are underrepresented in a school context. When students experience this doubt, everyday challenges such as feeling lonely, being excluded, or getting critical feedback can seem like evidence that they don’t belong in general. Social-belonging interventions share stories from older students who describe how they worried at first about whether they belonged in a new school and how these worries dissipated with time as they developed friendships and study partners, joined student groups, and formed mentor relationships. Belonging interventions offer students the view that it’s normal to worry about belonging at first in a new school but this gets better with time. Like growth mindset interventions, belonging interventions use written exercises to give students the opportunity to reflect on the intervention message and advocate for it to younger students. The theory is that this message can help students sustain a sense of belonging and stay engaged in school even when they face challenges, and that that helps students develop friendships and mentor relationships that support higher rates of achievement.

Social-belonging interventions were designed specifically to address circumstances in which people face underrepresentation or negative stereotypes in school. Even if all students have reasons to worry whether they belong in school, only some students have reason to question whether “people like me” belong. I am a White person whose parents both graduated from college. So, when I went to college, I felt homesick but I didn’t wonder whether “people like me” could belong.

That said, belonging concerns are felt by almost everyone, and in some cases belonging interventions have produced main effects (benefits for all students) rather than interactions predicated on group identity (e.g., Borman et al., 2019 for evidence from students in grade 6). However, most trials find greater benefits for students who face underrepresentation or negative stereotypes in specific settings. One study found that women in more gender-diverse engineering majors (averaging 33% women) showed no achievement gap with men in the first year and no benefit from a belonging intervention. But women in male-dominated majors (averaging 10% women) showed a large achievement gap in first year performance, but that gap was closed by the intervention (Walton et al., 2015; see also Binning et al., 2020) [Editor’s note: These two latter studies did not meet WWC standards for internal validity. Although this suggests caution in drawing conclusions from the studies, failing to meet WWC standards does not imply that an intervention is ineffective.]

Taken together, a fixed-mindset of intelligence and belonging uncertainty can be like a toxic tornado for students, swirling into each other and creating cascading self-doubt. I’m describing these interventions separately because they grew up independently in the literature, and the WWC’s two reports look at each separately. But for students, they are often experienced together.

It’s also important to state that, although the interventions reviewed by the WWC are typical of those conducted with postsecondary students, these are highly active areas with new trials reported regularly. Studies have explored new populations and college contexts (e.g., Murphy et al., 2020) and are increasingly focused on identifying boundary conditions that determine where we should and should not predict effects (see Bryan, Tipton, & Yeager, 2020). It is also noteworthy how few studies have examined the critical question of progress in college (3 in each report). We need much more research here, exploring effectiveness, implementation strategies, and boundary conditions. Further, research is increasingly complementing direct-to-student interventions by exploring how we can support practices in school that support growth mindset and belonging (Murphy et al., 2021). For example, recent research shows that highlighting pro-diversity peer norms—namely that most students endorse diversity—can facilitate more inclusive treatment among college students and, in turn, reduce achievement gaps between privileged and marginalized students (Murrar et al., 2020).

What are the key components that are needed for a social belonging or growth mindset intervention to have a good chance of working? What elements need to be in place to help students improve academically or to stay enrolled in college?

I would distinguish two layers of this question.

One layer is what does it take for a discrete exercise focused on belonging or growth-mindset—such as the focus of the trials reviewed by WWC—to help students. In general, we should consider what, how, when, and where.

What is it you want to offer students? It should give students an authentic and adaptive way to make sense of common challenges they face, a way of thinking they can use to achieve their goals in college. Simple exhortations such as, “I know you can do it” or “You belong!” do not effectively impart a growth mindset or a sense of belonging, as Carol Dweck and I have written. Instead, it is useful to use high-quality materials developed and validated in research. Examples of materials available online are here and here.

How will you convey this? The goal of these interventions is to address foundational beliefs students have about school, such as “Can I do it?” and “Can people like me belong here?” It’s not to do something else, like to build a skill. That means the experience need not take long—typically, interventions last 30-60 minutes—but it should be immersive and interactive. You want students to deeply reflect on the ideas you present and connect these ideas to their lived experience.

That said, the more you can implement approaches that are scalable within an institutional context the more students you can potentially help. That’s one reason recent trials that reach large samples have focused on online modules (e.g., LaCosse et al., 2020; Yeager, Walton, & Brady et al., 2016). Students can log-on individually and complete materials at near-zero marginal cost. However, these approaches also have challenges, as online modules may not be as engrossing as in-person experiences. As we have moved from delivering these interventions in one-on-one, in-person experiences to larger studies with materials delivered online, we have found that students spend less time on the same materials and write less in response to prompts. Another alternative is having students meet in-person in groups to participate in these interventions or discuss their content (see Binning et al., 2020; Murphy et al., 2020), but that may be more difficult to implement on a large scale. So, there can be trade-offs between reaching scale and creating deep and impactful experiences.

When should you do this? In general, it is valuable if an intervention happens earlier rather than later, so it can alter trajectories going forward. However, it may be optimal to deliver interventions soon after students have encountered some challenges, but before they have taken steps in response to those challenges that are hard to reverse (e.g., dropping out). In general, social-psychological interventions are more sensitive to timing than to dosage. Growth mindset and belonging interventions have been delivered from the summer before college (Yeager, Walton, Brady, et al., 2016), to the first academic term (Walton et al., 2015), to the second (Walton & Cohen, 2011).

Where should you deliver interventions? This brings us to the second layer. So far, I’ve addressed the first layer, where you are focused on a discrete experience or set of experiences. But the second layer is that, growth mindset and belonging interventions will be most effective in contexts in which (1) the message offered is legitimate and authentic (locally true) and (2) students have real opportunities to get academic support and to develop a sense of belonging. In the end, to produce the most robust change, we must create cultures in schools in which adaptive ideas about ability and belonging are normal and reinforced. There are many ways that institutions signal to students, even inadvertently, messages about the nature of intelligence and who belongs. In welcoming a new class to campus, do we extol the past achievements of a few, which may only heighten imposter syndrome among everyone else? Can we instead talk about what students can do in the future and who they can become? In welcoming students to class, do faculty communicate that they expect to weed out large numbers of students? Or do they design assignments and evaluations to support students’ learning and growth (Canning et al., 2019)? Another question involves how well colleges foster opportunities for students to develop in-group pride and identity. Tiffany Brannon at UCLA finds that African American students do better in college when they have more opportunities to participate in events that celebrate and explore Black culture (Brannon & Lin, 2021). Some resources to help researchers and practitioners create cultures of growth and belonging for all students are available at the Student Experience Project, co-led by the College Transition Collaborative (https://collegetransitioncollaborative.org/student-experience/).

Recently, you and your colleagues have distinguished between people with different characteristics - and environments with different characteristics. You’ve argued that researchers should be looking more closely at the contexts, or what you’ve called “psychological affordances” in which these interventions might have different effects. Why is this work important? Why should educators be paying attention?

Social-psychological interventions operate within complex systems. Those systems invariably determine the specific effect any intervention has. To understand this, my colleagues and I have found it useful to consider the affordances of a school context: What does a context make possible (Walton & Yeager, 2020)? For instance, no psychological intervention will help English-language speakers learn Chinese if they aren’t receiving instruction in Chinese.

We distinguish two kinds of affordances. One is structural: What is it that different institutions make possible for students to do? As an example, in a forthcoming study, Shannon Brady, Parker Goyer, David Yeager, and I tracked college outcomes of students randomly assigned to a social belonging intervention or a control condition at the end of high school. The intervention raised the rate of bachelor’s degree completion for students who first enrolled in more selective 4-year institutions from 26% to 43%. These are institutions that tend to have higher retention and graduation rates and tend to spend more per student on instruction and student services than less selective 4-year institutions. They thus afford higher 4-year completion rates. At the same time, the same belonging intervention had no effect on bachelor’s degree completion rates for students who first enrolled in less selective 4-year institutions.

The second kind of affordance is psychological: What is it that students can believe in a school context? Does the cultural context in which an intervention is delivered one in which the way of thinking offered by the intervention can take hold and thrive? Or is it one that makes that way of thinking illegitimate, inauthentic, or not useful?  A large-scale social-belonging intervention delivered online to students in 21 diverse colleges and universities increased first-year full-time completion rates for students from historically underperforming groups, but only in colleges that afforded, or fostered, a sense of belonging to members of those groups. Let’s break this down: In some college contexts, students from historically underperforming groups (who were not exposed to the intervention) realized a high sense of belonging by the end of the first year. Here the belonging message was “locally true” (true here, for people like me). Although we don’t know exactly why this was the case, presumably in these schools students from the given group had more opportunities to develop friendships, to join student groups, and to form meaningful relationships with instructors. In other colleges, students did not attain this high sense of belonging by the end of the first year. Only in the first case did the belonging intervention raise first-year completion rates (Walton, Murphy et al., in prep; described in Walton & Yeager, 2020).

In both cases, the belonging intervention helped students take advantage of opportunities available to them, whether to graduate or to belong. An important implication is that it may be necessary to address both students’ beliefs and whether contexts support more positive beliefs. That’s helpful, because it gives us a precise way to think about how to make contexts more supportive: To what extent do they make adaptive beliefs about intelligence and belonging legitimate and authentic and, if they do not, what can we do about this?

It sounds like you’re saying postsecondary leaders who want to foster greater student success and reduce gaps in retention and academic performance may want to consider these kinds of interventions, in part because they are relatively inexpensive to deliver to large numbers of students. But they should also consider how hospitable their campus is to students who might initially struggle in college.

For example, to reinforce a growth mindset, universities need to make academic support resources truly accessible; to reinforce a sense of belonging, universities might look for multiple ways to communicate that successful students of all kinds of backgrounds have initially experienced self-doubt, and that feeling like you don’t belong is a fairly normal and temporary part of adjusting to college.

That’s right. Growth mindset and belonging are about both student beliefs or ways of thinking and institutional practices—either alone may not be enough. So, to support a growth mindset, institutions should both (1) convey that all students can learn and grow with effort, good strategies, and support from others and (2) back that up by creating learning environments designed to support growth, including adequate academic supports, and classes that focus on fostering growth rather than identifying who is allegedly smart and who is not. To support belonging, institutions should (1) acknowledge that nearly all new college students worry at first about whether they belong, that this is normal and improves with time and (2) create classroom and out-of-classroom environments in which all of the diverse students we serve can develop strong friendships and mentoring relationships and find communities in which they belong.

Thanks very much, Greg.

 

Read the WWC’s summary of evidence for these interventions in the Growth Mindset Intervention Report and the Social Belonging Intervention Report. Find related resources at the The College Transition Collaborative (https://collegetransitioncollaborative.org/) or the Project for Education Research That Scales (https://www.perts.net/)

 

Carter Epstein, Senior Associate at Abt Associates, produced this blog with Greg Walton, Associate Professor of Psychology at Stanford University.

 

Note: The discussion above reflects the opinions of Greg Walton and does not necessarily reflect the opinions of the Institute of Education Sciences or the What Works Clearinghouse. Some of the studies cited above have not been reviewed by the What Works Clearinghouse.

 

REFERENCES

Binning, K.R., Kaufmann, N., McGreevy, E.M., Fotuhi, O., Chen, S., Marshman, E., Kalender, Z.Y., Limeri, L., Betancur, L., & Singh, C. (2020). Changing social contexts to foster equity in college science courses: An ecological-belonging intervention. Psychological Science, 31,1059-1070. https://doi.org/10.1177/0956797620929984

Borman, G.D., Rozek, C.S., Pyne, J., & Hanselman, P. (2019). Reappraising academic and social adversity improves middle school students’ academic achievement, behavior, and well-being. Proceedings of the National Academy of Sciences 116 (33), 16286-16291. https://doi.org/10.1073/pnas.1820317116

Brady, S. T., Walton, G. M., Goyer, J. P., & Yeager, D. S. (in prep). [Where does a brief belonging intervention increase the attainment of a college degree? The role of institutional affordances.] Manuscript in preparation.

Bryan, C. J., Tipton, E., & Yeager, D. S. (2021). Behavioural science is unlikely to change the world without a heterogeneity revolution. Nature human behaviour, 5(8), 980–989. https://doi.org/10.1038/s41562-021-01143-3

 Bryk, A. S., Grunow, A., Gomez, L. M., & LeMahieu, P. G. (2015). Learning to improve: How America’s schools can get better at getting better.  Harvard Education Press.

Canning, E. A., Muenks, K. ,Green, D.J., & Murphy, M.C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2). https://www.science.org/doi/10.1126/sciadv.aau4734 

Dweck, C. (2016, January 11). Recognizing and overcoming false growth mindset. Edutopia. https://www.edutopia.org/blog/recognizing-overcoming-false-growth-mindset-carol-dweck

Murphy, M.C., Fryberg, S.A., Brady, L.M, Canning, E.A., & Hecht, C.A. ( 2021, August 25). Global Mindset Initiative Paper 1: Growth mindset cultures and teacher practices. https://ssrn.com/abstract=3911594

Murrar, S., Campbell, M.R. & Brauer, M. (2020). Exposure to peers’ pro-diversity attitudes increases inclusion and reduces the achievement gap. Nature Human Behavior 4, 889–897 . https://doi.org/10.1038/s41562-020-0899-5

Walton, G.M. (2021, November 9). Stop telling students, “You belong!” Three ways to make a sense of belonging real and valuable. Education Week. https://www.edweek.org/leadership/opinion-stop-telling-students-you-belong/2021/11

Walton, G. M., Logel, C., Peach, J. M., Spencer, S. J., & Zanna, M. P. (2015). Two brief interventions to mitigate a “chilly climate” transform women’s experience, relationships, and achievement in engineering. Journal of Educational Psychology, 107(2), 468–485. https://eric.ed.gov/?id=EJ1061905

Walton, G. M., Murphy, M. C., Logel, C., Yeager, D. S., Goyer, J. P., Brady, S. T., . . . Krol, N. (in preparation). Where and with whom does a brief social-belonging intervention raise college achievement? Manuscript in preparation.

Walton, G. M. & Yeager, D. S. (2020). Seed and soil: Psychological affordances in contexts help to explain where wise interventions succeed or fail. Current Directions in Psychological Science, 29, 219-226. http://gregorywalton-stanford.weebly.com/uploads/4/9/4/4/49448111/waltonyeager_2020.pdf

Yeager, D. S., Walton, G. M., Brady, S. T., Akcinar, E. N., Paunesku, D., Keane, L., Kamentz, D., Ritter, G., Duckworth, A. L., Urstein, R., Gomez, E. M., Markus, H. R., Cohen, G. L., & Dweck, C. S. (2016). Teaching a lay theory before college narrows achievement gaps at scale. Proceedings of the National Academy of Sciences of the United States of America, 113(24), E3341-E3348. https://doi.org/10.1073/pnas.1524360113

 

 

Exploring the Growing Impact of Career Pathways

Career pathways programs for workforce development are spreading across the country at both the secondary and postsecondary levels. Based on a synthesis of studies examining career pathways programs that integrate postsecondary career-technical education (CTE), the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide presents five recommendations for implementing evidence-based practices:

Cover of advising practice guide
  1. Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment.
  2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways.
  3. Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways.
  4. Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways.
  5. Develop and continuously leverage partnerships to prepare students and advance their labor market success.

Led by the WWC’s postsecondary contractor, Abt Associates, this practice guide was created by an expert panel of researchers and practitioners to provide examples of career pathways strategies and components and guidance to implement them; advise on strategies to overcome potential obstacles; and summarize evidence associated with rigorous research studies that met WWC standards.

As a long-time researcher of postsecondary CTE and many other important aspects of community college education, I welcome the opportunity to reflect on these five recommendations. I hope that my blog will help readers understand how this new practice guide fits into a larger landscape of research focusing on programs, policies, and practices aligned with the career pathways framework. Far from new, the notion of career pathways goes back several decades; thus, it is not surprising that we see an evolution in research to measure students’ education and employment outcomes. And still, there is a need for more rigorous studies of career pathways.

The Abt team located about 16,000 studies that were potentially relevant to the practice guide. Those studies used a wide variety of methods, data (quantitative and qualitative), and analysis procedures. Only 61 of them were eligible for review against the WWC standards, however; and only 21 of those met the WWC standards. Interestingly, most of those 21 studies focused on non-degree postsecondary credentials, rather than on college degrees, with policies and programs associated with workforce development and adult education well represented. Thus, lessons from the practice guide speak more directly to career pathways programs that culminate in credentials below the associate degree level than about those programs leading to the associate or baccalaureate degree level.

This dearth of rigorous career pathways research is problematic, as educational institutions of all types, including community colleges, seek to deliver positive, equitable outcomes to students during and beyond the COVID-19 pandemic.

Focus on Career Pathways

After examining the evidence from the studies that met the WWC standards, it was clear that the evidence converged around career pathways programs following requirements in the Strengthening Career and Technical Education for the 21st Century Act and Workforce Innovation and Opportunity Act (WIOA). In alignment with the WIOA definition of career pathways, the set of studies in the practice guide examine a “combination of rigorous and high-quality education, training, and other services” that align with the skill needs of industries in the region or state and accelerate participants’ educational and career advancement, to the extent practicable.

As defined by WIOA, career pathways support learners in pursuing their education and career goals, lead to at least one postsecondary credential, and provide entry or advancement in a particular occupation or occupational cluster. Because a growing number of community colleges employ a career pathways approach, as advocated by the federal legislation, it made sense to focus the practice guide on rigorous results and evidence-based recommendations that may help to move career pathway design and implementation forward.

The Five Recommendations

Recommendation 1: Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment. Our panel advocated for the intentional design and structure of career pathways for good reason. Whereas all educational institutions enroll students in courses and programs, career pathways prioritize the student’s entire educational experience, from access and entry, to completion and credentialing, and on to employment and career advancement. This purposeful approach to supporting student attainment is theorized to lead to positive student outcomes.

Applying the meta-analysis process required by the WWC, we determined from the 21 studies whether career pathways were achieving this crucial goal. We found nine of the studies showed overall statistically significant, positive results on industry-recognized credential attainment. Of the 12 studies supporting this recommendation, most  measured non-degree credentials; only two measured degree attainment—an important point to recognize, because these are the studies that have been conducted thus far.

This very small number of rigorous studies measuring degree attainment leaves open the question of whether career pathways increase postsecondary degree attainment—specifically the predominant credential in the community college context, the associate degree—and calls for greater investment in research on student completion of associate degrees (as well as baccalaureate degrees, a growing phenomenon in the United States).

Recommendation 2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways. Studies that met WWC standards showed a positive impact of career pathways on college credit accumulation and industry-recognized credential attainment. Only one study measured postsecondary degree attainment relative to contextualized and basic skills instruction and it reported statistically significant and negative results. However, descriptive and correlational studies suggest that contextualized and basic skills instruction contribute to positive educational outcomes for students enrolled in Adult Basic Education in addition to postsecondary CTE and workforce training.

That results of rigorous research complement descriptive studies, some of which provide rich details on program implementation, is information useful for scaling up community college career pathways. Having said this, we still need to know more about how contextualized, basic skills instruction—and other applied instructional interventions—affect the outcomes of students, especially those from racial minoritized groups, with low incomes, and who are the first generation to attend college, all purported to be well served by the career pathways approach.

Recommendation 3: Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways. Studies supporting this recommendation focused on five education outcomes: industry-recognized credential attainment, academic performance, technical skill proficiency, credit accumulation, and postsecondary degree attainment. As seen with the previous two recommendations, results on industry-recognized credential attainment were statistically significant and positive. Results on academic performance, technical skill proficiency, and credit accumulation were indeterminate, meaning findings could be positive or negative but were not statistically significant.

What is important to reiterate here is that nearly all the studies that met the WWC standards focused on non-degree credentials, providing limited information about results on the education outcome of postsecondary degree attainment. To be clear, our panel is not saying career pathways should focus exclusively on non-degree credentials; rather that results on postsecondary degree attainment are not definitive. Even so, that findings linking flexible scheduling and non-degree credential attainment are positive is important to know now, when the country is dealing with the pandemic.

Community colleges nationwide are rethinking instructional delivery to better meet students’ dire health, family, and employment needs. Rigorous research on career pathways interventions, such as flexible delivery, is needed, particularly studies involving diverse student populations. In times of economic and social struggle, it is essential that community college career pathways produce the equitable outcomes they purport to provide.

Recommendation 4: Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways. The rigorous studies meeting WWC standards and measuring outcomes relative to comprehensive student supports focused on the education outcome domain only. Similar to the previous recommendation on flexible scheduling, findings on industry-recognized credential attainment were statistically significant and positive. However, on supports, findings on credit accumulation were statistically significant and positive, reinforcing findings generated by other studies showing holistic supports improve student outcomes. For example, a meta-analysis of studies of the Trade Adjustment Assistance Community College and Career Training grants that used rigorous evaluation designs reported favorable results for holistic supports in counseling and advising, case management, and various other support services and educational outcomes.

Consistent with the recommendations in this practice guide, a growing body of evidence favors integrating comprehensive student supports with career pathways. These supports are intended to meet the needs of the diverse population of students who attend community colleges; so, they should demonstrate equitable results on educational outcomes. More rigorous research is needed to measure whether and how career pathways provide access, opportunity, and outcomes for racially minoritized, low-income, and other underserved student groups. These studies should ascertain the impact of student supports on both education and employment outcomes, recognizing that students seek a high-quality credential and a good job that offers economic security and career mobility.

Recommendation 5: Develop and continuously leverage partnerships to prepare students and advance their labor market success. This recommendation specifically emphasizes labor market success, based on studies that examine labor market outcomes only. Supporting this recommendation were findings from studies of four labor market outcomes: short-term employment, short-term earnings, medium-term employment, and medium-term earnings. (The studies did not include long-term findings.)

Overall, statistically significant and positive outcomes were found in the meta-analysis for short-term employment, short-term earnings, and medium-term earnings. However, for medium-term employment, the meta-analysis results were indeterminate. To clarify, this does not mean employment-focused partnerships do not lead to labor market success; instead it points to a dearth of research that tracks students through training and into employment for long enough to measure long-term outcomes.

Even so, these initial findings from the meta-analysis are promising and suggest that developing and leveraging such partnerships may help move the needle on short- and medium-term employment outcomes. Longitudinal research that tracks students for periods sufficient to know whether long-term employment and earnings are affected should be a priority in the future.

Moving Forward

As I reflect on the research that I have conducted on career pathways over the years, I am gratified to see mounting evidence of positive student outcomes. As a first-generation college student myself, it has always made sense to me to demystify the college education process. Helping learners understand the entire educational journey, from start to finish, is bound to help them see how what they are learning may contribute to future education and career choices. I went to college not knowing what it would be like or whether I would be able to succeed, and I benefited from faculty and advisors who helped me see how my future could progress.

For other students like me who enter college without the benefit of family members sharing their stories of college-going, and for those who have to balance school with work and family care-taking responsibilities, it is important to know how a college education, including postsecondary CTE, can lead to positive educational and employment outcomes. Student groups underserved by postsecondary education deserve our most resolute and far-reaching efforts.

To this end, additional rigorous evidence on the impact of postsecondary CTE on college degree attainment could help to inform career pathways design, funding, and implementation. Also, as I reflected on the five recommendations, I was struck by the modest amount of research on medium-term labor market outcomes and the lack of any studies of long-term labor market outcomes. When the focus of career pathways is creating a path to living-wage employment and career advancement over the long term, it isn’t enough to know that students’ immediate employment outcomes were improved. When many students attending community colleges are already working, it isn’t even clear what immediate employment means.

If the outcome of interest for the majority of community college students who are adults and working is whether they get a better job and higher salary than they were getting pre-education, more nuanced measures and longer follow-up periods are needed than those provided by any of the research reviewed for this practice guide. It seems to me that finding more evidence of medium- and long-term outcomes could also provide more useful evidence of how career pathways work for diverse learner groups who are under-studied at the present time.

I was honored to help develop the practice guide with Hope Cotner, Grant Goold, Eric Heiser, Darlene Miller, and Michelle Van Noy. What an enormously gratifying experience it was to work with these professionals, the WWC team at Abt, and the Institute of Education Sciences staff. Working on this practice guide has left me feeling more optimistic about what we could learn with a more sizeable federal investment in research on postsecondary CTE in general, and on career pathways specifically. Rigorous evidence is needed to test models, explore interventions, and understand results for the plethora of learner groups who attend community colleges.

As the nation struggles to pull out of the pandemic that continues to rage in pockets across the country, it is the right time to invest in research that helps prepare students for good jobs that advance living-wage careers over a lifetime. A true commitment to equity in CTE programming is necessary for the nation, and now is the time to invest.

_____________________________________________________________________________________________________________

Debra D. Bragg, PhD, is president of Bragg & Associates, Inc., and the founder of research centers focusing on community college education at the University of Illinois at Urbana-Champaign and the University of Washington. She spent the first 15 years of her career in academe studying postsecondary CTE for federally funded research centers, having devoted her entire research agenda to improving education- and employment-focused policies, programs, and practices to create more equitable outcomes for community college students. She served as an expert panelist for the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide.

 

 

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

Teachers Should Not Be Left Wondering What Works

The past two school years have posed many new and unexpected challenges for students and teachers. One thing that has not changed much is that educators continue to need quick access to evidence on strategies that can best support students. The What Works Clearinghouse (WWC), an initiative of the U.S. Department of Education’s Institute of Education Sciences, aims to meet these needs with ready-to-use practices supported by evidence. The WWC Practice Guides describe these practices and how to implement them, most recently in the new guide for assisting students struggling in mathematics. These Practice Guides contain the classroom strategies and tips that are most likely to help improve student outcomes.

More than two dozen free Practice Guides address challenges educators face in teaching math, reading, and writing; supporting positive student behavior; and preventing dropout. The recommendations in Practice Guides are based on evidence from well-designed and well-implemented studies, the experiences of practitioners, and the expert opinions of a panel of nationally recognized experts.

Ann Jolly, an instructional program manager at the Charlotte-Mecklenburg Schools’ Program for Exceptional Children, has used WWC Practice Guides for years. She describes her experiences using the WWC resources below. Her experiences may help teachers or instructional leaders understand how to better incorporate evidence-based practices into their own practice.


The COVID-19 pandemic has us all wondering where the time goes. We want to use the most promising evidence-based practices to support our students. However, as expressed by one teacher who understands how easy it is to forget about trying out something new in the face of day-to-day demands, “Yeah, you just get busy teaching…

Whether you are a new teacher trying to figure out how to balance teaching, lesson planning, grading, and other duties, or a veteran who is “busy teaching,” you should check out the WWC. The WWC, created by the U.S. Department of Education, is an easy-to-navigate website with valuable resources. I know that, as teachers, we are constantly seeking out resources that will enable us to provide the best instruction to our students. The WWC can help by searching for research, reviewing studies for quality, and summarizing findings, so that busy teachers like us can focus on our students! Here’s a quick look at some of the WWC resources I have used to make a difference in my school and district as an instructional leader collaborating with teachers and families.

When I needed help boosting reading comprehension among my special education students, I used the WWC Practice Guide Improving Reading Comprehension in Kindergarten Through 3rd Grade. This guide provided me with recommendations of practices and other relevant information that the WWC gathered to support classroom instruction. For example, I was able to quickly see that teaching students how to use reading comprehension strategies had the strongest evidence, so I knew to focus on that. The guide gave me easy-to-understand resources about how to bring the strategies into my classroom, plus videos and reference tools with examples. These were easy to digest and I was able to immediately implement the recommendations in my classroom.

When I needed strategies to support literacy at home and in school, I used the WWC Practice Guide Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade and its supplemental resources. Not only does the guide include a wealth of information for teachers, but companion documents include a summary of recommendations, a Professional Learning Communities Facilitator’s Guide, and Tips for Supporting Reading Skills at Home. I used the last tool to develop a presentation for parents. Parents took notes and asked questions as they made connections between the guide and the practices they could use at home with their children. Finding opportunities like this one to build relationships between teachers and parents may be even more important now, during a pandemic, than it was when I held this workshop. 

When my school was looking for strategies to improve student behavior, I facilitated a book club with school staff using the WWC Practice Guide Reducing Behavior Problems in the Elementary School Classroom. I began the club after noticing that other teachers were coming to me for suggestions about a common pattern of behaviors interfering with student learning.  This WWC guide offered several strategies to share. Although we started by discussing a specific behavioral issue and a recommended practice to address it, we eventually worked through the whole guide, chapter by chapter. The WWC Practice Guide gave us a free resource with powerful evidence-based strategies and practices for us to try. Teachers across grade levels and content areas actively collaborated through the book club and were able to build a common language and understanding about schoolwide practices. One of the great embedded features in WWC Practice Guides are the “Obstacles” or “Roadblocks.” This feature acknowledges perceived and actual barriers to implementing evidence-based practices and suggests solutions to overcome them!

The WWC has created a wide range of other Practice Guides, covering students from early childhood through high school graduation (and beyond). The most recent products include Assisting Students Struggling with Mathematics: Intervention in the Elementary Grades, a Practice Guide for educators in grades K to 6 that provides ready-to-use strategies for assisting struggling students. Some of my colleagues have used the guides on Teaching Secondary Students to Write Effectively, Teaching Math to Young Children, and Using Student Achievement Data to Support Instructional Decision Making. So many more Practice Guides are available!

I also encourage you to sign up now for the WWC News Flash and add the WWC to your social media network on Twitter, Facebook, or YouTube to easily keep up with the most current information. Research evidence on “what works” in education is there just for you. When you have a question, rely on the WWC…and don’t be left wondering what works!

This blog was written by Ann C. Jolly, Instructional Program Manager, Programs for Exceptional Children at Charlotte-Mecklenburg Schools with Data Rotz, Mathematica. 

When “More Research is Needed” Is the Key Finding: Improving the Evidence Base for Nonacademic Interventions for Postsecondary Success in Rural and High-Poverty Contexts

Stakeholders in rural and high-poverty districts in Regional Educational Laboratory (REL) Appalachia’s region have noticed a troubling trend: many students graduate from high school academically well prepared but fail to enroll in college or enroll in college only to struggle and drop out within the first year. Stakeholders believe these high-performing students may face nonacademic challenges to postsecondary success, such as completing financial aid paperwork, securing transportation and housing at colleges far from home, or adjusting to campus life. To address these challenges, education leaders are looking for interventions that address nonacademic competencies: the knowledge, skills, and behaviors that enable students to navigate the social, cultural, and other implicit demands of postsecondary study.

To fill this need, REL Appalachia researchers conducted a review of the existing evidence of the impact of nonacademic interventions – that is, those designed to address nonacademic competencies – on postsecondary enrollment, persistence, and completion. The review had a particular focus on identifying interventions that also have evidence of effectiveness in communities serving students similar to those in Appalachia—high-poverty, rural students. Only one intervention, Upward Bound, demonstrated impact in rural, high-poverty communities. The review showed that Upward Bound, as implemented in the early 1990s, benefited high-poverty rural students’ college enrollment, with no demonstrated impact on persistence or completion.

Schools and communities need access to nonacademic interventions that benefit students served in high-poverty rural communities. Researchers: read on to learn more about the methods used in the evidence review, its findings, and steps you can take to support rural and high-poverty communities in improving enrollment and success in postsecondary education!

Nonacademic challenges to postsecondary success for rural students

All students face nonacademic challenges to postsecondary success, but rural populations and high-poverty populations in particular may benefit from interventions addressing those challenges because they enroll in and complete college at significantly lower rates than their nonrural or low-poverty peers. Although academic challenges contribute to this gap, rural and high-poverty populations also face unique nonacademic challenges to postsecondary enrollment and success. For example, rural students are less likely to encounter college-educated role models and high-poverty students often face inadequate college counseling at their schools (see research here, here, and here). As a result, rural and high-poverty students may have inadequate access to knowledgeable adults who can help them understand the steps needed to enroll or prepare them for the challenges of persisting in postsecondary education.  Nonacademic interventions can support students in developing the knowledge, skills, and behaviors necessary to overcome these challenges and improve postsecondary enrollment and success for rural and high-poverty students.

The need for evidence-based interventions

To support decisionmakers at rural and high-poverty schools in identifying evidence-based nonacademic interventions, researchers at REL Appalachia conducted an extensive search of the published research. The search looked for rigorous studies of nonacademic interventions with evidence of positive impact on college enrollment, persistence, performance, and completion for students attending rural schools or who were identified as high poverty. The purpose of the project was to identify a suite of interventions to recommend to these education leaders.

The results of our review indicate there may be gaps in the evidence available to all decisionmakers who are trying to help their students succeed in postsecondary education. The search first identified any studies that focused on postsecondary outcomes of nonacademic interventions serving students ages 5–19. Of the 1,777 studies with the relevant keywords, only 65 focused on the postsecondary outcomes of nonacademic interventions. Next, we evaluated these 65 studies against the What Works Clearinghouse (WWC) design standards, which assess the quality of evaluation study designs. Only 17 studies met WWC’s rigorous study design standards with or without reservations. Finally, researchers from REL Appalachia identified studies that showed positive impacts on students overall, and studies that looked at rural students and students identified as high poverty in particular. Only eight studies showed positive, statistically significant impacts on students’ postsecondary enrollment or success overall. Of the eight studies that showed positive impacts of nonacademic interventions on postsecondary outcomes, only three focused on high-poverty populations, and only one reported specifically on rural populations.

This figure shows the number of studies remaining at each stage of screening. The original searches returned 1,777 unique studies. Of these, 65 focused on postsecondary outcomes of nonacademic interventions with students ages 5 to 19. At the next stage, 17 studies remained that met these criteria and also met WWC standards. At the final stage, 8 studies remained that met all criteria and had a positive effect on postsecondary outcomes.

 Without additional research that focuses on low-income and rural contexts, schools and districts are left to implement programs with limited or no evidence of effectiveness. For example, the Quantum Opportunity Program (QOP) provides mentors to students as part of a long-term after school program. However, WWC reviews of QOP studies (here and here) showed indeterminate effects of the program on postsecondary outcomes. The lack of evidence should not detract from the important role QOP has in serving students, but it leaves open the question of whether those efforts are having the intended effects. With few clear alternatives, schools and districts continue to implement programs with limited evidence of effectiveness.

Action steps

Nationwide, 19 percent of U.S. public school students are enrolled in a rural school, and 24 percent are enrolled in a high-poverty school. To help districts and schools provide effective supports to those students, researchers can provide high-quality evidence on the effectiveness of nonacademic interventions in these contexts.

Carry out more studies on specific interventions designed to improve nonacademic competencies. REL Appalachia’s review found that the research on nonacademic competencies often focuses on defining the competencies themselves, rather than on studying interventions designed to develop the competencies. Of the 1,777 unique studies identified in our review, only 65 (3 percent) studied outcomes of interventions designed to improve nonacademic competencies. From these, we identified only 17 studies, representing nine interventions, with sufficiently rigorous designs to examine evidence of effectiveness.

The limited availability of rigorous evaluations of interventions suggests that, as researchers, we need to increase our focus on evaluating new interventions as they are developed or tested. Decisionmakers rarely design their own programs or interventions from scratch; they need to be able to identify existing programs and policies that are within their power to implement and have been proven effective in similar communities. Researchers can help decisionmakers select and implement successful interventions by providing evidence on whether interventions that develop students’ nonacademic competencies have positive effects on students’ postsecondary outcomes.

Design studies to generalize to rural and high-poverty populations. As researchers, we can also increase our focus on rural and high-poverty populations. REL Appalachia’s review found only three studies that focused on a high-poverty population and one that focused on a rural population. As researchers, we can address this gap in two ways: (a) we can carry out more studies specifically focused on rural and high-poverty areas; and (b) when using large national datasets or multi-site studies, we can consider rural and high-poverty populations in our sampling and disaggregate our results for these populations.

Summary

Stakeholders in rural and high-poverty contexts are looking for nonacademic interventions that will be effective with their students. To that end, REL Appalachia carried out an extensive review of evidence-based interventions. The review found few rigorous studies of nonacademic interventions, and even fewer that examined findings for students identified as high poverty or in rural settings. Without additional research, schools and districts serving rural and high-poverty populations may implement interventions that are not designed for their circumstances and may not achieve intended outcomes. As a result, resources may be wasted while rural and high-poverty students receive inadequate support for postsecondary success.  In addition to investing in rigorous studies, which can take a long time to complete, researchers and practitioners can also collaborate to implement short-term research methods to identify early indicators of the success of these programs. For example, researchers may be able to support schools and districts in developing descriptive studies examining change over time or change in formative assessment outcomes.

 

 

 Researchers have a role in helping more high school graduates from rural communities enroll, persist, and succeed in postsecondary education.

 

Rural and high-poverty schools and districts have unique strengths and challenges, and the lack of information about how interventions perform in those contexts presents a dilemma for decisionmakers: do nothing, or else muddle through with existing evidence, investing in interventions that don’t address local needs. As researchers, we can help resolve this dilemma by providing rigorous evidence about effective interventions tailored to rural and high-poverty contexts, as well as supporting practitioners in using more accessible methods to investigate the short-term outcomes of the programs they are already implementing.

 

by Rebecca A. Schmidt and CJ Park, Regional Educational Laboratory Appalachia

 

Advancing High-Quality Data and Evidence at the U.S. Department of Education

March 5, 2021: A post from Greg Fortelny, Chief Data Officer and Matt Soldner, Evaluation Officer, U.S. Department of Education

Last year, the education landscape changed dramatically as the effects of the coronavirus swept across the country. Overnight, families were confronted with the twin challenges of keeping their children, loved ones, and communities safe while establishing learning environments which enabled students to succeed and achieve. With each passing day, our schools are one step nearer recovery. But here at the U.S. Department of Education (ED), our work is far from done. Among the many lessons learned in the wake of the pandemic is that we must take full advantage of every opportunity to strengthen education systems and improve outcomes for all learners. From where we sit, making the most of those opportunities depends on two things: high-quality data and evidence.

Basing education policy and practice in strong evidence that is rooted in high-quality data can accelerate learning for all students, speeding efforts to recover from the pandemic’s effects. As the stewards of education data and evidence at ED, it is our charge from the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act) to improve the collection, analysis, and use of high-quality data and evidence. By doing so, we hope to help educators and policymakers at the federal, state, and local levels make the most effective decisions possible on behalf of the learners, families, and communities they serve.

In the two years since the passage of the Evidence Act, the Department’s Office of the Chief Data Officer (OCDO) has made progress in supporting ED’s mission to improve education outcomes by effectively leveraging data to support evidence-based policy and data-driven decision-making.  The Department’s Data Governance Board (DGB) was created to lead these efforts and, with the launch of its inaugural Data Strategy in December 2020, ED has established guidance and goals to go further to improve data quality and enable evidence-building in service of our nation’s learners.

The work of evidence-building is a collaborative effort, coordinated by ED’s Evaluation Officer housed at the Department’s Institute of Education Sciences. In this first phase of Evidence Act implementation, the Department has published a new agency-wide evaluation policy that governs the generation of its most rigorous evidence and is preparing to release its inaugural Annual Evaluation Plan. As part of the agency’s strategic planning process, ED will also develop and publish its first-ever Learning Agenda, documenting its evidence-building priorities for the next four years.

Even prior to the passage of the Evidence Act, ED has made data and evidence a priority. For decades, ED has been collecting and publishing data on students, teachers, schools, colleges, grants, student aid and more.  Now, with the launch of the ED’s Open Data Platform (ODP) in December 2020, educators, researchers, stakeholders, decision-makers, and the public can explore the array of taxpayer-funded education data and profiles through a user-friendly interface, with all data accessible from one central online repository.

At OCDO, we developed the ODP to link to research and ED data tools that serve to engage and inform the public through various displays of that publicly available data.  One rich example of these tools is the recently enhanced College Scorecard.  Visited by more than 1.4 million users in 2020, ED’s College Scorecard now enables students and their advocates to more easily search field of study identifiers and compare similar fields of study within an institution or across different institutions.  And with recent updates including  information on loan repayment rates and parent PLUS loan debt, prospective students now have even more data to make more informed enrollment decisions and to find the right postsecondary fit. 

In addition to making existing data more accessible to decision-makers, the Department invests in new discoveries in the education sciences that have the potential to dramatically improve student outcomes and strengthen education systems. For nearly 20 years, the Department’s Institute of Education Sciences (IES) has worked to bring rigorous, independent, and objective education statistics, research, and evaluation to bear on challenges from early childhood to adult and postsecondary education.   

Through its National Center for Education Evaluation and Regional Assistance (NCEE), IES supports several programs dedicated to improving the use of data and evidence in education practice. NCEE’s Regional Educational Laboratories (REL) program works in partnership with state and local educators and policymakers to develop and use research that improves academic outcomes for students. It’s What Works Clearinghouse™ reviews existing research on education programs, practices, and policies in education to help families, teachers, and leaders answer the question “what works” in the nation’s schools, colleges, and universities. And, through its Evaluation Division, NCEE conducts independent, high-quality evaluations of education programs supported by federal funds.

In recent months, much of the work of both OCDO and IES has pivoted to address the effects of the coronavirus pandemic. At IES, we have developed a wide range of COVID-related resources for families, educators, and policymakers. And our National Center for Education Statistics has recently announced a new survey designed to collect vital data on schools’ approaches to learning during the pandemic, critical to safely reopening America’s schools and promoting educational equity.  

OCDO also has also created valuable new resources in response to the pandemic. The new Education Stabilization Fund Public Transparency Portal provides public transparency and accountability for the over $30 billion in Elementary and Secondary School Emergency Relief, the Governor's Emergency Education Relief, and the Higher Education Emergency Relief funds established through the Coronavirus Aid, Relief, and Economic (CARES) Act. The grant funds were awarded to states, schools, and institutions of higher education last spring. Continuously updated to reflect new activity, this portal provides the public with accurate, reliable, and accessible data on one of the largest federal investments in education in our country’s history.  The portal will soon include similar accounting of the awards made to states, districts, and colleges through the $81.9 billion in Education Stabilization Funds authorized through the Coronavirus Response and Relief Supplemental Appropriations (CRRSA) Act, 2021. 

Despite the challenges we face, there is optimism, like the spark of an engaged student or the light of an inspired educator; we are eager to continue the work to serve learners through data.  The critical data priorities of ED are to empower users and leverage the data to address education equity gaps too often borne by our nation’s underprivileged students.  Rigorous evaluation identifies effective policies and practices, open and transparent data furthers research and public trust. Leveraging data to inform decisions not only improves ED operations but also helps guide schools and families in their efforts to support students and improve education outcomes. 

Your feedback is welcome, you can email us at data@ed.gov.

Subscribe to the Data Matters Blog at https://www.ed.gov/subscriptions , the NCEE blog at https://ies.ed.gov/blogs/ncee/, and follow OCDO on LinkedIn.

Yours Truly in Data and Evaluation,

Greg and Matt

P.S. Happy International Open Data Day Eve!

Introducing REL 2022

As I write this, my colleagues and I at the Regional Educational Laboratory (REL) Program are thinking about a single number: 535. No, we’re not concerned about 535 because it represents the number of voting members of Congress, though that would be a good guess. We’re also not thinking about Interstate 535, the “2.78-mile-long Auxiliary Interstate Highway spur of I-35 in the U.S. states of Minnesota and Wisconsin,” though now I’m intensely interested in why it might be that, at least according to Wikipedia, this road is “known locally as the ‘Can of Worms’ interchange.” Instead, my colleagues and I are excited about 535 because it represents the number of days between now and the start of the next cycle of the REL program, affectionately known as REL 2022.

Over a year ago, we began a process that culminates in the awarding of contracts to run each of our regional labs. We are excited to share our preliminary thoughts about the contours of REL 2022 through a Request for Information, or RFI, which we have posted hereI hope you will take time to read the RFI. If you have questions or suggestions after doing so, I hope you are moved to comment. Details on how to offer your feedback can be found in the RFI.

Importantly, we aren’t proposing to radically restructure the REL program. Instead, we are retooling some existing expectations and adding a few new features. Below, I’ve highlighted a few proposed changes that merit special attention.

The purpose of RELs is to improve student outcomes. Not to put too fine a point on it, but everything that takes place in REL 2022 should be in service of improving student outcomes. This does not mean that every REL project will, by itself, have a directly observable impact on achievement. But the work of any given REL, in concert with the efforts of those with whom it works, should be trained on a singular focus: bettering the lives of the students through education. There is no other, better, or higher calling.

We accomplish our purpose by working in partnership with stakeholders to support their use of evidence-based practices. Evidence-based practice is “baked in” to the statute that authorizes the REL program, and the importance of building and using evidence in education—and government more generally—is reiterated throughout federal law. (See, for example, the Every Student Succeeds Act of 2015 and the Foundations for Evidence-based Policymaking Act of 2018.) However, our emphasis on evidence isn’t rooted in a statutory imperative. Instead, it’s based on a set of core beliefs about our work: that researchers and educators can strengthen education via the rigorous application of the scientific method; that resources, including money and time, are constrained and that efforts with demonstrated effectiveness should be prioritized; and that each and every student deserves the best of “what works” in education, no matter their circumstance.

Nothing changes if nothing changes. In the REL 2022 cycle, we are explicitly asking RELs to think of themselves as “change agents.” This expectation is, I believe, entirely new to the REL Program and is likely to be uncomfortable to some. For that reason, it is helpful to be clear about what we’re expecting and why. Here goes.

I daresay that, no matter how proud they might be of their students and their educators, there is not a state chief, a district superintendent, or building principal who would report they are serving each of their students as well as they wish they could. (If you’re the one who does, please stop reading this blog and call me. I want to share your successes!) Each of those leaders has something they want to do better on behalf of their students and are contemplating, if not actively pursuing, change. It is our hope that RELs can join them in making change, with evidence in hand and research tools at the ready. REL reports, resources, and trainings are not ends unto themselves. They are means to enable the change efforts of local, state, and regional education leaders, working on behalf of students to improve important outcomes.

RELs work in partnership. Education research and technical assistance must be done in partnership with those it is meant to inform. Absent that, it is likely to fail to achieve its goals. At best, potentially positive impacts will be blunted. At worst, harm will be done. There’s a simple solution: collaboration that authentically engages stakeholders in all phases of project design and execution. That isn’t, I realize, as simple to do as it is to write.

As vendors consider the REL 2022 cycle, we ask that they keep two things in mind about what we’ve traditionally called partnerships. First, there are no necessary restrictions on who RELs can partner with when working with stakeholders to achieve stakeholder goals. Does it make sense to partner across levels of education within a state? Do it. Is there a state or national advocacy association that would accelerate a partner’s progress? Engage it. Is there are role for business or industry? Leverage it. A second and closely related concept is that there are no restrictions on partnerships’ functional forms. In general, it does not matter one whit to IES whether you prefer NICs, DBIR, or any other particular form of research partnership. What does? That RELs build projects in partnership—however and with whomever—intentionally, with the goal of supporting partners’ change efforts to achieve the goals they have identified.

We encourage deeper, not broader, work. We believe RELs are more likely to achieve success when they focus partnerships on clearly defined problems of policy or practice in specific geographies. A “Six-State Research Alliance on High School Graduation” can do important and meaningful work—but the process of agreeing on the work to be done and the targets to be met, seeing that work through to completion, and then achieving pre-specified goals is likely to be exceptionally difficult. The “South-Central Kansas Partnership for Kindergarten Readiness” or the “Maricopa County Alliance for Reducing Chronic Absenteeism in High Schools” may be more likely to achieve impact. This is not to say that lessons learned locally should not be shared regionally or nationally, or that groups with common interests might not form “communities of practice” or other networks for the purpose of sharing information or building connection. Rather, we ask RELs be strategic in scoping their highest-intensity work.

We define success as achieving measurable stakeholder goals. Evaluating the impact of research and technical assistance projects is notoriously hard. Often, program managers and the evaluators with whom they work are forced to satisfice, relying upon end-user self-reports of the quality, relevance, and usefulness of a provider’s work. Counts of outputs, such as report downloads and attendees served, are particularly common metrics reported in evaluation studies. Satisfaction is the coin of the realm. Lest I be accused of throwing stones inside my own glass house, let me be clear that we currently use these very measures to characterize the effectiveness of the current REL program.

In REL 2022, it is our intention to shift focus beyond outputs to emphasize outcomes. We will ask RELs to demonstrate, on a regular basis, that they are making progress toward the goals stakeholders set for important student outcomes at the outset of their work, with the acknowledgment that outputs are often critical to achieving a long-term goal and that satisfaction can be an important leading indicator. In 2027, the mark of success won’t be a glowing narrative from a state superintendent or school superintendent about the REL cycle just passed. Instead, it’ll be seeing that the quantifiable goals those leaders set for their work with the REL program were achieved.   

Putting RELs’ capacity for rigorous R&D to work. Finally, there is one manifestly new requirement for RELs as part of the 2022 cycle, one that I am particularly excited about because it brings together the best of two NCEE programs: the RELs and the What Works Clearinghouse™ (WWC). As part of the 2022 cycle, each REL will be required to develop—and then evaluate—a comprehensive toolkit based on a WWC Practice Guide, helping educators instantiate evidence-based practices in the classroom. RELs already have experience taking the content from Practice Guides and transforming them into tools for educators. Two examples include Professional Learning Community guides for both foundational reading and English learners. Similarly, North Carolina State University’s Friday Institute has looked to Practice Guides for inspiration to develop massive open online courses (MOOCs), including foundational reading and fractions. None have been evaluated for efficacy. Of course, the development and testing of these new toolkits will follow the expectations set above, including the expectation that strong and inclusive partnerships are at the root of all high-leverage work.

My NCEE colleagues and I are excited about the possibilities that REL 2022 represents. The REL program has a proud history and a strong track record of service to local, state, and regional stakeholders. We hope that, as you review the REL 2022 RFI, you’ll find the next iteration of the program continues in that tradition. As always, I welcome your feedback.

Matthew Soldner

Commissioner, National Center for Education Evaluation and Regional Assistance

 

“The How” of “What Works:” The Importance of Core Components in Education Research

Twenty-some odd years ago as a college junior, I screamed in horror watching a friend open a running dishwasher. She wanted to slip in a lightly used fork. I jumped to stop her, yelling “don’t open it, can’t you tell it’s full of water?” She paused briefly, turning to look at me with a “have you lost your mind” grimace, and yanked open the door.

Much to my surprise, nothing happened. A puff of steam. An errant drip, perhaps? But no cascade of soapy water. She slid the fork into the basket, closed the door, and hit a button. The machine started back up with a gurgle, and the kitchen floor was none the wetter.

Until that point in my life, I had no idea how a dishwasher worked. I had been around a dishwasher, but the house I lived in growing up didn’t have one. To me, washing the dishes meant filling the sink with soapy water, something akin to a washer in a laundry. I assumed dishwashers worked on the same principle, using gallons of water to slosh the dishes clean. Who knew?

Lest you think me completely inept, a counterpoint. My first car was a 1979 Ford Mustang. And I quickly learned how that very used car worked when the Mustang’s automatic choke conked out. As it happens, although a choke is necessary to start and run a gasoline engine, that it be “automatic” is not. My father Rube Goldberg-ed up a manual choke in about 15 minutes rather than paying to have it fixed.

My 14-year-old self learned how to tweak that choke “just so” so that I could get to school each morning. First, pull the choke all the way out to start the car, adjusting the fuel-air mixture ever so slightly. Then gingerly slide it back in, micron by micron, as the car warms up and you hit the road. A car doesn’t actually run on liquid gasoline, you see. Cars run on fuel vapor. And before the advent of fuel injection, fuel vapor was courtesy your carburetor and its choke. Not a soul alive who didn’t know how a manual choke worked could have started that car.

You would be forgiven if, by now, you were wondering where I am going with all of this and how it relates to the evaluation of education interventions. To that end, I offer three thoughts for your consideration:

  1. Knowing that something works is different from knowing how something works.

 

  1. Knowing how something works is necessary to put that something to its best use.

 

  1. Most education research ignores the how of interventions, dramatically diminishing the usefulness of research to practitioners.

My first argument—that there is a distinction between knowing what works and how something works—is straightforward. Since it began, the What Works Clearinghouse™ has focused on identifying “what works” for educators and other stakeholders, mounting a full-court press on behalf of internal validity. Taken together, Version 4.1 of the WWC Standards and Procedures Handbooks total some 192 pages. As a result, we have substantially greater confidence today than we did a decade ago that when an intervention developer or researcher reports that something worked for a particular group of students, we know that it actually did.

In contrast, WWC standards do not, and as far as I can tell have not ever, addressed the how of an intervention. By “the how” of an intervention, I’m referring to the parts of it that must be working, sometimes “just so,” if its efficacy claims are to be realized. For a dishwasher, it is something like: “a motor turns a wash arm, which sprays dishes with soapy water.” (It is not, as I had thought, “the dishwasher fills with soapy water that washes the mac and cheese down the drain.”) In the case of my Mustang, it was: “the choke controls the amount of air that mixes with fuel from the throttle, before heading to the cylinders.”

If you have been following the evolution of IES’ Standards for Excellence in Education Research, or SEER, and its principles, you recognize “the how” as core components. Most interventions consist of multiple core components that are—and perhaps must—be arrayed in a certain manner if the whole of the thing is to “work.” Depicted visually, core components and their relationships to one another and to the outcomes they are meant to affect form something between a logic model (often too simplistic) and a theory of change (often too complex).

(A word of caution: knowing how somethings works is also different from knowing why something works. I have been known to ask at work about “what’s in the arrows” that connect various boxes in a logic model. The why lives in those arrows. In the social sciences, those arrows are where theory resides.)  

My second argument is that knowing how something works matters, at least if you want to use it as effectively as possible. This isn’t quite as axiomatic as the distinction between “it works” and “how it works,” I realize.

This morning, when starting my car, I didn’t have to think about the complex series of events leading up to me pulling out of the driveway. Key turn, foot down, car go. But when the key turns and the car doesn’t go, then knowing something about how the parts of a car are meant to work together is very, very helpful. Conveniently, most things in our lives, if they work at all, simply do.  

Inconveniently, we don’t have that same confidence when it comes to things in education. There are currently 10,677 individual studies in the What Works Clearinghouse (WWC) database. Of those, only about 11 percent meet the WWC’s internal validity standards. Among them, only 445 have at least one statistically significant positive finding. Because the WWC doesn’t consider results from studies that don’t have strong internal validity, it isn’t quite as simple as saying “only about 4 percent of things work in education.” Instead, we’re left with “89 percent of things aren’t tested rigorously enough to have confidence about whether they work, and when tested rigorously, only about 38 percent do.” Between the “file drawer” problem that plagues research generally and our own review of the results from IES efficacy trials, we have reason to believe the true efficacy rate of “what works” in education is much lower.

Many things cause an intervention to fail. Some interventions are simply wrong-headed. Some interventions do work, but for only some students. And other interventions would work, if only they were implemented well.

Knowing an intervention’s core components and the relationships among them would, I submit, be helpful in at least that third case. If you don’t know that a dishwasher’s wash arm spins, the large skillet on the bottom rack with its handle jutting to the sky might not strike you as the proximate cause of dirty glasses on the top rack. If you don’t know that a core component of multi-tiered systems of support is progress monitoring, you might not connect the dots between a decision to cut back on periodic student assessments and suboptimal student outcomes.

My third and final argument, that most education research ignores the how of interventions, is based in at least some empiricism. The argument itself is a bit of a journey. One that starts with a caveat, wends its way to dismay, and ends in disappointment.

Here’s the caveat: My take on the relative lack of how in most education research comes from my recent experience trying to surface “what works” in remote learning. This specific segment of education research may well be an outlier. But I somehow doubt it.

Why dismay? Well, as regular readers might recall, in late March I announced plans to support a rapid evidence synthesis on effective practices in remote learning. It seemed simple enough: crowd-source research relevant to the task, conduct WWC reviews of the highest-quality submissions, and then make those reviews available to meta-analysts and other researchers to surface generalizable principles that could be useful to educators and families.

My stated goal had been to release study reviews on June 1. That date has passed, and the focus of this post is not “New WWC Reviews of Remote Learning Released.” As such, you may have gathered something about my plan has gone awry. You would be right.

Simply, things are taking longer than hoped. It is not for lack of effort. Our teams identified more than 930 studies, screened more than 700 of those studies, and surfaced 250 randomized trials or quasi-experiments. We have prioritized 35 of this last group for review. (For those of you who are thinking some version of “wow, it seems like it might be a waste to not look at 96 percent of the studies that were originally located,” I have some thoughts about that. We’ll have to save that discussion, though, for another blog.)

Our best guess for when those reviews will be widely available is now August 15. Why things are taking as long as they are is, as they say, “complicated.” The June 1 date was unlikely from the start, dependent as it was upon a series of best-case situations in times that are anything but. And at least some of the delay is driven by our emphasis on rigor and steps we take to ensure the quality of our work, something we would not short-change in any event.  

Not giving in to my dismay, however, I dug in to the 930 studies in our remote learning database to see what I might be able to learn in the meantime. I found that 22 of those studies had already been reviewed by the WWC. “Good news,” I said to myself. “There are lessons to be learned among them, I’m sure.”

And indeed, there was a lesson to be learned—just not the one I was looking for. After reviewing the lot, there was virtually no actionable evidence to be found. That’s not entirely fair. One of the 22 records was a duplicate, two were not relevant, two were not locatable, and one was behind a paywall that even my federal government IP address couldn’t get behind. Because fifteen of the sixteen remaining studies reviewed name-brand products, there was one action I could take in most cases: buy the product the researcher had evaluated.

I went through each article, this time making an imperfect determination about whether the researcher described the intervention’s core components and, if so, arrayed them in a logic model. My codes for core components included one “yes,” two “bordering on yes,” six “yes-ish,” one “not really,” and six “no.” Not surprisingly, logic models were uncommon, with two studies earning a “yes” and two more tallied as “yes-ish.” (You can see now why I am not a qualitative researcher.)

In case there’s any doubt, herein lies my disappointment: if an educator had turned to one of these articles to eke out a tip or two about “what works” in remote learning, they would have been, on average, out of luck. If they did luck out and find an article that described the core components of the tested intervention, there was a vanishingly small chance there would be information on how to put those components together to form a whole. As for surfacing generalizable principles for educators and families across multiple studies? Not without some serious effort, I can assure you.

I have never been more convinced of the importance of core components being well-documented in education research than I am today. As they currently stand, the SEER principles for core components ask:

  • Did the researcher document the core components of an intervention, including its essential practices, structural elements, and the contexts in which it was implemented and tested?
  • Did the researcher offer a clear description of how the core components of an intervention are hypothesized to affect outcomes?
  • Did the researcher's analysis help us understand which components are most important in achieving impact?

More often than not, the singular answer to the questions above is a resounding “no.” That is to the detriment of consumers of research, no doubt. Educators, or even other researchers, cannot turn to the average journal article or research report and divine enough information about what was actually studied to draw lessons for classroom practice. (There are many reasons for this, of course. I welcome your thoughts on the matter.) More importantly, though, it is to the detriment of the supposed beneficiaries of research: our students. We must do better. If our work isn’t ultimately serving them, who is it serving, really?  

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance
Agency Evaluation Officer, U.S. Department of Education