NCEE Blog

National Center for Education Evaluation and Regional Assistance

Getting to Know ED: My Journey as a STEM Next Fellow at IES

This guest blog was contributed by Dr. Holly Miller, who currently serves as a STEM Next Opportunity Fund Fellow at the Institute of Education Science’s National Center for Education Evaluation.

Since August 2022, I’ve been serving as the STEM Next Opportunity Afterschool and Summer Learning Fellow at the U.S. Department of Education (ED). More specifically, I work within the Department’s Institute of Education Sciences (IES).

Upon arriving at IES, I was charged with a specific challenge: amplify how evidence-based practice in out-of-school time (OST) can support student learning and development. This mission was made all the more relevant by the need for states and districts to respond to the consequences of the COVID pandemic which, at the time, remained an official national emergency.

Perhaps naively, I hoped to walk in on Day One and find “The Official Compendium of Evidence-based Practices in Global Pandemics and Related Crises” that I could pull off the shelf and hand to educators. Unfortunately, I quickly discovered no such tome existed. And I began to realize that one of the biggest challenges I’d face in my new role was getting to know ED itself! To an outsider, the Department can seem like a huge machine. Getting to know it, though, can pay incredible dividends. As I came to learn, there are tons of great resources—if only you know where to look.

One of OST educators’ first stops in getting to know ED should be IES. For the uninitiated, IES is the Department’s statistics, research, and evaluation arm. The mission of IES is to provide scientific evidence on which to ground education practice and policy and to share this information in formats that are useful and accessible to educators, parents, policymakers, researchers, and the public. It is independent and non-partisan.

Across its four centers—the National Centers for Education Statistics, Education Evaluation, Education Research, and Special Education Research—IES conducts six broad types of work (http://ies.ed.gov):

1. Providing data to describe the “condition of education,” including students’ academic proficiency.

2. Conducting surveys and sponsoring research projects to understand where education needs improvement and how these improvements might be made.

3. Funding development and rigorous testing of new approaches for improving education outcomes for all students.

4. Conducting large-scale evaluations of federal education programs and policies.

5. Providing resources to increase the use of data and research in education decision-making, including independent reviews of research on “what works” in education through the What Works Clearinghouse.

6. Supporting the advancement of statistics and research through specialized training and development of methods and measures.

I could see that this work had the potential to benefit a variety of stakeholders—teachers, administrators, students, researchers, and policymakers. Still, I had so many unanswered questions. As a middle school teacher, I frequently told students, “The only dumb question is the one you don’t ask.” Therefore, as I surveyed the education research landscape at IES, I asked lots and lots of questions. My presence at IES was akin to a toddler at the zoo for the first time: “What are those? Why is that so big? Why don’t we have more of these? When do we eat?” Months of asking and I find my queries have been distilled into two essential questions:

  1. What has been the impact of the COVID pandemic on students and educators; and

 

  1. How can education research, like that conducted or sponsored by IES, help us understand—and address—those impacts?

What has been the impact of the COVID-19 pandemic?

The pandemic disrupted nearly every aspect of daily life in the United States, including the education system. One of the most alarming impacts of the pandemic on education has been the widening of pre-existing gaps in student achievement and the resources that students need to be successful.

We all know the statistics … students have lost tons of learning. The "Report on the Condition of Education" is a congressionally mandated annual report from the National Center for Education Statistics (NCES). Using the most recent data available from NCES and other sources, the report contains key indicators on the condition of education in the United States at all levels, from prekindergarten through postsecondary, as well as labor force outcomes and international comparisons. For example, the report on the condition of education 2023 recently released shares that on both the 4th- and 8th-grade NAEP mathematics assessments, higher percentages of students performed below NAEP Basic in 2022 than in 2019 (Irwin et al., 2023).  This has been particularly bad among students who have historically been underserved. The average NAEP mathematics scores in 2022 were generally lower for English Learners (EL) students than for non-EL students; lower for those identified as students with disabilities than for their peers without disabilities; and higher for students in low-poverty schools than for students in high-poverty schools. These patterns were similar to those observed for reading (Irwin et al., 2023).

This is surely due, at least in part, to differences in the resources students have access to. Even before the pandemic, huge gaps in resources existed. The pandemic only made matters worse. According to a report by the U.S. Department of Education’s Office for Civil Rights (OCR) (2021), low-income students and students of color have been disproportionately negatively impacted by school shutdowns and remote learning practices. These students often lack access to reliable technology and internet resources, making it difficult for them to participate fully in online classes and complete assignments. Additionally, many students rely on meals provided by schools, so the closure of physical school buildings has led to food insecurity for some.

Also of note: the dramatic effect on student wellbeing. During the pandemic, mental health concerns such as fear, anxiety, and depression were common among the general public, especially children and older adults (Brooks et al., 2020; Pfefferbaum & North, 2020).  Research on the pandemic’s impact on mental health among students finds that “they showed increased fear, stress, and decreased happiness, and these were associated with their learning quality change.” (Hu et al., 2022).

Furthermore, the impact of COVID on educators is increasingly well-known. Educators had to make changes in short order, often with limited resources. This had consequences. Educators faced increased stress levels due to the shift to remote instruction, and many reported struggling to maintain a work-life balance while working from home. Findings indicate teachers reported greater mental health concerns than those in many other professions, and that remote teachers reported significantly higher levels of distress than those teaching in person (Kush et al., 2021). For some, it was too much, and they made the decision to leave the profession. Forty percent of public schools hiring for open teaching positions in special education in 2020–21 reported having difficulties filling the opening, compared with 17 percent in 2011–12 (Irwin et al., 2023) Not only were teachers leaving the workforce, but potential teachers were second-guessing their career choice. The number of persons enrolled in traditional teacher preparation programs decreased by 30 percent between 2012–13 and 2019–20, and the number of persons completing such programs decreased by 28 percent between 2012–13 and 2019–20 (Irwin et al., 2023).  

All of us are looking for solutions to all these problems. Given that I entered IES during the pandemic, I wanted to know how I could leverage its resources to help.

How can education research help?

First, I had to understand how IES, as a science agency, was structured to do the work of education research. My college textbook on education research (Newby, 2010) asserted that it should have three objectives: to explore issues and find answers to questions, to collect and disseminate information that shapes policy and decision-making, and to improve practice for practitioners.

It’s easy to see how the six broad areas of work at IES I listed above fit within those three objectives. For example, in normal (that is, pre-COVID) times, it’s the job of the National Center for Education Statistics (NCES) to collect and disseminate education-related statistics and information about student achievement to inform the work of researchers, policymakers, and other education decision-makers. IES’ two research Centers, the National Centers for Education Research (NCER) and Special Education Research (NCSER) support researchers’ exploration of a wide range of education topics and their use of high-quality methods to answer important questions of policy and practice. Finally, the National Center for Education Evaluation and Regional Assistance (NCEE) conducts its own rigorous evaluations of federal policies and programs; supports states and districts in the use of data, evidence, and applied research to improve local practice; and disseminates information about “what works” through its What Works Clearinghouse (WWC). In the wake of the pandemic, IES had to quickly focus its activities and resources to meet new demands across the education system. Here are just a few of the new questions that IES had to address amid the pandemic.

  • What’s happening in schools, and who is learning in-person versus virtually or in hybrid settings? In late 2021, NCES leveraged work being done as part of the National Assessment of Educational Progress (NAEP) to meet an immediate need to better understand schools’ policies about learning mode, masking, and social distancing. In the weeks that followed, the School Pulse Panel was created (https://ies.ed.gov/schoolsurvey/spp/). Initially, the School Pulse focused on collecting monthly information on the impact of the COVID-19 pandemic from a national sample of elementary, middle, high, and combined-grade public schools. Over time, its focus has broadened. While some survey questions are asked repeatedly to observe trends over time, others are unique each month. IES is now able to provide regular and near-real-time snapshots into “what’s happening” in the nation’s schools on a wide range of topics that matter to educators, policymakers, and families.

 

 

  • How can educators and caregivers support student learning in online, hybrid, and at-home settings? With schools closed and remote learning becoming the norm, educators and caregivers had to adapt their teaching methods and find new ways to engage students. As part of a mandate to provide assistance about “what works” in education, NCEE supported a series of efforts to bring together information for teachers navigating online and hybrid teaching environments and for caregivers who were providing instruction at home. NCEE commissioned work leading to the development of the “Best Practice in K-12 Online Teaching” minicourse (here), freely available from North Carolina State University, to support teachers new to online education in their transition to the medium. (The literature review on which the mini-course is based can be found here). NCEE’s Regional Educational Laboratories developed nearly 200 pandemic-related resources. Notable examples include “Supporting Your Child’s Reading at Home” (https://ies.ed.gov/ncee/rel/Products/Region/southeast/Resource/100679), which focuses on the development of early literacy skills, and “Teaching Math to Young Children for Families and Caregivers” (https://ies.ed.gov/ncee/rel/Products/Region/central/Resource/100652).

   

Since its inception in 2002, IES and its Centers have supported decision-makers—be they federal, state, or local—and educators in making use of high-quality evidence in their practice. The pandemic showed just important IES, its resources, and its infrastructure, can be.

In the pandemic’s wake, though, it seems to me that building even more evidence about “what works” is vital. The American Rescue Plan (ARP) provided historic levels of resources to expand educational opportunities and to ensure that education is better able to address the wide-ranging needs of students and their families – especially those who were disproportionately impacted by the pandemic. Many ARP investments, including those related to OST, have the requirement that programs be rooted in evidence-based practices. Because there are still things to learn about what makes strong programs, we can strengthen the field by building evidence that can address key problems of practice.

Conclusion

When I came to ED and IES, searching for information on how to use evidence-based practices to support COVID recovery within the context of OST, I was lost. As I’ve come to better understand the organization, I’ve learned that vast resources are available. Half of the battle was just figuring out “what lives where” within the Department! I hope this blog has given OST practitioners a bit of a roadmap to make their own process of discovery easier.

In Part Two of this series, I will explore how OST learning fits into ED, education research, and the post-pandemic education system. The latter has been profoundly affected, creating an opportunity for innovation and transformation in the delivery of education. The value of research cannot be underestimated in this context. As a result, my next blog will pose two questions. First, I’ll ask what the role of OST in learning recovery can be in the years ahead.  Then I’ll consider what evidence needs to be built to make the most of what OST can offer. I hope you’ll read it!

I’d love to hear your thoughts on this blog. Send them my way at holly.miller@ed.gov.

 

Citations

Brooks S.K., Webster R.K., Smith L.E., Woodland L., Wessely S., Greenberg N., Rubin G.J. (2020). The psychological impact of quarantine and how to reduce it: rapid review of the evidence. Lancet North Am. Ed. 395(10227):912–920.

Education in a Pandemic: The Disparate Impacts of COVID-19 on America's Students 2021 U.S. Department of Education’s Office for Civil Rights (OCR).

Hu K, Godfrey K, Ren Q, Wang S, Yang X, Li Q. (2022). The impact of the COVID-19 pandemic on college students in USA: Two years later. Psychiatry Res. Sep; 315:114685.

Huck, C., & Zhang, J. (2021). Effects of the COVID-19 Pandemic on K-12 Education: A Systematic Literature Review. New Waves-Educational Research and Development Journal, 24(1), 53-84.

Irwin, V., Wang, K., Tezil, T., Zhang, J., Filbey, A., Jung, J., ... & Parker, S. (2023). Report on the Condition of Education 2023. NCES 2023-144. National Center for Education Statistics.

Kush, J. M., Badillo-Goicoechea, E., Musci, R. J., & Stuart, E. A. (2021). Teacher mental health during the COVID-19 pandemic: informing policies to support teacher well-being and effective teaching practices.

Newby, P. (2010). Research Methods for Education. Pearson Education.

Pfefferbaum B., North C.S. (2020). Mental health and the Covid-19 pandemic. N. Engl. J. Med.;383(6):510–512.

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

NCEE is hiring!

The U.S. Department of Education’s Institute of Education Sciences (IES) is seeking professionals in education-related fields to apply for an open position in the National Center for Education Evaluation and Regional Assistance (NCEE). Located in NCEE’s Evaluation Division, this position would support impact evaluations and policy implementation studies. Learn more about our work here: https://ies.ed.gov/ncee.

If you are even potentially interested in this sort of position, you are strongly encouraged to set up a profile in USAJobs (https://www.usajobs.gov/) and to upload your information now. As you build your profile, include all relevant research experience on your resume whether acquired in a paid or unpaid position. The position will open in USAJobs on July 15, 2019 and will close as soon as 50 applications are received, or on July 29, 2019, whichever is earlier. Getting everything in can take longer than you might expect, so please apply as soon as the position opens in USAJobs (look for vacancy number IES-2019-0023).

 

Regional Educational Laboratories: Connecting Research to Practice

By Joy Lesnick, Acting Commissioner, NCEE

Welcome to the NCEE Blog! 

Joy Lesnick

We look forward to using this space to provide information and insights about the work of the National Center for Education Evaluation and Regional Assistance (NCEE). A part of the Institute of Education Sciences (IES), NCEE’s primary goal is providing practitioners and policymakers with research-based information they can use to make informed decisions. 

We do this in a variety of ways, including large-scale evaluations of education programs and practices supported by federal funds; independent reviews and syntheses of research on what works in education; and a searchable database of research citations and articles (ERIC) and reference searches from National Library of Education. We will explore more of this work in future blogs, but in this post I’d like to talk about an important part of NCEE—the Regional Educational Laboratories (RELs).

It’s a timely topic. Last week, the U.S. Department of Education released a solicitation for organizations seeking to become REL contractors beginning in 2017 (the five-year contracts for the current RELs will conclude at the end of 2016). The REL program is an important part of the IES infrastructure for bridging education research and practice. Through the RELs, IES seeks to ensure that research does not “sit on a shelf” but rather is broadly shared in ways that are relevant and engaging to policymakers and practitioners. The RELs also involve state and district staff in collaborative research projects focused on pressing problems of practice. An important aspect of the RELs’ work is supporting the use of research in education decision making – a charge that the Every Student Succeeds Act has made even more critical.

The RELs and their staff must be able to navigate comfortably between the two worlds of education research and education practice, and understand the norms and requirements of both.  As part of this navigating, RELs focus on: (1) balancing rigor and relevance; (2) differentiating support to stakeholders based on need; (3) providing information in the short term, and developing evidence over the long term; and (4) addressing local issues that can also benefit the nation.

While the RELs are guided by federal legislation, their work reflects – and responds to – the needs of their communities. Each REL has a governing board comprised of state and local education leaders that sets priorities for REL work. Also, nearly all REL work is conducted in collaboration with research alliances, which are ongoing partnerships in which researchers and regional stakeholders work together over time to use research to address an education problem.  

Since the current round of RELs were awarded in 2012, these labs and their partners have conducted meaningful research resulting in published reports and tools, held hundreds of online and in-person seminars and training events that have been attended by practitioners across the country, and produced videos of their work that you can find on the REL Playlist on the IES YouTube site. Currently, the RELs have more than 100 projects in progress. RELs do work in nearly every topic that is crucial to improving education—kindergarten readiness, parent engagement, discipline, STEM education, college and career readiness, teacher preparation and evaluation, and much more.

IES’s vision is that the 2017–2022 RELs will build on and extend the current priorities of high-quality research, genuine partnership, and effective communication, while also tackling high-leverage education problems.  High-leverage problems are those that: (1) if addressed could result in substantial improvements in education outcomes for many students or for key subgroups of students; (2) are priorities for regional policymakers, particularly at the state level; and (3) require research or research-related support to address well. Focusing on high-leverage problems increases the likelihood that REL support ultimately will contribute to improved student outcomes.

Visit the IES REL website to learn more about the 2012-2017 RELs and how you can connect with the REL that serves your region.  Visit the FedBizOpps website for information about the competition for the 2017-2022 RELs.