IES Blog

Institute of Education Sciences

New Education Data from the Household Pulse Survey

Recognizing the extraordinary information needs of policymakers during the coronavirus pandemic, NCES joined a partnership with the Census Bureau and four other federal statistical agencies to quickly develop a survey to gather key indicators of our nation’s response to the global pandemic. Thus, the experimental 2020 Household Pulse Survey began development on March 23, 2020, and data collection began on April 23, 2020. This new survey provides weekly national and state estimates, which are released to the public in tabular formats one week after the end of data collection.

The Household Pulse Survey gathers information from adults about employment status, spending patterns, food security, housing, physical and mental health, access to health care, and educational disruption. The education component includes questions about the following:

  • The weekly time spent on educational activities by students in public and private elementary and secondary schools
  • The availability of computer equipment and the Internet for instructional purposes
  • The extent to which computer equipment and the Internet for students were provided or subsidized

Since this survey is designed to represent adults 18 years old and over, the responses to the education questions concern students within the households of adults 18 years old and over, not the percentage of students themselves.

In the Household Pulse Survey during the weeks of April 23 through May 5, adults reported that their average weekly time spent on teaching activities with elementary and secondary students in their household was 13.1 hours. These results differed by educational attainment: adults who had not completed high school reported a weekly average of 9.9 hours in teaching activities with children, whereas adults with a bachelor’s or higher degree reported 13.9 hours (figure 1). In terms of the average weekly time spent on live virtual contact between students in their household and their teachers, adults reported a lower average of 4.1 hours.



Adults’ reports about the school instruction model need to be interpreted carefully because respondents could choose multiple types of approaches. A higher percentage of adults with a bachelor’s or higher degree (84 percent) reported that classes for elementary and secondary students in their household had moved to a format using online resources than did adults who had completed some college or an associate’s degree (74 percent), adults who had completed only high school (64 percent), or adults who had not completed high school (57 percent).

Higher percentages of adults with higher levels of education than of adults with lower levels of education reported that computers and the Internet were always available for educational purposes for elementary and secondary students in their households (figure 2).



The percentage of adults who reported that the school district provided a computer or digital device for children in their households to use at home for educational purposes was higher for adults who had not completed high school (44 percent) than for adults with a bachelor’s or higher degree (33 percent). Also, a higher percentage of adults who had not completed high school than of adults with higher levels of educational attainment reported financial assistance for student Internet access.

It is important to note that the speed of the survey development and the pace of the data collection efforts have led to policies and procedures for the experimental Household Pulse Survey that are not always consistent with traditional federal survey operations. Data should be interpreted with proper caution.  

More information on the Household Pulse Survey, detailed statistical tables, and microdata sets are available at https://www.census.gov/householdpulsedata. The Household Pulse Survey site includes breakouts of the data by other characteristics, such as race/ethnicity. In addition to participating in the development of this new survey, NCES has also generated new analyses based on existing data that respond to new needs for policy information, such as the availability of the Internet for student learning.

 

By Xiaolei Wang, AIR

Using Mistakes as a Vehicle for Learning in Mathematics: From Research to Practice at Scale in Education

Every student makes mistakes. But not every student is given the opportunity to learn from mistakes. Left unaddressed, the mathematical misconceptions that underlie many mistakes can keep students from progressing in mathematics.

 

At the request of districts in the Minority Student Achievement Network (MSAN), a Strategic Education Research Partnership (SERP) team was convened in 2007 to address a widening achievement gap in Algebra I. The team was charged with identifying an intervention strategy, subject to several district constraints:

  1. The solution would need to be applied to all students in the regular classroom to avoid the stereotype threat associated with separating students based on performance and to protect the intervention from budget cuts that target supplemental, after-school, and summer programs first.
  2. A new curriculum was off the table because it would create upheaval for a time and would be followed by a decline in student performance during the period of adjustment.
  3. Extensive teacher training was considered undesirable because it would be costly and because algebra teachers consider themselves more expert in mathematics teaching than central office staff who would be requiring the training.

 

Julie Booth joined the partnership, and with funding from IES, led the iterative development and testing of worked example assignments that, with the input of teachers and administrators, fit within the routines of the classroom. The result—AlgebraByExample—consists of 42 uniquely designed assignments that address misconceptions, harness the power of explanation, and use mistakes as a vehicle for learning.

Typical math assignments require students to solve problems on their own. If a student’s work is incorrect, the student may never focus on what went wrong. ByExample assignments also give students problems to solve, but they first provide a solution to a similar problem that is marked right or wrong. Students are prompted with questions that target common misconceptions and errors before solving a similar problem on their own. Each assignment contains several strategically designed item pairs:

 

 

Designed in collaboration with teachers from districts in several states, the assignments can be easily incorporated into any Algebra I curriculum and teachers can choose in what way and in what order to use them. The assignments were tested in randomized trials in classrooms in eight districts with more than 6,000 students. Not only did students using AlgebraByExample improve an average of 7 percentage points on an assessment of standardized test items, students at the lower end of the distribution improved the most. The PDF downloads of the assignments are freely available for anyone to use.

The success of AlgebraByExample  led to  further IES funding of MathByExample for Grades 4 and 5 and GeometryByExample for high school geometry .

 

Resources:

AlgebraByExample website

MathByExample website

Booth et al, 2015

NSF Stem for All Video Submission 2019

 

Interview with Dr. Suzanne Donovan (SERP), Dr. Julie Booth (Temple University), and Allie Huyghe (SERP), the developers of the ByExample interventions.

 

 

Was it part of the original plan to develop an intervention that could one day be used at scale in schools?

Yes. SERP partnerships begin with problems of practice nominated by district partners, but the partnership agreement distinguishes SERP from a consultant. The intention from the start is to frame the problem and design a solution that can be used at scale. SERP has developed in-house, user-centered design expertise so that resources (such as the ByExample products) developed through partnerships meet the needs of teachers and students. Products scale when they improve the experience of teachers and students. Both the model and the internal design capacity allow SERP to move from problem framing through research, development, and dissemination of a product with IES grant funding.

 

Describe the initial research and development that occurred.

Dr. Julie Booth drafted initial assignments drawing on the mathematics misconceptions literature. SERP held regular partnership meetings with teachers and administrators at which assignments were reviewed and additional misconceptions were nominated for attention in the assignments. Administrators agreed to randomization of the assignments across classrooms and within-teacher. Assignments were first tested in individual topic blocks and revised in accordance with student performance data, observations, and teacher feedback. A year-long pilot study was then conducted using the full set of assignments.

 

Beyond IES or ED grants, what additional funding was needed to develop the intervention?

For the ByExample work, additional funding was provided by the Goldman Sachs Foundation in the initial phase to support partnership formation, problem framing, and the solution generation. IES grants funded the research and development, along with initial dissemination activities to make the materials available to the public. We were also able to develop an online platform to allow for digital use with the IES grant funds.

 

What model was used for dissemination and sustainability?

The assignments are available as free downloads on SERP’s website, and as printed workbooks through SERP’s partner print-on-demand company. They have been publicized through online communications, journal articles, presentations at conferences of various types, social media, and word of mouth. There will be a small fee for use of the digital platform to support its maintenance, but the PDFs will remain as free downloads. We have been able to sustain the collaboration of the partnership team by responding to requests from educators to expand the approach to other grade levels and submitting additional proposals to IES that have been awarded.

 

What advice would you provide to researchers who are looking to move their research from the lab to market? What steps should they take? What resources should they look for?

First, I would note that it is difficult to persuade educators to use a product that solves a problem they don’t believe they have. Listen to educators and apply research expertise to address the challenges that they experience on a day-to-day basis. Design for ease of use by teachers. No matter how good your strategy or marketing is, if it’s too much work for an already busy teacher to use, you may get uptake by a few committed teachers, but not at scale. Finally, pay attention to where teachers get their information. For AlgebraByExample, we got a big boost from the Marshall Report, produced by a teacher for other teachers to call attention to usable research.  

 

In one sentence, what would you say is most needed for gaining traction and wide scale use by educators?

Design for the routines of the classroom.

 


Suzanne Donovan, PhD, is the founding Executive Director of the SERP Institute, an education research, development, and implementation organization incubated at the National Academies. SERP leads collaborations of educators, researchers, and designers to generate research-based, scalable, and sustainable solutions to critical problems of practice. 

Julie Booth, PhD, is a Professor of STEM Education and Psychology and the Deputy Dean of Academic and Faculty Affairs at Temple University’s College of Education and Human Development. Her work focuses on translating between cognitive science and education to better understand students’ learning and improve instruction, primarily in mathematics education. She is currently an Executive Editor for the Journal of Experimental Education.

Allie Huyghe is the Assistant Director of the SERP Institute, where she manages several projects, including the IES-funded MathbyExample and GeometryByExample projects. She is also intricately involved with other SERP areas of work, participating in the design of materials from early development through release to the public.

 

This interview was produced by Christina Chhin (Christina.Chhin@ed.gov) and Edward Metz (Edward.Metz@ed.gov) of the Institute of Education Sciences. This is the fifth in an ongoing series of blog posts examining moving from university research to practice at scale in education.​

 

 

Introducing REL 2022

As I write this, my colleagues and I at the Regional Educational Laboratory (REL) Program are thinking about a single number: 535. No, we’re not concerned about 535 because it represents the number of voting members of Congress, though that would be a good guess. We’re also not thinking about Interstate 535, the “2.78-mile-long Auxiliary Interstate Highway spur of I-35 in the U.S. states of Minnesota and Wisconsin,” though now I’m intensely interested in why it might be that, at least according to Wikipedia, this road is “known locally as the ‘Can of Worms’ interchange.” Instead, my colleagues and I are excited about 535 because it represents the number of days between now and the start of the next cycle of the REL program, affectionately known as REL 2022.

Over a year ago, we began a process that culminates in the awarding of contracts to run each of our regional labs. We are excited to share our preliminary thoughts about the contours of REL 2022 through a Request for Information, or RFI, which we have posted hereI hope you will take time to read the RFI. If you have questions or suggestions after doing so, I hope you are moved to comment. Details on how to offer your feedback can be found in the RFI.

Importantly, we aren’t proposing to radically restructure the REL program. Instead, we are retooling some existing expectations and adding a few new features. Below, I’ve highlighted a few proposed changes that merit special attention.

The purpose of RELs is to improve student outcomes. Not to put too fine a point on it, but everything that takes place in REL 2022 should be in service of improving student outcomes. This does not mean that every REL project will, by itself, have a directly observable impact on achievement. But the work of any given REL, in concert with the efforts of those with whom it works, should be trained on a singular focus: bettering the lives of the students through education. There is no other, better, or higher calling.

We accomplish our purpose by working in partnership with stakeholders to support their use of evidence-based practices. Evidence-based practice is “baked in” to the statute that authorizes the REL program, and the importance of building and using evidence in education—and government more generally—is reiterated throughout federal law. (See, for example, the Every Student Succeeds Act of 2015 and the Foundations for Evidence-based Policymaking Act of 2018.) However, our emphasis on evidence isn’t rooted in a statutory imperative. Instead, it’s based on a set of core beliefs about our work: that researchers and educators can strengthen education via the rigorous application of the scientific method; that resources, including money and time, are constrained and that efforts with demonstrated effectiveness should be prioritized; and that each and every student deserves the best of “what works” in education, no matter their circumstance.

Nothing changes if nothing changes. In the REL 2022 cycle, we are explicitly asking RELs to think of themselves as “change agents.” This expectation is, I believe, entirely new to the REL Program and is likely to be uncomfortable to some. For that reason, it is helpful to be clear about what we’re expecting and why. Here goes.

I daresay that, no matter how proud they might be of their students and their educators, there is not a state chief, a district superintendent, or building principal who would report they are serving each of their students as well as they wish they could. (If you’re the one who does, please stop reading this blog and call me. I want to share your successes!) Each of those leaders has something they want to do better on behalf of their students and are contemplating, if not actively pursuing, change. It is our hope that RELs can join them in making change, with evidence in hand and research tools at the ready. REL reports, resources, and trainings are not ends unto themselves. They are means to enable the change efforts of local, state, and regional education leaders, working on behalf of students to improve important outcomes.

RELs work in partnership. Education research and technical assistance must be done in partnership with those it is meant to inform. Absent that, it is likely to fail to achieve its goals. At best, potentially positive impacts will be blunted. At worst, harm will be done. There’s a simple solution: collaboration that authentically engages stakeholders in all phases of project design and execution. That isn’t, I realize, as simple to do as it is to write.

As vendors consider the REL 2022 cycle, we ask that they keep two things in mind about what we’ve traditionally called partnerships. First, there are no necessary restrictions on who RELs can partner with when working with stakeholders to achieve stakeholder goals. Does it make sense to partner across levels of education within a state? Do it. Is there a state or national advocacy association that would accelerate a partner’s progress? Engage it. Is there are role for business or industry? Leverage it. A second and closely related concept is that there are no restrictions on partnerships’ functional forms. In general, it does not matter one whit to IES whether you prefer NICs, DBIR, or any other particular form of research partnership. What does? That RELs build projects in partnership—however and with whomever—intentionally, with the goal of supporting partners’ change efforts to achieve the goals they have identified.

We encourage deeper, not broader, work. We believe RELs are more likely to achieve success when they focus partnerships on clearly defined problems of policy or practice in specific geographies. A “Six-State Research Alliance on High School Graduation” can do important and meaningful work—but the process of agreeing on the work to be done and the targets to be met, seeing that work through to completion, and then achieving pre-specified goals is likely to be exceptionally difficult. The “South-Central Kansas Partnership for Kindergarten Readiness” or the “Maricopa County Alliance for Reducing Chronic Absenteeism in High Schools” may be more likely to achieve impact. This is not to say that lessons learned locally should not be shared regionally or nationally, or that groups with common interests might not form “communities of practice” or other networks for the purpose of sharing information or building connection. Rather, we ask RELs be strategic in scoping their highest-intensity work.

We define success as achieving measurable stakeholder goals. Evaluating the impact of research and technical assistance projects is notoriously hard. Often, program managers and the evaluators with whom they work are forced to satisfice, relying upon end-user self-reports of the quality, relevance, and usefulness of a provider’s work. Counts of outputs, such as report downloads and attendees served, are particularly common metrics reported in evaluation studies. Satisfaction is the coin of the realm. Lest I be accused of throwing stones inside my own glass house, let me be clear that we currently use these very measures to characterize the effectiveness of the current REL program.

In REL 2022, it is our intention to shift focus beyond outputs to emphasize outcomes. We will ask RELs to demonstrate, on a regular basis, that they are making progress toward the goals stakeholders set for important student outcomes at the outset of their work, with the acknowledgment that outputs are often critical to achieving a long-term goal and that satisfaction can be an important leading indicator. In 2027, the mark of success won’t be a glowing narrative from a state superintendent or school superintendent about the REL cycle just passed. Instead, it’ll be seeing that the quantifiable goals those leaders set for their work with the REL program were achieved.   

Putting RELs’ capacity for rigorous R&D to work. Finally, there is one manifestly new requirement for RELs as part of the 2022 cycle, one that I am particularly excited about because it brings together the best of two NCEE programs: the RELs and the What Works Clearinghouse™ (WWC). As part of the 2022 cycle, each REL will be required to develop—and then evaluate—a comprehensive toolkit based on a WWC Practice Guide, helping educators instantiate evidence-based practices in the classroom. RELs already have experience taking the content from Practice Guides and transforming them into tools for educators. Two examples include Professional Learning Community guides for both foundational reading and English learners. Similarly, North Carolina State University’s Friday Institute has looked to Practice Guides for inspiration to develop massive open online courses (MOOCs), including foundational reading and fractions. None have been evaluated for efficacy. Of course, the development and testing of these new toolkits will follow the expectations set above, including the expectation that strong and inclusive partnerships are at the root of all high-leverage work.

My NCEE colleagues and I are excited about the possibilities that REL 2022 represents. The REL program has a proud history and a strong track record of service to local, state, and regional stakeholders. We hope that, as you review the REL 2022 RFI, you’ll find the next iteration of the program continues in that tradition. As always, I welcome your feedback.

Matthew Soldner

Commissioner, National Center for Education Evaluation and Regional Assistance

 

NHES Data Files Provide Researchers Supplemental Information on Survey Respondents’ Communities

Increasingly, researchers are merging survey data with data from external sources, such as administrative data or different surveys, to enhance analyses. Combining data across sources increases the usefulness of the data while minimizing the burden on survey respondents.

In September, the National Household Education Surveys Program (NHES) released restricted-use supplemental geocode data files that use sample respondents’ addresses to integrate the 2016 NHES Parent and Family Involvement in Education (PFI), Early Childhood Program Participation (ECPP), and Adult Training and Education (ATES) survey data with data from other collections. The supplemental geocode files include additional geographic identifiers, characteristics of respondents’ neighborhoods and local labor markets, radius-based measures of household proximity to job search assistance and educational opportunities, and, for surveys focused on children, school district identifiers based on home addresses and school district characteristics.

The new data can complement researchers’ analyses of data from all three surveys. Researchers can expand their analyses of school choice and access to K–12 schooling options using the PFI survey data. Those interested in analyses of decisions about children’s early education can use the ECPP survey data to look at the availability of Head Start programs, preschools in private schools near children’s homes, and the prevalence of prekindergarten programs in local school districts. Researchers interested in nondegree credential attainment and training for work can use data from the ATES to find information on local labor markets and the number of American Job Centers near respondents’ homes.

The NHES:2016 restricted-use supplemental geocode files are available to restricted-use license holders to be used in conjunction with the NHES:2016 survey data files. To access the full set of NHES:2016 geocode supplemental restricted-use data files, apply for a restricted-use license. You can also browse the list of variables in the supplemental geocode files.

 

By Emily Isenberg and Sarah Grady, NCES

Equity: Alignment of Mission and Methods

Editor's Note: The following post was originally posted on the IES-funded CTE Research Network. The grantee has given us permission to post it on the IES blog.

Funded in 2018 by the Institute of Education Sciences (IES), the Career and Technical Education (CTE) Research Network aims to conduct and promote high-quality casual studies examining the impact of career and technical education. Aligned with the theme of the January 2020 IES Principal Investigators Meeting – Closing the Gaps for All Learners – the Network’s activities include working to deepen the field’s understanding of issues of equity and inequity in CTE research and evaluation.

 

The importance of understanding equity in CTE research

The Wisconsin Department of Public Instruction defines equity in the following way:

“Every student has access to the educational resources and rigor they need at the right moment in their education across race, gender, ethnicity, language, disability, sexual orientation, family background, and/or family income.”

An explicit focus on equity in CTE is particularly important considering that in the not so distant past, vocational education (a precursor to the term career and technical education, or CTE) often served as the track for youth deemed “unable to learn” or “not college material.” In many cases, vocational education was used to systematically relegate students—many of whom were low-income, Black or African American, Latinx, or American Indian—into low-wage jobs that offered limited opportunities for growth.

Today, the focus of CTE has expanded to include fields in science, technology, engineering and mathematics (STEM) and represents for many young people an opportunity to graduate high school and enter postsecondary education or the labor market with highly valued skills and certifications in numerous fields. As CTE has evolved, participation has become associated with a variety of positive outcomes. For example, researchers have found that CTE course taking is associated with higher high school graduation and postsecondary enrollment rates, higher labor market earnings, and better overall student outcomes.

While these positive CTE outcomes are promising, there is more to understand about the causal outcomes associated with CTE participation, especially among subgroups of students based on race, gender, socioeconomic level, and ability status. IES and the CTE Research Network are committed to deepening the field’s understanding of equity and inequity in CTE studies. Along with acknowledging the pernicious ways in which vocational education has historically been used to discriminate against some students and disaggregating outcome data by student subpopulation (an emphasis in recent Perkins V legislation), the network concludes that at a minimum, engaging in equity-minded research and evaluation requires:

  • Establishing diverse research teams: Research has shown that diversity on teams yields greater innovation, more productivity, and better financial results (Levine, 2020). With these benefits in mind, it is important to be intentional in creating diverse research teams that can bring new perspectives, voices, and approaches to studies that aim to identify, analyze and interpret equity data.
  • Adopting an equity mindset in research and evaluation: To inform the field’s understanding of how CTE may promote or inhibit equitable student outcomes, researchers must commit to recognizing their own biases and examining how those biases may influence their research designs and analyses. An equity mindset also requires capturing and analyzing patterns of inequities that appear in administrative and implementation data.
  • Exploring intersectionality: Adopting an equity mindset—as important for research as is using valid and reliable measures—also requires conducting analyses of CTE outcomes that go beyond merely examining differences between subpopulations. Rather, analyses should also examine intersectionality within subpopulations (for example, by gender and race), which affords the field a more nuanced understanding of how outcomes for members of the same subpopulation may vary by other dimensions of identity (such as gender or ability status). Such analyses can help the field understand what works and for whom—information that can help drive policy and practice.
  • Addressing the systems, policies, and procedures that promote inequities: Inequities do not exist in a vacuum. Thus, it is important to contextualize causal CTE studies, acknowledging how systems, policies, and procedures may create barriers to success for some students. Analyses that take an ecosystems approach—focusing on how the social, economic, and geographic environment shapes outcomes—provide valuable insight into the nature of inequities that exist and how these inequities might be overcome. Equally important is to identify the possible or probable causes of inequities to understand how race, gender, and other variables influence students’ experiences in CTE. Analyses must also extend beyond merely identifying average effect sizes to investigating variation in treatment experiences by subpopulations, an approach that provides valuable insights into how young people in different subpopulations fare relative to their peers in specific contexts. Using data and analysis in this way can provide the evidence needed to support policy recommendations aimed at closing equity gaps and creating the conditions that all students need to transition successfully into adulthood.
  • Engaging the communities that participate in our studies: Because evidence is critical for making data-driven decisions, it is important when designing causal studies to include the participating communities and other stakeholders in the knowledge generation and interpretation processes. These communities and stakeholders can also play an important role in informing researchers’ understanding of the specific causes of inequities identified in study findings. Research should be an inclusive process—the communities being studied and those directly affected by research findings should be included in the planning, implementation, and interpretation of research.
  • Asking what more is needed to promote equity: Embracing equity as a measure of success in education research will take time and will require a significant shift in the way research is conceptualized, designed, and conducted. However, to promote a more just society, it is imperative that researchers keep equity at the center of their work.

Although the CTE Research Network is funded to conduct causal studies, which can play a role in identifying inequities, we realize that other research methods also play a role in deepening the field’s understanding of such inequities. For example, qualitative and implementation research can be used to gain important insight into the contextual factors that shape or reinforce inequities and can also be used to engage stakeholders as informants on the topic. Therefore, building the field’s knowledge of these issues will require employing a range of data collection efforts.

In the meantime, the CTE Research Network is taking the following action steps to continue to advance our equity-minded approach to CTE research:

  • Developing a set of equity questions to consistently consider during network convenings
  • Elevating issues of equity in all network presentations
  • Sharing resources on equity to help network members think critically about how best to bring an equity lens to bear on research and evaluation studies
  • Creating and promoting opportunities to help diversify researchers engaged in causal CTE research

As a network, we believe these research practices will shine a light on (in)equity in CTE. Where inequities exist, we hope our work will inform education policymaking that aims not only to close existing equity gaps but also to prevent the perpetuation of inequities in CTE. We invite other researchers to join us in this effort by taking similar action steps as part of their own research and evaluation endeavors. The following resources can inform researchers’ understanding of equity issues in general and in CTE studies in particular:

 

References

Andrews, K., Parekh, J., & Peckoo, S. (2019). How to embed a racial and ethnic equity perspective in research: Practical guidance for the research process. Washington, DC: Child Trends.

Dougherty, S. M. (2016). Career and technical education in high school: Does it improve student outcomes? Washington, DC: Thomas B. Fordham Institute. https://eric.ed.gov/?id=ED570132

Hemelt, S. W., Lenard, M. A., & Paeplow, C. G. (2017). Building better bridges to life after high school: Experimental evidence on contemporary career academies. Washington, DC: National Center for Analysis of Longitudinal Data in Education Research. https://eric.ed.gov/?id=ED572934

Hodge, E., Dougherty, S., & Burris, C. (2020). Tracking and the future of career and technical education: How efforts to connect school and work can avoid the past mistakes of vocational education. Boulder, CO: National Education Policy Center. Retrieved from http://nepc.colorado.edu/publication/cte

Kemple, J. (2008). Career academies: Long-term impacts on work, education, and transitions to adulthood. New York: MDRC. Retrieved from https://www.mdrc.org/publication/career-academies-long-term-impacts-work-education-and-transitions-adulthood

Rosen, R., & Molina, F. (2019). Practitioner perspectives on equity in career and technical education. New York: MDRC. https://eric.ed.gov/?id=ED596458


Written by Equity in CTE Workgroup, on behalf of the CTE Research Network

This is the fifth in a series of blog posts that stems from the 2020 Annual Principal Investigators Meeting. The theme of the meeting was Closing the Gaps for All Learners and focused on IES’s objective to support research that improves equity in access to education and education outcomes. Other posts in this series include Addressing Persistent Disparities in Education Through IES ResearchWhy I Want to Become an Education ResearcherDiversify Education Sciences? Yes, We Can!, and Closing the Opportunity Gap Through Instructional Alternatives to Exclusionary Discipline.