IES Blog

Institute of Education Sciences

New Standards to Advance Equity in Education Research

One year ago, IES introduced a new equity standard and associated recommendations to its Standards for Excellence in Education Research (SEER). The intent of this standard, as well as the other eight SEER standards, is to complement IES’s focus on rigorous evidence building with guidance and supports for practices that have the potential to make research transformational. The addition of equity to SEER is part of IES’s ongoing mission to improve academic achievement and access to educational opportunities for all learners (see IES Diversity Statement). IES is mindful, however, that to authentically and rigorously integrate equity into research, education researchers may need additional resources and tools. To that end, IES hosted a Technical Working Group (TWG) meeting of experts to gather input for IES’s consideration regarding the existing tools and resources that the education community could use as they implement the new SEER equity standard in their research, along with identifying any notable gaps where tools and resources are needed. A summary of the TWG panel discussion and recommendations is now available.

The TWG panel recommended several relevant resources and provided concrete suggestions for ways IES can support education researchers’ learning and growth, including training centers, coaching sessions, webinars, checklists, and new resource development, acknowledging that different researchers may need different kinds of supports. The meeting summary includes both a mix of recommendations for tools and resources, along with important considerations for researchers, including recommendations for best practices, as they try to embed equity in their research. 

The new SEER equity standard and accompanying recommendations have been integrated throughout the current FY 2024 Request for Applications. By underscoring the importance of equity, the research IES supports will both be rigorous and relevant to address the needs of all learners.   


This blog was written by NCER program officer Christina Chhin. If you have questions or feedback regarding the equity TWG, please contact Christina Chhin (Christina.Chhin@ed.gov) or Katina Stapleton (Katina.Stapleton@ed.gov), co-chair of the IES Diversity Council. If you have any questions or feedback regarding the equity standard or associated recommendations, please email NCEE.Feedback@ed.gov.

NCES Celebrates LGBTQ+ Pride Month

June is LGBTQ+ Pride Month, and NCES is proud to share some of the work we have undertaken to collect data on the characteristics and well-being of sexual and gender minority populations.

Inclusion of questions about sexual orientation and gender identity on federal surveys allows for better understanding of sexual and gender minority populations relative to the general population. These sexual orientation and gender identity (SOGI) data meet a critical need for information to understand trends within larger population groups, and insights gained from analysis of the data can lead to potential resources and needed interventions being provided to better serve the community. Giving respondents the opportunity to describe themselves and bring their “whole self” to a questionnaire helps them to be seen and heard by researchers and policymakers.

Sometimes, NCES is asked why questions like this appear on an education survey. They can be sensitive questions for some people, after all. NCES asks these questions to be able to understand the different experiences, equity, and outcomes related to education for sexual and gender minorities, just as NCES does for groups identified by other demographic characteristics like race, ethnicity, household income, and what part of the country someone lives in. By sexual minorities, we mean people who report their sexual orientation to be something other than straight or heterosexual, and by gender minorities, we mean people whose sex as recorded at birth is different from their gender.

Over the past 10 years, NCES has researched how to best ask respondents about their sexual orientation and gender identity, how respondents react to these questions, and the quality of data that NCES has collected on these characteristics.

At NCES, several studies include background questions for adults about their sexual orientation and gender identity. These are the High School Longitudinal Study: 2009 (HSLS:09) Second Follow-up in 2016, the Baccalaureate and Beyond Longitudinal Study (B&B) 08/18 and 16/21 collections, the National Postsecondary Student Aid Study (NPSAS) in 2020, and the Beginning Postsecondary Students Longitudinal Study (BPS) 2020/22 (see table below for more details about these surveys).


 


The collection of these data allows NCES to describe the experiences of gender and sexual minority individuals. For example:

  • In 2020, postsecondary students who identified as genderqueer, gender nonconforming, or a different identity had difficulty finding safe and stable housing at three times the rate (9 percent) of students who identified as male or female (3 percent each).1
     
  • In 2018, about 10 years after completing a 2007–08 bachelor’s degree, graduates who were gender minorities2 described their financial situations. Graduates who were gender minorities were less likely to own a home (31 percent) or hold a retirement account (74 percent) than graduates who were not gender minorities (63 percent and 87 percent, respectively) (figure 1).3  

Figure 1. Percentage of 2007–08 bachelor’s degree recipients who owned a home, had a retirement account, reported negative net worth, and did not meet essential expenses in the past 12 months, by gender minority status in 2018

NOTE: “Retirement account” includes both employer-based retirement accounts such as 401(k), 403(b), and pensions, and non-employer-based retirement accounts such as individual retirement accounts. Respondents are considered to have negative net worth if they would still be in debt after selling all their major possessions, turning all their investments and other assets into cash, and paying off as many debts as they could. “Did not meet essential expenses” refers to being unable to meet essential living expenses such as mortgage or rent payments, utility bills, or important medical care. “Past 12 months” refers to any of the 12 months preceding the interview. Gender minority indicates whether the respondent’s gender identity differed from the sex assigned at birth. Gender identity categories include male; female; transgender, male-to-female; transgender, female-to-male; genderqueer or gender nonconforming; a different gender identity; and more than one gender identity.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2008/18 Baccalaureate and Beyond Longitudinal Study (B&B:08/18).


  • In the 2017–18 school year, 18 percent of public schools had a recognized student group that promoted the acceptance of students’ sexual orientation and gender identity, such as a Gay-Straight Alliance (GSA). This was an increase from the 2015–16 school year, in which 12 percent of schools reported having a GSA.4
     
  • For 2008 bachelor’s degree graduates with a full-time job in 2018, straight people reported higher average salaries than either lesbian/gay or bisexual people.  

NCES is committed to collecting data about equity in education and describing the experiences of SGM students, graduates, and educators.

To learn more about the research conducted at NCES and across the federal statistical system on the measurement of SOGI, please visit the Federal Committee on Statistical Methodology (FCSM) website and check out these two presentations from the FCSM 2022 Research and Policy Conference: How do you Describe Yourself in the Workplace? Asking Teachers about their Sexual Orientation and Gender Identity in a School Survey and Assessing Open-Ended Self-Reports of Sexual Orientation and Gender Identity: Is There Room For Improvement?.

 

By Maura Spiegelman and Elise Christopher, NCES


[1] U.S. Department of Education, National Center for Education Statistics, 2019–20 National Postsecondary Student Aid Study (NPSAS:20, preliminary data).

[2] On the NCES surveys mentioned above, gender identity categories include male; female; transgender, male-to-female; transgender, female-to-male; genderqueer or gender nonconforming; a different gender identity; and more than one gender identity.

[3] U.S. Department of Education, National Center for Education Statistics, 2008/18 Baccalaureate and Beyond Longitudinal Study (B&B:08/18).

[4] U.S. Department of Education, National Center for Education Statistics, 2015–16 and 2017–18 School Survey on Crime and Safety (SSOCS).

U.S. Is Unique in Score Gap Widening in Mathematics and Science at Both Grades 4 and 8: Prepandemic Evidence from TIMSS

Tracking differences between the performance of high- and low-performing students is one way of monitoring equity in education. These differences are referred to as achievement gaps or “score gaps,” and they may widen or narrow over time.

To provide the most up-to-date international data on this topic, NCES recently released Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS. This interactive web-based Stats in Brief uses data from the Trends in International Mathematics and Science Study (TIMSS) to explore changes between 2011 and 2019 in the score gaps between students at the 90th percentile (high performing) and the 10th percentile (low performing). The study—which examines data from 47 countries at grade 4, 36 countries at grade 8, and 29 countries at both grades—provides an important picture of prepandemic trends.

This Stats in Brief also provides new analyses of the patterns in score gap changes over the last decade. The focus on patterns sheds light on which part of the achievement distribution may be driving change, which is important for developing appropriate policy responses. 


Did score gaps change in the United States and other countries between 2011 and 2019?

In the United States, score gap changes consistently widened between 2011 and 2019 (figure 1). In fact, the United States was the only country (of 29) where the score gap between high- and low-performing students widened in both mathematics and science at both grade 4 and grade 8.


Figure 1. Changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

* p < .05. Change in score gap is significant at the .05 level of statistical significance.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


For any given grade and subject combination, no more than a quarter of participating countries had a score gap that widened, and no more than a third had a score gap that narrowed—further highlighting the uniqueness of the U.S. results.


Did score gaps change because of high-performing students, low-performing students, or both?

At grade 4, score gaps widened in the United States between 2011 and 2019 due to decreases in low-performing students’ scores, while high-performing students’ scores did not measurably change (figure 2). This was true for both mathematics and science and for most of the countries where score gaps also widened.


Figure 2. Changes in scores of high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores of high- and low-performing U.S. students between 2011 and 2019 and changes in the corresponding score gaps

p < .05. 2019 score gap is significantly different from 2011 score gap.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


Low-performing U.S. students’ scores also dropped in both subjects at grade 8, but at this grade, they were accompanied by rises in high-performing students’ scores. This pattern—where the two ends of the distribution move in opposite directions—led to the United States’ relatively large changes in score gaps. Among the other countries with widening score gaps at grade 8, this pattern of divergence was not common in mathematics but was more common in science.

In contrast, in countries where the score gaps narrowed, low-performing students’ scores generally increased. In some cases, the scores of both low- and high-performing students increased, but the scores of low-performing students increased more.

Countries with narrowing score gaps typically also saw their average scores rise between 2011 and 2019, demonstrating improvements in both equity and achievement. This was almost never the case in countries where the scores of low-performing students dropped, highlighting the global importance of not letting this group of students fall behind.  


What else can we learn from this TIMSS Stats in Brief?

In addition to providing summary results (described above), this interactive Stats in Brief allows users to select a subject and grade to explore each of the study questions further (exhibit 1). Within each selection, users can choose either a more streamlined or a more expanded view of the cross-country figures and walk through the findings step-by-step while key parts of the figures are highlighted.


Exhibit 1. Preview of the Stats in Brief’s Features

Image of the TIMSS Stats in Brief web report


Explore NCES’ new interactive TIMSS Stats in Brief to learn more about how score gaps between high- and low-performing students have changed over time across countries.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on TIMSS data releases and resources.

 

By Maria Stephens and Ebru Erberber, AIR; and Lydia Malley, NCES

You’ve Been Asked to Participate in a Study

Dear reader,

You’ve been asked to participate in a study.

. . . I know what you’re thinking. Oh, great. Another request for my time. I am already so busy.

Hmm, if I participate, what is my information going to be used for? Well, the letter says that collecting data from me will help researchers study education, and it says something else about how the information I provide would “inform education policy . . .”

But what does that mean?

If you’re a parent, student, teacher, school administrator, or district leader, you may have gotten a request like this from me or a colleague at the National Center for Education Statistics (NCES). NCES is one of 13 federal agencies that conducts survey and assessment research in order to help federal, state, and local policymakers better understand public needs and challenges. It is the U.S. Department of Education’s (ED’s) statistical agency and fulfills a congressional mandate to collect, collate, analyze, and report statistics on the condition of American education. The law also directs NCES to do the same about education across the globe.

But how does my participation in a study actually support the role Congress has given NCES?

Good question. When NCES conducts a study, participants are asked to provide information about themselves, their students or child/children, teachers, households, classrooms, schools, colleges, or other education providers. What exactly you will be asked about is based on many considerations, including previous research or policy needs. For example, maybe a current policy might be based on results from an earlier study, and we need to see if the results are still relevant. Maybe the topic has not been studied before and data are needed to determine policy options. In some cases, Congress has charged NCES with collecting data for them to better understand education in general.

Data collected from participants like you are combined so that research can be conducted at the group level. Individual information is not the focus of the research. Instead, NCES is interested in the experiences of groups of people or groups of institutions—like schools—based on the collected data. To protect respondents, personally identifiable information like your name (and other information that could identify you personally) is removed before data are analyzed and is never provided to others. This means that people who participate in NCES studies are grouped in different ways, such as by age or type of school attended, and their information is studied to identify patterns of experiences that people in these different groups may have had.

Let’s take a look at specific examples that show how data from NCES studies provide valuable information for policy decisions.

When policymakers are considering how data can inform policy—either in general or for a specific law under consideration—data from NCES studies play an important role. For example, policymakers concerned that students in their state/district/city often struggle to pay for college may be interested in this question:

“What can education data tell me about how to make college more affordable?”

Or policymakers further along in the law development process might have more specific ideas about how to help low-income students access college. They may have come across research linking programs such as dual enrollment—when high school students take college courses—to college access for underrepresented college students. An example of this research is provided in the What Works Clearinghouse (WWC) dual-enrollment report produced by ED’s Institute for Education Sciences (IES), which shows that dual-enrollment programs are effective at increasing students’ access to and enrollment in college and attainment of degrees. This was found to be the case especially for students typically underrepresented in higher education.   

Then, these policymakers might need more specific questions answered about these programs, such as:

What is the benefit of high school students from low-income households also taking college courses?”

Thanks to people who participate in NCES studies, we have the data to address such policy questions. Rigorous research using data from large datasets, compiled from many participants, can be used to identify differences in outcomes between groups. In the case of dual-enrollment programs, college outcomes for dual-enrollment participants from low-income households can be compared with those of dual-enrollment participants from higher-income households, and possible causes of those differences can be investigated.

The results of these investigations may then inform enactment of laws or creation of programs to support students. In the case of dual enrollment, grant programs might be set up at the state level for districts and schools to increase students’ local access to dual-enrollment credit earning.

This was very close to what happened in 2012, when I was asked by analysts in ED’s Office of Planning, Evaluation, and Policy Development to produce statistical tables with data on students’ access to career and technical education (CTE) programs. Research, as reviewed in the WWC dual-enrollment report, was already demonstrating the benefits of dual enrollment for high school students. Around 2012, ED was considering a policy that would fund the expansion of dual enrollment specifically for CTE. The reason I was asked to provide tables on the topic was my understanding of two important NCES studies, the Education Longitudinal Study of 2002 (ELS:2002) and the High School Longitudinal Study of 2009 (HSLS:09). Data provided by participants in those studies were ideal for studying the question. The tables were used to evaluate policy options. Based on the results, ED, through the President, made a budget request to Congress to support dual-enrollment policies. Ultimately, dual-enrollment programs were included in the Strengthening Career and Technical Education for the 21st Century Act (Perkins V).  

The infographic below shows that this scenario—in which NCES data provided by participants like you were used to provide information about policy—has happened on different scales for different policies many times over the past few decades. The examples included are just some of those from the NCES high school longitudinal studies. NCES data have been used countless times in its 154-year history to improve education for American students. Check out the full infographic (PDF) with other examples.


Excerpt of full infographic showing findings and actions for NCES studies on Equity, Dropout Prevention, and College and Career Readiness


However, it’s not always the case that a direct line can be drawn between data from NCES studies and any one policy. Research often informs policy indirectly by educating policymakers and the public they serve on critical topics. Sometimes, as in the dual-enrollment and CTE programs research question I investigated, it can take time before policy gets enacted or a new program rolls out. This does not lessen the importance of the research, nor the vital importance of the data participants provide that underpin it.

The examples in the infographic represent experiences of actual individuals who took the time to tell NCES about themselves by participating in a study.  

If you are asked to participate in an NCES study, please consider doing so. People like you, schools like yours, and households in your town do matter—and by participating, you are helping to inform decisions and improve education across the country.

 

By Elise Christopher, NCES

Culturally Responsive Language and Literacy Enrichment for Native American Children

As part of our recognition of Native American Heritage Month, we asked Diane Loeb to discuss her IES-funded research on culturally responsive language and literacy enrichment for Native American children.

Development of language and exposure to early literacy is critical to a child’s academic success. Speaking and listening skills are necessary to navigate learning at every level of school. According to NCES, American Indians/Alaska Native populations have the highest percentage of students who receive services under the Individuals with Disabilities Education Act. There continues to be a significant need for Native American speech-language pathologists and audiologists, culturally sensitive assessment tools, and intervention approaches.

In 2006, I had the privilege to work with ten Native American college students who were recruited to the University of Kansas for the speech-language pathology and audiology master’s program. The students were from tribes across the country and varied greatly in their undergraduate preparation and world experiences. One thing that they had in common is that they wanted to make a difference in the lives of others—in particular, those who needed help with their speech, language, hearing skills, and related difficulties. As a result of working with these amazing students, I learned about their families, their customs, and their dreams. I also became painfully aware of the historical trauma Native Americans experience as a result of genocide, colonialism, and racism. In the twentieth century, Native Americans were sent to boarding schools and deprived of their language, culture, and their family.

As the students advanced in their academic studies and clinical work, it became clear to me that there were very few resources for identifying and intervening with language delay and language disorder. Under- and over-identification for special education services were highly possible due to our lack of understanding of Native American history, level of family assimilation, and inter-tribal differences. Although there were a handful of articles related to conducting assessments, very few studies addressed culturally sensitive and responsive intervention, where children’s cultural values and beliefs, experiences, and how they learn guide the assessment and intervention. The lack of culturally responsive tools for Native Americans propelled me to write an IES-funded grant proposal designed to implement culturally authentic intervention designed to be meaningful, sensitive, and respectful of Native American culture.

As a result of the IES grant we received, we developed a culturally based language and vocabulary intervention for Native American kindergarten children at risk for speech and language impairment, as well as a training program for teachers and speech-language pathologists. Language and literacy lessons were based on positive stories about Native Americans in storybooks and storytelling was taught through the venue of shared reading. Native American adults from the Native American school we were working with examined our materials to ensure that our activities were in line with the values and beliefs of the participating children. Pilot testing suggested that students made gains in literacy and language skills following intervention. 

My colleague, Grace McConnell, and I recently published an in-depth analysis of the narratives produced by the children in our initial studies. We found distinct trends in narrative structure and evaluative comments depending on student age and whether there were visual supports. What we found highlights the importance of culturally responsive language and literacy interventions for Native American children. There remains a great need for these interventions. From my work, I have learned several important lessons that may be useful to current and future researchers. The three most salient to me are

  • Include members of the tribe with whom you are working as part of the process of developing assessments and interventions for children who are Native American. This helps to ensure that your assessments and interventions are culturally sensitive.
  • Develop authentic materials that are culturally relevant, sensitive, and meaningful. We found several books with positive cultural lessons, such as respecting the earth, working together, and harmony with others and nature.
  • Remember that tribes can differ substantially from one another and that families may differ regarding cultural values and beliefs within a given tribe. When we designed literacy and language units around Native American storybooks, they often were related to specific tribes (such as Navajo or Apache). This gave us the opportunity to discuss different tribes in various parts of the country and for the children to learn about and compare their own customs and beliefs with another tribe. Students also learned about different family practices within their own tribe by sharing their family experiences with other children.

Following my work with Native American students and children, I pursued grant and research opportunities focused on the development of children born preterm of all races/ethnicities. I am working with neonatologists and nurses on studies to improve the developmental outcomes of children born preterm. Approximately 25% of children born preterm are later diagnosed with language delay or language disorder. I am currently designing NICU interventions to facilitate language, cognitive, motor, and social interaction skills that support academic success. A future goal is to focus my intervention work with Native American infants born preterm and their families. Providing facilitation of language and literacy early in development for these at-risk infants may be key for their later academic success.

Diane Loeb at Diane_Loeb@Baylor.edu is the Martin Family Endowed Chair of Communication Sciences and Disorders and Department Chair at Robbins College of Health and Human Sciences at Baylor University in Waco, Texas. She is a first-generation college graduate. This research was conducted while she was an Associate Professor at the University of Kansas in Lawrence, KS.

This guest blog was produced by Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council, and Amy Sussman (Amy.Sussman@ed.gov), NCSER Program Officer.