IES Blog

Institute of Education Sciences

Representation Matters: Exploring the Role of Gender and Race on Educational Outcomes

This year, Inside IES Research is publishing a series of interviews showcasing a diverse group of IES-funded education researchers that are making significant contributions to education research, policy, and practice.

 

The process of education transmits sociocultural values to learners in addition to information and knowledge. How individuals are represented in curricula and instructional materials can teach students about their place in the world. This can either perpetuate existing systemic inequalities or, conversely, provide a crucial counternarrative to them. With an exploration grant from IES, Anjali Adukia (University of Chicago) and Alex Eble (Teachers College, Columbia University) are exploring how  representation and messages about gender and race in elementary school books may influence student’s education outcomes over time. The researchers will develop and use machine-learning tools that leverage text and image analysis techniques to identify gender- and race-based messages in commonly used elementary-school books.

 

Interview with Anjali Adukia, University of Chicago

Tell us how your research contributes to a better understanding of the importance of diversity, equity, and/or inclusion in education.

In my work, I seek to understand how to reduce inequalities such that children from historically (or contemporaneously) marginalized backgrounds have equal opportunities to fully develop their potential. I examine factors that motivate and shape behavior, preferences, and educational decision-making, with a particular focus on early-life influences. Proceeding from the notion that children are less likely to be able to focus on learning until their basic needs are met, my research uses both econometric methods and qualitative approaches to understand the specific roles different basic needs play in making these decisions. My research, for example, has explored the role of safety and health (sanitation, violence), economic security (road construction, workfare), justice (restorative practices), and representation (children’s books), particularly for marginalized groups.

 

As a woman and a minority, how has your background and experiences shaped your career?

My research is informed and influenced by my own experiences. When I was a child, I never understood why there weren’t more characters that looked like me or when there were, why they had such limited storylines. For me personally, the motivation underlying our IES-funded project was borne out of my lived experience of always searching for content that reflected who I was. I think of representation as a fundamental need: if you don’t see yourself represented in the world around you, it can limit what you see as your potential; and similarly, if you don’t see others represented, it can limit what you see as their potential; and if you only see certain people represented, then this shapes your subconscious defaults.

It was a real watershed moment when I realized that academia allowed me to pursue many of my larger goals in life, in which I hope to meaningfully improve access to opportunities and outcomes for children – especially those from disadvantaged backgrounds. I hope to accomplish this in various ways, paying forward the many kindnesses generously given to me by: (1) producing rigorous policy-relevant evidence that expands our understanding of big questions and opens new avenues for inquiry; (2) translating my research such that it helps inform policymakers and practitioners in the design of school policies and practices; (3) understanding issues with a depth and sophistication that comes from “on-the-ground” insights, knowledge cultivated in multiple disciplines using different methodologies, introspection, humility, and courage; (4) directly working with government agencies, non-profit organizations, and community groups to positively inform policies; (5) contributing to the larger public discourse; and (6) by training and advising students to have the fortitude to ask hard questions, to be able to defend different perspectives on issues, to learn that knowledge brings more questions than answers, and to be willing to take risks and fail (and in the process, I will certainly learn more from them than they will ever be able to learn from me).

 

What has been the biggest challenge you encountered and how did you overcome the challenge?

Life is always filled with challenges, but one challenge starting from when I was young was to feel comfortable in my own skin and to find legitimacy in my own voice. I grew up as an Indian-American daughter of Hindu immigrants in a rural, predominantly white and Christian setting. I was different from the other kids and did not always feel like I fit in. I remember literally trying to erase my skin hoping that it would make it lighter. I found the helpers, as my parents (and Mr. Rogers) would suggest, and tried to focus on the voices that lifted me up – my family, teachers, other mentors, those friends who loved me no matter my differences. My mother always told me to find the kindness, the good, the love in people; to find the common ground and to embrace and learn from the differences. I surrounded myself with love, focusing on what I had and on what I could do rather than what society was telling me I couldn’t do. I turned to concentrating on things that mattered to me, that drove me. I don’t think there is a single challenge in life that I overcame alone. I have been very lucky, and I am deeply grateful for the many gifts in my life, the many loved ones – family, friends, colleagues, mentors, healthcare workers – who have lifted me up, and the opportunities that came my way.

 

How can the broader education research community better support the needs of underrepresented, minority scholars?

The notion of what is considered to be an important question is often driven by the senior scholars in a field, for example, the people considered to be “giants.” Demographically, this small set of leading scholars has historically consisted of people from the most highly represented groups (particularly in economics). And because the field is thus shaped mainly by researchers from a “dominant” group background, the key questions being pursued may not always reflect the experiences or concerns of people from underrepresented backgrounds. Education research has pockets where these different perspectives are being considered, but it can continue to evolve by becoming more open to approaches thought to be less traditional or to questions not typically asked (or asked from a different point of view). Expanding the notion of what is considered important, rigorous research can be difficult and cause growing pains, but it will help expand our knowledge to incorporate more voices.

 

What advice would you give to emerging scholars from underrepresented, minority backgrounds that are pursuing a career in education research?

Keep a journal of questions that arise and topics that pique your curiosity and interest. Soon, you will find questions in the fabric of everyday life, and you will start to articulate the wonder you see in the world around you and what inspires you to action, to understand the universe further. I find that when I return to past writings and journal entries, I am reminded of questions that have ignited my fires and see some of the common themes that emerge over time. Find your voice and know that your voice and views will grow and evolve over time. There are so many interesting and important questions one can pursue. Most importantly, you have to be true to yourself, your own truth. Find circles of trust in which you can be vulnerable. Draw strength from your struggle. There is deep truth and knowledge within you.

 


Dr. Anjali Adukia is an Assistant Professor at the University of Chicago Harris School of Public Policy and the College.

This interview was produced by Christina Chhin (Christina.Chhin@ed.gov), Program Officer, National Center for Education Research.

 

Better Reading Comprehension When You Know That You Don’t Know

The more you already know about a topic, the easier it may be to comprehend and learn from texts about that topic. But knowledge has to start somewhere. So how can we help students learn from texts when they may have low background knowledge?

In their exploratory study, researchers from ETS found that lack of knowledge is not necessarily a barrier to comprehension. Rather, they suggest that students who can identify their lack of background knowledge are more likely to comprehend and learn new information than students who do not acknowledge they lack background knowledge. In other words, knowing that you might not know may lead to better outcomes.

To determine the role of background knowledge, the researchers pretested middle and high school students’ background knowledge through questions related to topics the students may have some but not complete knowledge of, such as ecology, immigration, and wind power. The pretest included an “I don’t know” option, along with correct and incorrect responses.

Students then took a scenario-based assessment in which they read multiple sources about each of the topics. This type of assessment mirrors real-world learning by encouraging readers to build their own interpretations of a topic, which helps researchers determine whether students comprehend what they read.

They found that students who selected “I don’t know” when answering background knowledge questions had better understanding of the content than those who provided wrong answers on these questions. In fact, students who selected “I don’t know” rather than answering incorrectly were nearly three times as likely to learn from sources that provided the correct information than students who had answered the pretest incorrectly. Students who selected “I don’t know” may also learn more than students who had a comparable level of weak background knowledge. The researchers suggest that the “I don’t know” readers may have set different reading goals prior to engaging with the sources than those who guessed incorrectly.

 

Possible Implications for Teaching and Learning

The results from this work support the idea that having and building background knowledge is key. Thus, teachers may want to assess existing knowledge and address knowledge gaps prior to instruction.

Teachers may also want to provide an “I don’t know” option or options that allow students to rate their level of certainty. Doing so may help teachers distinguish between students who recognize their own gaps in knowledge from those who may not be aware that they are wrong or that they simply do not know. This latter group of students may need more help in determining the accuracy of their judgments or may have incorrect knowledge that could interfere with learning.

The researchers further suggest that teachers may want to go beyond the role of background knowledge by teaching students how to set appropriate reading goals and use strategic reading approaches to learn new facts or correct existing misunderstandings.

 


The research reported here was conducted under NCER grant R305A150176: What Types of Knowledge Matters for What Types of Comprehension? Exploring the Role of Background Knowledge on Students' Ability to Learn from Multiple Texts.

This blog was written by Dr. Meredith Larson. Contact her for more information about this project.

New International Data Show Large and Widening Gaps Between High- and Low-Performing U.S. 4th- and 8th-Graders in Mathematics and Science

NCES recently released results from the 2019 Trends in International Mathematics and Science Study (TIMSS). TIMSS tests students in grades 4 and 8 in mathematics and science every 4 years. The results show that

  • Across both subjects and grades, the United States scored, on average, in the top quarter of the education systems that took part in TIMSS 2019.
    • Among the 64 education systems that participated at grade 4, the United States ranked 15th and 8th in average mathematics and science scores, respectively.
    • Among the 46 education systems that participated at grade 8, the United States ranked 11th in average scores for both subjects.
  • On average, U.S. scores did not change significantly between the 2011 and 2019 rounds of TIMSS.

Average scores are one measure of achievement in national and international studies. However, they provide a very narrow perspective on student performance. One way to look more broadly is to examine differences in scores (or “score gaps”) between high-performing students and low-performing students. Score gaps between high performers and low performers can be one indication of equity within an education system. Here, high performers are those who scored in the 90th percentile (or top 10 percent) within their education system, and low performers are those who scored in the 10th percentile (or bottom 10 percent) within their education system.

In 2019, while some education systems had a higher average TIMSS score than the United States, none of these education systems had a wider score gap between their high and low performers than the United States. This was true across both subjects and grades.

Figure 1 shows an example of these findings using the grade 8 mathematics data. The figure shows that 17 education systems had average scores that were higher or not statistically different from the U.S. average score.

  • Of these 17 education systems, 13 had smaller score gaps between their high and low performers than the United States. The score gaps in 4 education systems (Singapore, Chinese Taipei, the Republic of Korea, and Israel) were not statistically different from the score gap in the United States.
  • The score gaps between the high and low performers in these 17 education systems ranged from 170 points in Quebec, Canada, to 259 points in Israel. The U.S. score gap was 256 points.
  • If you are interested in the range in the score gaps for all 46 education systems in the TIMSS 2019 grade 8 mathematics assessment, see Figure M2b of the TIMSS 2019 U.S. Highlights Web Report, released in December 2020. This report also includes these results for grade 8 science and both subjects at the grade 4 level.

Figure 1. Average scores and 90th to 10th percentile score gaps of grade 8 students on the TIMSS mathematics scale, by education system: 2019

NOTE: This figure presents only those education systems whose average scores were similar to or higher than the U.S. average score. Scores are reported on a scale of 0 to 1,000 with a TIMSS centerpoint of 500 and standard deviation of 100.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS), 2019.


From 2011 to 2019, U.S. average scores did not change significantly. However, the scores of low performers decreased, and score gaps between low and high performers grew wider in both subjects and grades. In addition, at grade 8, there was an increase in the scores of high performers in mathematics and science over the same period. These two changes contributed to the widening gaps at grade 8.

Figure 2 shows these results for the U.S. grade 8 mathematics data. Average scores in 2011 and 2019 were not significantly different. However, the score of high performers increased from 607 to 642 points between 2011 and 2019, while the score of low performers decreased from 409 to 385 points. As a result, the score gap widened from 198 to 256 points between 2011 and 2019. In addition, the 2019 score gap for grade 8 mathematics is significantly wider than the gaps for all previous administrations of TIMSS.


Figure 2. Trends in average scores and selected percentile scores of U.S. grade 8 students on the TIMSS mathematics scale: Selected years, 1995 to 2019

* p < .05. Significantly different from the 2019 estimate at the .05 level of statistical significance.

NOTE: Scores are reported on a scale of 0 to 1,000 with a TIMSS centerpoint of 500 and standard deviation of 100.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS), 1995, 1999, 2003, 2007, 2011, 2015, 2019.


These TIMSS findings provide insights regarding equity within the U.S. and other education systems. Similar results from the National Assessment of Educational Progress (NAEP) show that mathematics scores at both grades 4 and 8 decreased or did not change significantly between 2009 and 2019 for lower performing students, while scores increased for higher performing students. More national and international research on the gap between high- and low-performing students could help inform important education policy decisions that aim to address these growing performance gaps.

To learn more about TIMSS and the 2019 U.S. and international results, check out the TIMSS 2019 U.S. Highlights Web Report and the TIMSS 2019 International Results in Mathematics and Science. A recording is also available for a RISE Webinar from February 24, 2021 (What Do TIMSS and NAEP Tell Us About Gaps Between High- and Low-Performing 4th and 8th Graders?) that explores these topics further. 

 

By Katie Herz, AIR; Marissa Hall, AIR; and Lydia Malley, NCES

The Importance of Partnering with Practitioners in English Learner Research

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In FY 2014, Dr. Karen Thompson was awarded a grant for The Oregon English Learner Alliance: A Partnership to Explore Factors Associated with Variation in Outcomes for Current and Former English Learners in Oregon to determine best practices to support academic achievement among current and former English learners. Dr. Thompson and her colleagues wrote a guest blog post describing the work that the partnership undertook to better understand and improve the performance of English learners in Oregon. In this blog, we interviewed Dr. Thompson—three years after the end of the grant—to get her perspectives on the partnership, outcomes of their work, and where things currently stand.

 

What was the purpose of your research and what led you to do this work?

When I came to Oregon from California in 2012, there was growing momentum in the state to better understand and meet the needs of the state’s multilingual student population, particularly students classified as English learners (ELs). The state had developed an ambitious EL strategic plan, which included a variety of goals and action steps, such as identifying model programs and sharing best practices. I noticed that Oregon did not have publicly available information about the state’s former EL students. In prior work, other researchers and I had demonstrated that analyzing data only about students currently classified as English learners without also analyzing data about former EL students can provide incomplete and misleading information. Therefore, for Oregon to realize its goals and truly understand which programs and practices were most effectively educating its multilingual students, the state needed to make changes to its data systems. This was the seed that led to the Oregon Department of Education/Oregon State University English Language Learner Partnership. Our first goal was to simply determine how many former EL students there were in the state. Then, once the state had created a flag to identify former EL students, we were able to conduct a wide range of analyses to better understand opportunities and outcomes for both current and former EL students in ways that have informed state reporting practices and policy decisions.

 

How does this research differ from other work in the field? Why do you think partnerships with practitioners were necessary to carry out the work?

When we began our partnership, collecting and analyzing information about both current and former EL students was not common. Happily, more and more researchers and education agencies have now adopted these approaches, and we think our partnership has helped play a role in this important and illuminating shift.  

It was crucial to conduct this work via partnerships between researchers and practitioners. Practitioner partners had deep knowledge of the state’s current data systems, along with knowledge about which reporting and analysis practices could shift to incorporate new information about current and former EL students. Research partners had the bandwidth to conduct additional analyses and to lead external dissemination efforts. Our regular partnership meetings enabled our work to evolve in response to new needs. 

 

What do you think was the most important outcome of your work and why?

I think the most important outcome of our work is that educators across Oregon now have information about both their current and former English learner students and can use this data to inform policy and practice decisions. Other analyses we conducted have also informed state actions. For example, our analysis of how long it takes Oregon EL students to develop English proficiency and exit EL services informed the state’s EL progress indicator under the Every Student Succeeds Act.

 

What are the future directions for this work?

Our IES-funded partnership led to funding from the Spencer Foundation to do further research about EL students with disabilities in Oregon, which has impacted practices in the state. In addition, I am excited to be one of the collaborators in the new IES-funded National Research and Development Center to Improve Education for Secondary English Learners (PI: Aída Walquí, WestEd). As part of the Center’s research, I am working with colleagues at the University of Oregon and the University of California, Los Angeles to analyze malleable factors impacting content-course access and achievement for secondary EL students. We are collaborating with four states in this work, and as in our ODE/OSU partnership, we will be analyzing data for both current and former EL students. At a policy level, colleagues and I are involved in conversations about how data collection and reporting at the federal level could also incorporate analysis of data for both current and former EL students, including ways this might inform future reauthorizations of the Elementary and Secondary Education Act.

 

---

Dr. Karen Thompson is an Associate Professor at the College of Education at Oregon State University. Her research focuses on how curriculum and instruction, teacher education and policy interact to share the classroom experiences of K-12 multilingual students.

 

Written by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for English Learner Program, National Center for Education Research.

Online Training for the 2019 NHES Early Childhood Program Participation Survey Data and Parent and Family Involvement in Education Survey Data

The NCES National Household Education Survey (NHES) program administered two national surveys in 2019—the Early Childhood Program Participation (ECPP) survey and the Parent and Family Involvement in Education (PFI) survey. The ECPP survey collects information on young children’s care and education, including the use of home-based care with both relatives and nonrelatives and center-based care and education. The survey examines how well these care arrangements cover work hours, costs of care, location of care, the process of selecting care, and factors making it difficult to find care. The PFI survey collects information on a range of issues related to how families connect to schools, including information on family involvement with schools, school choice, homeschooling, virtual education, and homework practices.

NCES released data from the 2019 NHES administration on January 28, 2021. For each of the two surveys, this release includes the following:

  • Public-use data files, in ASCII, CSV, SAS, SPSS, Stata, and R
  • Restricted-use data files (in formats listed above and with codebook)
  • Public-Use Data File Codebook
  • Data File User’s Manual (for both public-use and restricted-use files)

That’s a lot of information! How should you use it? We suggest you start by viewing the NHES online data Distance Learning Dataset Training modules. The modules provide a high-level overview of the NHES program and the data it collects. They also include important considerations to ensure that your analysis takes into account the NHES’s complex sample design (such as applying weights and estimating standard errors).   

You should first view the five general NHES modules, which were developed for the 2012 NHES data. These modules are:

  • Introduction to the NHES
  • Getting Started with the NHES Data
  • Data Collected Through the NHES
  • NHES Sample Design, Weights, Variance, and Missing Data
  • Considerations for Analysis of NHES Data

A sixth module explains key changes in the 2019 ECPP and PFI surveys compared to their respective 2012 surveys:

  • Introduction to the 2019 NHES Data Collection

The sixth module also provides links to the 2019 ECPP and PFI data, restricted-use licensing information, and other helpful resources.

Now you are ready to go! If you have any questions, please contact us at NHES@ed.gov.

By Lisa Hudson, NCES