Inside IES Research

Notes from NCER & NCSER

Spotlight on FY 2023 Early Career Grant Awardees: Word-Level Reading Disabilities

NCSER is excited to share the work of our three new Early Career Development and Mentoring Grants Program principal investigators (PI). The aim of this grant program is to support early career scholars in their academic career trajectories as they pursue research in special education. Through a series of interview blogs, each PI will share their research interests, advice for other early career scholars, and desired impact within the field of special education.

The first scholar we are spotlighting is Kelly Williams, assistant professor in communication sciences and special education at the University of Georgia (formerly at Indiana University). Dr. Williams received a grant to develop an intervention to support reading and spelling outcomes for adolescents with word-level reading disabilities (WLRD).

How did you become interested in this area of research?         

Headshot of Dr. Kelly Williams

I originally became interested in research on WLRD through my experience as a high school special education teacher in rural Georgia where I taught English literature and composition to students with mild to moderate disabilities. Most of my students had difficulty reading and spelling words accurately and automatically, which significantly impacted their performance both in and out of school. In school, my students struggled to complete grade-level coursework, which, in turn, affected their ability to graduate with a regular high school diploma. Outside of school, my students had difficulty with tasks such as completing job applications that required extensive amounts of reading. Although I was well prepared to provide classroom accommodations and modifications for my students, I found that I lacked the knowledge and skills to provide intensive interventions that would help improve basic reading and spelling skills. These experiences ultimately led me to pursue my doctorate in special education with an emphasis on learning disabilities.

What advice do you have for other early career researchers?

I think it is important for early career researchers to collaborate with various stakeholders throughout the entire research process. Although many of my ideas stem from my own experiences as a teacher, I have found that listening to various perspectives has helped me identify problems, brainstorm potential solutions, and design practical interventions that will improve outcomes for students with disabilities. Sustaining effective interventions requires us to think about how we can involve students, teachers, administrators, parents/caregivers, schools, and other community members in research.

What broader impact are you hoping to achieve with your research?

We know low reading achievement is associated with numerous negative outcomes across domains (social, emotional, behavioral, academic, economic). My hope is that this project will provide secondary teachers with a feasible and practical intervention to improve reading outcomes for older students with WLRD, which, in turn, may help prevent or ameliorate the effects of these negative consequences. Ultimately, I envision that this intervention could be used independently or as part of a multi-component reading intervention for secondary students with WLRD.

How will this intervention be distinct from other reading and spelling interventions?

There are two ways that this intervention is distinct from other word reading and spelling interventions. First, this intervention will embed spelling instruction within word reading, which is not currently happening in research or practice for secondary students with WLRD. Many existing programs teach spelling in isolation or through rote memorization, despite a large body of research demonstrating a connection between spelling and word reading. Second, the proposed intervention will emphasize a flexible approach to multisyllabic word reading instead of teaching formal syllable division rules. The goal of this approach is to reduce cognitive load, thereby improving the ability to accurately and automatically read and spell words.

Thank you, Kelly Williams, for your thoughtful insights and commitment to improving reading and spelling among students with word-level reading disabilities. NCSER looks forward to following your work as you progress in developing this intervention.

This blog was produced by Emilia Wenzel, NCSER intern and graduate student at University of Chicago. Katie Taylor (Katherine.Taylor@ed.gov) is the program officer for NCSER’s Early Career Development and Mentoring program.

What We are Learning from Research Using NAEP Mathematics Response Process Data

Three students (two using tablets, one using a laptop) sitting at a library table

The National Assessment of Educational Progress (NAEP) is the largest nationally representative and ongoing assessment of subject knowledge among students in public and private schools in the United States. On the 2017 eighth grade mathematics assessment, 38% of students without disabilities scored at the NAEP Proficient level or above while 25% scored below the NAEP Basic level. However, for students with disabilities, math achievement levels were much worse. Only about 9% of students with disabilities scored at the NAEP Proficient level or above whereas 69% scored below the NAEP Basic level. In response to this gap, in 2021, the National Center for Special Education Research (NCSER) released a funding opportunity to coincide with the release of the 2017 Grade 8 NAEP Mathematics response process data. NCSER intended to support research that explores how learners with disabilities interact with the NAEP digital assessment to better support these learners in test-taking environments and determine whether and how that information could be used to inform instructional practices. There is much to learn from research on NAEP process data for understanding test-taking behaviors and achievement of learners with disabilities. Below we showcase the latest findings from currently funded research and encourage more investigators to conduct research with newly released process data.

Since 2017, administrations of NAEP have captured a variety of response process data, including keystrokes as learners progress through the assessment, how learners use the available tools (such as the calculator), and how accommodations (for example, text-to-speech or more time to complete the assessment) affect performance. Besides score data, NAEP datasets also include survey data from learners, teachers, and schools, and information on test item characteristics and student demographics (including disability). Together, these data provide a unique opportunity for researchers to conduct an in-depth investigation of the test-taking behavior and the mathematics competencies of learners with disabilities compared to their peers without disabilities.  

In July 2021, IES awarded two grants to conduct research using NAEP process data. The results of these projects are expected to improve the future development and administration of digital learning assessments, identify needed enhancements to mathematics instruction, and highlight areas where further research is needed.  Although these projects are ongoing, we would like to highlight findings from one of the funded projects awarded to SRI International and led by principal investigator Xin Wei  entitled Analysis of NAEP Mathematics Process, Outcome, and Survey Data to Understand Test-Taking Behavior and Mathematics Performance of Learners with Disabilities.

The findings from this study, recently published in Autism, is an example of the power of process data to shed new light on learners with disabilities. Focusing on autistic students, Xin Wei and her team analyzed data from 15 items on the NAEP math assessment, their response time in seconds, their score on the items (including partially correct scoring), and survey data related to their enjoyment, interest, and persistence in math. They also analyzed the content of each item using Flesch Reading Ease scores to measure the reading difficulty level of the item. Finally, they rated each item based on the complexity of any social context of the item, as prior research has shown that these contexts can be more challenging for autistic students. They conducted statistical analyses to compare the performance of autistic students with extended time accommodations, autistic students without accommodations, and general education peers. The researchers were not only looking for any areas of weakness, but also areas of strength. Previous studies have demonstrated that autistic people frequently excel in abstract spatial reasoning and calculation tasks, relying more on visual-mental representations than verbal ones.

The findings showed that in comparison to their general education peers, unaccommodated autistic students scored higher and solved math problems involving the identification of figures more quickly. Unaccommodated autistic students were also faster than their general education peers at solving the following types of math items: comparing measures using unit conversions, mentally rotating a triangle, interpreting linear equations, and constructing data analysis plots. Although autistic students who used the extended-time accommodation were lower performing than the other two groups, they had a higher accuracy rate on items involving identifying figures and calculating the diameter of a circle. Both groups of autistic students seem to perform poorer on word problems. Researchers concluded that the linguistic complexity could be one of the reasons that autistic students struggle with math word problems; however, there were two word problems with which they seemed to struggle despite the fact that they were not linguistically complex. It turns out that the items were rated as having substantial social context complexity. The researchers also looked at the student survey data on what types of math they enjoyed more and found they had more enjoyment working with shapes and figures and less enjoyment for solving equations.

The researchers recommend incorporating meta-cognitive and explicit schema instruction during mathematics instruction to aid autistic students in understanding real-life math word problems. They also recommend that assessment developers consider simplifying the language and social context of math word problems to make the assessment more equitable, fair, and accessible for autistic students. Because the autistic student population is particularly heterogenous, more research is required to better understand how to improve instructional strategies for them.

IES plans to release the same type of process data from the 2017 Grade 4 NAEP Mathematics at the end of this summer. We encourage researchers to request these process data to conduct research to understand test-taking behavior and performance of students with disabilities at the elementary school level. For a source of funding for the work, consider applying to the current Special Education Research Grants competition. Here are some important resources to support your proposal writing:

This blog was authored by Sarah Brasiel (Sarah.Brasiel@ed.gov), program officer at NCSER, and Juliette Gudknecht, summer data science intern at IES and graduate student at Teachers College, Columbia University. IES encourages special education researchers to use NAEP response process data for research under the Exploration project type within our standard Special Education Research Grants Program funding opportunity.   

Adult Foundational Skills Research: Reflections on PIAAC and Data on U.S. Adult Skills

In this blog, NCER program officer, Dr. Meredith Larson, interviews Dr. Holly Xie from NCES about the Program for the International Assessment of Adult Competencies (PIAAC), an OECD-developed international survey  of adult skills in literacy, numeracy, and digital problem solving administered at least once a decade. PIAAC also collects information on adult activities (such as skill use at home or work, civic participation, etc.), demographics (such as level of education, race), and other factors (such as health outcomes). To date, NCER has funded three research grants (herehere, and here) and one training grant that relied on PIAAC data.

NCES has led the U.S. efforts in administering PIAAC and has been sharing results for over a decade. PIAAC in Cycle I (PIAAC I) included three waves of data collection in the United States with the first data released in 2013. From PIAAC I, we learned a wealth of information about the skills of U.S. adults. For example, the 2017 wave of data collection found that the percentages of U.S. adults performing at the lowest levels were 19 percent in literacy, 29 percent in numeracy, and 24 percent in digital problem solving. As we look forward to learning from PIAAC II, Dr. Xie reflects on the products from PIAAC I and possibilities for PIAAC II (release in 2024).

What is your role at NCES and with PIAAC specifically?

I am the PIAAC national program manager and oversee all aspects of the PIAAC study in the United States, including development and design, data collection, analysis and reporting, and dissemination/outreach. I also represent the United States at PIAAC international meetings.

What is something you’re particularly excited about having produced during PIAAC I?

I am most excited about the U.S. PIAAC Skills Map. The Skills Map provides information on adult skills at the state and county levels. Users can explore adult skills in literacy and numeracy in their state or county and get estimates of literacy or numeracy proficiency overall and by age and education levels. Or they can compare a county to a state, a state to the nation, or compare counties (or states) to each other. The map also has demographic and socioeconomic data from the American Community Survey (ACS) to provide context for the state or county estimates. This YouTube video demonstrates what the map can do.

 

 

We also have other PIAAC web products and publications such as national and international reports, Data Points, and PIAAC publications that provide invaluable information on U.S. adult skills and interrelationships of those skills to other social, economic, and demographic factors.

Do you have examples of how information from PIAAC I has been used?

PIAAC data cover results at the national, state, and county levels, and as such, they can be useful for policymakers or decision makers who would like to know where things stand in terms of the skills of their adult population and where they need to allocate resources at these different levels of the system. In other words, PIAAC data can be useful for drafting targeted policies and programs that will benefit their population and constituencies.

For instance, at the national level, former Vice President Biden used information from PIAAC I in his report Ready to Work for the June 2014 reauthorization of the Workforce Innovation and Opportunity Act, known as WIOA. PIAAC was also cited in the discussion of extending the Second Chance Pell experiment as identified in the 2019 report titled Prisoners’ Eligibility for Pell Grants: Issues for Congress.

The Digital Equity Act of 2021 also leveraged the PIAAC. This legislation identifies particular populations that determine the funding formula. The quick guide to these populations uses PIAAC to estimate one of these populations: Individuals with a language barrier, including individuals who are English learners and have low levels of literacy.

Local governments have also used PIAAC products. For example, the Houston Mayor’s Office for Adult Literacy in collaboration with the Barbara Bush Foundation used the PIAAC Skills Map data to inform the Adult Literacy Blueprint.

And the adult education advocacy group, ProLiteracy, also used the PIAAC and the Skills Map to develop a toolkit for local program adult education and adult literacy program advocacy.

When will the results of PIAAC II be available, and how does this cycle differ from PIAAC I?

PIAAC II data collection began in 2022 and results will be released in December 2024 and will include information on the literacy, numeracy, and adaptive problem-solving skills of adults in the United States. The numeracy assessment now includes a measure of “numeracy components,” which focus on number sense, smaller/bigger number values, measurement, etc. This information will help us learn more about the skills of adults who have very low numeracy skills. The adaptive problem-solving component is a new PIAAC module and will measure the ability to achieve one’s goals in a dynamic situation in which a method for reaching a solution is not directly available.

PIAAC II will also include, for the first time, questions about financial literacy in the background questionnaire, using items on managing money and tracking spending and income, savings methods, and budgeting. These additional questions will allow people to explore relationships between foundational skills, financial literacy, and other constructs in PIAAC.

What types of research could you imagine stemming from the PIAAC II?

One of the most unique features of PIAAC (both PIAAC I and II) is the direct assessment of literacy, numeracy, and problem-solving skills (information that no other large-scale assessment of adults provides). Thirty-one countries, including the United States, participated in PIAAC II (2022/23), so researchers will be able to compare the adult skills at the international level and also study trends between PIAAC I and PIAAC II.

It’s worth noting that the data collection took place while we were still experiencing the effects of the COVID-19 pandemic. This may provide researchers opportunities to explore how the pandemic is related to adults’ skills, health, employment, training, and education status.

Where can the public access data from PIAAC?

Researchers can find information about the available data from the national U.S. PIAAC 2017 Household, PIAAC 2012/14 Household, and PIAAC 2014 Prison datasets, and international and trend datasets on the NCES Data Files page. PIAAC restricted-use data files contain more detailed information, such as continuous age and earnings variables, that can be used for more in-depth analysis. Accessing the restricted-use data requires a restricted-use license from NCES.

NCES also has an easy-to-use online analysis tool: the International Data Explorer (IDE). The IDE allows users to work directly with the PIAAC data and produce their own analyses, tables, regressions, and charts. An IDE tutorial video provides comprehensive, step-by-step instructions on how to use this tool. It contains detailed information about the content and capabilities of the PIAAC IDE, as well as how the PIAAC data are organized in the tool.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer for postsecondary and adult education, NCER.

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.