IES Blog

Institute of Education Sciences

IES Makes Three New Awards to Accelerate Breakthroughs in the Education Field

Through the Transformative Research in the Education Sciences Grants program (ALN 84.305T), IES  invests in innovative research that has the potential to make dramatic advances towards solving seemingly intractable problems and challenges in the education field, as well as to accelerate the pace of conducting education research to facilitate major breakthroughs. In the most recent FY 2024 competition for this program, IES invited applications from partnerships between researchers, product developers, and education agencies to propose transformative solutions to major education problems that leverage advances in technology combined with research insights from the learning sciences.

IES is thrilled to announce that three grants have been awarded in the FY 2024 competition. Building on 20 years of IES research funding to lay the groundwork for advances, these three projects focus on exploring potentially transformative uses of generative artificial intelligence (AI) to deliver solutions that can scale in the education marketplace if they demonstrate positive impacts on education outcomes. The three grants are:

Active Learning at Scale (Active L@S): Transforming Teaching and Learning via Large-Scale Learning Science and Generative AI

Awardee: Arizona State University (ASU; PI: Danielle McNamara)

The project team aims to solve the challenge that postsecondary learners need access to course materials and high-quality just-in-time generative learning activities flexibly and on-the-go.  The solution will be a mobile technology that uses interactive, research-informed, and engaging learning activities created on the fly, customized to any course content with large language models (LLMs). The project team will leverage two digital learning platforms from the SEERNet networkTerracotta and ASU Learning@Scale – to conduct research and will include over 100,000 diverse students at ASU, with replication studies taking place at Indiana University (IU). IES funding has supported a large portion of the research used to identify the generative learning activities the team will integrate into the system—note-taking, self-explanation, summarization, and question answering (also known as retrieval practice). The ASU team includes in-house technology developers and researchers, and they are partnering with researchers at IU and developers at INFLO and Clevent AI Technology LLC. The ASU and IU teams will have the educator perspective represented on their teams, as these universities provide postsecondary education to large and diverse student populations.

Talking Math: Improving Math Performance and Engagement Through AI-Enabled Conversational Tutoring

Awardee: Worcester Polytechnic Institute (PI: Neil Heffernan)

The project team aims to provide a comprehensive strategy to address persistent achievement gaps in math by supporting students during their out-of-school time. The team will combine an evidence-based learning system with advances in generative AI to develop a conversational AI tutor (CAIT– pronounced as “Kate”) to support independent math practice for middle school students who struggle with math, and otherwise, may not have access to after-school tutoring. CAIT will be integrated into ASSISTments, a freely available, evidence-based online math platform with widely used homework assignments from open education resources (OER). This solution aims to dramatically improve engagement and math learning during independent math problem-solving time. The team will conduct research throughout the product development process to ensure that CAIT is effective in supporting math problem solving and is engaging and supportive for all students. ASSISTments has been used by over 1 million students and 30,000 teachers, and IES has supported its development and efficacy since 2003. The project team includes researchers and developers at Worcester Polytechnic Institute and the ASSISTments Foundation, researchers from WestEd, educator representation from Greater Commonwealth Virtual School, and a teacher design team.

Scenario-Based Assessment in the age of generative AI: Making space in the education market for alternative assessment paradigm

Awardee: University of Memphis (PI: John Sabatini)

Educators face many challenges building high-quality assessments aligned to course content, and traditional assessment practices often lack applicability to real world scenarios. To transform postsecondary education, there needs to be a shift in how knowledge and skills are assessed to better emphasize critical thinking, complex reasoning, and problem solving in practical contexts. Supported in large part by numerous IES-funded projects, including as part of the Reading for Understanding Initiative, the project team has developed a framework for scenario-based assessments (SBAs). SBAs place knowledge and skills into a practical context and provide students with the opportunity to apply their content knowledge and critical thinking skills. The project team will leverage generative AI along with their framework for SBAs to create a system for postsecondary educators to design and administer discipline-specific SBAs with personalized feedback to students, high levels of adaptivity, and rich diagnostic information with little additional instructor effort. The project team includes researchers, developers, and educators at University of Memphis and Georgia State University, researchers and developers at Educational Testing Service (ETS), and developers from multiple small businesses including Capti/Charmtech, MindTrust, Caimber/AMI, and Workbay who will participate as part of a technical advisory group.

We are excited by the transformative potential of these projects and look forward to seeing what these interdisciplinary teams can accomplish together. While we are hopeful the solutions they create will make a big impact on learners across the nation, we will also share lessons learned with the field about how to build interdisciplinary partnerships to conduct transformative research and development.


For questions or to learn more about the Transformative Research in the Education Sciences grant program, please contact Erin Higgins (Erin.Higgins@ed.gov), Program Lead for the Accelerate, Transform, Scale Initiative.

Designing Culturally Responsive and Accessible Assessments for All Adult Learners

Dr. Meredith Larson, program officer for adult education at NCER, interviewed Dr. Javier Suárez-Álvarez, associate professor and associate director at the Center for Educational Assessment, University of Massachusetts Amherst. Dr. Suárez-Álvarez has served as the project director for the Adult Skills Assessment Project: Actionable Assessments for Adult Learners (ASAP) grant and was previously an education policy analyst in France for the Organisation for Economic Co-operation and Development (OECD), where he was the lead author of the PISA report 21st-Century Readers: Developing Literacy Skills in a Digital World. He and the ASAP team are working on an assessment system to meet the needs of adult education learners, educators, and employers that leverages online validated and culturally responsive banks of literacy and numeracy tasks. In this interview, Dr. Suárez-Álvarez discusses the importance of attending to learners’ goals and cultural diversity in assessment.

How would you describe the current context of assessment for adult education, and how does ASAP fit in it?

In general, the adult education field lacks assessments that meet the—sometimes competing—needs and goals of educators and employers and that attend to and embrace learner characteristics, goals, and cultural diversity. There is often a disconnect where different stakeholders want different things from the same assessments. Educators ask for curriculum-aligned assessments, learners want assessments to help them determine whether they have job-related skills for employment or promotion, and employers want to determine whether job candidates are trained in high-demand skills within their industries.

Despite these differing needs and interests, everyone involved needs assessment resources for lower skilled and culturally diverse learners that are easy to use, affordable or free, and provide actionable information for progress toward personal or occupational goals. ASAP is one of the first attempts to respond to these needs by developing an assessment system that delivers real-time customizable assessments to measure and improve literacy and numeracy skills. ASAP incorporates socioculturally responsive assessment principles to serve the needs of all learners by embracing the uniqueness of their characteristics. These principles involve ensuring that stakeholders from diverse socioeconomic, cultural, linguistic, racial, and ethnic groups are represented in our test design and development activities.

Why is attending to cultural diversity important to ASAP and assessment, and how are you incorporating this into your work?

U.S. Census projections for 2045 predict a shift in the demographic composition of the population from a White majority to a racially mixed majority. This suggests that we should prepare for cultural shifts and ensure our assessments fully embrace socioculturally responsive assessment practices. Without these practices, assessments limit the ability of adults from varied demographic backgrounds to demonstrate their capabilities adequately. Socioculturally responsive assessments are pivotal for representing the growing diversity in the learner population and for uncovering undetected workforce potential.

In ASAP, we are conducting focus groups, interviews, and listening sessions with learners, educators, and employers to understand their needs. We are also co-designing items in collaboration with key stakeholders and building consensus across adult education, workforce, and policy experts. We are developing use cases to understand hypothetical product users and conducting case studies to establish linkages between instruction and assessment as well as across classroom and workplace settings.

How has your background informed your interest in and contributions to ASAP?

As a teenager growing up in Spain, I saw first-hand the possible negative impact assessments could have when they don’t attend to learner goals and circumstances. When I was 15, my English teacher, based on narrow assessments, told my parents I was incapable of learning English, doubted my academic potential, and suggested I forego higher education for immediate employment. Defying this with the support of other teachers and my family, I pursued my passion. I became proficient in English at the age of 25 when I needed it to be a researcher, and I completed my PhD in psychology (psychometrics) at the age of 28.

Many adult students may have heard similar messages from prior teachers based on assessment results. And even now, many of the assessments the adult education field currently uses for these learners are designed by and for a population that no longer represents most learners. These adult learners may be getting advice or feedback that does not actually reflect their abilities or doesn’t provide useful guidance. Unfortunately, not all students are as lucky as I was. They may not have the support of others to counterbalance narrow assessments, and that shouldn’t be the expectation.

What are your hopes for the future of assessments for this adult population and the programs and employers that support them?

I hope we switch from measuring what we know generally how to measure (such as math and reading knowledge on a multiple-choice test) to measuring what matters to test takers and those using assessment results so that they can all accomplish goals in ways that honor individuals’ circumstances. Knowledge and skills—like the real world—are much more than right and wrong responses on a multiple-choice item. I also hope that as we embrace the latest developments in technology, such as AI, we can use them to deliver more flexible and personalized assessments.

In addition, I hope we stop assuming every learner has the same opportunities to learn or the same goals for their learning and that we start using assessments to empower learners rather than just as a measure of learning. In ASAP, for example, the adult learner will decide the type of test they want to take when to take it, the context within which the assessment will be framed, and when, where, and to whom the assessment result will be delivered.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), program officer for adult education at NCER.

 

NCES Presentation at National HBCU Week Conference

In NCES’s recently released Strategic Plan, Goal 3 identifies our commitment to foster and leverage beneficial partnerships. To fulfill that goal, NCES participates in multiple conferences and meetings throughout the year. Recently, NCES participated in the National Historically Black Colleges and Universities (HBCU) Week Conference. NCES’s presentation at this conference helps us to establish a dialogue with HBCUs and develop partnerships to address critical issues in education.

NCES Commissioner Peggy G. Carr kicked off the presentation with an overview of HBCU data—such as student characteristics, enrollment, and financial aid. Then, NCES experts explored how data from various NCES surveys can help researchers, educators, and policymakers better understand the condition and progress of HBCUs. Read on to learn about these surveys.

 

Integrated Postsecondary Education Data System (IPEDS)

The Integrated Postsecondary Education Data System (IPEDS) is an annual administrative data collection that gathers information from more than 6,000 postsecondary institutions, including 99 degree-granting, Title IV–eligible HBCUs (in the 2021–22 academic year).

The data collected in IPEDS includes information on institutional characteristics and resources; admissions and completions; student enrollment; student financial aid; and human resources (i.e., staff characteristics). These data are disaggregated, offering insights into student and employee demographics by race/ethnicity and gender, students’ age categories, first-time/non-first-time enrollment statuses, and full-time/part-time attendance intensity.

Data from IPEDS can be explored using various data tools—such as Data Explorer, Trend Generator, and College Navigator—that cater to users with varying levels of data knowledge and varying data needs.

 

National Postsecondary Student Aid Study (NPSAS)

The National Postsecondary Student Aid Study (NPSAS) is a nationally representative study that examines the characteristics of students in postsecondary institutions—including HBCUs—with a special focus on how they finance their education. NPSAS collects data on the percentage of HBCU students receiving financial aid and the average amounts received from various sources (i.e., federal, state, and institution) by gender and race/ethnicity.

Conducted every 3 or 4 years, this study combines data from student surveys, student-level school records, and other administrative sources and is designed to describe the federal government’s investment in financing students’ postsecondary education.

Data from NPSAS can be explored using DataLab and PowerStats.

 

National Teacher and Principal Survey (NTPS)

The National Teacher and Principal Survey (NTPS) is the U.S. Department of Education’s primary source of information on K–12 public and private schools from the perspectives of teachers and administrators. NTPS consists of coordinated surveys of schools, principals, and teachers and includes follow-up surveys to study principal and teacher attrition.

Among many other topics, NTPS collects data on the race/ethnicity of teachers and principals. These data—which show that Black teachers and principals make up a relatively small portion of the K–12 workforce—can be used to explore the demographics and experiences of teachers and principals. NTPS provides postsecondary institutions, like HBCUs, a snapshot of the preK–12 experiences of students and staff.

Data from NTPS can be explored using DataLab and PowerStats.

 

National Assessment of Educational Progress (NAEP)

The National Assessment of Educational Progress (NAEP)—also known as the Nation’s Report Card—is the the largest nationally representative and continuing assessment of what students in public and private schools in the United States know and are able to do in various subjects.

Main NAEP assesses students in grades 4, 8, and 12 in subjects like reading, mathematics, science, and civics, while NAEP Long-Term Trend assesses 9-, 13-, and 17-year-olds in reading and mathematics.

Among many other topics, NAEP collects data on students by race/ethnicity. These data can help to shed light on students’ experiences, academic performance, and level of preparedness before they enroll in HBCUs.

Data from NAEP can be explored using the NAEP Data Explorer.

 

To explore more HBCU data from these and other NCES surveys—including enrollment trends from 1976 to 2021—check out this annually updated Fast Fact. Be sure to follow NCES on X, Facebook, LinkedIn, and YouTube and subscribe to the NCES News Flash to stay up to date on the latest from NCES.

 

By Megan Barnett, AIR

Innovative Approaches to High Quality Assessment of SEL Skills

In celebration of IES’s 20th anniversary and SEL Day, we are highlighting NCER’s investments in field-initiated research. In this blog, program officer Dr. Emily Doolittle discusses a persistent barrier to supporting social and emotional learning (SEL) in schools—the lack of high quality, reliable, and valid SEL assessments—and the innovative research supported by IES to tackle this challenge.

High quality measurement is critical for education research and practice. Researchers need valid and reliable assessments to answer questions about what works for whom and why. Schools use assessments to guide instruction, determine student response to intervention, and for high-stakes decision-making such as provision of special education services.

For social and emotional learning (SEL), assessment can be particularly challenging due to lack of precision in defining core SEL competencies. One consequence of this imprecision is that measures and intervention targets are often misaligned. SEL assessment also tends to rely on student, teacher, and parent reports despite the lack of agreement among reporters and the potential for biased responding. Through NCER, IES is supporting the development and validation of SEL measures using new technologies and approaches to address some of these challenges. Here are some examples of this innovative measurement work.

  • SELweb is a web-based direct assessment of four specific SEL skills - emotion recognition, social perspective taking, social problem solving, and self-control. It is available for use with elementary school students in grades K-3 and 4-6 with a middle school version currently under development. The SEL Quest Digital Platform will support school-based implementation of SELweb and other SEL assessments with an instrument library and a reporting dashboard for educators.
  • vSchool uses a virtual reality (VR) environment to assess prosocial skills. Students in 4th to 6th grades build their own avatar to interact with other characters in school settings using menu-driven choices for prosocial (helping, encouraging, sharing) and non-prosocial (aggressive, bullying, teasing) behaviors.
  • VESIP (Virtual Environment for Social Information Processing) also uses a VR school environment with customizable avatars to assess 3rd through 7th grade students’ social information processing in both English and Spanish.

Other assessments take a different approach to the challenges of SEL measurement by looking for ways to improve self, teacher, and parent reports.

  • In Project MIDAS, the research team is creating a system to integrate the different information provided by students, teachers, and parents to see if combining these reports will lead to more accurate identification of middle school students with SEL needs.
  • In Project EASS-E, the researchers are creating a teacher-report measure that will incorporate information about a child’s environment (e.g., neighborhood and community context) to better support elementary school students’ needs.

Please check out IES’s search tool to learn more about the wide variety of measurement research we fund to develop and validate high quality assessments for use in education research and practice.


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research

 

Money Matters: Exploring Young Adults’ Financial Literacy and Financial Discussions With Their Parents

Financial literacy is a critical skill for young adults—especially as they begin to enter college or the workforce—that is often needed for partial or full financial independence and increased financial decision making.

The Program for International Student Assessment (PISA)—which is coordinated by the Organization for Economic Cooperation and Development (OECD)—gives us a unique opportunity to analyze and understand the financial literacy of 15-year-olds in the United States and other education systems around the world. PISA is the only large-scale nationally representative assessment that measures the financial literacy skills of 15-year-olds. The financial literacy domain was administered first in 2012 and then in 2015 and 2018. The 2018 financial literacy cycle assessed approximately 117,000 students, representing about 13.5 million 15-year-olds from 20 education systems. The fourth cycle began in fall 2022 in the United States and is currently being conducted.


How Frequently Do Students Discuss Financial Topics With Their Parents?

In 2018, all education systems that administered the PISA financial literacy assessment also asked students to complete a questionnaire about their experiences with money matters in school and outside of school. In the United States, about 3,500 students out of the total 3,740 U.S. PISA sample completed the questionnaire.

This blog post explores how frequently students reported talking about the following five topics with their parents (or guardians or relatives):

  1. their spending decisions
  2. their savings decisions
  3. the family budget
  4. money for things they want to buy
  5. news related to economics or finance

Students’ answers were grouped into two categories: frequent (“a few times a month” or “once a week or more”) and infrequent (“never or almost never” or “a few times a year”).

We first looked at the degree to which students frequently discussed various financial topics with their parents. In 2018, the frequency of student-parent financial discussions varied by financial topic (figure 1):

  • About one-quarter (24 percent) of U.S. 15-year-old students reported frequently discussing with their parents news related to economics or finance.
  • More than half (53 percent) of U.S. 15-year-old students reported frequently discussing with their parents money for things they wanted to buy.

Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance), in 2018


Do male and female students differ in how frequently they discuss financial topics with their parents?

In 2018, higher percentages of female students than of male students frequently discussed with their parents the family budget (35 vs. 32 percent) and money for things they wanted to buy (56 vs. 50 percent). Meanwhile, a lower percentage of female students than of male students frequently discussed with their parents news related to economics or finance (21 vs. 26 percent) (figure 2).


Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance) and gender, in 2018


Are Students’ Financial Literacy Scores Related to How Frequently They Discuss Financial Matters With Their Parents?

With a scale from 0–1,000, the PISA financial literacy assessment measures students’ financial knowledge in four content areas:

  1. money and transactions
  2. planning and managing finances
  3. risk and reward
  4. the financial landscape

In 2018, the average score of 15-year-old students ranged from 388 points in Indonesia to 547 points in Estonia. The U.S. average (506 points) was higher than the average in 11 education systems, lower than the average in 4 education systems, and not measurably different from the average in 4 education systems. The U.S. average was also not measurably different from the OECD average.

We also examined the relationship between frequent parent–student financial discussions and students’ financial literacy achievement (figure 3). After taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status—as well as their school’s poverty and location—the results show that students who reported frequently discussing spending decisions with their parents scored 16 points higher on average than did students who reported infrequently discussing this topic. On the other hand, students who reported frequently discussing news related to economics or finance with their parents scored 18 points lower on average than did students who reported infrequently discussing this topic.  


Two-sided horizontal bar chart showing financial literacy score-point differences between students who frequently and infrequently discuss financial topics with their parents, after accounting for student and school characteristics, in 2018


Do Students Think That Young Adults Should Make Their Own Spending Decisions?

We also explored whether students agreed that young people should make their own spending decisions. In 2018, some 63 percent of U.S. 15-year-old students reported they agreed or strongly agreed, while 37 percent reported that they disagreed.

Do male and female students differ in their agreement that young adults should make their own spending decisions?

When comparing the percentage of male versus female students, we found that a lower percentage of female students than of male students agreed or strongly agreed that young people should make their own spending decisions (59 vs. 66 percent). This pattern held even after taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status as well as school poverty and location.  


Upcoming PISA Data Collections

A deeper understanding of the frequency of parent–student financial conversations, the types of topics discussed, and the relationships between financial topics and financial literacy could help parents and educators foster financial literacy across different student groups in the United States.

PISA began collecting data in 2022 after being postponed 1 year due to the COVID-19 pandemic; 83 education systems are expected to participate. The PISA 2022 Financial Literacy Assessment will include items from earlier years as well as new interactive items. The main PISA results will be released in December 2023, and the PISA financial literacy results will be released in spring/summer 2024.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to receive notifications when these new PISA data are released.

 

By Saki Ikoma, Marissa Hall, and Frank Fonseca, AIR