Inside IES Research

Notes from NCER & NCSER

Experimenting with Science Education to Improve Learner Opportunities and Outcomes

The NAEP science assessment measures science knowledge and ability to engage in scientific inquiry and conduct scientific investigations. According to results from the 2019 NAEP science assessment, only one-third of grade 4 and grade 8 students, and less than one-quarter of grade 12 students scored at or above proficient. In addition, for grade 4 middle-performing and low-performing students, their science performance showed declines from 2015. While IES has a history of investing in high quality science education research to improve science teaching and learning, these data suggest that much more work is needed.

To that end, during the 2022-23 school year, IES held two Learning Acceleration Challenges designed to incentivize innovation to significantly improve learner outcomes in math and science. Under the Challenge for the Science Prize, IES sought interventions to significantly improve science outcomes for middle school students with low performance in science. Unfortunately, the judging panel for the Challenge did not recommend any finalists for the Science Prize (more information about the Math Prize results can be found here). IES recognized this Challenge was an ambitious and rapid effort to improve science achievement. Feedback from potential Science Prize entrants indicated that the rapid cycle for evaluating the intervention along with the lack of resources to implement the intervention were barriers to this competition.

With the knowledge gained from the Science Prize, IES is continuing to design opportunities that encourage transformative, innovative change to improve teaching and learning in science. In our newest opportunity, the National Center for Education Research (NCER) at IES, in partnership with the National Science Foundation (NSF), released a Request for Applications for a National Research and Development Center (R&D Center) on Improving Outcomes in Elementary Science Education. Results from the most recent NAEP science assessment and the lessons learned from the Science Prize suggest opportunities for improving teaching and learning in science education need to begin early in education, and more resources are needed to conduct high quality research in science education. Through this R&D Center, IES and NSF will provide greater resources (grant award of up to $15 million over 5 years) to tackle persistent challenges in elementary science education, including the measurement of elementary science learning outcomes, and generating evidence of the impact of elementary science interventions on learner’s science achievement. In doing so, the new Elementary Science R&D Center will provide national leadership on elementary science education and build capacity in conducting high-quality science education research.


This blog was written by NCER program officer, Christina Chhin. For more information about the Elementary Science R&D Center competition, contact NCER program officers, Jennifer Schellinger or Christina Chhin, take a look at the 84.305C RFA, and/or attend one of our virtual office hours.

Language Equity Matters: Recognizing the Incredible Potential of Bilingual Learners

This year, Inside IES Research is featuring a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice. In recognition of Hispanic Heritage Month, we interviewed Dr. Aída Walqui, director of the IES-funded National Research & Development Center to Improve Education for Secondary English Learners at WestEd about her career journey and language equity for minoritized populations.

How have your background and experiences shaped your scholarship and career?

My background has been a tremendous influence. I was born in Lima, Peru, and grew up the first child of a modest, hard-working, politically involved, and well-educated family. From very early on, issues of language, education, and discrimination—and the way in which diverse groups were perceived—have been central in my life.

My father was born in the Peruvian jungle, and he grew up in Lima speaking Spanish. Through family conversations at the dinner table and other experiences, I became aware that Peruvian society was deeply segregated by ethnic and linguistic boundaries. For example, as a little girl, I did not understand why it was good for me to study German in a German school, where my emergent German was viewed as wonderful, and not something that negatively impacted my first language, Spanish. . . while the children in the Highlands, where we vacationed, were admonished for speaking Quechua, their native language. Their native language was considered almost an illness that needed to be eradicated, and their emergent Spanish was derided as imperfect.

Although my parents were not linguists, they explained that the language was just an excuse—the real issues were political, social, and economic control. I realized that the children who spoke Quechua were just as talented. But for them, learning Spanish was mandatory. Society saw it as the only thing to be proud of. My father also helped me understand that language was not just used for purposes of communication, but also to classify or package people—which impedes learning who people are as individuals. And that the experience of education itself had a lot to do with this.

Overall, I have had an immensely rich intellectual life. I owe my family, my late husband, and colleagues around the world for making it possible for me to live and work in many contexts, including working in Andean intercultural, bilingual education, teaching Spanish as a second language for the Peruvian Ministry of Education, teaching in Alisal High School in Salinas, CA for six intense and rewarding years, as well as living and working in the United Kingdom, Australia, and New Zealand. I’ve noticed the same patterns in all these places. The languages are different, but the patterns are the same: the dismissal of populations that had been minoritized due to language issues, the enormous contribution language minority populations play in these nations, and that additional languages are assets that help you learn.

I’ve become even more determined upon realizing the incredible potential that people have. As a Latina in the United States, I have focused on developing the incredible resource of Spanish that Latinos have, while also developing English at the same level of proficiency.

Success depends on educators and those who support them envisioning the richness of these people, and by extension the richness they can provide to society. It is only looking at the seeds of time that I can say that change is possible. While sociolinguistic discrimination still exists in Peru, tremendous positive changes have also occurred. In the United States, we have similarly made strides, but still have a long way to go. In education, it is important to follow Gramsci’s old advice: pessimism of the intellect, optimism of the will.

In your area of research, what do you see as the greatest research needs or recommendations to address diversity, equity, and inclusion and to improve the relevance of education research for diverse communities of students and families?

We must coherently put together examples of what is possible. For example, our Center colleagues are working on policy levers such as how to integrate English learner development with subject matter courses to strengthen the education of English learners.

In the classroom, in the past, we have been singularly worried about how well English learners are using language, how to construct grammatical sentences, how to make those sentences correct, and so forth. In reality, the focus needs to be on multiple learning modalities as well as the subject matter, critical understanding, and the ability to express ideas—language—related to the content. That is, multiple forms of learning all matter in the moment, not just one.

We all need to know how to use language well, but we also need to simultaneously learn the content and critical thinking that language brings to life, not just grammatical labels or how you conjugate verbs.

What advice would you give to emerging scholars from underrepresented, minoritized groups that are pursuing a career in education research?

I would say that above all, it is essential for emerging scholars from minoritized groups to know what about education research or development is specifically important to them, and how they intend to contribute to their field, to society, and to the improvement of the groups they represent.

Knowing where your passion resides brings more than just constant direction to scholarly efforts. During difficult moments, it will sustain those efforts. Embrace educational causes you care for, even if they don’t always seem important or popular. Think through them, research them, and communicate them, time and time again, in increasingly more potent ways.

Finally, it is essential to cultivate critical dialogue with colleagues to re-examine ideas, advance proposals, and gain sight into how synergetic efforts can advance the societal educational impact of immensely talented but minoritized groups.


Dr. Aída Walqui directs the National Research and Development Center for Improving the Education of English Learners in Secondary Schools at WestEd where she started and developed one of its signature programs, the Quality Teaching for English Learners (QTEL) initiative. QTEL focuses on the development of the expertise of teachers and educational leaders to support elementary and secondary English Learners’ conceptual, analytic, and language practices in disciplinary subject matter areas. Her main area of interest and research is teacher expertise in multilingual academic contexts and how to promote its growth across the continuum of teacher professional development. In 2016 on the 50th anniversary of the International TESOL Association Dr. Walqui was selected as one of the 50 most influential researchers in the last 50 years in the field of English Language teaching.

This blog post was produced by Helyn Kim (Helyn.Kim@ed.gov), program officer for the English Learner Portfolio at NCER.

Trading the Number 2 Pencil for 2.0 Technology

Although traditional pencil and paper tests provide good information for many purposes, technology presents the opportunity to assess students on tasks that better elicit the real world skills called for by college and career standards. IES supports a number of researchers and developers who are using technology to develop better assessments through grants as well as the Small Business Innovation Research program. 

A screenshot of the GISA assessment intro page One example of the power of technology to support innovative assessment  is the Global, Integrated, Scenario-based Assessment (known as ‘GISA’), developed by John Sabatini and Tenaha O’Reilly at the Educational Testing Service as part of a grant supported by the Reading for Understanding Research Initiative.

Each GISA scenario is structured to resemble a timely, real-world situation. For example, one scenario begins by explaining that the class has been asked to create a website on green schools. The student is assigned the task of working with several students (represented by computer avatars) to create the website. In working through the scenario, the student engages in activities that are scaffolded to support students in summarizing information, completing a graphic organizer, and collaborating to evaluate whether statements are facts or opinions. The scenario provides a measure of each student’s ability to learn from text through assessing his or her knowledge of green schools before and after completing the scenario. This scenario is available on the ETS website along with more information about the principles on which GISA was built.

Karen Douglas, of the National Center for Education Research, recently spoke to Dr. Sabatini and Dr. O’Reilly on the role of technology in creating GISA, what users think of it, and their plans for continuing to develop technology-based assessments.

How did the use of technology contribute to the design of GISA?

Technological delivery creates many opportunities over more traditional paper and pencil test designs. On the efficiency side of the argument, items and tasks can be delivered over the internet in a standardized way and there are obvious advantages for automated scoring. However, the real advantage has to do with both the control over test environment and what can be assessed. We can more effectively simulate the digital environments that students use in school, leisure and, later, in the workforce. GISA uses scenario-based assessment to deliver items and tasks. During a scenario-based assessment students are given a plausible reason for reading a collection of thematically related materials. The purpose defines what is important to focus on as students work towards a larger goal. The materials are diverse and may reflect different perspectives and quality of information. 

Screenshot of a GISA forum on Green Schools

The student not only needs to understand these materials but also needs to evaluate and integrate them as they solve problems, make decisions, or apply what they learn to new situations. This design is not only more like the activities that occur in school, but also affords the opportunity for engaging students in deeper thinking. GISA also includes simulated students that may support or scaffold the test taker's understanding with good habits of mind such as the use of reading strategies. Items are sequenced to build up test takers’ understanding and to examine what parts of a more complex task students can or cannot do. In this way, the assessment serves as a model for learning while simultaneously assessing reading. Traditionally, the areas of instruction and assessment have not been integrated in a seamless manner.

What evidence do you have that GISA provides useful information about reading skills?

We have a lot more research to conduct, but thus far we have been able to create a new technology- delivered assessment that updates the aspects of reading that are measured and introduces a variety of new features.

Despite the novel interface, items, tasks, and format, students are able to understand what is expected of them. Our analyses indicate the test properties are good and that students can do a range of tasks that were previously untested in traditional assessments. While students may be developing their skills on more complex tasks, there is evidence they can do many of the components that feed into it. In this way the assessment may be more instructionally relevant.

Informally, we have received positive feedback on GISA from both teachers and students. Teachers view the assessment as better matching the types of activities they teach in the classroom, while students seem to enjoy the more realistic purpose for reading, the more relevant materials, and the use of simulated peers. 

What role do you think technology will play in future efforts to create better assessments?

We believe technology will play a greater role in how assessments are designed and delivered. Being able to provide feedback to students and better match the test to student needs are some areas where future assessments will drive innovation. More interactive formats, such as intelligent tutoring and gaming, will also grow over time. With new forms of technology available, the possibilities for meeting students’ educational needs increases dramatically.

What’s next for GISA?

We are using GISA in two additional grants. In one grant, we leverage the GISA designs for use with adults, a group for which there are few viable assessments. In the other grant we are using GISA to get a better understanding of how background knowledge affects reading comprehension.

For more information about the Reading for Understanding Research Initiative, read this post on the IES blog.

By Karen Douglas, Education Research Analyst, NCER, who oversees the Reading for Understanding Research Initiative

Pinning Down the Use of Research in Education

There are plenty of great ideas to be found on Pinterest: recipes for no-bake, allergen-friendly cookies; tips for taking better photos; and suggestions for great vacation spots in Greece. Lots of teachers use Pinterest as a way to share classroom ideas and engaging lessons. But where do teachers, education leaders and decision makers turn when they need evidence-based instructional practices that may work to help struggling readers, or want to use research to address other educational challenges?

Since 2014, the National Center for Education Research (NCER) has funded two National Research and Development Centers on Knowledge Utilization in an effort to find out. They are well on their way to answering questions about how and why teachers and schools use—or do not use—research in their decision making. They are also exploring ways the research community can increase interest in and actual use of research-based practices.

We are beginning to see the first results of their efforts to answer two key questions. First, are educators and schools using education research in their decision making, and if they aren’t, why not? The second question is: If educators are not using evidence as a part of their work, what can the research community do to make it more likely they will?

The National Center for Research in Policy and Practice (NCRPP) was awarded to the University of Colorado Boulder and is led by Principal Investigator Bill Penuel (University of Colorado Boulder), and Co-Principal Investigators Derek Briggs (University of Colorado Boulder), Jon Fullerton (Harvard University), Heather Hill (Harvard University), Cynthia Coburn (Northwestern University), and Jim Spillane (Northwestern University).

NCRPP has recently released their first technical report which covers the descriptive results from their nationally-representative survey of school and district leaders. Results from the report show that school and district leaders do use research evidence for activities such as designing professional development, expanding their understanding of specific issues, or convincing others to agree with a particular point of view on an education issue. Instrumental uses of research, when district leaders apply research to guide or inform a specific decision, were most commonly reported. Overall, school and district leaders were positive about the relevance and value of research for practice. When asked to report what specific piece of research was most useful, school and district leaders named books, policy reports, and peer-reviewed journal articles. You can get more information on the center's website, http://ncrpp.org. They are also very active on Twitter.

The Center for Research Use in Education (CRUE) was awarded to the University of Delaware and is led by Principal Investigator Henry May (University of Delaware), and Co-Principal Investigator Liz Farley-Ripple (University of Delaware). This team is currently working on drafting their measures of research use, which will include a set of surveys for researchers and another set for practitioners. They are especially interested in understanding which factors contribute to deep engagement with research evidence, and how gaps in perceptions and values between researchers and practitioners may be associated with frequency of deep research use. You can learn more about the work of CRUE on their website, http://www.research4schools.org/ and follow them on Twitter

While the Centers were tasked with tapping into use of research evidence specifically, both are interested in understanding all sources of evidence that practitioners use, whether it’s from peer-reviewed research articles, the What Works Clearinghouse, a friend at another school, or even Pinterest. There is certainly a wealth of research evidence to support specific instructional practices and programs, and these two Centers will begin to provide answers to questions about how teachers and leaders are using this research.

So, it’s possible that, down the road, Pinterest will become a great place for homemade, toxic-free finger paint and evidence-based practices for improving education.

Written by Becky McGill-Wilkinson, NCER Program Officer for the Knowledge Utilization Research and Development Centers

C-SAIL: Studying the Impact of College- and Career-Readiness Standards

The nationwide effort to implement college- and career-ready standards is designed to better prepare students for success after high school, whether that means attending a postsecondary institution, entering the work force, or some combination of both. But there is little understanding about how these standards have been implemented across the country or the full impact they are having on student outcomes.  

To fill that void, the Institute of Education Sciences (IES) funded a new five-year research center, the Center on Standards, Alignment, Instruction, and Learning (C-SAIL). The center is studying the implementation of college- and career-ready standards and assessing how the standards are related to student outcomes. The center is also developing and testing an intervention that supports standards-aligned instruction.

Andy Porter (pictured right), of the University of Pennsylvania’s Graduate School of Education, is the director of C-SAIL and recently spoke with James Benson, the IES project officer for the center. Here is an edited version of that conversation.

You have been studying education standards for over 30 years. What motivated you to assemble a team of researchers and state partners to college- and career-readiness standards?

Standards-based reform is in a new and promising place with standards that might be rigorous enough to close achievement gaps that advocates have been fighting to narrow for the last 30 years. And with so many states implementing new standards, researchers have an unprecedented opportunity to learn about how standards-based reform is best done. We hypothesize that the only modest effects of standards-based reform thus far are largely due to the fact that those reforms stalled at the classroom door, so a focus of the Center will be how implementation is achieved and supported among teachers.

What are the main projects within the Center, and what are a few of the key questions that they are currently addressing?

We have four main projects. The first, an implementation study, asks, “How are state, district, and school-level educators making sense of the new standards, and what kinds of guidance and support is available to them?” We’re comparing and contrasting implementation approaches in four states—Kentucky, Massachusetts, Ohio and Texas. In addition to reviewing state policy documents, we’re surveying approximately 280 district administrators, 1,120 principals, and 6,720 teachers across (the same) four states, giving special attention to the experiences of English language learners and students with disabilities.

The second project is a longitudinal study that asks, “How are college- and career-readiness standards impacting student outcomes across all 50 states?” and “How are English language learners and students with disabilities affected by the new standards?” We’re analyzing data from the National Assessment of Education Progress (NAEP) and other sources to estimate the effects of college- and career-readiness standards on student achievement, high school completion, and college enrollment. Specifically, we’re examining whether implementing challenging state academic standards led to larger improvements in student outcomes in states with lower prior standards than in states with higher prior standards.

The third project is the Feedback on Alignment and Support for Teachers (FAST) intervention study, in which we are building an original intervention designed to assist teachers in providing instruction aligned to their state’s standards. FAST features real-time, online, personalized feedback for teachers, an off-site coach to assist teachers in understanding and applying aligned materials, and school-level collaborative academic study teams in each school.

The fourth project is a measurement study to determine the extent to which instruction aligns with college- and career-readiness standards. C-SAIL is developing new tools to assess alignment between teachers' instruction and state standards in English language arts and math.

How do you envision working with your partner states in the next few years? How do you plan to communicate with states beyond those partnering with the Center?

We’ve already collaborated with our partner states–Kentucky, Massachusetts, Ohio, and Texas–on our research agenda, and the chief state school officer from each state, plus a designee of their choice, sits on our advisory board. Additionally, we’re currently working with our partner states on our implementation study and plan to make our first findings this summer on effective implementation strategies immediately available to them.

All states, however, will be able to follow our research progress and access our findings in myriad ways, including through our website (pictured left). Our Fact Center features downloadable information sheets and the C-SAIL blog offers insights from our researchers and network of experts. We also invite practitioners, policymakers, parents and teachers to stay up-to-date on C-SAIL activities by subscribing to our newsletter, following us on Twitter, or liking us on Facebook.

Looking five years into the future, when the Center is finishing its work, what do you hope to understand about college- and career-readiness standards that we do not know now?

Through our implementation study, we will have documented how states are implementing new, challenging state academic standards; how the standards affect teacher instruction; what supports are most valuable for states, districts, and schools; and, how the new standards impact English language learners and students with disabilities.

Through our longitudinal study, we will have combined 50-state NAEP data with high school graduation rates, and college enrollment in order to understand how new standards impact student learning and college- and career-readiness.

Through our FAST Intervention, we will have created and made available new tools for teachers to monitor in real-time how well-aligned the content of their enacted curriculum is to their states’ college- and career-readiness standards in ELA and math.

Finally, but not least, we will have led policymakers, practitioners and researchers in a national discussion of our findings and their implications for realizing the full effects of standards-based reform.