Inside IES Research

Notes from NCER & NCSER

Investing in Next Generation Technologies for Education and Special Education

The Department of Education’s (ED) Small Business Innovation Research (SBIR) program, administered by the Institute of Education Sciences (IES), funds entrepreneurial developers to create the next generation of technology products for students, teachers, and administrators in education and special education. The program, known as ED/IES SBIR, emphasizes an iterative design and development process and pilot research to test the feasibility, usability, and promise of new products to improve outcomes. The program also focuses on planning for commercialization so that the products can reach schools and end-users and be sustained over time.

In recent years, millions of students in tens of thousands of schools around the country have used technologies developed through ED/IES SBIR, including more than million students and teachers who used products for remote teaching and learning during the COVID-19 pandemic.

ED/IES SBIR Announces 2022 Awards

IES has made 10 2022 Phase I awards for $250,000*. During these 8 month projects, teams will develop and refine prototypes of new products and test their usability and initial feasibility. All awardees who complete a Phase I project will be eligible to apply for a Phase II award in 2023.

IES has made nine 2022 Phase II awards, which support further research and development of prototypes of education technology products that were developed under 2021 ED/IES SBIR Phase I awards. In these Phase II projects, teams will complete product development and conduct pilot studies in schools to demonstrate the usability and feasibility, fidelity of implementation, and the promise of the products to improve the intended outcomes.

IES also made one Direct to Phase II award to support the research, development, and evaluation of a new education technology product to ready an existing researcher-developed evidence-based intervention for use at scale and to plan for commercialization. The Direct to Phase II project is awarded without a prior Phase I award. All Phase II and the Direct to Phase II awards are for $1,000,000 for two-years. Across all awards, projects address different ages of students and content areas.

The list of all 2022 awards is posted here. This page will be updated with the two additional Phase I awards after the contracts are finalized.

 

 

The 2022 ED/IES SBIR awards highlight three trends that continue to emerge in the field of education technology.

Trend 1: Projects Are Employing Advanced Technologies to Personalize Learning and Generate Insights to Inform Tailored Instruction

About two-thirds of the new projects are developing software components that personalize teaching and learning, whether through artificial intelligence, machine learning, natural language processing, automated speech recognition, or algorithms. All these projects will include functionalities afforded by modern technology to personalize learning by adjusting content to the level of the individual learner, offer feedback and prompts to scaffold learning as students progress through the systems, and generate real-time actionable information for educators to track and understand student progress and adjust instruction accordingly. For example:

  • Charmtech Labs and Literably are fully developing reading assessments that provide feedback to inform instruction.
  • Sirius Thinking and studio:Sckaal are developing prototypes to formatively assess early grade school students in reading.
  • Sown To Grow and xSEL Labs are fully developing platforms to facilitate student social and emotional assessments and provide insights to educators.
  • Future Engineers is fully developing a platform for judges to provide feedback to students who enter STEM and educational challenges and contests.
  • Querium and 2Sigma School are developing prototypes to support math and computer science learning respectively.
  • ,Soterix is fully developing a smart walking cane and app for children with visual impairments to learn to navigate.
  • Alchemie is fully developing a product to provide audio cues to blind or visually impaired students learning science.
  • Star Autism Support is developing a prototype to support practitioners and parents of children with autism spectrum disorder.

Trend 2: Projects Focusing on Experiential and Hands-On Learning
Several new projects are combining hardware and software solutions to engage students through pedagogies employing game-based, hands-on, collaborative, or immersive learning:

  • Pocketlab is fully developing a matchbox-sized car with a sensor to collect physical science data as middle school students play.
  • GaiaXus is developing a prototype sensor used for environmental science field experiments.
  • Mind Trust is a developing a virtual reality escape room for biology learning.
  • Smart Girls is developing a prototype science game and accompanying real-world hands-on physical activity kits.
  • Indelible Learning is developing a prototype online multi-player game about the electoral college.
  • Edify is fully developing a school-based program for students to learn about, create, and play music.

Trend 3: Projects to Advance Research to Practice at Scale

Several new awards will advance existing education research-based practices into new technology products that are ready to be delivered at scale:

  • INSIGHTS is fully developing a new technology-delivered version to ready an NIH- and IES-supported social and emotional intervention for use at scale.
  • xSEL Laband Charmtech Labs (noted above) are building on prior IES-funded research-based interventions to create scalable products.
  • Scrible is developing an online writing platform in partnership with the National Writers Project based on prior Department of Education-funded research. 

 


*Note: Two additional 2022 Phase I awards are forthcoming in 2022. The contracts for these awards are delayed due to a back-up in the SAM registration process.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Edward Metz (Edward.Metz@ed.gov) is the Program Manager of the ED/IES SBIR program.

Michael Leonard (Michael.Leonard@ed.gov) is the Program Analyst of the ED/IES SBIR program.

 

Does Gifted Education Access Vary by District? A Study in Washington State

Students and their teacher work over a table with a large map on it.

States and localities have discretion over gifted programs, but surprisingly little large-scale research compares the education environments of students in gifted programs to high-achieving, non-gifted students or investigates how these learning environments vary across districts. In this guest blog, Ben Backes, James Cowan, and Dan Goldhaber discuss their IES-funded exploration study, where they  use administrative and survey data to describe the relationship between gifted participation and access to educational resources across nearly 300 school districts in Washington State.

Gifted Access and Participation in Washington

The underrepresentation of low-income and minority students in gifted programs has attracted attention because identification procedures often include nomination or referral processes requiring subjective evaluation of student ability. Nationally, low-income and non-White students are significantly less likely to participate in gifted programs. To better understand who is in these gifted programs in Washington State, we are investigating participation in gifted programs by student race/ethnicity and socioeconomic status in grades 4–12. Consistent with prior studies, relative to White students, we observe Asian students being more likely to be found in gifted programs, while Black, Hispanic, and free and reduced-price lunch students are less likely to receive gifted services. Washington districts frequently use universal screening policies, and the Black-White and Hispanic-White gifted gaps disappear once statistical adjustments for prior test scores are used. We find little association between use of modifications for underrepresented minorities or low-income students—as reported by district coordinators—and gifted participation.

In sum, we find consistent evidence of disparities in access to gifted programs conditional on student achievement in Washington for low-income students, but less consistent evidence of disparities by student race/ethnicity. However, we only observe data on student academic aptitude beginning in third grade, and many classification decisions are made before this time. There may be disparities in initial gifted classification decisions for younger students.

Unsurprisingly, participation in gifted programs does affect student learning environments. Gifted students are much more likely to sit in classrooms with other high-achieving students and in more homogenous classrooms. These differences persist even after limiting the sample to high achievers. These patterns are most pronounced in elementary school. Gifted students are taught by more qualified teachers in elementary and middle school, as measured by experience, licensure test scores, and educational attainment. However, these effects are very small.

Differences Across Districts and Program Types

We find that although gifted students do tend to take more advanced courses with higher-achieving peers, there is considerable variation in the design of gifted programming across school districts.

  • Although school districts tend to assign gifted students to more advanced academic tracks, we find that these effects are mostly concentrated in large urban and suburban districts. The estimated gifted effects on access to more advanced courses are typically much smaller in the western and eastern school districts in smaller cities and rural areas of the state.
  • Larger, higher income districts in cities and suburbs operate gifted programs that provide more significant changes in learning environments. Students in these programs are more likely to share classrooms with other gifted students and with high-achieving students, and—in the case of large districts—sit in smaller classrooms with more qualified teachers.
  • The structure of gifted programming also influences the type of instructional approaches districts employ. Self-contained gifted programs—where students are assigned to specialized classrooms for most of their instruction—report using a broad array of acceleration strategies. However, about one third of gifted students participate in programs offered through services in regular classrooms, where independent study, supplemental instruction, and flexible ability grouping appear to be important strategies.
  • Well under half of districts have established gifted curricula for math or ELA. About 20% of gifted students are districts that report having a districtwide math curriculum and 25% are in districts that report having districtwide ELA curriculum. This finding is consistent with another study that surveyed districts in three states.

What’s Next?

There is a growing body of empirical literature providing causal estimates of the effect of gifted participation on student achievement which generally uses administrative data from a single school district. The results from this study of gifted programs across an entire state suggest that district-specific gifted programming effects are likely to vary substantially as the nature of the programs vary substantially across districts. This implies both that we should be cautious about generalizing based on district-level studies and that the variation in findings across studies may be indicative of true variation in program effectiveness. In the next stage of this project, we plan to investigate the extent to which this heterogeneity generates differences in the relationship between gifted participation and student achievement.


Ben Backes is a Senior Economist with CALDER at the American Institutes for Research.

James Cowan is a Senior Researcher with CALDER at the American Institutes for Research.

Dan Goldhaber is the Director of CALDER at the American Institutes for Research and CEDR at the University of Washington.

 

Active-Duty Military Families and School Supports

Virtually every school district in the United States educates a child whose parent or guardian is serving in the Armed Forces. This May for Military Appreciation Month we asked Timothy Cavell, University of Arkansas, and Renée Spencer, Boston University, to discuss their IES-funded project on school supports for military-connected students.

What motivated your team to study military-connected students?

We got interested in studying military-connected students through our work on youth mentoring. We saw the potential for school-based mentoring to offer a measured response to the needs of military-connected students who are generally resilient but who, at times, need extra support. With funding from IES, we developed a system for delivering school-based mentoring that was anchored by a district-level military student mentoring coordinator who forged home-school-community action teams composed of school staff, military parents, and community leaders. This project heightened our sensitivity to the high mobility that characterizes military-connected families. These students experience 6 to 9 moves during their K-12 years—a mobility rate 3 times that of non-military children. Our current IES project, the Active-Duty Military Families and School Supports (ADMFSS) study, looks beyond mentoring to explore other kinds of supports that might benefit highly mobile military students and parents. We want to know how school supports might foster school connectedness for military students and parents.

What are your preliminary research findings?

We’re still in the early phases of data analysis and working on manuscripts for publication, but we can share a few things we’ve learned so far. Our findings are based on collecting three waves of parent and student data across two separate cohorts of elementary and middle school students (N = 532).

  • Personal connections seem to matter most to military connected students and parents. Of the many types of school supports we measured, including things like welcoming practices and social and emotional learning supports, students rated having teachers help new students feel welcome when they first move into the school as most important. Parents rated ongoing communication with the school as most important.
  • School supports likely matter. In preliminary analyses of our data, we’re finding associations between measures of school support and academic and psychosocial functioning. Parents who reported receiving school supports they considered important also reported higher quality parent-teacher relationships, stronger perceptions that schools were welcoming of military families, and less parenting stress compared to parents who reported receiving fewer school supports they considered important. Students who reported receiving school supports they considered important reported feeling more connected to school, higher academic efficacy, higher school engagement, and greater family support than students who reported receiving fewer supports they considered important. Although military-connected parents often noted a preference for not being treated differently from civilian families, they do appreciate school supports geared specifically for military-connected students. Some examples include an orientation, open house, or school tour at the beginning of the school year; lunchtime groups specifically for military-connected students; and access to the military family life counselor.

Based on your preliminary research, what advice would you give schools on how to best support military-connected students?

Most military families seem to weather the stresses and strains of multiple moves, but there are times when these families and students need additional support. The majority of military-connected students attend civilian schools where teachers often lack understanding of and appreciation for military family culture. We learned from our work that military-connected parents greatly appreciate when school staff acknowledge the distinct nature of military family life and “see” their family’s sacrifice. Simply recognizing the distinct challenges and sacrifices these families encounter can go a long way, and small accommodations (for example, not penalizing students for being absent on the day an active-duty parent returns from deployment) are highly valued.  

What has been the most rewarding aspect of this project for you as a PI?

Without a doubt, it’s the level of appreciation expressed by the families who participated in our study. We were surprised that many felt our study was an effort to see the challenges faced by military-connected students, a group often considered the most invisible within a school. It is meaningful to engage in work that touches the lives of families who make important sacrifices to serve our country.

What are the next steps for your research team?

We just received recommendation for funding from the Department of Defense to develop and conduct an initial evaluation of a digital tool that can be used to support the school transitions of military-connected students in the elementary and middle school grades. This tool will capture information about the transitioning military student that is catalogued in a teacher-friendly e-dossier that parents can share with new teachers before the student arrives in their classroom.

We hope this tool will empower military-connected parents to act with greater agency when their family moves, and their student makes yet another school transition. By sharing this information with the new school, it provides military-connected students with just-in-time support and receiving teachers with just-in-time training about military family life and the needs of this new student.


Renée Spencer is a professor at the Boston University School of Social Work. Her research is rooted in relational perspectives of human development and much of her work focuses on distinguishing factors that facilitate positive and meaningful youth mentoring relationships from those that contribute to mentoring going awry. Dr. Spencer’s research highlights the importance of tailoring mentoring to the specific needs of special populations of youth, such as systems-involved and military-connected youth.

Tim Cavell is a professor in the Department of Psychological Science at the University of Arkansas. His research focuses on the role of parents, teachers, and mentors in selective interventions for children who are highly aggressive or chronically bullied. Dr. Cavell also examines school-based strategies to support elementary school students from military families.

This interview blog is part of a larger IES blog series on diversity, equity, inclusion and accessibility (DEIA) in the education sciences. It was produced by IES program officer Vinita Chhabra (Vinita.Chhabra@ed.gov), parent of military-connected students. For more information about the study, please contact the program officer Katina Stapleton (Katina.Stapleton@ed.gov).

Catalyzing Data Science Education in K-12: Recommendations from a Panel of Experts

Several efforts around the country are re-examining the skills students need to be prepared for the 21st century. Frontier digital technologies such as artificial intelligence, quantum computing, and blockchain carry the potential—and in some cases have already begun—to radically transform the economy and the workplace. Global engagement and national competitiveness will likely rely upon the skills, deep understanding, and leadership in these areas.

These technologies run on a new type of fuel: data, and very large amounts of it. The “big data” revolution has already changed the way modern businesses, government, and research is conducted, generating new information and shaping critical decisions at all levels. The volume and complexity of modern data has evolved to such a degree that an entire field—data science—has emerged to meet the needs of these new technologies and the stakeholders employing them, drawing upon an inter-disciplinary intersection of statistics, computer science, and domain knowledge. Data science professionals work in a variety of industries, and data now run many of the systems we interact with in our daily life—whether smart voice assistants on our phone, social media platforms in our personal and civic lives, or Internet of Things infrastructure in our built environment.

Students in grades K-12 also interact with these systems. Despite the vast amount of data that students are informally exposed to, there are currently limited formal learning opportunities for students to learn how to understand, assess, and work with the data that they encounter in a variety of contexts. Data science education in K-12 is not widespread, suggesting that our education system has not invested in building capacity around these new and important skill sets. A review of the NCES 2019 NAEP High School Transcript Study (HSTS) data revealed that only 0.07% of high school graduates took a data science course, and 0.04% of high school graduates took an applied or interdisciplinary data science course in health informatics, business, energy, or other field. Critically, education research informing the design, implementation, and teaching of these programs is similarly limited.

To develop a better understanding of the state of data science education research, on October 26, 2021, NCER convened a Technical Working Group (TWG) panel to provide recommendations to NCER on 1) the goals for K-12 data science education research, 2) how to improve K-12 data science education practice, 3) how to ensure access to and equity in data science education, and 4) what is needed to build an evidence base and research capacity for the new field. The five key recommendations from the panel are summarized in a new report.  

  • Recommendation 1. Articulate the Developmental Pathway—Panelists recommended more research to better articulate K-12 learning pathways for students.
  • Recommendation 2: Assess and Improve Data Science Software—Panelists suggested additional research to assess which data analysis software tools (tinker-based tools, spreadsheets, professional software, or other tools) should be incorporated into instruction and when, in order to be developmentally appropriate and accessible to all learners.
  • Recommendation 3: Build Tools for Measurement and Assessment—Panelists advocated for additional research to develop classroom assessment tools to support teachers and to track student success and progress, and to ensure students may earn transferable credit for their work from K-12 to postsecondary education.
  • Recommendation 4: Integrate Equity into Schooling and Systems—Panelists emphasized the importance of equity in opportunities and access to high quality data science education for all learners. Data science education research should be conducted with an equity lens that critically examines what is researched and for whom the research benefits.
  • Recommendation 5: Improve Implementation—Panelists highlighted several systematic barriers to successfully implementing and scaling data science education policies and practices, including insufficient resources, lack of teacher training, and misalignment in required coursework and credentials between K-12, postsecondary education, and industry. The panel called for research to evaluate different implementation approaches to reduce these barriers and increase the scalability of data science education policies and practices. 

Given the limited evidence base informing data science education at the K-12 level, panelists expressed a sense of urgency for additional research, and for expanded research efforts to quickly build an evidence base to evaluate the promise of, practices for, and best ways to impart data science education. These transformations may carry significant implications for career and technical skills, online social and civic engagement, and global citizenship in the digital sphere.   

Importantly, this report highlights more research is still needed—and soon. IES looks forward to the field’s ideas for research projects that address what works, for whom, and under which conditions within data science education and will continue to engage the education research community to draw attention to critical research gaps in this area.


Written by Zarek Drozda, 2021-2022 FAS Data Science Education Impact Fellow.

 

Using Cost Analysis to Inform Replicating or Scaling Education Interventions

A key challenge when conducting cost analysis as part of an efficacy study is producing information that can be useful for addressing questions related to replicability or scale. When the study is a follow up conducted many years after the implementation, the need to collect data retrospectively introduces additional complexities. As part of a recent follow-up efficacy study, Maya Escueta and Tyler Watts of Teachers College, Columbia University worked with the IES-funded Cost Analysis in Practice (CAP) project team to plan a cost analysis that would meet these challenges. This guest blog describes their process and lessons learned and provides resources for other researchers.

What was the intervention for which you estimated costs retrospectively?

We estimated the costs of a pre-kindergarten intervention, the Chicago School Readiness Project (CSRP), which was implemented in nine Head Start Centers in Chicago, Illinois for two cohorts of students in 2004-5 and 2005-6. CSRP was an early childhood intervention that targeted child self-regulation by attempting to overhaul teacher approaches to behavioral management. The intervention placed licensed mental health clinicians in classrooms, and these clinicians worked closely with teachers to reduce stress and improve the classroom climate. CSRP showed signs of initial efficacy on measures of preschool behavioral and cognitive outcomes, but more recent results from the follow-up study showed mainly null effects for the participants in late adolescence.

The IES research centers require a cost study for efficacy projects, so we faced the distinct challenge of conducting a cost analysis for an intervention nearly 20 years after it was implemented. Our goal was to render the cost estimates useful for education decision-makers today to help them consider whether to replicate or scale such an intervention in their own context.

What did you learn during this process?

When enumerating costs and considering how to implement an intervention in another context or at scale, we learned four distinct lessons.

1. Consider how best to scope the analysis to render the findings both credible and relevant given data limitations.

In our case, because we were conducting the analysis 20 years after the intervention was originally implemented, the limited availability of reliable data—a common challenge in retrospective cost analysis—posed two challenges. We had to consider the data we could reasonably obtain and what that would mean for the type of analysis we could credibly conduct. First, because no comprehensive cost analysis was conducted at the time of the intervention’s original implementation (to our knowledge), we could not accurately collect costs on the counterfactual condition. Second, we also lacked reliable measures of key outcomes over time, such as grade retention or special education placement that would be required for calculating a complete cost-benefit analysis. This meant we were limited in both the costs and the effects we could reliably estimate. Due to these data limitations, we could only credibly conduct a cost analysis, rather than a cost-effectiveness analysis or cost-benefit analysis, which generally produce more useful evidence to aid in decisions about replication or scale.

Because of this limitation, and to provide useful information for decision-makers who are considering implementing similar interventions in their current contexts, we decided to develop a likely present-day implementation scenario informed by the historical information we collected from the original implementation. We’ll expand on how we did this and the decisions we made in the following lessons.

2. Consider how to choose prices to improve comparability and to account for availability of ingredients at scale.

We used national average prices for all ingredients in this cost analysis to make the results more comparable to other cost analyses of similar interventions that also use national average prices. This involved some careful thought about how to price ingredients that were unique to the time or context of the original implementation, specific to the intervention, or in low supply. For example, when identifying prices for personnel, we either used current prices (national average salaries plus fringe benefits) for personnel with equivalent professional experience, or we inflation-adjusted the original consulting fees charged by personnel in highly specialized roles. This approach assumes that personnel who are qualified to serve in specialized roles are available on a wider scale, which may not always be the case.

In the original implementation of CSRP, spaces were rented for teacher behavior management workshops, stress reduction workshops, and initial training of the mental health clinicians. For our cost analysis, we assumed that using available school facilities were more likely and tenable when implementing CSRP at large scale. Instead of using rental prices, we valued the physical space needed to implement CSRP by using amortized construction costs of school facilities (for example, cafeteria/gym/classroom). We obtained these from the CAP Project’s Cost of Facilities Calculator.

3. Consider how to account for ingredients that may not be possible to scale.

Some resources are simply not available in similar quality at large scale. For example, the Principal Investigator (PI) for the original evaluation oversaw the implementation of the intervention, was highly invested in the fidelity of implementation, was willing to dedicate significant time, and created a culture that was supportive of the pre-K instructors to encourage buy-in for the intervention. In such cases, it is worth considering what her equivalent role would be in a non-research setting and how scalable this scenario would be. A potential proxy for the PI in this case may be a school principal or leader, but how much time could this person reasonably dedicate, and how similar would their skillset be?  

4. Consider how implementation might work in institutional contexts required for scale.

Institutional settings might necessarily change when taking an intervention to scale. In larger-scale settings, there may be other ways of implementing the intervention that might change the quantities of personnel and other resources required. For example, a pre-K intervention such as CSRP at larger scale may need to be implemented in various types of pre-K sites, such as public schools or community-based centers in addition to Head Start centers. In such cases, the student/teacher ratio may vary across different institutional contexts, which has implications for the per-student cost. If delivered in a manner where the student/ teacher ratio is higher than in the original implementation, the intervention may be less costly, but may also be less impactful. This highlights the importance of the institutional setting in which implementation is occurring, and how this might affect the use and costs of resources.

How can other researchers get assistance in conducting a cost analysis?

In conducting this analysis, we found the following CAP Project tools to be especially helpful (found on the CAP Resources page and the CAP Project homepage):

  • The Cost of Facilities Calculator: A tool that helps estimate the cost of physical spaces (facilities).
  • Cost Analysis Templates: Semi-automated Excel templates that support cost analysis calculations.
  • CAP Project Help Desk: Real-time guidance from a member of the CAP Project team. You will receive help in troubleshooting challenging issues with experts who can share specific resources. Submit a help desk request by visiting this page.

Maya Escueta is a Postdoctoral Associate in the Center for Child and Family Policy at Duke University where she researches the effects of poverty alleviation policies and parenting interventions on the early childhood home environment.

Tyler Watts is an Assistant Professor in the Department of Human Development at Teachers College, Columbia University. His research focuses on the connections between early childhood education and long-term outcomes.

For questions about the CSRP project, please contact the NCER program officer, Corinne.Alfeld@ed.gov. For questions about the CAP project, contact Allen.Ruby@ed.gov.