Inside IES Research

Notes from NCER & NCSER

Active-Duty Military Families and School Supports

Virtually every school district in the United States educates a child whose parent or guardian is serving in the Armed Forces. This May for Military Appreciation Month we asked Timothy Cavell, University of Arkansas, and Renée Spencer, Boston University, to discuss their IES-funded project on school supports for military-connected students.

What motivated your team to study military-connected students?

We got interested in studying military-connected students through our work on youth mentoring. We saw the potential for school-based mentoring to offer a measured response to the needs of military-connected students who are generally resilient but who, at times, need extra support. With funding from IES, we developed a system for delivering school-based mentoring that was anchored by a district-level military student mentoring coordinator who forged home-school-community action teams composed of school staff, military parents, and community leaders. This project heightened our sensitivity to the high mobility that characterizes military-connected families. These students experience 6 to 9 moves during their K-12 years—a mobility rate 3 times that of non-military children. Our current IES project, the Active-Duty Military Families and School Supports (ADMFSS) study, looks beyond mentoring to explore other kinds of supports that might benefit highly mobile military students and parents. We want to know how school supports might foster school connectedness for military students and parents.

What are your preliminary research findings?

We’re still in the early phases of data analysis and working on manuscripts for publication, but we can share a few things we’ve learned so far. Our findings are based on collecting three waves of parent and student data across two separate cohorts of elementary and middle school students (N = 532).

  • Personal connections seem to matter most to military connected students and parents. Of the many types of school supports we measured, including things like welcoming practices and social and emotional learning supports, students rated having teachers help new students feel welcome when they first move into the school as most important. Parents rated ongoing communication with the school as most important.
  • School supports likely matter. In preliminary analyses of our data, we’re finding associations between measures of school support and academic and psychosocial functioning. Parents who reported receiving school supports they considered important also reported higher quality parent-teacher relationships, stronger perceptions that schools were welcoming of military families, and less parenting stress compared to parents who reported receiving fewer school supports they considered important. Students who reported receiving school supports they considered important reported feeling more connected to school, higher academic efficacy, higher school engagement, and greater family support than students who reported receiving fewer supports they considered important. Although military-connected parents often noted a preference for not being treated differently from civilian families, they do appreciate school supports geared specifically for military-connected students. Some examples include an orientation, open house, or school tour at the beginning of the school year; lunchtime groups specifically for military-connected students; and access to the military family life counselor.

Based on your preliminary research, what advice would you give schools on how to best support military-connected students?

Most military families seem to weather the stresses and strains of multiple moves, but there are times when these families and students need additional support. The majority of military-connected students attend civilian schools where teachers often lack understanding of and appreciation for military family culture. We learned from our work that military-connected parents greatly appreciate when school staff acknowledge the distinct nature of military family life and “see” their family’s sacrifice. Simply recognizing the distinct challenges and sacrifices these families encounter can go a long way, and small accommodations (for example, not penalizing students for being absent on the day an active-duty parent returns from deployment) are highly valued.  

What has been the most rewarding aspect of this project for you as a PI?

Without a doubt, it’s the level of appreciation expressed by the families who participated in our study. We were surprised that many felt our study was an effort to see the challenges faced by military-connected students, a group often considered the most invisible within a school. It is meaningful to engage in work that touches the lives of families who make important sacrifices to serve our country.

What are the next steps for your research team?

We just received recommendation for funding from the Department of Defense to develop and conduct an initial evaluation of a digital tool that can be used to support the school transitions of military-connected students in the elementary and middle school grades. This tool will capture information about the transitioning military student that is catalogued in a teacher-friendly e-dossier that parents can share with new teachers before the student arrives in their classroom.

We hope this tool will empower military-connected parents to act with greater agency when their family moves, and their student makes yet another school transition. By sharing this information with the new school, it provides military-connected students with just-in-time support and receiving teachers with just-in-time training about military family life and the needs of this new student.


Renée Spencer is a professor at the Boston University School of Social Work. Her research is rooted in relational perspectives of human development and much of her work focuses on distinguishing factors that facilitate positive and meaningful youth mentoring relationships from those that contribute to mentoring going awry. Dr. Spencer’s research highlights the importance of tailoring mentoring to the specific needs of special populations of youth, such as systems-involved and military-connected youth.

Tim Cavell is a professor in the Department of Psychological Science at the University of Arkansas. His research focuses on the role of parents, teachers, and mentors in selective interventions for children who are highly aggressive or chronically bullied. Dr. Cavell also examines school-based strategies to support elementary school students from military families.

This interview blog is part of a larger IES blog series on diversity, equity, inclusion and accessibility (DEIA) in the education sciences. It was produced by IES program officer Vinita Chhabra (Vinita.Chhabra@ed.gov), parent of military-connected students. For more information about the study, please contact the program officer Katina Stapleton (Katina.Stapleton@ed.gov).

Catalyzing Data Science Education in K-12: Recommendations from a Panel of Experts

Several efforts around the country are re-examining the skills students need to be prepared for the 21st century. Frontier digital technologies such as artificial intelligence, quantum computing, and blockchain carry the potential—and in some cases have already begun—to radically transform the economy and the workplace. Global engagement and national competitiveness will likely rely upon the skills, deep understanding, and leadership in these areas.

These technologies run on a new type of fuel: data, and very large amounts of it. The “big data” revolution has already changed the way modern businesses, government, and research is conducted, generating new information and shaping critical decisions at all levels. The volume and complexity of modern data has evolved to such a degree that an entire field—data science—has emerged to meet the needs of these new technologies and the stakeholders employing them, drawing upon an inter-disciplinary intersection of statistics, computer science, and domain knowledge. Data science professionals work in a variety of industries, and data now run many of the systems we interact with in our daily life—whether smart voice assistants on our phone, social media platforms in our personal and civic lives, or Internet of Things infrastructure in our built environment.

Students in grades K-12 also interact with these systems. Despite the vast amount of data that students are informally exposed to, there are currently limited formal learning opportunities for students to learn how to understand, assess, and work with the data that they encounter in a variety of contexts. Data science education in K-12 is not widespread, suggesting that our education system has not invested in building capacity around these new and important skill sets. A review of the NCES 2019 NAEP High School Transcript Study (HSTS) data revealed that only 0.07% of high school graduates took a data science course, and 0.04% of high school graduates took an applied or interdisciplinary data science course in health informatics, business, energy, or other field. Critically, education research informing the design, implementation, and teaching of these programs is similarly limited.

To develop a better understanding of the state of data science education research, on October 26, 2021, NCER convened a Technical Working Group (TWG) panel to provide recommendations to NCER on 1) the goals for K-12 data science education research, 2) how to improve K-12 data science education practice, 3) how to ensure access to and equity in data science education, and 4) what is needed to build an evidence base and research capacity for the new field. The five key recommendations from the panel are summarized in a new report.  

  • Recommendation 1. Articulate the Developmental Pathway—Panelists recommended more research to better articulate K-12 learning pathways for students.
  • Recommendation 2: Assess and Improve Data Science Software—Panelists suggested additional research to assess which data analysis software tools (tinker-based tools, spreadsheets, professional software, or other tools) should be incorporated into instruction and when, in order to be developmentally appropriate and accessible to all learners.
  • Recommendation 3: Build Tools for Measurement and Assessment—Panelists advocated for additional research to develop classroom assessment tools to support teachers and to track student success and progress, and to ensure students may earn transferable credit for their work from K-12 to postsecondary education.
  • Recommendation 4: Integrate Equity into Schooling and Systems—Panelists emphasized the importance of equity in opportunities and access to high quality data science education for all learners. Data science education research should be conducted with an equity lens that critically examines what is researched and for whom the research benefits.
  • Recommendation 5: Improve Implementation—Panelists highlighted several systematic barriers to successfully implementing and scaling data science education policies and practices, including insufficient resources, lack of teacher training, and misalignment in required coursework and credentials between K-12, postsecondary education, and industry. The panel called for research to evaluate different implementation approaches to reduce these barriers and increase the scalability of data science education policies and practices. 

Given the limited evidence base informing data science education at the K-12 level, panelists expressed a sense of urgency for additional research, and for expanded research efforts to quickly build an evidence base to evaluate the promise of, practices for, and best ways to impart data science education. These transformations may carry significant implications for career and technical skills, online social and civic engagement, and global citizenship in the digital sphere.   

Importantly, this report highlights more research is still needed—and soon. IES looks forward to the field’s ideas for research projects that address what works, for whom, and under which conditions within data science education and will continue to engage the education research community to draw attention to critical research gaps in this area.


Written by Zarek Drozda, 2021-2022 FAS Data Science Education Impact Fellow.

 

Using Cost Analysis to Inform Replicating or Scaling Education Interventions

A key challenge when conducting cost analysis as part of an efficacy study is producing information that can be useful for addressing questions related to replicability or scale. When the study is a follow up conducted many years after the implementation, the need to collect data retrospectively introduces additional complexities. As part of a recent follow-up efficacy study, Maya Escueta and Tyler Watts of Teachers College, Columbia University worked with the IES-funded Cost Analysis in Practice (CAP) project team to plan a cost analysis that would meet these challenges. This guest blog describes their process and lessons learned and provides resources for other researchers.

What was the intervention for which you estimated costs retrospectively?

We estimated the costs of a pre-kindergarten intervention, the Chicago School Readiness Project (CSRP), which was implemented in nine Head Start Centers in Chicago, Illinois for two cohorts of students in 2004-5 and 2005-6. CSRP was an early childhood intervention that targeted child self-regulation by attempting to overhaul teacher approaches to behavioral management. The intervention placed licensed mental health clinicians in classrooms, and these clinicians worked closely with teachers to reduce stress and improve the classroom climate. CSRP showed signs of initial efficacy on measures of preschool behavioral and cognitive outcomes, but more recent results from the follow-up study showed mainly null effects for the participants in late adolescence.

The IES research centers require a cost study for efficacy projects, so we faced the distinct challenge of conducting a cost analysis for an intervention nearly 20 years after it was implemented. Our goal was to render the cost estimates useful for education decision-makers today to help them consider whether to replicate or scale such an intervention in their own context.

What did you learn during this process?

When enumerating costs and considering how to implement an intervention in another context or at scale, we learned four distinct lessons.

1. Consider how best to scope the analysis to render the findings both credible and relevant given data limitations.

In our case, because we were conducting the analysis 20 years after the intervention was originally implemented, the limited availability of reliable data—a common challenge in retrospective cost analysis—posed two challenges. We had to consider the data we could reasonably obtain and what that would mean for the type of analysis we could credibly conduct. First, because no comprehensive cost analysis was conducted at the time of the intervention’s original implementation (to our knowledge), we could not accurately collect costs on the counterfactual condition. Second, we also lacked reliable measures of key outcomes over time, such as grade retention or special education placement that would be required for calculating a complete cost-benefit analysis. This meant we were limited in both the costs and the effects we could reliably estimate. Due to these data limitations, we could only credibly conduct a cost analysis, rather than a cost-effectiveness analysis or cost-benefit analysis, which generally produce more useful evidence to aid in decisions about replication or scale.

Because of this limitation, and to provide useful information for decision-makers who are considering implementing similar interventions in their current contexts, we decided to develop a likely present-day implementation scenario informed by the historical information we collected from the original implementation. We’ll expand on how we did this and the decisions we made in the following lessons.

2. Consider how to choose prices to improve comparability and to account for availability of ingredients at scale.

We used national average prices for all ingredients in this cost analysis to make the results more comparable to other cost analyses of similar interventions that also use national average prices. This involved some careful thought about how to price ingredients that were unique to the time or context of the original implementation, specific to the intervention, or in low supply. For example, when identifying prices for personnel, we either used current prices (national average salaries plus fringe benefits) for personnel with equivalent professional experience, or we inflation-adjusted the original consulting fees charged by personnel in highly specialized roles. This approach assumes that personnel who are qualified to serve in specialized roles are available on a wider scale, which may not always be the case.

In the original implementation of CSRP, spaces were rented for teacher behavior management workshops, stress reduction workshops, and initial training of the mental health clinicians. For our cost analysis, we assumed that using available school facilities were more likely and tenable when implementing CSRP at large scale. Instead of using rental prices, we valued the physical space needed to implement CSRP by using amortized construction costs of school facilities (for example, cafeteria/gym/classroom). We obtained these from the CAP Project’s Cost of Facilities Calculator.

3. Consider how to account for ingredients that may not be possible to scale.

Some resources are simply not available in similar quality at large scale. For example, the Principal Investigator (PI) for the original evaluation oversaw the implementation of the intervention, was highly invested in the fidelity of implementation, was willing to dedicate significant time, and created a culture that was supportive of the pre-K instructors to encourage buy-in for the intervention. In such cases, it is worth considering what her equivalent role would be in a non-research setting and how scalable this scenario would be. A potential proxy for the PI in this case may be a school principal or leader, but how much time could this person reasonably dedicate, and how similar would their skillset be?  

4. Consider how implementation might work in institutional contexts required for scale.

Institutional settings might necessarily change when taking an intervention to scale. In larger-scale settings, there may be other ways of implementing the intervention that might change the quantities of personnel and other resources required. For example, a pre-K intervention such as CSRP at larger scale may need to be implemented in various types of pre-K sites, such as public schools or community-based centers in addition to Head Start centers. In such cases, the student/teacher ratio may vary across different institutional contexts, which has implications for the per-student cost. If delivered in a manner where the student/ teacher ratio is higher than in the original implementation, the intervention may be less costly, but may also be less impactful. This highlights the importance of the institutional setting in which implementation is occurring, and how this might affect the use and costs of resources.

How can other researchers get assistance in conducting a cost analysis?

In conducting this analysis, we found the following CAP Project tools to be especially helpful (found on the CAP Resources page and the CAP Project homepage):

  • The Cost of Facilities Calculator: A tool that helps estimate the cost of physical spaces (facilities).
  • Cost Analysis Templates: Semi-automated Excel templates that support cost analysis calculations.
  • CAP Project Help Desk: Real-time guidance from a member of the CAP Project team. You will receive help in troubleshooting challenging issues with experts who can share specific resources. Submit a help desk request by visiting this page.

Maya Escueta is a Postdoctoral Associate in the Center for Child and Family Policy at Duke University where she researches the effects of poverty alleviation policies and parenting interventions on the early childhood home environment.

Tyler Watts is an Assistant Professor in the Department of Human Development at Teachers College, Columbia University. His research focuses on the connections between early childhood education and long-term outcomes.

For questions about the CSRP project, please contact the NCER program officer, Corinne.Alfeld@ed.gov. For questions about the CAP project, contact Allen.Ruby@ed.gov.

 

Powering Our Future: How Service-Learning Aligned with Next Generation Science Standards Can Promote Science Learning, Social and Emotional Skills, and Civic Engagement

Each generation faces its own societal challenges. Two prominent issues—the climate crisis and America’s political divide—are heavy burdens for today’s youth. Without explicit focus in schools, it is hard to imagine how children will learn to work across differences and collaborate with others to solve complex environmental problems. Youth are very capable people, and school comes alive when they feel agency and see how their efforts matter in the community. Service-learning can help teachers make instruction feel relevant and teach skills that lead to civic engagement as youth learn to design, implement, and evaluate solutions to problems that are important to them. In this interview blog, the Connect Science project team explains how they developed curriculum and professional development to support teachers to engage their students in service-learning experiences.

Can you tell us about Connect Science and what it looks like in action?

Fueled by an IES Development and Innovation grant, our team developed and evaluated a science-based service-learning approach for the upper-elementary school years. In doing so, we answered a need that teachers and schools face as they strive to create engaging experiences aligned with the Next Generation Science Standards (NGSS).

Connect Science is a 12-week project-based learning unit for upper elementary students. Early on, teachers and students explore topics of energy and natural resources using lessons aligned with the NGSS. Teachers guide student learning on what it means to be an engaged citizen and on the social and collaborative skills needed to take action in the community. To prepare, teachers receive five days of professional development and follow-up coaching. Teachers also receive a Connect Science manual, related books, and science materials.

But what does Connect Science actually look like in action? Imagine fourth graders engaged in a science unit on renewable and non-renewable resources. The students learn about different energy sources and then discuss pros and cons of each source. They become aware that non-renewable energy resources are rapidly diminishing and would not always be available to generate electricity. The awareness of this problem energizes them to promote energy conservation. Toward that goal, the students decide to educate other students and families at their school about energy use. At the next open house night, they turn their cafeteria into an energy fair where they share important information. For example, one group of students teaches about what types of energy sources were used in their state to produce electricity and another group teaches ways that people can save energy in their home. Before and after the energy fair, the students administer a pre- and post-survey on energy facts to size up what their visitors learned.

How did the IES grant support the development and pilot testing of Connect Science?

In the first two years of this grant, we developed and tested materials with teachers. In the third year, we conducted a randomized controlled trial of Connect Science involving 41 classrooms with 20 in Connect Science and 21 in a waitlist comparison group, resulting in a student sample of 868 students (423 students participated in the intervention).

We found that Connect Science impacted teacher practices and student outcomes. Teachers in the Connect Science group were more effective at engaging in the two NGSS practices that we measured: eliciting and building on prior knowledge and creating opportunities for student critique, explanation, and argument. Further, we saw higher science achievement and energy attitudes and behaviors in the intervention than control condition. The social skill results hinged on the fidelity of implementation. When teachers used more Connect Science practices, students showed improved communication and social competence. As a result of these findings, Connect Science is designated as a Promising Program by the Collaborative for Academic, Social, and Emotional Learning (CASEL).

What are the implications of your findings?

Too few projects integrate academic and social learning in schools. Often, high-quality NGSS materials are developed with little thought about the social skills students need to engage in that instruction. Likewise, social and emotional learning is often taught separately from academic content. Service-learning is a framework that bridges these two areas and allows students to engage in authentic, science-based work. Given our experiences, we have a few recommendations for educators eager to use service-learning.

  • Teach social, emotional, and collaborative skills with intention before launching into group work. In the elementary schools, children thrive from being in supportive caring classrooms and they respond well to lessons on active listening, respectful communication, and understanding people with multiple perspectives.
  • Leverage the existing curriculum and build in service-learning experiences. Rather than adding one more new topic, look at existing curricular topics and use service-learning to facilitate deep learning on content areas that already part of the curriculum.
  • Amplify youth voice. Teachers need to work with students to identify a relevant community problem and generate solutions to that problem. We carefully developed the Connect Science materials to be more teacher-directed toward the beginning of the unit and more student-directed toward the end. This approach was based both on theoretical and empirical work supporting the importance of student autonomy.

 


Sara Rimm-Kaufman is the Commonwealth Professor of Education at the University of Virginia School of Education and Human Development. Her recent book for teachers, SEL from the Start, is based on the Connect Science work.

Eileen Merritt is a Research Scientist in the College of Natural Resources and the Environment at Virginia Tech. Her research and teaching focus on environmental and sustainability education.

Tracy Harkins is the owner of Harkins Consulting, LLC in Maine. Her focus is providing professional development and resources to engage and motivate student learners through service-learning. She will be offering an upcoming Connect Science Institute in Summer 2022.

For questions about this project, please contact Corinne.Alfeld@ed.gov, NCER program officer.

DE21: A Researcher-Practitioner-Policymaker Conference on Dual Enrollment

Dual enrollment improves student college going and postsecondary success, but practitioners need help in understanding the impact of dual enrollment and in learning strategies associated with effective and equitable implementation. Under the auspices of the IES-funded Evaluation of Career and College Promise (CCP) project, the North Carolina Community College System suggested hosting a conference to build knowledge and capacity in the field about dual enrollment. The Evaluation of CCP is a partnership with the SERVE Center at the University of North Carolina at Greensboro, the North Carolina Department of Public Instruction, the North Carolina Community College System, and the RAND Corporation. In addition to the research goals—which involve looking at the implementation, impact, and cost of North Carolina’s dual enrollment program—the project also has a goal of capacity development for the agencies and for practitioners. As part of meeting this last goal, the project recently hosted a conference on Dual Enrollment: Accelerating Educational Attainment (DE21) with over 1,000 registrants from North Carolina and around the country.      

Julie Edmunds, the project’s principal investigator, discusses the DE21 conference.

Why host a conference on dual enrollment?

This was the brainchild of our partners at the North Carolina Community College System. They wanted to create an opportunity where researchers and practitioners could gather and share lessons learned from their respective work. The NC Community College System expected that we would be learning a lot from our project that we would want to share; they also knew that the people in the trenches had many valuable insights to help bridge the gap between research and practice. Because existing research shows that not all groups of students have the same access to dual enrollment, the project team decided collectively that the conference should have a strong focus on equity and to use the conference as a way to communicate and discuss strategies to support equity.

What happened at the conference?

We had a total of 40 sessions across two full days. There were dynamic keynote speakers, including Karen Stout from Achieving the Dream, and panels that discussed dual enrollment from the policy, research, student and parent perspectives. Although there was a strong North Carolina focus, there were sessions from other states such as Massachusetts, Texas, Indiana, and Ohio.

Conference presentations were organized into five themes: expanding access and equity, fostering college attainment, ensuring a successful transition to college and careers, preparing students for dual enrollment, and supporting success in dual enrollment courses.

The CCP study team presented findings from our evaluation of North Carolina’s dual enrollment pathways. We looked at individual and school-level factors associated with dual enrollment participation, such as student demographics, school size, locale, percentage of students from underrepresented minority groups, academic achievement, and workforce-orientation of students. Student socioeconomic level did not affect participation in dual enrollment. We also presented preliminary impacts of North Carolina’s three different dual enrollment pathways (college transfer, Career and Technical Education, and Cooperative Innovative High Schools or early colleges). Results from these three pathways showed that CCP participants had better high school outcomes such as higher school graduation rates and were more likely to enroll in postsecondary education. In addition, there were multiple sessions sharing research results from other states.

There were many presentations from practitioners that focused on topics like rigorous instruction, advising, participation of students with disabilities, creating strong secondary-postsecondary partnerships, using high school teachers as college instructors, among others. I need to give a huge shoutout to Katie Bao from the NC Community College System, who shepherded us all through the conference planning and implementation process.

What was the impact of the pandemic?

When we originally planned for the conference, we thought it would be in person. After the pandemic hit, we decided (as many other organizations did) to host it virtually. This made the conference much more accessible to a national audience, and we had participants and presenters from around the country.

What if someone missed the conference?

Another benefit of a virtual conference is that we are able to share all the sessions from the meeting. Please visit our site on YouTube to listen to the conference. 

What comes next?

Our study work continues, and we will share the results in a variety of ways, including through briefs and journal articles. We are also planning to host a second conference in 2023 and expect that it will have a virtual component so that it can continue to be available to a national audience.


Dr. Julie Edmunds is a Program Director at the SERVE Center at the University of North Carolina at Greensboro. In addition to being the PI on the Evaluation of Career and College Promise, she is one of the leading researchers on early college, a model that combines high school and college.