Inside IES Research

Notes from NCER & NCSER

How Remote Data Collection Enhanced One Grantee’s Classroom Research During COVID-19

Under an IES grant, Michigan State University, in collaboration with the Michigan Department of Education, the Michigan Center for Educational Performance and Information, and the University of Michigan, is assessing the implementation, impact, and cost of the Michigan “Read by Grade 3” law intended to increase early literacy outcomes for Michigan students. In this guest blog, Dr. Tanya Wright and Lori Bruner discuss how they were able to quickly pivot to a remote data collection plan when COVID-19 disrupted their initial research plan.  

The COVID-19 pandemic began while we were planning a study of early literacy coaching for the 2020-2021 academic year. It soon became abundantly clear that restrictions to in-person research would pose a major hurdle for our research team. We had planned to enter classrooms and record videos of literacy instruction in the fall. As such, we found ourselves faced with a difficult choice: we could pause our study until it became safer to visit classrooms and miss the opportunity to learn about literacy coaching and in-person classroom instruction during the pandemic, or we could quickly pivot to a remote data collection plan.

Our team chose the second option. We found that there are multiple technologies available to carry out remote data collection. We chose one of them (a device known as the Swivl) that included a robotic mount, where a tablet or smartphone can be placed to take the video, with a 360-degree rotating platform that works in tandem with a handheld or wearable tracker and an app that allows videos to be instantly uploaded to a cloud-based storage system for easy access.

Over the course of the school year, we captured over 100 hours of elementary literacy instruction in 26 classrooms throughout our state. While remote data collection looks and feels very different from visiting a classroom to record video, we learned that it offers many benefits to both researchers and educators alike. We also learned a few important lessons along the way.

First, we learned remote data collection provides greater flexibility for both researchers and educators. In our original study design, we planned to hire data collectors to visit classrooms, which restricted our recruitment of schools to a reasonable driving distance from Michigan State University (MSU). However, recording devices allow us to capture video anywhere, including rural areas of our state that are often excluded from classroom research due to their remote location. Furthermore, we found that the cost of purchasing and shipping equipment to schools is significantly less than paying for travel and people’s time to visit classrooms. In addition, using devices in place of data collectors allowed us to easily adapt to last-minute schedule changes and offer teachers the option to record video over multiple days to accommodate shifts in instruction due to COVID-19.

Second, we discovered that we could capture more classroom talk than when using a typical video camera. After some trial and error, we settled on a device with three external wireless microphones: one for the teacher and two additional microphones to place around the classroom. Not only did the extra microphones record audio beyond what the teacher was saying, but we learned that we can also isolate each microphone during data analysis to hear what is happening in specific areas of the classroom (even when the teacher and children were wearing masks). We also purchased an additional wide-angle lens, which clipped over the camera on our tablet and allowed us to capture a wider video angle.  

Third, we found remote data collection to be less intrusive than sending a research team into schools. The device is compact and can be placed on any flat surface in the classroom or be mounted on a basic tripod. The teacher has the option to wear the microphone on a lanyard to serve as a hands-free tracker that signals the device to rotate to follow the teacher’s movements automatically. At the end of the lesson, the video uploads to a password-protected storage cloud with one touch of a button, making it easy for teachers to share videos with our research team. We then download the videos to the MSU server and delete them from our cloud account. This set-up allowed us to collect data with minimal disruption, especially when compared to sending a person with a video camera to spend time in the classroom.

As with most remote work this year, we ran into a few unexpected hurdles during our first round of data collection. After gathering feedback from teachers and members of our research team, we were able to make adjustments that led to a better experience during the second round of data collection this spring. We hope the following suggestions might help others who are considering such a device to collect classroom data in the future:

  1. Consider providing teachers with a brief informational video or offering after-school training sessions to help answer questions and address concerns ahead of your data collection period. We initially provided teachers with a detailed user guide, but we found that the extra support was key to ensuring teachers had a positive experience with the device. You might also consider appointing a member of your research team to serve as a contact person to answer questions about the remote data collection during data collection periods.
  2. As a research team, it is important to remember that team members will not be collecting the data, so it is critical to provide teachers with clear directions ahead of time: what exactly do you want them to record? Our team found it helpful to send teachers a brief two-minute video outlining our goals and then follow up with a printable checklist they could use on the day they recorded instruction. 
  3. Finally, we found it beneficial to scan the videos for content at the end of each day. By doing so, we were able to spot a few problems, such as missing audio or a device that stopped rotating during a lesson. While these instances were rare, it was helpful to catch them right away, while teachers still had the device in their schools so that they could record missing parts the next day.

Although restrictions to in-person research are beginning to lift, we plan to continue using remote data collection for the remaining three years of our project. Conducting classroom research during the COVID-19 pandemic has proven challenging at every turn, but as we adapted to remote video data collection, we were pleased to find unanticipated benefits for our research team and for our study participants.


This blog is part of a series focusing on conducting education research during COVID-19. For other blog posts related to this topic, please see here.

Tanya S. Wright is an Associate Professor of Language and Literacy in the Department of Teacher Education at Michigan State University.

Lori Bruner is a doctoral candidate in the Curriculum, Instruction, and Teacher Education program at Michigan State University.

Cost Analysis in Practice: Resources for Cost Analysis Studies

IES supports rigorous research that can provide scientific evidence on how best to address our nation’s most pressing education needs. As part of the Standards for Excellence in Education Research (SEER) principles, IES-funded researchers are encouraged, and in some cases required, to conduct a cost analysis for their projects with the intended goal of supporting education agencies’ decision-making around the adoption of programs, policies, or practices. 

 

The Cost Analysis in Practice (CAP) Project is a 3-year initiative funded by IES to support researchers and practitioners who are planning or conducting a cost analysis of educational programs and practices. This support includes the following freely available resources.

  • Resources developed by the CAP Project
    • Introductory resources on cost analysis including Standards and Guidelines 1.1, an infographic, a video lecture, and FAQs.
    • Tools for planning your cost analysis, collecting and analyzing cost data, and reporting your results.
    • A Help Desk for you to submit inquiries about conducting a cost analysis with a response from a member of the CAP Project Team within two business days.
  • Other resources recommended by the CAP Project
    • Background materials on cost analysis
    • Guidance on carrying out a cost analysis
    • Standards for the Economic Evaluation of Educational and Social Programs
    • Cost analysis software

 

The CAP Project is also involved in longer-term collaborations with IES-funded evaluation projects to better understand their cost analysis needs. As part of this work, the CAP Project will be producing a set of three blogs to discuss practical details regarding cost studies based on its collaboration with a replication project evaluating an intervention that integrates literacy instruction into the teaching of American history. These blogs will discuss the following:

  • Common cost analysis challenges that researchers encounter and recommendations to address them
  • The development of a timeline resource for planning a cost study
  • Data collection for a cost study

 

The CAP Project is interested in your feedback on any of the CAP Project resources and welcomes suggestions for additional resources to support cost analysis. If you have any feedback, please fill out a suggestion form at the bottom of the Resources web page.

Partnering with Practitioners to Address Mental Health in Rural Communities

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In 2017, Dr. Wendy Reinke, University of Missouri, received IES funding to formalize the Boone County School’s Mental Health Coalition by strengthening their partnership and validating the Early Identification System (EIS) to screen for social, emotional, behavioral, and academic risk among K-12 students in rural schools. Building on these successes, Dr. Reinke now leads the National Center for Rural School Mental Health (NCRSMH), a consortium of researchers leading efforts to advance mental health screening and support in rural communities.

Bennett Lunn, a Truman-Albright Fellow at IES, asked Dr. Reinke about the work of the original partnership and how it has informed her efforts to build new partnerships with other rural schools around the country. Below are her responses.

 

What was the purpose of the Boone County Schools Mental Health Coalition and what inspired you to do this work?

In 2015, our county passed an ordinance in which a small percent of our sales tax is set aside to support youth mental health in our community. As a result, the schools had visits from many of the local mental health agencies to set up services in school buildings. The superintendents quickly realized that it would be wise to have a more coordinated effort across school districts. They formed a coalition and partnered with researchers at the University of Missouri to develop a comprehensive model to prevent and intervene in youth mental health problems. The enthusiasm of our school partners and their willingness to consider research evidence to inform the model was so energizing! We were able to build a multi-tiered prevention and intervention framework that uses universal screening data to inform supports. In addition, we were awarded an IES partnership grant to help validate the screener, conduct focus groups and surveys of stakeholders to understand the feasibility and social validity of the model, and determine how fidelity to the model is related to student outcomes. The EIS is now being used in 54 school buildings across six school districts as part of their daily practice. 

 

Were there advantages to operating in a partnership to validate the screener?  

The main benefit of working in partnership with school personnel is that you learn what works under what circumstances from those directly involved in supporting students. We meet every month with the superintendents and other school personnel to ensure that if things are not working, we can find solutions before the problems become too big. We vote on any processes or procedure that were seen as needing to change. The meeting includes school personnel sharing the types of activities they are doing in their buildings so that others can replicate those best practices, and we meet with students to get their perspectives on what is working. In addition, the university faculty bring calls for external funding of research to the group to get ideas for what types of research would be appropriate and beneficial to the group. Schools are constantly changing and encountering new challenges. Being close to those who are working in the buildings allows for us to work together in forming and implementing feasible solutions over time.

 

What advice do you have for researchers trying to make research useful and accessible to practitioners? 

Be collaborative and authentic. Demonstrate that you are truly there to create meaningful and important changes that will benefit students. Show that your priority is improving outcomes for schools and students and not simply collecting data for a study. These actions are vital to building trust in a partnership. By sharing the process of reviewing data, researchers can show how the research is directly impacting schools, and practitioners have an opportunity to share how their experience relates to the data. A good way to do this is by presenting with practitioners at conferences or collaboratively writing manuscripts for peer reviewed journals. For example, we wrote a manuscript (currently under review) with one of our school counselor partners describing how he used EIS data in practice. Through collaboration like this, we find that the purpose and process of research becomes less mysterious, and schools can more easily identify and use practices that are shown to work. In this way, long-term collaboration between partners can ultimately benefit students!

 

How does the work of the original partnership inform your current work with the National Center for Rural School Mental Health? 

We are bringing what we have learned both in how to be effective partners and to improve the model to the National Center for Rural School Mental Health. For instance, we are developing an intervention hub on our Rural Center website that will allow schools to directly link evidence-based interventions to the data. We learned that having readily available ideas for intervening using the data is an important aspect of success. We have also learned that schools with problem solving teams can implement the model with higher fidelity, so we are developing training modules for schools to learn how to use the data in problem solving teams. We will be taking the comprehensive model to prevent and intervene with youth mental health and using it in rural schools. We will continue to partner with our rural schools to continuously improve the work so that it is feasible, socially valid, and important to rural schools and youth in those schools.


 

Dr. Wendy Reinke is an Associate Vice Chancellor for Research at the University of Missouri College of Education. Her research focuses on school-based prevention interventions for children and youth with social emotional and behavioral challenges.

Written by Bennett Lunn (Bennett.lunn@ed.gov), Truman-Albright Fellow, National Center for Education Research and National Center for Special Education Research

Highlighting the Science of Learning at the 2021 ED Games Expo

Research on how people learn is critical for informing the design of effective education technology products. To design products that improve student learning, we need to understand how students approach solving problems, the information they need to adopt optimal solution strategies, the skills that underlie success in particular academic domains, the best ways to arrange information on a screen to guide student attention to relevant information, and the best study strategies for optimizing learning and retention. Through its research grants programs, IES has invested in research projects to develop and test education technology products based in the science of learning.

 

The 2021 ED Games Expo, which takes place virtually from June 1-5, features a number of these products, and you can learn more about them through a mix of live and pre-recorded sessions and videos. Most are ready to demo now. Students and educators can send questions about the research, game-play, or ed tech experience directly to the participating researchers. Here are just a few of the many ways you can interact with IES-funded researchers at the Expo:

 

  1. The Virtual Learning Lab (VLL) will be hosting a live, virtual session on Friday, June 4th from 4:00-5:30pm Eastern Time to celebrate their 5-year research collaboration to explore precision education in the context of algebra instruction. The VLL developed an AI-powered video recommendation system that personalizes math instruction within Math Nation. The researchers measured student ability and engagement, detected effects of virtual learning environment usage on achievement, and identified characteristics of effective online tutoring. The session will feature short talks and opportunities for Q&A: 
  • A Video Recommendation System for Algebra (Walter Leite, University of Florida)
  • Reinforcement Learning for Enhancing Collaborative Problem Solving (Guojing Zhou, University of Colorado Boulder)
  • Natural Language Processing for VLE Research (Danielle McNamara, Arizona State University)
  • Identifying Pedagogical Conversational Patterns in Online Algebra Learning (Jinnie Shin, University of Florida)
  • Scaling Items and Persons for Obtaining Ability Estimates in VLEs (A. Corinne Huggins-Manley, University of Florida)
  • Measuring Student Ability from Single- and Multiple-Attempt Practice Assessments in VLEs (Ziying Li, University of Florida)
  • Detecting Careless Responding to Assessment Items in VLEs (Sanaz Nazari, University of Florida)
  • Personalization, Content Exposure, and Fairness in Assessment (Daniel Katz, University of California, Santa Barbara)
  • Fair AI in VLEs (Chenglu Li, University of Florida)

 

  1. The ED Games Expo YouTube Playlist, which will be posted on June 1st on the event page, features 26 products developed with IES grant funding. For ed tech products informed by research on how people learn, check out the products funded through the Cognition and Student Learning topic:
  • Graspable Math allows math teachers to assign interactive algebra tasks and turns equations into tangible objects that middle school and high school students can manipulate to practice and explore. Teachers can follow live, step-by-step, student work.
  • eBravo Boulder Reading Intervention is a self-paced personalized reading comprehension curriculum that teaches secondary students the problem-solving skills good readers use to learn from challenging texts, in this case in the science discipline of ecology. Reading strategies and exercises are guided by well-researched models of reading comprehension, helping students build deep, durable, and reusable knowledge from text. 
  • iSTART and Writing Pal are interventions designed for middle school students, high school students, and young adults improve their reading and writing skills. Within these interventions, students play games to practice reading comprehension and writing strategies.
  • All You Can Eat, Gwakkamole, and CrushStations are part of a suite of Executive Function skill-building games, designed to improve student shifting, inhibitory control, and working memory respectively.

 

We hope you can join us for this exciting event in June to learn more about and try out all the research-based products ready to be used in our nation’s schools. For more information on the featured resources and online events, please see this blog.


Written by Erin Higgins (Erin.Higgins@ed.gov), Program Officer for the Cognition and Student Learning program, National Center for Education Research

Towards a Better Understanding of Middle-Schoolers’ Argumentation Skills

What is the difference between fact and opinion? How do you find relevant evidence and use it to support a position? Every day, teachers help students practice these skills by fostering critical discussions, a form of argumentation that encourages students to use reasoning to resolve differences of opinion.

In their IES-funded study, Exploring and Assessing the Development of Students' Argumentation Skills, Yi Song and her colleagues are uncovering activities (both teacher led and technology supported) that can improve middle-school students’ ability to generate better oral and written arguments.

This project began in 2019 and is working in classrooms and with teachers and students. The researchers have created a series of videos that describe their work. In this series, Dr. Song and her co-PIs, Dr. Ralph Ferretti and Dr. John Sabatini, discuss why the project is important to education, how they will conduct the research plan, and how educators can apply what they are learning in classrooms.

 

 


For questions and more information, contact Meredith Larson (Meredith.Larson@ed.gov), Program Officer, NCER