IES Blog

Institute of Education Sciences

Technology Facilitated Tutoring Programs to Accelerate Learning

It goes without saying, challenges caused by COVID-19 in the field of education remain widespread and have the potential to be long lasting. A recent report confirmed that during the pandemic the move to full-time remote schooling was related to a decrease in student achievement, especially in high-poverty districts. Now over two years later, school leaders continue to employ strategies to address learning loss, such as intensive tutoring programs. This is because decades of research support the effectiveness of in-person tutoring to accelerate learning, and recent research also shows positive effects for high dosage virtual tutoring for struggling learners during the pandemic.

While the human-to-human interaction will always be central to a quality tutoring experience, technology offers unique functionalities to enrich and extend tutoring. For example, new models of technology facilitated tutoring programs—

  • Engage students with game-based and multi-media content that adjusts to the level of the individual and generate real-time tips to scaffold learning
  • Employ tools such as virtual whiteboards and data visualizations to enrich the virtual workspace for the tutor and student
  • Use dashboards to present real-time data driven insights for tutors to track student progress and individualize instruction
  • Provide automated professional development and training opportunities to prepare tutors

Technology also enables schools and community organizations to offer remote tutoring programs at scale—to reach students anywhere and anytime, including after school and during the summer. This of course depends on student access to technology and the availability of qualified tutors.  

Four IES Supported Technology-Based Tutoring Programs

In spring 2020, to help address the crisis in education caused by the pandemic, four teams of IES-funded developers adapted and extended their learning technologies for remote tutoring to be ready to be used at scale. The technologies are all research based and offer unique capabilities to strengthen the tutoring experience and to allow programs to reach more students. Each of the following programs described below can now deliver tutoring in schools or remote settings.

A2i by Learning Ovations. A2i (watch video) is a web-based product for students in kindergarten to grade 3 that continually assesses reading and generates data-driven recommendations to inform instruction. A2i is used in hundreds of schools by tens of thousands of students each year. Originally developed for in-school use through multiple IES and other government awards, research demonstrates the efficacy of A2i to improve student reading.

At the beginning of COVID-19, the Community Literacy Support System was designed to extend A2i for tutoring at student homes and other non-school locations. The program provides customized lessons and data visualization tools for tutors and parents. In the past two years, the tutoring program has been used by 9,200 students across 5 different states, at 23 different community sites (such as this one), and within almost 8,000 homes.

ASSISTments by ASSISTments. ASSISTments (watch video) is used by teachers to assign problems to students from curricula, such as EngageNY, Illustrative Mathematics, and Open Up Resources. Students receive real-time instructional feedback while doing problems online, while teachers receive reports with actionable insights to inform instruction. Initially developed through multiple awards from IES and other sources, in 2021-22, ASSISTments was used by over 5,000 educators and 200,000 students at schools around the country. Research by SRI International demonstrated that classrooms that used ASSISTments increased in learning course content compared to a control group.

At the onset of COVID-19, ASSISTments designed TutorASSIST, a tool to present data visualizations for tutors to target specific student needs through remote (or in-person) sessions. More than 750 students across Louisiana, Georgia, and Maryland used the tool to support remote tutoring during the pandemic. With a 2021 ED/EIR grant, ASSISTments is further developing its core product and tutoring tool to serve historically underserved students, including starting a school tutoring pilot program in eight schools in partnership with EnCorps Tutors, with a focus on optimizing the technology for tutoring.

SAGA Coach by SAGA Education and Simbulus. SAGA Education and Woot Math (watch video) employ interactive and game-based activities to support student math learning and a dashboard that generates data-driven insights to promote dialogue and discussion of complex topics between teachers and students. The school-based intervention reaches approximately 75,000 students per year. Woot Math was developed through an ED/IES SBIR award and through other sources. In 2021, Woot Math was acquired by and integrated within SAGA Education’s online math program.  

During the pandemic, SAGA Coach was designed to extend Saga Education and Woot Math for remote tutoring through the addition of an interactive whiteboard shared by tutors and students, and online training materials for tutors. During the 2020-21 school year, 5,500 students used SAGA Coach for high-dosage remote tutoring, and in 2022-23, the Boys and Girls Clubs of America will use SAGA Coach to deliver a remote tutoring program.

Lightning Squad by Sirius Thinking and Success For All. Lightning Squad (watch video) is a multimedia platform where pairs of students in grades 1 to 3 who are struggling readers collaborate to read stories and play games that are presented by the computer, while a tutor provides targeted support. Developed through an ED/IES SBIR award, the product is currently being evaluated through a multiyear efficacy trial and will be used in 50 Baltimore City elementary schools in fall 2022.  

At the onset of COVID-19 in 2020, Lightning Squad was adapted for remote delivery in eight Baltimore City schools serving principally low-income students. In the remote version, tutors use a video platform (for example, Zoom) with pairs of students in their respective homes. Students proceed with the activities of the software and respond verbally while a tutor types responses on the screen in real-time for each team member to see. In the 2020-21 school year, 16 Baltimore City schools used Lightning Squad with over 800 students, most for remote tutoring. Research conducted during the project by Success For All (not yet reviewed by the Department) found that students who were able to maintain consistent participation with remote tutoring gained 1.5 years of progress as measured by their initial placement and end of year placement, double the expected gains in reading during the period of school closures.  An additional 1,200 students were served remotely using Lightning Squad in other states during this same period.

Additional Related Resources on Tutoring to Accelerate Learning

  • Lightning Squad and SAGA Coach are part of Proven Tutoring, a coalition of technology-delivered tutoring programs with a mission to help educators learn about and access tutoring programs. These evidence-based programs have the potential to increase the achievement of students performing far below grade level due to COVID school closures or other factors.
  • In June 2021 during the ED Games Expo, IES partnered with AmeriCorps to host a webinar focusing on government and community partner initiatives to support remote tutoring to accelerate student learning during COVID-19.

In April 2022, AmeriCorps partnered with ED to produce a webinar on lessons from the field on the topic of high-dosage tutoring. AmeriCorps and the Department, along with Johns Hopkins University’s Everyone Graduates Center, are partners in the National Partnership for Student Success (NPSS). Launched in July 2022, the NPSS is committed to engaging 250,000 adults as tutors, mentors, and coaches in evidence-based programs designed to accelerate students’ recovery from the pandemic. 


Stay tuned to @IESResearch for news and updates on research, initiatives, and project updates in the area of tutoring to accelerate learning.

Edward Metz is a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences.

Melissa Moritz was the Afterschool and Summer Learning Fellow in the National Center for Education Evaluation and Regional Assistance (NCEE) at the US Department of Education’s Institute of Education Sciences. She currently serves as the Director of Policy for the STEM Next Opportunity Fund.

Please contact Edward.Metz@ed.gov with questions or for more information.

 

Meet NCSER and NCER Summer 2022 Interns

IES is proud to introduce the summer 2022 cohort of interns. These three interns come to us through the U.S. Department of Education’s Student Volunteer Trainee Program and are helping the Centers translate and understand the work we do. We asked this year’s interns to tell us about themselves, why they are interested in an internship, what they are learning, and a fun fact to share. Here’s what they said.

Kaitlynn Fraze is pursuing a PhD in special education and research methods at George Mason University.

Before pursuing my PhD, I taught in a variety of special education teaching positions. I started as a special education teacher at an elementary school serving students with high-incidence disabilities, then transitioned to teaching high school and post-graduate classes for students with severe disabilities and complex medical and communication needs. While teaching, I took master’s degree classes in autism and severe intellectual disabilities. My experiences in the public school system and in academia inspired me to learn more about how I could use research to inform policy and advocacy efforts.

I found my way to NCSER after completing a summer internship with the Department’s Office of Special Education Programs (OSEP) in 2021. The experience I gained at OSEP with programs aimed at bridging the research-to-practice gap influenced my drive and direction. While policy and advocacy for inclusion continue to be passions of mine, my graduate focus shifted to reading research for students with moderate to severe disabilities to help improve instruction for those students who were not previously held to the same high educational standards as their peers. I sought out an internship with NCSER because I want to use research to improve practice for ALL students and have exposure to federal special education grants management.

While interning at NCSER, I learned about the different IES-funded research programs and gained experience writing about impactful researchers and their research programs. The internship expanded my professional network to include even more people who share the same passion for education research for students with disabilities. Paired with the knowledge gained from my PhD program, the internship has strengthened my readiness to enter the field.

Fun Fact:

I love cooking! I enjoy making dinner for my family. The entire process of planning, organizing, and preparing the food is calming and therapeutic for me. I use it as a time to bond with my son, develop his functional life skills, and make huge messes.

Manvi Harde is a rising 2nd year Jefferson & Echols Scholar, pursuing a bachelor’s degree in Global Development Studies and Economics at the University of Virginia.

Growing up in a vibrant and diverse area attending one of the biggest public schools in Arizona, I had an incredible opportunity to expand my perspective and learn from my peers. I always loved education, in whichever form it manifested, including advocating for CTE education and raising awareness for the education of refugees. Through these various passions, I realized that I had a deep-rooted interest in education policy. 

At the University of Virginia, I immersed myself even more in the world of education and diversity by taking classes, such as Poverty and Education Policy and Race and Ethnic Relations to Macroeconomics, tutoring local refugee children, and interacting with undergraduate and faculty groups to compile data on racial justice and anti-racism education. I also was a fellow for Teach for America this past semester, through which I worked with changemakers and policy educators to tutor children from low-income families throughout the country.

Through my internship at IES, I am challenging myself and delving into the nuances of the world of education policy and research, with an eager hope to enter this field in the future. I strive to apply the knowledge I’ve learned through my work on disseminating and translating research for different stakeholders to uplift communities through research and policy and to find bright spots within those areas.

Fun Fact:

Throughout quarantine, my family and I fostered 5 dogs, each of which has a special place in my heart. I love pets, and though we didn’t adopt any of them, it was a wonderful experience to provide love and a home to these dogs for as long as needed.

Nadiyah Williams is a rising senior, majoring in information science at the University of Maryland, College Park.

I have been taking several classes to help me prepare for a profession in either data science or cybersecurity. This summer, I worked as a data science intern at IES, focusing on a project that leveraged data from the Office of Postsecondary Education (OPE) at ED. We used these data to determine which institutions were classified as minority-serving institutions (MSIs) during a particular time period.

While interning this summer, I learned a lot about all the different types of colleges that are eligible to become MSIs and what makes them eligible. The work I did this summer supports IES in determining whether the research Centers are getting applications from or awarding grants to MSIs. This work will continue to be important as IES identifies areas to expand the grant applicant pool.

I am grateful for my internship this summer. IES has taught me so many skills, especially in Excel, while leveraging previous coursework in Python and SQL. I hope to use the skills I have learned while cleaning data in my future college courses and my future job.

Fun Fact:

I enjoy traveling and have been to several cool countries such as Ghana and Qatar.

Timing is Everything: Understanding the IPEDS Data Collection and Release Cycle

For more than 3 decades, the Integrated Postsecondary Education Data System (IPEDS) has collected data from all postsecondary institutions participating in Title IV federal student aid programs, including universities, community colleges, and vocational and technical schools.

Since 2000, the 12 IPEDS survey components occurring in a given collection year have been organized into three seasonal collection periods: Fall, Winter, and Spring.

The timing of when data are collected (the “collection year”) is most important for the professionals who report their data to the National Center for Education Statistics (NCES). However, IPEDS data users are generally more interested in the year that is actually reflected in the data (the “data year”). As an example, a data user may ask, “What was happening with students, staff, and institutions in 2018–19?"


Text box that says: The collection year refers to the time period the IPEDS survey data are collected. The data year refers to the time period reflected in the IPEDS survey data.


For data users, knowing the difference between the collection year and the data year is important for working with and understanding IPEDS data. Often, the collection year comes after the data year, as institutions need time to collect the required data and check to make sure they are reporting the data accurately. This lag between the time period reflected by the data and when the data are reported is typically one academic term or year, depending on the survey component. For example, fall 2021 enrollment data are not reported to NCES until spring 2022, and the data would not be publicly released until fall 2022.

After the data are collected by NCES, there is an additional time period before they are released publicly in which the data undergo various quality and validity checks. About 9 months after each seasonal collection period ends (i.e., Fall, Winter, Spring), there is a Provisional Data Release and IPEDS data products (e.g., web tools, data files) are updated with the newly released seasonal data. During this provisional release, institutions may revise their data if they believe it was inaccurately reported. A Revised/Final Data Release then happens the following year and includes any revisions that were made to the provisional data.

Sound confusing? The data collection and release cycle can be a technical and complex process, and it varies slightly for each of the 12 IPEDS survey components. Luckily, NCES has created a comprehensive resource page that provides information about the IPEDS data collection and release cycles for each survey component as well as key details for data users and data reporters, such as how to account for summer enrollment in the different IPEDS survey components.

Table 1 provides a summary of the IPEDS 2021–22 data collection and release schedule information that can be found on the resource page. Information on the data year and other details about each survey component can also be found on the resource page.


Table 1. IPEDS 2021–22 Data Collection and Release Schedule

Table showing the IPEDS 2021–22 data collection and release schedule


Here are a few examples of how to distinguish the data year from the collection year in different IPEDS data products.

Example 1: IPEDS Trend Generator

Suppose that a data user is interested in how national graduation rates have changed over time. One tool they might use is the IPEDS Trend Generator. The Trend Generator is a ready-made web tool that allows users to view trends over time on the most frequently asked subject areas in postsecondary education. The Graduation Rate chart below displays data year (shown in green) in the headline and on the x-axis. The “Modify Years” option also allows users to filter by data year. Information about the collection year (shown in gold) can be found in the source notes below the chart.


Image of IPEDS Trend Generator webpage


Example 2: IPEDS Complete Data Files

Imagine that a data user was interested enough in 6-year Graduation Rates that they wanted to run more complex analyses in a statistical program. IPEDS Complete Data Files include all variables for all reporting institutions by survey component and can be downloaded by these users to create their own analytic datasets.

Data users should keep in mind that IPEDS Complete Data Files are organized and released by collection year (shown in gold) rather than data year. Because of this, even though files might share the same collection year, the data years reflected within the files will vary across survey components.


Image of IPEDS Complete Data Files webpage


The examples listed above are just a few of many scenarios in which this distinction between collection year and data year is important for analysis and understanding. Knowing about the IPEDS reporting cycle can be extremely useful when it comes to figuring out how to work with IPEDS data. For more examples and additional details on the IPEDS data collection and release cycles for each survey component, please visit the Timing of IPEDS Data Collection, Coverage, and Release Cycle resource page.

Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube, follow IPEDS on Twitter, and subscribe to the NCES News Flash to stay up-to-date on all IPEDS data releases.

 

By Katie Hyland and Roman Ruiz, American Institutes for Research

Google Acquires Intellectual Property for IES-Supported Education Technology Products Moby.Read and SkillCheck

On April 1, 2022, Google acquired the intellectual property (IP) rights for Moby.Read and SkillCheck, education technology products developed through IES programs by California-based Analytics Measures, Inc. (AMI). AMI will continue as a small business and is honoring school contracts that use Moby.Read and SkillCheck until 2024.

Moby.Read is a technology-delivered, fully automated, oral reading fluency (ORF) assessment that is self-administered by grade school students. As students read a passage aloud into a tablet, the speech-recognition software generates an assessment of ORF in real time through natural language processing software that analyzes text passages of the read-aloud performances. SkillCheck is a component of Moby.Read that employs natural language processing software to analyze recordings and produce interactive report pages that rate and illustrate the student's basic reading skills.

 

 

The technologies were developed over two decades with IES funding. Beginning in 2002, AMI designed several early prototypes to be used for ORFs as a part of the National Assessment of Educational Progress and other national assessments administered by IES’s National Center for Education Statistics. In 2016 and 2017, the IES Small Business Innovation Research program (ED/IES SBIR) funded AMI to develop and test Moby.Read to be used in schools at scale. With 2020 and 2021 ED/IES SBIR awards, AMI developed the SkillCheck as an additional component of Moby.Read to provide educators activities to inform instruction. AMI conducted research at key points over 20 years to validate the results of the assessment.

Since commercial launch in 2019, the Moby.Read and SkillCheck have been used for more than 30,000 student assessments in 30 states. Google acquired the Moby.Read and SkillCheck IP with plans to incorporate these tools into Google suite of products for education.

For additional information on the research, development, and commercialization of these technologies, see this Success Story on the ED/IES SBIR website.


Edward Metz is a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences. Please contact Edward.Metz@ed.gov with questions or for more information.

 

Calculating the Costs of School Internet Access

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

Internet access has become an indispensable element of many education and social programs. However, researchers conducting cost analyses of education programs often don’t capture these costs due to lack of publicly available information on what school districts pay for internet service. EducationSuperHighway, a nonprofit organization, now collects information about the internet bandwidth and monthly internet costs for each school district in the United States. The information is published on the Connect K-12 website. While Connect K-12 provides a median cost per Mbps in schools nationwide, its applicability in cost analyses is limited. This is because the per student cost varies vastly depending on the school district size.

As customers, we often save money by buying groceries in bulk. One of the reasons that larger sizes offer better value is that the ingredient we consume is sometimes only a small part of the total cost of the whole product; the rest of the cost goes into the process that makes the product accessible, such as packaging, transportation, and rent.

Same thing with internet. To make internet available in schools, necessary facilities and equipment include, but are not limited to web servers, ethernet cables, and Wi-Fi routers. Large school districts, which are often in urban locations, usually pay much less per student than small districts, which are often in rural areas. Costs of infrastructural adaptations need to be considered when new equipment and facilities are required for high-speed internet delivery. Fiber-optic and satellite internet services have high infrastructural costs. While old-fashioned DSL internet uses existing phone lines and thus has less overhead cost, it's much slower, often making it difficult to meet the current Federal Communications Commission recommended bandwidth of 1 Mbps per student.

In short, there is no one-price-for-all when it comes to costs of school internet access. To tackle this challenge, we used the data available on Connect K-12 for districts in each of the 50 U.S. states to calculate some useful metrics for cost analyses. First, we categorized the districts with internet access according to MDR’s definition of small, medium, and large school districts (Small: 0-2,499 students; Medium: 2,500-9,999 students; Large: 10,000+ students). For each category, we calculated the following metrics which are shown in Table 1:

  1. median cost per student per year
  2. median cost per student per hour

 

Table 1: Internet Access Costs

District size

(# of students)

Median mbps per student per month

Median cost per mbps per month

Median cost per student per month

Cost per student per year

Cost per student per hour

Small (0-2,499)

1.40

$1.75

$2.45

$29.40

$0.02

Medium (2,500-9,999)

0.89

$0.95

$0.85

$10.15

$0.007

Large (10,000+)

0.83

$0.61

$0.50

$6.03

$0.004

National median

1.23

$1.36

$1.67

$20.07

$0.014

 

Note: Cost per student per hour is computed based on the assumption that schools open for 1,440 hours (36 weeks) per annum, e.g., for a small district the cost per student per hour is $29.40/1,440 = $0.02). See methods here.

 

Here’s an example of how you might determine an appropriate portion of the costs to attribute to a specific program or practice:  

Sunnyvale School is in a school district of 4,000 students. It offers an afterschool program in the library in which 25 students work online with remote math tutors. The program runs for 1.5 hours per day on 4 days per week for 36 weeks. Internet costs would be:

 

1.5 hours x 4 days x 36 weeks x 25 students x $0.007 = $37.80.

 

The cost per student per hour might seem tiny. Take New York City Public Schools, for example, the cost per Mbps per month is $0.13, and yet the district pays $26,000 each month for internet. For one education program or intervention, internet costs may sometimes represent only a small fraction of the overall costs and may hardly seem worth estimating in comparison to personnel salaries and fringe benefits. However, it is critical for a rigorous cost analysis study to identify all the resources needed to implement a program.


Yuan Chang is a research assistant in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher on the CAP Project.

 Anna Kushner is a doctoral student in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher for the CAP Project.