IES Blog

Institute of Education Sciences

Meet NCSER and NCER Summer 2022 Interns

IES is proud to introduce the summer 2022 cohort of interns. These three interns come to us through the U.S. Department of Education’s Student Volunteer Trainee Program and are helping the Centers translate and understand the work we do. We asked this year’s interns to tell us about themselves, why they are interested in an internship, what they are learning, and a fun fact to share. Here’s what they said.

Kaitlynn Fraze is pursuing a PhD in special education and research methods at George Mason University.

Before pursuing my PhD, I taught in a variety of special education teaching positions. I started as a special education teacher at an elementary school serving students with high-incidence disabilities, then transitioned to teaching high school and post-graduate classes for students with severe disabilities and complex medical and communication needs. While teaching, I took master’s degree classes in autism and severe intellectual disabilities. My experiences in the public school system and in academia inspired me to learn more about how I could use research to inform policy and advocacy efforts.

I found my way to NCSER after completing a summer internship with the Department’s Office of Special Education Programs (OSEP) in 2021. The experience I gained at OSEP with programs aimed at bridging the research-to-practice gap influenced my drive and direction. While policy and advocacy for inclusion continue to be passions of mine, my graduate focus shifted to reading research for students with moderate to severe disabilities to help improve instruction for those students who were not previously held to the same high educational standards as their peers. I sought out an internship with NCSER because I want to use research to improve practice for ALL students and have exposure to federal special education grants management.

While interning at NCSER, I learned about the different IES-funded research programs and gained experience writing about impactful researchers and their research programs. The internship expanded my professional network to include even more people who share the same passion for education research for students with disabilities. Paired with the knowledge gained from my PhD program, the internship has strengthened my readiness to enter the field.

Fun Fact:

I love cooking! I enjoy making dinner for my family. The entire process of planning, organizing, and preparing the food is calming and therapeutic for me. I use it as a time to bond with my son, develop his functional life skills, and make huge messes.

Manvi Harde is a rising 2nd year Jefferson & Echols Scholar, pursuing a bachelor’s degree in Global Development Studies and Economics at the University of Virginia.

Growing up in a vibrant and diverse area attending one of the biggest public schools in Arizona, I had an incredible opportunity to expand my perspective and learn from my peers. I always loved education, in whichever form it manifested, including advocating for CTE education and raising awareness for the education of refugees. Through these various passions, I realized that I had a deep-rooted interest in education policy. 

At the University of Virginia, I immersed myself even more in the world of education and diversity by taking classes, such as Poverty and Education Policy and Race and Ethnic Relations to Macroeconomics, tutoring local refugee children, and interacting with undergraduate and faculty groups to compile data on racial justice and anti-racism education. I also was a fellow for Teach for America this past semester, through which I worked with changemakers and policy educators to tutor children from low-income families throughout the country.

Through my internship at IES, I am challenging myself and delving into the nuances of the world of education policy and research, with an eager hope to enter this field in the future. I strive to apply the knowledge I’ve learned through my work on disseminating and translating research for different stakeholders to uplift communities through research and policy and to find bright spots within those areas.

Fun Fact:

Throughout quarantine, my family and I fostered 5 dogs, each of which has a special place in my heart. I love pets, and though we didn’t adopt any of them, it was a wonderful experience to provide love and a home to these dogs for as long as needed.

Nadiyah Williams is a rising senior, majoring in information science at the University of Maryland, College Park.

I have been taking several classes to help me prepare for a profession in either data science or cybersecurity. This summer, I worked as a data science intern at IES, focusing on a project that leveraged data from the Office of Postsecondary Education (OPE) at ED. We used these data to determine which institutions were classified as minority-serving institutions (MSIs) during a particular time period.

While interning this summer, I learned a lot about all the different types of colleges that are eligible to become MSIs and what makes them eligible. The work I did this summer supports IES in determining whether the research Centers are getting applications from or awarding grants to MSIs. This work will continue to be important as IES identifies areas to expand the grant applicant pool.

I am grateful for my internship this summer. IES has taught me so many skills, especially in Excel, while leveraging previous coursework in Python and SQL. I hope to use the skills I have learned while cleaning data in my future college courses and my future job.

Fun Fact:

I enjoy traveling and have been to several cool countries such as Ghana and Qatar.

Timing is Everything: Understanding the IPEDS Data Collection and Release Cycle

For more than 3 decades, the Integrated Postsecondary Education Data System (IPEDS) has collected data from all postsecondary institutions participating in Title IV federal student aid programs, including universities, community colleges, and vocational and technical schools.

Since 2000, the 12 IPEDS survey components occurring in a given collection year have been organized into three seasonal collection periods: Fall, Winter, and Spring.

The timing of when data are collected (the “collection year”) is most important for the professionals who report their data to the National Center for Education Statistics (NCES). However, IPEDS data users are generally more interested in the year that is actually reflected in the data (the “data year”). As an example, a data user may ask, “What was happening with students, staff, and institutions in 2018–19?"


Text box that says: The collection year refers to the time period the IPEDS survey data are collected. The data year refers to the time period reflected in the IPEDS survey data.


For data users, knowing the difference between the collection year and the data year is important for working with and understanding IPEDS data. Often, the collection year comes after the data year, as institutions need time to collect the required data and check to make sure they are reporting the data accurately. This lag between the time period reflected by the data and when the data are reported is typically one academic term or year, depending on the survey component. For example, fall 2021 enrollment data are not reported to NCES until spring 2022, and the data would not be publicly released until fall 2022.

After the data are collected by NCES, there is an additional time period before they are released publicly in which the data undergo various quality and validity checks. About 9 months after each seasonal collection period ends (i.e., Fall, Winter, Spring), there is a Provisional Data Release and IPEDS data products (e.g., web tools, data files) are updated with the newly released seasonal data. During this provisional release, institutions may revise their data if they believe it was inaccurately reported. A Revised/Final Data Release then happens the following year and includes any revisions that were made to the provisional data.

Sound confusing? The data collection and release cycle can be a technical and complex process, and it varies slightly for each of the 12 IPEDS survey components. Luckily, NCES has created a comprehensive resource page that provides information about the IPEDS data collection and release cycles for each survey component as well as key details for data users and data reporters, such as how to account for summer enrollment in the different IPEDS survey components.

Table 1 provides a summary of the IPEDS 2021–22 data collection and release schedule information that can be found on the resource page. Information on the data year and other details about each survey component can also be found on the resource page.


Table 1. IPEDS 2021–22 Data Collection and Release Schedule

Table showing the IPEDS 2021–22 data collection and release schedule


Here are a few examples of how to distinguish the data year from the collection year in different IPEDS data products.

Example 1: IPEDS Trend Generator

Suppose that a data user is interested in how national graduation rates have changed over time. One tool they might use is the IPEDS Trend Generator. The Trend Generator is a ready-made web tool that allows users to view trends over time on the most frequently asked subject areas in postsecondary education. The Graduation Rate chart below displays data year (shown in green) in the headline and on the x-axis. The “Modify Years” option also allows users to filter by data year. Information about the collection year (shown in gold) can be found in the source notes below the chart.


Image of IPEDS Trend Generator webpage


Example 2: IPEDS Complete Data Files

Imagine that a data user was interested enough in 6-year Graduation Rates that they wanted to run more complex analyses in a statistical program. IPEDS Complete Data Files include all variables for all reporting institutions by survey component and can be downloaded by these users to create their own analytic datasets.

Data users should keep in mind that IPEDS Complete Data Files are organized and released by collection year (shown in gold) rather than data year. Because of this, even though files might share the same collection year, the data years reflected within the files will vary across survey components.


Image of IPEDS Complete Data Files webpage


The examples listed above are just a few of many scenarios in which this distinction between collection year and data year is important for analysis and understanding. Knowing about the IPEDS reporting cycle can be extremely useful when it comes to figuring out how to work with IPEDS data. For more examples and additional details on the IPEDS data collection and release cycles for each survey component, please visit the Timing of IPEDS Data Collection, Coverage, and Release Cycle resource page.

Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube, follow IPEDS on Twitter, and subscribe to the NCES News Flash to stay up-to-date on all IPEDS data releases.

 

By Katie Hyland and Roman Ruiz, American Institutes for Research

Google Acquires Intellectual Property for IES-Supported Education Technology Products Moby.Read and SkillCheck

On April 1, 2022, Google acquired the intellectual property (IP) rights for Moby.Read and SkillCheck, education technology products developed through IES programs by California-based Analytics Measures, Inc. (AMI). AMI will continue as a small business and is honoring school contracts that use Moby.Read and SkillCheck until 2024.

Moby.Read is a technology-delivered, fully automated, oral reading fluency (ORF) assessment that is self-administered by grade school students. As students read a passage aloud into a tablet, the speech-recognition software generates an assessment of ORF in real time through natural language processing software that analyzes text passages of the read-aloud performances. SkillCheck is a component of Moby.Read that employs natural language processing software to analyze recordings and produce interactive report pages that rate and illustrate the student's basic reading skills.

 

 

The technologies were developed over two decades with IES funding. Beginning in 2002, AMI designed several early prototypes to be used for ORFs as a part of the National Assessment of Educational Progress and other national assessments administered by IES’s National Center for Education Statistics. In 2016 and 2017, the IES Small Business Innovation Research program (ED/IES SBIR) funded AMI to develop and test Moby.Read to be used in schools at scale. With 2020 and 2021 ED/IES SBIR awards, AMI developed the SkillCheck as an additional component of Moby.Read to provide educators activities to inform instruction. AMI conducted research at key points over 20 years to validate the results of the assessment.

Since commercial launch in 2019, the Moby.Read and SkillCheck have been used for more than 30,000 student assessments in 30 states. Google acquired the Moby.Read and SkillCheck IP with plans to incorporate these tools into Google suite of products for education.

For additional information on the research, development, and commercialization of these technologies, see this Success Story on the ED/IES SBIR website.


Edward Metz is a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences. Please contact Edward.Metz@ed.gov with questions or for more information.

 

Calculating the Costs of School Internet Access

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

Internet access has become an indispensable element of many education and social programs. However, researchers conducting cost analyses of education programs often don’t capture these costs due to lack of publicly available information on what school districts pay for internet service. EducationSuperHighway, a nonprofit organization, now collects information about the internet bandwidth and monthly internet costs for each school district in the United States. The information is published on the Connect K-12 website. While Connect K-12 provides a median cost per Mbps in schools nationwide, its applicability in cost analyses is limited. This is because the per student cost varies vastly depending on the school district size.

As customers, we often save money by buying groceries in bulk. One of the reasons that larger sizes offer better value is that the ingredient we consume is sometimes only a small part of the total cost of the whole product; the rest of the cost goes into the process that makes the product accessible, such as packaging, transportation, and rent.

Same thing with internet. To make internet available in schools, necessary facilities and equipment include, but are not limited to web servers, ethernet cables, and Wi-Fi routers. Large school districts, which are often in urban locations, usually pay much less per student than small districts, which are often in rural areas. Costs of infrastructural adaptations need to be considered when new equipment and facilities are required for high-speed internet delivery. Fiber-optic and satellite internet services have high infrastructural costs. While old-fashioned DSL internet uses existing phone lines and thus has less overhead cost, it's much slower, often making it difficult to meet the current Federal Communications Commission recommended bandwidth of 1 Mbps per student.

In short, there is no one-price-for-all when it comes to costs of school internet access. To tackle this challenge, we used the data available on Connect K-12 for districts in each of the 50 U.S. states to calculate some useful metrics for cost analyses. First, we categorized the districts with internet access according to MDR’s definition of small, medium, and large school districts (Small: 0-2,499 students; Medium: 2,500-9,999 students; Large: 10,000+ students). For each category, we calculated the following metrics which are shown in Table 1:

  1. median cost per student per year
  2. median cost per student per hour

 

Table 1: Internet Access Costs

District size

(# of students)

Median mbps per student per month

Median cost per mbps per month

Median cost per student per month

Cost per student per year

Cost per student per hour

Small (0-2,499)

1.40

$1.75

$2.45

$29.40

$0.02

Medium (2,500-9,999)

0.89

$0.95

$0.85

$10.15

$0.007

Large (10,000+)

0.83

$0.61

$0.50

$6.03

$0.004

National median

1.23

$1.36

$1.67

$20.07

$0.014

 

Note: Cost per student per hour is computed based on the assumption that schools open for 1,440 hours (36 weeks) per annum, e.g., for a small district the cost per student per hour is $29.40/1,440 = $0.02). See methods here.

 

Here’s an example of how you might determine an appropriate portion of the costs to attribute to a specific program or practice:  

Sunnyvale School is in a school district of 4,000 students. It offers an afterschool program in the library in which 25 students work online with remote math tutors. The program runs for 1.5 hours per day on 4 days per week for 36 weeks. Internet costs would be:

 

1.5 hours x 4 days x 36 weeks x 25 students x $0.007 = $37.80.

 

The cost per student per hour might seem tiny. Take New York City Public Schools, for example, the cost per Mbps per month is $0.13, and yet the district pays $26,000 each month for internet. For one education program or intervention, internet costs may sometimes represent only a small fraction of the overall costs and may hardly seem worth estimating in comparison to personnel salaries and fringe benefits. However, it is critical for a rigorous cost analysis study to identify all the resources needed to implement a program.


Yuan Chang is a research assistant in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher on the CAP Project.

 Anna Kushner is a doctoral student in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher for the CAP Project.

A Conversation About Educational Inequality With Outstanding Predoctoral Fellow Marissa Thompson

Each year, the Institute of Education Sciences (IES) recognizes an outstanding fellow from its Predoctoral Interdisciplinary Research Training Programs in the Education Sciences for academic accomplishments and contributions to education research. The 2021 awardee, Marissa Thompson, completed her PhD at Stanford University and worked as a postdoctoral fellow with the Education Policy Initiative at University of Michigan’s Ford School of Public Policy. This summer, she joins Columbia University as an assistant professor of sociology. Her work focuses on the relationship between education and socioeconomic and racial inequality over the course of life.

Recently, we caught up with Dr. Thompson and asked her to discuss her research on educational inequality and her experiences as a scholar.

How did you become interested in a career in education research?

For a long time, I thought that I wanted to become an engineering professor. I majored in chemical and biomolecular engineering in college and planned to pursue a doctoral degree in engineering after I graduated. Though I was excited about my undergraduate research projects, I was also passionate about diversity and inclusion in science, technology, engineering, and mathematics (STEM) fields. This led me to spend my free time in college working on programs within the School of Engineering that promoted more equitable access to these majors. At the same time, I began taking some courses outside of the engineering program, which led me to a series of introductory sociology electives and inspired me to think about a career in the social sciences.

My interests in educational inequality stemmed in part from my own experiences and challenges as a Black woman in the sciences, but also from the experiences of my classmates who had to overcome barriers to access these fields. I wanted to have a more direct impact on the policies and programs that help to mitigate racial and socioeconomic inequality in education, which led me to apply for graduate programs in sociology of education.

What inspired you to focus your research on understanding the role of education in shaping inequality?

I began my graduate studies with the goal of focusing more narrowly on access and persistence in STEM fields, but this quickly developed into a broader interest in educational inequality. I was fortunate to work on several projects with advisors and mentors that motivated my interests in educational inequality over the life course—from studying racial and socioeconomic achievement gaps in public school districts across the country to studying how processes of major choice can lead to increased gender segregation across fields. My work seeks to understand how a variety of sources—including structural inequality, policy changes, and individual preferences—are related to disparities in access to quality educational experiences. My goal as a researcher is to understand how patterns of inequality emerge as well as to research the efficacy of policies that might mitigate social inequality. In doing so, I hope to have an impact on reducing educational disparities for future generations.

What do you see as the greatest research needs or recommendations to improve the relevance of education research for diverse communities of students and families?

I think one of the most important ways that we can improve the relevance of education research for diverse communities of students and families is to involve a more diverse group of voices in the research process. This includes creating more opportunities for researchers from different backgrounds who may ask questions that are uniquely informed by their own experiences or the experiences of their communities. In addition, I also believe that, as researchers, we have a responsibility to speak to the communities that are affected by the policies and patterns that we influence.  

What advice would you give to emerging scholars that are pursuing a career in education research?

My first piece of advice would be to find mentors and peers in graduate school who can support you. I have benefitted tremendously from the encouragement of my support system, and I have learned so much from my mentors and peers along the way. I would also encourage students from outside of the traditional social sciences to consider research in education. As an undergraduate engineering major, I was initially afraid to take a leap and change disciplines for graduate school, but in retrospect, I’m so glad that I did. At the time, I worried that my skillset and training in a different discipline would be a disadvantage, but I believe that my interdisciplinary background and unique perspective have helped me to grow my research agenda in ways that would not have been possible otherwise. 


This blog was produced by Bennett Lunn (Bennett.Lunn@ed.gov), Truman-Albright Fellow. It is part of an Inside IES Research blog series showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice.