IES Blog

Institute of Education Sciences

Calculating the Costs of School Internet Access

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

Internet access has become an indispensable element of many education and social programs. However, researchers conducting cost analyses of education programs often don’t capture these costs due to lack of publicly available information on what school districts pay for internet service. EducationSuperHighway, a nonprofit organization, now collects information about the internet bandwidth and monthly internet costs for each school district in the United States. The information is published on the Connect K-12 website. While Connect K-12 provides a median cost per Mbps in schools nationwide, its applicability in cost analyses is limited. This is because the per student cost varies vastly depending on the school district size.

As customers, we often save money by buying groceries in bulk. One of the reasons that larger sizes offer better value is that the ingredient we consume is sometimes only a small part of the total cost of the whole product; the rest of the cost goes into the process that makes the product accessible, such as packaging, transportation, and rent.

Same thing with internet. To make internet available in schools, necessary facilities and equipment include, but are not limited to web servers, ethernet cables, and Wi-Fi routers. Large school districts, which are often in urban locations, usually pay much less per student than small districts, which are often in rural areas. Costs of infrastructural adaptations need to be considered when new equipment and facilities are required for high-speed internet delivery. Fiber-optic and satellite internet services have high infrastructural costs. While old-fashioned DSL internet uses existing phone lines and thus has less overhead cost, it's much slower, often making it difficult to meet the current Federal Communications Commission recommended bandwidth of 1 Mbps per student.

In short, there is no one-price-for-all when it comes to costs of school internet access. To tackle this challenge, we used the data available on Connect K-12 for districts in each of the 50 U.S. states to calculate some useful metrics for cost analyses. First, we categorized the districts with internet access according to MDR’s definition of small, medium, and large school districts (Small: 0-2,499 students; Medium: 2,500-9,999 students; Large: 10,000+ students). For each category, we calculated the following metrics which are shown in Table 1:

  1. median cost per student per year
  2. median cost per student per hour

 

Table 1: Internet Access Costs

District size

(# of students)

Median mbps per student per month

Median cost per mbps per month

Median cost per student per month

Cost per student per year

Cost per student per hour

Small (0-2,499)

1.40

$1.75

$2.45

$29.40

$0.02

Medium (2,500-9,999)

0.89

$0.95

$0.85

$10.15

$0.007

Large (10,000+)

0.83

$0.61

$0.50

$6.03

$0.004

National median

1.23

$1.36

$1.67

$20.07

$0.014

 

Note: Cost per student per hour is computed based on the assumption that schools open for 1,440 hours (36 weeks) per annum, e.g., for a small district the cost per student per hour is $29.40/1,440 = $0.02). See methods here.

 

Here’s an example of how you might determine an appropriate portion of the costs to attribute to a specific program or practice:  

Sunnyvale School is in a school district of 4,000 students. It offers an afterschool program in the library in which 25 students work online with remote math tutors. The program runs for 1.5 hours per day on 4 days per week for 36 weeks. Internet costs would be:

 

1.5 hours x 4 days x 36 weeks x 25 students x $0.007 = $37.80.

 

The cost per student per hour might seem tiny. Take New York City Public Schools, for example, the cost per Mbps per month is $0.13, and yet the district pays $26,000 each month for internet. For one education program or intervention, internet costs may sometimes represent only a small fraction of the overall costs and may hardly seem worth estimating in comparison to personnel salaries and fringe benefits. However, it is critical for a rigorous cost analysis study to identify all the resources needed to implement a program.


Yuan Chang is a research assistant in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher on the CAP Project.

 Anna Kushner is a doctoral student in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher for the CAP Project.

A Conversation About Educational Inequality With Outstanding Predoctoral Fellow Marissa Thompson

Each year, the Institute of Education Sciences (IES) recognizes an outstanding fellow from its Predoctoral Interdisciplinary Research Training Programs in the Education Sciences for academic accomplishments and contributions to education research. The 2021 awardee, Marissa Thompson, completed her PhD at Stanford University and worked as a postdoctoral fellow with the Education Policy Initiative at University of Michigan’s Ford School of Public Policy. This summer, she joins Columbia University as an assistant professor of sociology. Her work focuses on the relationship between education and socioeconomic and racial inequality over the course of life.

Recently, we caught up with Dr. Thompson and asked her to discuss her research on educational inequality and her experiences as a scholar.

How did you become interested in a career in education research?

For a long time, I thought that I wanted to become an engineering professor. I majored in chemical and biomolecular engineering in college and planned to pursue a doctoral degree in engineering after I graduated. Though I was excited about my undergraduate research projects, I was also passionate about diversity and inclusion in science, technology, engineering, and mathematics (STEM) fields. This led me to spend my free time in college working on programs within the School of Engineering that promoted more equitable access to these majors. At the same time, I began taking some courses outside of the engineering program, which led me to a series of introductory sociology electives and inspired me to think about a career in the social sciences.

My interests in educational inequality stemmed in part from my own experiences and challenges as a Black woman in the sciences, but also from the experiences of my classmates who had to overcome barriers to access these fields. I wanted to have a more direct impact on the policies and programs that help to mitigate racial and socioeconomic inequality in education, which led me to apply for graduate programs in sociology of education.

What inspired you to focus your research on understanding the role of education in shaping inequality?

I began my graduate studies with the goal of focusing more narrowly on access and persistence in STEM fields, but this quickly developed into a broader interest in educational inequality. I was fortunate to work on several projects with advisors and mentors that motivated my interests in educational inequality over the life course—from studying racial and socioeconomic achievement gaps in public school districts across the country to studying how processes of major choice can lead to increased gender segregation across fields. My work seeks to understand how a variety of sources—including structural inequality, policy changes, and individual preferences—are related to disparities in access to quality educational experiences. My goal as a researcher is to understand how patterns of inequality emerge as well as to research the efficacy of policies that might mitigate social inequality. In doing so, I hope to have an impact on reducing educational disparities for future generations.

What do you see as the greatest research needs or recommendations to improve the relevance of education research for diverse communities of students and families?

I think one of the most important ways that we can improve the relevance of education research for diverse communities of students and families is to involve a more diverse group of voices in the research process. This includes creating more opportunities for researchers from different backgrounds who may ask questions that are uniquely informed by their own experiences or the experiences of their communities. In addition, I also believe that, as researchers, we have a responsibility to speak to the communities that are affected by the policies and patterns that we influence.  

What advice would you give to emerging scholars that are pursuing a career in education research?

My first piece of advice would be to find mentors and peers in graduate school who can support you. I have benefitted tremendously from the encouragement of my support system, and I have learned so much from my mentors and peers along the way. I would also encourage students from outside of the traditional social sciences to consider research in education. As an undergraduate engineering major, I was initially afraid to take a leap and change disciplines for graduate school, but in retrospect, I’m so glad that I did. At the time, I worried that my skillset and training in a different discipline would be a disadvantage, but I believe that my interdisciplinary background and unique perspective have helped me to grow my research agenda in ways that would not have been possible otherwise. 


This blog was produced by Bennett Lunn (Bennett.Lunn@ed.gov), Truman-Albright Fellow. It is part of an Inside IES Research blog series showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice.

Measuring Student Safety: New Data on Bullying Rates at School

Bullying remains a serious issue for students and their families, as well as policy makers, administrators, and educators. NCES is committed to providing reliable and timely data on bullying to measure the extent of the problem and track any progress toward reducing its prevalence. As such, a new set of web tables focusing on bullying rates at school was just released. These tables use data from the School Crime Supplement to the National Crime Victimization Survey, which collects data on bullying by asking a nationally representative sample of students ages 12–18 if they had been bullied at school. This blog post highlights data from these newly released web tables.

In 2019, about 22 percent of students reported being bullied at school during the school year (figure 1). This percentage was lower compared with a decade ago (2009), when 28 percent of students reported being bullied at school.

Students’ reports of being bullied varied based on student and school characteristics in 2019. For instance, a higher percentage of female students than of male students reported being bullied at school during the school year (25 vs. 19 percent). The percentage of students who reported being bullied at school was higher for students of Two or more races (37 percent) than for White students (25 percent) and Black students (22 percent), which were in turn higher than the percentage of Asian students (13 percent). Higher percentages of 6th-, 7th-, and 8th-graders reported being bullied at school (ranging from 27 to 28 percent), compared with 9th-, 10th-, and 12th-graders (ranging from 16 to 19 percent). A higher percentage of students enrolled in schools in rural areas (28 percent) than in schools in other locales (ranging from 21 to 22 percent) reported being bullied at school.


Figure 1. Percentage of students ages 12–18 who reported being bullied at school during the school year, by selected student and school characteristics: 2019

Horizontal bar chart showing the percentage of students ages 12–18 who reported being bullied at school during the school year in 2019, by selected student characteristics (sex, race/ethnicity, and grade) and school characteristics (locale and control of school)

1 Total includes race categories not separately shown.
2 Race categories exclude persons of Hispanic ethnicity. Data for Pacific Islander and American Indian/Alaska Native students did not meet reporting standards in 2019; therefore, data for these two groups are not shown.
3 Excludes students with missing information about the school characteristic.
NOTE: “At school” includes in the school building, on school property, on a school bus, and going to and from school. Although rounded numbers are displayed, the figures are based on unrounded data.
SOURCE: U.S. Department of Justice, Bureau of Justice Statistics, School Crime Supplement (SCS) to the National Crime Victimization Survey, 2019. See Digest of Education Statistics 2020, table 230.40.


Not all students chose to report the bullying to adults at school. Among students ages 12–18 who reported being bullied at school during the school year in 2019, about 46 percent reported notifying an adult at school about the incident. This percentage was higher for Black students than for White students (61 vs. 47 percent), and both percentages were higher than the percentage for Hispanic students (35 percent).

For more details on these data, see the web tables from “Student Reports of Bullying: Results from the 2019 School Crime Supplement to the National Crime Victimization Survey.” For additional information on this topic, see the Condition of Education indicator Bullying at School and Electronic Bullying. For indicators on other topics related to school crime and safety, select “School Crime and Safety” on the Explore by Indicator Topics page.

 

By Ke Wang, AIR

Congratulations Dr. Roddy Theobald on Winning the 2022 AEFP Early Career Award!

Headshot of Roddy TheobaldEach year, the Association for Education Finance and Policy (AEFP) recognizes one outstanding early career scholar whose research makes a significant contribution to the field of education finance and policy. In 2022, Dr. Roddy Theobald was the recipient of the Early Career award from AEFP. Congratulations to Dr. Theobald!

Dr. Theobald is a principal researcher in the Center for Analysis of Longitudinal Data in Education Research (CALDER) at the American Institutes for Research (AIR). CALDER, a collaboration among researchers at AIR and several universities around the United States, uses longitudinal data to explore a wide range of policy-relevant topics in education. Dr. Theobald’s research focuses on the teacher pipeline and its implications for student outcomes. Over the years, he has been involved in multiple IES-funded projects. These projects reflect a clear commitment to improving the teacher workforce and promoting positive outcomes for students. Dr. Theobald became interested in education policy research and studying the teacher workforce as a result of his experience as a 7th grade math teacher in the Oakland Unified School District. He is particularly interested in better understanding teacher shortage areas and what schools and districts can do to address them. 

As principal investigator (PI) on a recently completed researcher-practitioner partnership project, Dr. Theobald and his team worked in partnership with the Massachusetts Department of Elementary and Secondary Education to investigate the predictive validity of the state’s pre-service teacher evaluation systems and later in-service teaching outcomes and student outcomes. Key findings showed that teacher candidate performance on the Massachusetts Candidate Assessment of Performance, a practice-based assessment of student teaching, was predictive of their in-service summative performance ratings a year later. In examining the predictive validity of the Massachusetts Tests for Educator Licensure, results indicated that pre-service teacher scores were positively and significantly related to in-service performance ratings and value-added modeling of student test scores.

Dr. Theobald is currently the PI of a research grant that examines associations between pre-service teacher experiences (coursework, student teaching placements, and the match between student teaching experiences and early career experiences), special education teacher workforce entry and retention, and student academic outcomes. Using data on graduates of special education teacher education programs in Washington state, he found that the rate of special educator attrition is between 20-30%, which includes teachers that left public schools as well as those who moved to general education classrooms. Interestingly, the research team found that while dual endorsement in special and general education is positively associated with retention in the teaching workforce, it is negatively associated with retention in special education classrooms specifically. In terms of factors that promote retention, the research team found that better coherence between teacher preparation and early career experiences is associated with greater retention and that being supervised by a cooperating teacher endorsed in special education as part of student teaching is associated with a higher likelihood of becoming a special education teacher. The research team also found a link between preservice teacher experiences and student outcomes: students demonstrate larger reading gains when their district and the program from which their teacher graduated emphasized evidence-based literacy decoding practices and when a more experienced cooperating teacher supervised their teacher’s student teaching placement.

When we asked Dr. Theobald about the direction in which this line of research is heading, he explained, “immediate next steps in this line of work include looking at the employment outcomes of individuals trained to be special education teachers who never enter public school teaching or leave the teacher workforce, as well as better understanding the paraeducator workforce in public schools. It is also essential to understand how the special educator workforce has changed in response to the COVID pandemic, and we hope to study these changes in the years to come!”

This blog was authored by Kaitlynn Fraze, doctoral student at George Mason University and IES intern, and Katie Taylor (Katherine.Taylor@ed.gov), program officer at the National Center for Special Education Research.

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.