IES Blog

Institute of Education Sciences

New International Data Highlight Experiences of Teachers and Principals

The Teaching and Learning International Survey (TALIS) 2018 surveyed lower secondary teachers and principals in the United States and 48 other education systems (in the United States, lower secondary is equivalent to grades 7–9). Following the release of volume 1 of the TALIS 2018 U.S. highlights web report in 2019, volume 2 provides new data comparing teachers’ and principals’ opinions about job satisfaction, stress, salary, and autonomy.

According to the survey results, nearly a quarter of U.S. lower secondary teachers (26 percent) reported they had “a lot” of stress in their work. This was higher than the average across countries participating in TALIS (16 percent) (figure 1). The U.S. percentage was higher than the percentage in 35 education systems, lower than the percentage in 3 education systems, and not measurably different from the percentage in 10 education systems. Across educations systems, the percentage of teachers who reported experiencing “a lot” of stress in their work ranged from 1 percent in the country of Georgia to 38 percent in England (see figure 8T in the TALIS 2018 U.S. highlights web report, volume 2).


Figure 1. Percentage of lower secondary teachers in the United States and across TALIS education systems reporting that they experience “a lot” of stress in their work: 2018


Less than half of U.S. lower secondary teachers (41 percent) “agree” or “strongly agree” that they are satisfied with their salary, which was not measurably different from the TALIS average (39 percent) (figure 2). The U.S. percentage was higher than the percentage in 22 education systems, lower than the percentage in 17 education systems, and not measurably different from the percentage in 9 education systems. The percentage of teachers who “agree” or “strongly agree” that they are satisfied with their salary ranged widely across education systems, from 6 percent in Iceland to 76 percent in Alberta–Canada (see figure 10T in the TALIS 2018 U.S. highlights web report, volume 2).

About half of U.S. lower secondary principals (56 percent) “agree” or “strongly agree” that they are satisfied with their salary, which was not measurably different from the TALIS average (47 percent) (figure 2). The U.S. percentage was higher than the percentage in 15 education systems, lower than the percentage in 3 education systems, and not measurably different from the percentage in 29 education systems. As with teachers, the percentage of principals who “agree” or “strongly agree” that they are satisfied with their salary ranged widely across education systems, from 17 percent in Italy to 86 percent in Singapore (see figure 10P in the TALIS U.S. highlights web report, volume 2).


Figure 2. Percentage of lower secondary teachers and principals in the United States and across TALIS education systems who “agree” or “strongly agree” that they are satisfied with the salary they receive for their work: 2018


TALIS is conducted in the United States by the National Center for Education Statistics (NCES) and is sponsored by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of industrialized countries. Further information about TALIS can be found in the technical notes, questionnaires, list of participating OECD and non-OECD countries, and FAQs for the study. In addition, the two volumes of the OECD international reports are available on the OECD TALIS website.

 

By Tom Snyder, AIR; Ebru Erberber, AIR; and Mary Coleman, NCES

New Report Highlights Progress and Challenges in U.S. High School Dropout and Completion Rates

A new NCES report has some good news about overall high school dropout and completion rates, but it also highlights some areas of concern.

Using a broad range of data, the recently released Trends in High School Dropout and Completion Rates in the United States report shows that the educational attainment of young adults has risen in recent decades. The public high school graduation rate is up, and the status dropout rate (the percentage of 16- to 24-year-olds who are not enrolled in school and have not completed high school) is down. Despite these encouraging trends, there are significant disparities in educational attainment among young adults in the United States. The report shines new light on these disparities by analyzing detailed data from the U.S. Census Bureau.

For large population groups, the report provides status dropout rates calculated using annual data from the American Community Survey (ACS), administered by the U.S. Census Bureau. For example, in 2017, some 5.4 percent of 16- to 24-year-olds who were not enrolled in high school lacked a high school diploma or equivalent credential.

For smaller population groups, there are not enough ACS respondents during any given year to allow for precise and reliable estimates of the high school status dropout rate. For these demographic subgroups, NCES pools the data from 5 years of the ACS in order to obtain enough respondents to accurately describe patterns in the dropout rate.

For example, while the overall status dropout rate for Asian 16- to 24-year-olds was below the national average in 2017, the rates for specific subgroups of Asian young adults varied widely. Based on 5 years of ACS data, high school status dropout rates among Asian 16- to 24-year-olds ranged from 1.1 percent for individuals of Korean descent to 23.2 percent for individuals of Burmese descent. These rates represent the “average” status dropout rate for the period from 2013 to 2017. They offer greater precision than the 1-year estimates, but the 5-year time span might make them difficult to interpret at first glance. 

 


Figure 1. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by selected Asian subgroups: 2013–2017

‡ Reporting standards not met. Either there are too few cases for a reliable estimate or the coefficient of variation (CV) is 50 percent or greater.
If the estimation procedure were repeated many times, 95 percent of the calculated confidence intervals would contain the true status dropout rate for the population group.
NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations. Asian subgroups exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017.


 

The 5-year ACS data can also be used to describe status dropout rates for smaller geographic areas with more precision than the annual ACS data. For example, the average 2013–2017 status dropout rates ranged from 3.8 percent in Massachusetts to 9.6 percent in Louisiana. The 5-year ACS data allowed us to calculate more accurate status dropout rates for each state and, in many cases, for racial/ethnic subgroups within the state. Access the complete state-level dropout rates by race/ethnicity here.
 


Figure 2. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by state: 2013–2017

NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017. See table 2.3.


 

For more information about high school dropout and completion rates, check out the recently released Trends in High School Dropout and Completion Rates in the United States report. For more information about the 5-year ACS datasets, visit https://www.census.gov/programs-surveys/acs/guidance/estimates.html.

 

By Joel McFarland

Higher Rates of Homeschooled Students Than Enrolled Students Participated in Family Learning Activities in 2016

About 3 percent of the school-age population—around 1.7 million students—was homeschooled in 2016. We know that homeschooled students have different educational experiences than students who are enrolled in public or private schools, and recently released data explore some of those differences.

The Parent and Family Involvement in Education survey of the National Household Education Surveys Program (NHES) provides information on homeschooled and public and private school students based on a nationally representative sample. Parents provide information about their children’s formal education and learning activities outside of school.

The survey asks about six broad types of family learning activities that students experienced in the month prior to the survey. The 2016 results indicate that homeschooled students were more likely than their peers enrolled in public or private schools to participate in five of these six activities.

In 2016, higher percentages of homeschooled students than of students enrolled in public or private schools visited a library; a bookstore; an art gallery, museum, or historical site; and a zoo or aquarium in the month prior to completion of the survey (figure 1). A higher percentage of homeschooled students also attended an event sponsored by a community, religious, or ethnic group with their parents in the month prior to completion of the survey. The one activity for which there was no measurable difference between homeschooled and students enrolled in public or private schools was going to a play, concert, or other live show.

 


Figure 1. Percentage of 5- to 17-year-old students participating in selected family learning activities in the past month, by homeschool and enrollment status: 2016

 

NOTE: Includes 5- to 17-year-old students in grades or grade equivalents of kindergarten through grade 12. Homeschooled students are school-age children who receive instruction at home instead of at a public or private school either all or most of the time. Excludes students who were enrolled in public or private school more than 25 hours per week and students who were homeschooled only because of temporary illness. Selected activities with the child may have included any member of the household.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2016.


 

The NHES data do not tell us why these differences exist, but parents’ availability of time and parenting style may be a factor. However, more research is needed to understand these differences.

A recent report, Homeschooling in the United States: Results from the 2012 and 2016 Parent and Family Involvement Survey (PFI-NHES:2012 and 2016), provides the full complement of data from the NHES about homeschoolers’ experiences in 2016. In addition to family learning activities, the report provides information about the following:
 

  • Homeschooler demographics

  • Reasons for homeschooling

  • Providers of homeschool instruction

  • Amount of time homeschoolers spent attending public schools, private schools, or college

  • Participation in local homeschool group activities

  • Homeschool teaching styles

  • Sources of homeschool curriculum and books

  • Online coursetaking of homeschool students

  • Homeschool subject areas

  • Parent expectations of homeschooled students’ future education
     

For more information on the National Household Education Surveys Program, please go to https://nces.ed.gov/nhes/.

 

By Sarah Grady

New International Comparisons of Reading, Mathematics, and Science Literacy Assessments

The Program for International Student Assessment (PISA) is a study of 15-year-old students’ performance in reading, mathematics, and science literacy that is conducted every 3 years. The PISA 2018 results provide us with a global view of U.S. students’ performance compared with their peers in nearly 80 countries and education systems. In PISA 2018, the major domain was reading literacy, although mathematics and science literacy were also assessed.

In 2018, the U.S. average score of 15-year-olds in reading literacy (505) was higher than the average score of the Organization for Economic Cooperation and Development (OECD) countries (487). Compared with the 76 other education systems with PISA 2018 reading literacy data, including both OECD and non-OECD countries, the U.S. average reading literacy score was lower than in 8 education systems, higher than in 57 education systems, and not measurably different in 11 education systems. The U.S. percentage of top performers in reading was larger than in 63 education systems, smaller than in 2 education systems, and not measurably different in 11 education systems. The average reading literacy score in 2018 (505) was not measurably different from the average score in 2000 (504), the first year PISA was administered. Among the 36 education systems that participated in both years, 10 education systems reported higher average reading literacy scores in 2018 compared with 2000, and 11 education systems reported lower scores.

The U.S. average score of 15-year-olds in mathematics literacy in 2018 (478) was lower than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 mathematics literacy data, the U.S. average mathematics literacy score was lower than in 30 education systems, higher than in 39 education systems, and not measurably different in 8 education systems. The average mathematics literacy score in 2018 (478) was not measurably different from the average score in 2003 (483), the earliest year with comparable data. Among the 36 education systems that participated in both years, 10 systems reported higher mathematics literacy scores in 2018 compared with 2003, 13 education systems reported lower scores, and 13 education systems reported no measurable changes in scores.  

The U.S. average score of 15-year-olds in science literacy (502) was higher than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 science literacy data, the U.S. average science literacy score was lower than in 11 education systems, higher than in 55 education systems, and not measurably different in 11 education systems. The average science literacy score in 2018 (502) was higher than the average score in 2006 (489), the earliest year with comparable data. Among the 52 education systems that participated in both years, 7 education systems reported higher average science literacy scores in 2018 compared with 2006, 22 education systems reported lower scores, and 23 education systems reported no measurable changes in scores.

PISA is conducted in the United States by NCES and is coordinated by OECD, an intergovernmental organization of industrialized countries. Further information about PISA can be found in the technical notes, questionnaires, list of participating OECD and non-OECD countries, released assessment items, and FAQs.

 

By Thomas Snyder

New Study on U.S. Eighth-Grade Students’ Computer Literacy

In the 21st-century global economy, computer literacy and skills are an important part of an education that prepares students to compete in the workplace. The results of a recent assessment show us how U.S. students compare to some of their international peers in the areas of computer information literacy and computational thinking.

In 2018, the U.S. participated for the first time in the International Computer and Information Literacy Study (ICILS), along with 13 other education systems around the globe. The ICILS is a computer-based international assessment of eighth-grade students that measures outcomes in two domains: computer and information literacy (CIL)[1] and computational thinking (CT).[2] It compares U.S. students’ skills and experiences using technology to those of students in other education systems and provides information on teachers’ experiences, school resources, and other factors that may influence students’ CIL and CT skills.

ICILS is sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and is conducted in the United States by the National Center for Education Statistics (NCES).

The newly released U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report provides information on how U.S. students performed on the assessment compared with students in other education systems and describes students’ and teachers’ experiences with computers.


U.S. Students’ Performance

In 2018, U.S. eighth-grade students’ average score in CIL was higher than the average of participating education systems[3] (figure 1), while the U.S. average score in CT was not measurably different from the average of participating education systems.

 


Figure 1. Average computer and information literacy (CIL) scores of eighth-grade students, by education system: 2018p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

⁵ Data collected at the beginning of the school year.

NOTE: The ICILS computer and information literacy (CIL) scale ranges from 100 to 700. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average CIL scores, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Given the importance of students’ home environments in developing CIL and CT skills (Fraillon et al. 2019), students were asked about how many computers (desktop or laptop) they had at home. In the United States, eighth-grade students with two or more computers at home performed better in both CIL and CT than their U.S. peers with fewer computers (figure 2). This pattern was also observed in all participating countries and education systems.

 


Figure 2. Average computational thinking (CT) scores of eighth-grade students, by student-reported number of computers at home and education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

NOTE: The ICILS computational thinking (CT) scale ranges from 100 to 700. The number of computers at home includes desktop and laptop computers. Students with fewer than two computers include students reporting having “none” or “one” computer. Students with two or more computers include students reporting having “two” or “three or more” computers. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average scores of students with two or more computers at home, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

U.S. Students’ Technology Experiences

Among U.S. eighth-grade students, 72 percent reported using the Internet to do research in 2018, and 56 percent reported completing worksheets or exercises using information and communications technology (ICT)[4] every school day or at least once a week. Both of these percentages were higher than the respective ICILS averages (figure 3). The learning activities least frequently reported by U.S eighth-grade students were using coding software to complete assignments (15 percent) and making video or audio productions (13 percent).

 


Figure 3. Percentage of eighth-grade students who reported using information and communications technology (ICT) every school day or at least once a week, by activity: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

NOTE: The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Activities are ordered by the percentages of U.S. students reporting using information and communications technology (ICT) for the activities, from largest to smallest.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Browse the full U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report to learn more about how U.S. students compare with their international peers in their computer literacy skills and experiences.

 

By Yan Wang, AIR, and Linda Hamilton, NCES

 

[1] CIL refers to “an individual's ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society” (Fraillon et al. 2019).

[2] CT refers to “an individual’s ability to recognize aspects of real-world problems which are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer” (Fraillon et al. 2019). CT was an optional component in 2018. Nine out of 14 ICILS countries participated in CT in 2018.

[3] U.S. results are not included in the ICILS international average because the U.S. school level response rate of 77 percent was below the international requirement for a participation rate of 85 percent.

[4] Information and communications technology (ICT) can refer to desktop computers, notebook or laptop computers, netbook computers, tablet devices, or smartphones (except when being used for talking and texting).

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Duckworth, D., and Friedman, T. (2019). IEA International Computer and Information Literacy Study 2018: Assessment Framework. Cham, Switzerland: Springer. Retrieved October 7, 2019, from https://link.springer.com/book/10.1007%2F978-3-030-19389-8.