IES Blog

Institute of Education Sciences

Teaching with Technology: U.S. Teachers’ Perceptions and Use of Digital Technology in an International Context

The coronavirus pandemic forced teachers across the world to immediately transition instruction to a virtual setting in early 2020. To understand U.S. teachers’ level of preparedness for this shift in an international context, this blog examines recent international data from U.S. teachers’ responses to questions on the following topics:

  • Their perceptions of information and communications technologies (ICT) resources
  • Their use of ICT for instruction prior to the pandemic

In general, the results suggest that U.S. teachers are more resourced in ICT than their international peers, and they use ICT at a similar frequency at school when teaching.

 

Teachers’ perceptions of ICT resources at their school

The quantity and quality of ICT resources available in school systems prior to the coronavirus pandemic may impact teachers’ access to such resources for instructional purposes while classrooms are functioning in a virtual format. The United States participated in the 2018 International Computer and Information Literacy Study (ICILS), which asked questions about ICT resources to a nationally representative sample of eighth-grade teachers from 14 education systems.

The results from this study show that 86 percent of eighth-grade teachers both in the United States and across ICILS 2018 education systems “strongly agreed” or “agreed” that ICT is considered a priority for use in teaching (figure 1). Compared with the ICILS 2018
averages,[1] higher percentages of U.S. eighth-grade teachers “strongly agreed” or “agreed” with various statements about the use of ICT.

While 86 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “ICT is considered a priority for use in teaching,” only 61 percent “strongly agreed” or “agreed” that “there is sufficient opportunity for me to develop expertise in ICT” (figure 1). Additionally, 62 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “there is enough time to prepare lessons that incorporate ICT.” These disparities may have had an impact on teacher capacity during the sudden shift to 100 percent online learning as a result of the coronavirus pandemic, which would be a good topic for future research and analyses.  


Figure 1. Percentage of eighth-grade teachers who reported that they “strongly agree” or “agree” with statements about using ICT in teaching at school, by statement: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
² National Defined Population covers 90 to 95 percent of National Target Population.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Statements are ordered by the percentages of U.S. teachers reporting “strongly agree” or “agree” from largest to smallest.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 17 from U.S. Results from the 2018 ICILS Web Report.


Teachers’ perceptions of the use of ICT for instruction

Teachers’ views on the role of ICT in virtual instruction during the coronavirus pandemic are not yet clear. However, in 2018, when instruction was conducted in physical classrooms, most U.S. eighth-grade teachers participating in ICILS expressed positive perceptions about “using ICT in teaching and learning at school,” as did many teachers internationally.

Among eighth-grade teachers in the United States, 95 percent agreed that ICT “enables students to access better sources of information,” 92 percent agreed that ICT “helps students develop greater interest in learning,” and 92 percent agreed that ICT “helps students work at a level appropriate to their learning needs.” On average across other education systems participating in ICILS, at least 85 percent of teachers agreed with each of these statements (Fraillon et al. 2019).

Seventy-five percent of U.S. eighth-grade teachers in 2018 agreed that ICT “improves academic performance of students,” which was higher than the ICILS international average of 71 percent. The percentages of teachers who agreed with this statement varied across education systems, from three-quarters or more of teachers in Chile, Denmark, Kazakhstan, and Portugal to less than half of teachers in Finland and North Rhine-Westphalia (Germany).

 

Frequency of teachers’ use of ICT

Teachers’ reported use of ICT for instruction in physical classroom settings may provide insight into their level of experience as they transition to virtual settings during the coronavirus pandemic.

In 2018, half of U.S. eighth-grade teachers reported “using ICT at school when teaching” every day, which was not significantly different from the ICILS average of 48 percent. However, the U.S. percentage was lower than the percentages of teachers in Moscow (76 percent), Denmark (72 percent), and Finland (57 percent) (figure 2).


Figure 2. Percentage of eighth-grade teachers who reported using ICT at school every day when teaching, by education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.
¹ Met guidelines for sample participation rates only after replacement schools were included.
² National Defined Population covers 90 to 95 percent of National Target Population.
³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
⁴ Data collected at the beginning of the school year.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their percentages of teachers reporting using ICT at school when teaching from largest to smallest. Italics indicate the benchmarking participants.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 15 from U.S. Results from the 2018 ICILS Web Report.


For more information on teachers and technology, check out NCES’s ICILS 2018 website, the international ICILS website, and the earlier NCES blog “New Study on U.S. Eighth-Grade Students’ Computer Literacy.”

 

By Amy Rathbun, AIR, and Stephen Provasnik, NCES

 


[1] The ICILS average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. The United States did not meet the guidelines for a sample participation rate of 85 percent, so it is not included in the international average.

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Friedman, T., and Duckworth, D. (2019). Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 International Report. Amsterdam: International Association for the Evaluation of Educational Achievement.

Understanding School Lunch Eligibility in the Common Core of Data

Every year in the Common Core of Data (CCD), NCES releases data on the number of students eligible for the National School Lunch Program (NSLP), a U.S. Department of Agriculture (USDA) meal program that provides nutritionally balanced low-cost or free meals to children during the school day. The program was established under the National School Lunch Act, signed into law by President Harry Truman in 1946, and currently serves nearly 30 million children.

This post highlights substantial changes to the NSLP and related changes in CCD reporting and provides guidance on how to use the NSLP data.

Free or Reduced-Price Lunch vs. Direct Certification

Historically, student eligibility for free or reduced-price lunch (FRPL) was determined through individual students submitting school meals application forms within school districts. In 1986, the USDA introduced a direct certification option to reduce participation barriers in the school meal program. Under direct certification, any child belonging to a household that participates in Supplemental Nutrition Assistance Program (SNAP), Temporary Assistance for Needy Families (TANF), Food Distribution Program on Indian Reservations (FDPIR), or (in some states) Medicaid—as well as children who are migrant, homeless, in foster care, or in Head Start—are categorically eligible to receive free meals at school.

The NSLP data included in CCD releases include school-level FRPL and direct certification eligibility counts for all public schools with students enrolled. These point-in-time counts are taken on or around October 1 of each school year and reported by the states based on the following guidance: 

  • FRPL-Eligible Students
    • Free lunch students: those eligible to participate in the Free Lunch Program (i.e., those with family incomes below 130 percent of the poverty level or who are directly certified)
    • Reduced-price lunch students: those eligible to participate in the Reduced-Price Lunch Program (i.e., those with family incomes between 130 and 185 percent of the poverty level)
    • Free and reduced-price lunch student: the total of free lunch students and reduced-price lunch students
  • Direct Certification
    • The number of students reported as categorically eligible to receive free meals to the USDA for the FNS 742. Students are categorically eligible to receive free meals if they belong to a household receiving the selected federal benefits noted above or are migrant, homeless, in foster care, or in Head Start.

The count of students eligible for free lunch includes students directly certified plus any students who qualified for free lunch by completing a school meals application. As such, the number of students reported as directly certified should always be less than or equal to the number of free lunch students.

Note that changes in SNAP (both legislated eligibility requirements and temporary changes such as national disasters) can have implications for reported NSLP eligibility as well.

The Healthy, Hunger-Free Kids Act of 2010

In 2010, the Healthy, Hunger-Free Kids Act (HHFKA) established national nutrition standards for food served and sold in schools and made changes to the NSLP to increase food access. These changes also impacted the NSLP data published through CCD:

  • While direct certification had been an option since 1986, HHFKA mandated that states directly certify NSLP eligibility for at least 95 percent of SNAP participants. With the mandated use of direct certification, several states stopped reporting FRPL eligibility entirely. 
  • HHFKA introduced the Community Eligibility Provision (CEP) to expand access to free meals to all students in low-income areas. Schools qualifying under CEP no longer count students who qualify for reduced-price lunch since all students are provided a free lunch. CEP schools may report all students as eligible for free lunch regardless of economic status, since all students are provided a free lunch.

Guidance for Data Users

The NSLP eligibility data published through CCD are often used by researchers as a proxy measure for the number of students living in poverty. However, there are limitations to the usefulness of these data that researchers should consider when using NSLP data.

The NSLP data published through CCD has changed over time. CCD published just FRPL counts through SY 2015–16. Starting in SY 2016–17, states can report FRPL and/or direct certification eligibility counts for each school, and CCD publishes both FRPL and direct certification, as reported by the states.[1]

When creating state and national estimates (including tables in the Digest of Education Statistics), NCES uses FRPL counts when they are available. If FRPL data are not available, direct certification data is used as a proxy. For this type of analysis, NCES includes all schools for which both student enrollment data and FRPL or direct certification were reported. States that only reported direct certification are footnoted. NCES recommends that data users be mindful of the reporting differences when analyzing or drawing conclusions with these data.

The NSLP data meet a variety of critical analysis needs to help policy makers, researchers, and the public target resources and answer policy questions. CCD is the only source of nationwide school-level NSLP data. Explore NSLP data as well as all of the other CCD data elements available either by using the CCD data query tool or by downloading data files directly.

 

By Beth Sinclair, AEM, and Chen-Su Chen, NCES

 


[1] In SY 2018–19, states reported FRPL counts for 95 percent of schools. Five states/jurisdictions reported solely the number of direct certification students (Delaware, the District of Columbia, Massachusetts, Tennessee, and American Samoa). The remaining states/jurisdictions were split: about half reported solely the number of FRPL students for each school and the other half reported both FRPL and direct certification for each school (or FRPL for some schools and direct certification for others).

NASA Launches a Rocket to Mars with a Rover (Perseverance) and Helicopter (Ingenuity) On Board to Explore

Editor’s Note: This Inside IES Blog is crossed-posted on Homeroom, the official blog of the U.S. Department of Education.

 

On July 30, 2020, NASA launched a rocket from Cape Canaveral Air Force Station in Florida on a journey to Mars. The rocket is carrying a rover named Perseverance and a helicopter named Ingenuity, both of which will land inside Mars's Jezero Crater on February 18, 2021. While on Mars, Perseverance and Ingenuity will collect the first Martian soil and rock samples for future return to Earth, search for signs of extinct or extant life, characterize the planet’s climate and geology, and pave the way for human exploration of the Red Planet with the help of new technologies and scientific instruments.

Perseverance and Ingenuity were named by students through a national Kindergarten to Grade 12 student competition run by NASA in partnership with Future Engineers and Battelle Education.

The student whose entry won the prize to name the rover is Alexander Mather, a seventh grader from Lake Braddock Secondary School in Burke, Virginia. Alexander submitted the name Perseverance and included the following in his essay:

“Curiosity. Insight. Spirit. Opportunity. If you think about it, all of these names of past Mars rovers are qualities we possess as humans. We are always curious, and seek opportunity. We have the spirit and insight to explore the Moon, Mars, and beyond. But, if rovers are to be the qualities of us as a race, we missed the most important thing. Perseverance.”

 

Watch the March 5 program where the winning name was revealed here:

 

The student whose entry won the prize to name the helicopter is Vaneeza Rupani, a junior at Tuscaloosa County High School in Northport, Alabama. Vanessa submitted the name Ingenuity and included the following in her essay:

"The ingenuity and brilliance of people working hard to overcome the challenges of interplanetary travel are what allow us all to experience the wonders of space exploration. Ingenuity is what allows people to accomplish amazing things, and it allows us to expand our horizons to the edges of the universe."

 

Watch the video trailer featuring the naming of the Mars helicopter: 

 

About the “Name the Rover” Contest

Not only did the contest help NASA pick a new name for the rover, it also engaged U.S. students in the engineering and scientific work that makes Mars exploration possible, stimulated interest in science, technology, engineering, and mathematics (STEM), and inspired the next generation of STEM leaders.

After launching the competition in August 2019, students from 50 states, U.S. territories, and military bases submitted over 28,000 essays. More than 4,500 volunteer judges narrowed the pool to 155 semifinalists. From these, NASA chose nine finalists—Clarity, Courage, Endurance, Fortitude, Ingenuity, Perseverance, Promise, Tenacity, and Vision—and opened a public poll in which anyone could vote. After considering these poll results, NASA officials chose the two names.

To manage the competition, NASA used a web-based platform developed by Burbank, California-based Future Engineers.  This platform was created with the support of a 2017 award from the U.S. Department of Education and Institute of Education Sciences’ Small Business Innovation Research program (ED/IES SBIR).  Future Engineers built this platform to be an online hub for classrooms and educators to access free, project-based STEM activities and to provide a portal where students submit and compete in different kinds of maker and innovation challenges across the country. The Mars 2020 “Name the Rover” contest was the first naming challenge issued on the platform. We look forward to more student challenges to come!


Edward Metz (Edward.Metz@ed.gov) is a research scientist at the Institute of Education Sciences in the US Department of Education.

Bob Collom is an integration lead in the Mars Exploration Program at NASA Headquarters.


About ED/IES SBIR

The U.S. Department of Education’s Small Business Innovation Research program, administered by the Institute of Education Sciences (IES), funds projects to develop education technology products designed to support students, teachers, or administrators in general or special education. The program emphasizes rigorous and relevant research to inform iterative development and to evaluate whether fully developed products show promise for leading to the intended outcomes. The program also focuses on commercialization once the award period ends so that products can reach students and teachers and be sustained over time. ED/IES SBIR-supported products are currently used by millions of students in thousands of schools around the country.

About NASA’s Mars Exploration Program (MEP)

NASA’s Mars Exploration Program (MEP) in the Planetary Science Division is a science-driven program that seeks to understand whether Mars was, is, or can be, a habitable world. To find out, we need to understand how geologic, climatic, and other processes have worked to shape Mars and its environment over time, as well as how they interact today. To that end, all of our future missions will be driven by rigorous scientific questions that will continuously evolve as we make new discoveries. MEP continues to explore Mars and to provide a continuous flow of scientific information and discovery through a carefully selected series of robotic orbiters, landers and mobile laboratories interconnected by a high-bandwidth Mars/Earth communications network. The Mars 2020 Project at NASA’s Jet Propulsion Laboratory manages rover development for NASA’s Science Mission Directorate. NASA’s Launch Services Program, based at the agency’s Kennedy Space Center in Florida, is responsible for launch management.

 

Teachley’s Game Apps for Mathematics: From Research to Practice at Scale in Education

With a 2010 IES research grant, researchers at Teachers College, Columbia University conducted basic research and created prototype software programs for children in mathematics. In 2011, three members of the research team launched a startup and submitted a successful proposal to IES’s Small Business Innovation Research programs. With awards in 2012 and 2013, the developers created a suite of math game apps that support fact fluency and promote math strategy development. The apps all connect with a teacher dashboard that provides in-depth reports in real time and supports differentiation in math instruction. In 2013, Teachley’s Addimal Adventure won an Apple Design Award as one of the 12 best apps of 2014. Since their commercial launch in 2014, Teachley Apps have been downloaded 1.5 million times, and the Teachley suite of products are currently used in all 50 states and 2,000 schools.

Interview with Kara Carpenter, co-founder of Teachley

 

 

The three co-founders of Teachley were all classroom teachers before you met at Teachers College as graduate students in 2010. What led to your decision to go to graduate school to earn PhDs as researchers?

While teaching 2nd grade, I had the opportunity to receive professional development focused on elementary math content, and I became fascinated with how children develop their mathematical thinking. Years later, when I was getting a master’s in curriculum & teaching at Teachers College, I pursued a work study opportunity with Professor Herb Ginsburg, who focuses on early childhood math thinking. At the time in 2009, my cofounder Rachael Labrecque was already working with Professor Ginsburg, and the three of us submitted an application to IES to develop math software for young learners. That fall, I went back to classroom teaching, but when the application was funded in 2010, I decided to take the leap and accept a research fellowship to pursue a PhD. My other co-founder, Dana Pagar, joined our research team that fall, and the three of us decided to start Teachley in 2012 to bring all the great research on how kids learn math into marketable products.

 

Tell us about the research projects that you were involved with in graduate school.

We worked on a project developing math software for grades pre-K to 3, called MathemAntics. We developed dozens of activities and conducted small learning studies along the way. In the third year, we conducted an RCT with approximately 400 students in grades PreK - 2. Each of our dissertations involved different elements of the project. Mine focused on teaching and detecting kids’ single-digit addition strategies. Dana’s focused on continuous versus discrete blocks, while Rachael studied teachers’ preparedness to integrate technology into their classrooms.

 

How did you come up with the idea to develop apps that would be used in schools on a wide scale basis?
Originally, we were looking for a company who might want to take these research findings and turn them into commercial products. We were meeting with various business leaders, and one of them turned to us and said, “You should do this. You should start a company to bring your ideas to market.” That’s the push we needed to think of ourselves as potential startup founders.

 

How did you find out about the SBIR program at the US Department of Education’s Institute of Education Sciences? How important was the first SBIR award for launching Teachley?

Once we decided to start Teachley, we knew that SBIR would be a great resource for us. The MathemAntics project had actually started out as an NIH SBIR Phase I with a different company. That first ED/IES SBIR award is the reason that Teachley became a company. Without that funding, we would not have been able to prove ourselves capable of bringing a product to market. Institutional investors aren’t taking those kinds of risks, and angel investment is too tied into social networks and who you know.

 

Was Teachers College supportive of its graduate students starting a small business and getting an award to develop apps? Did anyone at the university offer advice or guidance on how to operate a small business?

Leaving the university was tricky because we had research fellowships when we started the company. However, the Teachers College president at the time, Susan Fuhrman, and the provost, Tom James, were supportive of our startup. We speak and participate in various discussions and events at Teachers College, which keeps us connected to the university and the research.

 

How does Teachley ensure that research is integrated into your development and validation process?

Before developing any new product idea, we look to the research to see what’s already been learned about the topic, especially as it relates to struggling learners. During the early stages of development, we rely on close observations of students as they use pencil/paper mockups and early software builds. As a team, we closely review videos of students working through problems, looking to find better, more intuitive ways to support students’ thinking. Once we have a functional prototype, we use more formal evaluative techniques to determine our impact on student learning.

 

What models have you used to commercialize Teachley on a widespread basis?

We have tried out many different revenue models. Initially, we tried publishing the games for free and charging schools for the formative assessment data. However, we soon found that bundling the games and data together into a single subscription worked better for schools. With our latest game, Market Bay, we are trying a new model where educators create a free account, and parents subscribe to have access at home. Schools who subscribe to Teachley get home access to Market Bay and our other games for all of their families.

 

Have you raised funds from venture capitalists? Why or why not?

Not yet. Raising money from venture capitalists can put you on a succeed-or-fail-fast treadmill that isn’t always a great fit for the education market. Many investors are looking for a 70x return within just a few years or they abandon ship. Developing great educational software takes time for both the iterative design process and the research to prove your effectiveness. We are just now at the stage where raising venture capital may soon make sense because we have enough content to scale our school/district sales.

 

When COVID-19 emerged and schools closed, you made your apps freely available to teachers and students in their classes, and 15,000 teachers and students were able to access your products. What was that experience like?

Teachers are looking for digital products that will deeply engage students and support true learning. We’re a great fit. However, schools across the country are suffering budget shortfalls at the same time as they need to spend more to ensure they meet safety standards. We’re working with schools and teachers to find alternative ways to fund our program, from parent organizations to Donors Choose to corporate partnerships.

 

None of you had had formal business training prior to founding Teachley. Do you have advice for those who are interested in starting an entrepreneurial small business to develop education technology that can be used in schools?

My advice would be to know your users and implementation deeply. If you don’t have a background in teaching, spend time volunteering in schools. Become a close observer of children and their thinking, so you can create products that support and bring out children’s genius.

 

 ____________________________________________________________________________

Kara Carpenter is cofounder of Teachley (@teachley), an edtech startup focused on promoting deep math thinking and learning. Kara has over 10 years of teaching experience and was a National Board Certified Teacher with a PhD in Cognition and Learning from Teachers College, Columbia University. Her dissertation went on to become an Apple Design Award winning app, Addimal Adventure.

This interview was produced by Ed Metz (Edward.Metz@ed.gov) of the Institute of Education Sciences. This post is the sixth in an ongoing series of blog posts examining moving from university research to practice at scale in education.

 

 

Why Do Parents Choose Schools for Their Children?

Have you ever wondered why parents choose a specific school for their child? New data from the Parent and Family Involvement (PFI) Survey of the National Household Education Surveys (NHES) program allow us to identify the factors that parents of K–12 students rate as “very important” when choosing a school. In the 2018–19 school year, 36 percent of students had parents who indicated that they had considered multiple schools for their child. Among these students, 79 percent had parents who indicated that the quality of teachers, principals, or other school staff was very important (figure 1). Other factors that a majority of students’ parents indicated as being very important include safety (including student discipline) (71 percent) and curriculum focus or unique academic programs (e.g., language immersion, STEM focus) (59 percent).


Figure 1. Among K–12 students whose parents considered multiple schools, percentage whose parents indicated that selected factors were “very important” when choosing child’s school, by school type: 2018–19

SOURCE: Hanson, R., and Pugliese, C. (2020). Parent and Family Involvement in Education: 2019 (NCES 2020-076). U.S. Department of Education. Washington, DC: National Center for Education Statistics.


Although parents of students attending different types of schools (i.e., public assigned schools, public chosen schools, private religious schools, or private nonreligious schools) rated most factors for choosing a school similarly, some differences were observed. For example, higher percentages of students in private nonreligious schools than of students in all other kinds of schools had parents who indicated that the following factors were very important when choosing a school:

  • Quality of teachers, principals, or other school staff (92 percent) (figure 1)
  • Curriculum focus or unique academic programs (74 percent) (figure 1)
  • Number of students in class (58 percent) (figure 2)

In addition, a higher percentage of students in private nonreligious schools (42 percent) than of students in public schools (30 percent for public assigned schools and 31 percent for public chosen schools) had parents who indicated that student body characteristics were very important when choosing a school (figure 2). Conversely, a lower percentage of students in private nonreligious schools (14 percent) than of students in any other school type (ranging from 22 to 29 percent) had parents who rated cost as very important.


Figure 2. Among K–12 students whose parents considered multiple schools, percentage whose parents indicated that selected factors were “very important” when choosing child’s school, by school type: 2018–19

SOURCE: Hanson, R., and Pugliese, C. (2020). Parent and Family Involvement in Education: 2019 (NCES 2020-076). U.S. Department of Education. Washington, DC: National Center for Education Statistics.


Thirty percent of students in public assigned schools had parents who reported that they had considered other schools for their child. What did parents of students in public assigned schools value more than other parents (figure 3)?

  • Extracurricular options (including before- and after-school programs): 31 percent of parents of students in public assigned schools indicated that this factor was very important, compared with 25 percent in public chosen schools and 24 percent in private religious schools.
  • Special facilities (e.g., gymnasium, planetarium, library): 26 percent of parents of students in public assigned schools indicated that this factor was very important, compared with 20 percent in public chosen schools and 15 percent in private religious schools.
  • Quality or availability of special education (including services for students with disabilities): 25 percent of parents of students in public assigned schools indicated that this factor was very important, compared with 13 percent in private religious schools and 17 percent in private nonreligious schools.

Figure 3. Among K–12 students whose parents considered multiple schools, percentage whose parents indicated that selected factors were “very important” when choosing child’s school, by school type: 2018–19

SOURCE: Hanson, R., and Pugliese, C. (2020). Parent and Family Involvement in Education: 2019 (NCES 2020-076). U.S. Department of Education. Washington, DC: National Center for Education Statistics.


On the other hand, a lower percentage of students in public assigned schools had parents who indicated that the quality of teachers, principals, or other school staff was very important (77 percent) than did students in any other type of school (82 percent of students in public chosen schools, 84 percent of students in private religious schools, and 92 percent of students in private nonreligious schools) (figure 1).

Only 38 percent of students in private religious schools had parents who indicated that the religious orientation of the school was very important when choosing a school (figure 4). Likewise, only a quarter of students overall had parents who indicated that convenience of location was very important when choosing a school.


Figure 4. Among K–12 students whose parents considered multiple schools, percentage whose parents indicated that selected factors were “very important” when choosing child’s school, by school type: 2018–19

SOURCE: Hanson, R., and Pugliese, C. (2020). Parent and Family Involvement in Education: 2019 (NCES 2020-076). U.S. Department of Education. Washington, DC: National Center for Education Statistics.


More details about the characteristics and factors that play a role in school choice, as well as additional statistics on family involvement in schools, are available in the recent NCES release Parent and Family Involvement in Education: 2019.

 

By Sarah Grady, NCES