IES Blog

Institute of Education Sciences

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.

Investing in Next Generation Technologies for Education and Special Education

The Department of Education’s (ED) Small Business Innovation Research (SBIR) program, administered by the Institute of Education Sciences (IES), funds entrepreneurial developers to create the next generation of technology products for students, teachers, and administrators in education and special education. The program, known as ED/IES SBIR, emphasizes an iterative design and development process and pilot research to test the feasibility, usability, and promise of new products to improve outcomes. The program also focuses on planning for commercialization so that the products can reach schools and end-users and be sustained over time.

In recent years, millions of students in tens of thousands of schools around the country have used technologies developed through ED/IES SBIR, including more than million students and teachers who used products for remote teaching and learning during the COVID-19 pandemic.

ED/IES SBIR Announces 2022 Awards

IES has made 10 2022 Phase I awards for $250,000*. During these 8 month projects, teams will develop and refine prototypes of new products and test their usability and initial feasibility. All awardees who complete a Phase I project will be eligible to apply for a Phase II award in 2023.

IES has made nine 2022 Phase II awards, which support further research and development of prototypes of education technology products that were developed under 2021 ED/IES SBIR Phase I awards. In these Phase II projects, teams will complete product development and conduct pilot studies in schools to demonstrate the usability and feasibility, fidelity of implementation, and the promise of the products to improve the intended outcomes.

IES also made one Direct to Phase II award to support the research, development, and evaluation of a new education technology product to ready an existing researcher-developed evidence-based intervention for use at scale and to plan for commercialization. The Direct to Phase II project is awarded without a prior Phase I award. All Phase II and the Direct to Phase II awards are for $1,000,000 for two-years. Across all awards, projects address different ages of students and content areas.

The list of all 2022 awards is posted here. This page will be updated with the two additional Phase I awards after the contracts are finalized.

 

 

The 2022 ED/IES SBIR awards highlight three trends that continue to emerge in the field of education technology.

Trend 1: Projects Are Employing Advanced Technologies to Personalize Learning and Generate Insights to Inform Tailored Instruction

About two-thirds of the new projects are developing software components that personalize teaching and learning, whether through artificial intelligence, machine learning, natural language processing, automated speech recognition, or algorithms. All these projects will include functionalities afforded by modern technology to personalize learning by adjusting content to the level of the individual learner, offer feedback and prompts to scaffold learning as students progress through the systems, and generate real-time actionable information for educators to track and understand student progress and adjust instruction accordingly. For example:

  • Charmtech Labs and Literably are fully developing reading assessments that provide feedback to inform instruction.
  • Sirius Thinking and studio:Sckaal are developing prototypes to formatively assess early grade school students in reading.
  • Sown To Grow and xSEL Labs are fully developing platforms to facilitate student social and emotional assessments and provide insights to educators.
  • Future Engineers is fully developing a platform for judges to provide feedback to students who enter STEM and educational challenges and contests.
  • Querium and 2Sigma School are developing prototypes to support math and computer science learning respectively.
  • ,Soterix is fully developing a smart walking cane and app for children with visual impairments to learn to navigate.
  • Alchemie is fully developing a product to provide audio cues to blind or visually impaired students learning science.
  • Star Autism Support is developing a prototype to support practitioners and parents of children with autism spectrum disorder.

Trend 2: Projects Focusing on Experiential and Hands-On Learning
Several new projects are combining hardware and software solutions to engage students through pedagogies employing game-based, hands-on, collaborative, or immersive learning:

  • Pocketlab is fully developing a matchbox-sized car with a sensor to collect physical science data as middle school students play.
  • GaiaXus is developing a prototype sensor used for environmental science field experiments.
  • Mind Trust is a developing a virtual reality escape room for biology learning.
  • Smart Girls is developing a prototype science game and accompanying real-world hands-on physical activity kits.
  • Indelible Learning is developing a prototype online multi-player game about the electoral college.
  • Edify is fully developing a school-based program for students to learn about, create, and play music.

Trend 3: Projects to Advance Research to Practice at Scale

Several new awards will advance existing education research-based practices into new technology products that are ready to be delivered at scale:

  • INSIGHTS is fully developing a new technology-delivered version to ready an NIH- and IES-supported social and emotional intervention for use at scale.
  • xSEL Laband Charmtech Labs (noted above) are building on prior IES-funded research-based interventions to create scalable products.
  • Scrible is developing an online writing platform in partnership with the National Writers Project based on prior Department of Education-funded research. 

 


*Note: Two additional 2022 Phase I awards are forthcoming in 2022. The contracts for these awards are delayed due to a back-up in the SAM registration process.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Edward Metz (Edward.Metz@ed.gov) is the Program Manager of the ED/IES SBIR program.

Michael Leonard (Michael.Leonard@ed.gov) is the Program Analyst of the ED/IES SBIR program.

 

Improving Academic Achievement through Instruction in Self-Regulated Strategy Development: The Science Behind the Practice

Self-Regulated Strategy Development (SRSD) is an evidence-based instructional approach characterized by active, discussion-based, scaffolded, and explicit learning of knowledge of the writing process; general and genre-specific knowledge; academic vocabulary; and validated strategies for teaching reading and writing. IES has supported multiple research studies on SRSD for students with learning disabilities in K-12 and postsecondary general education settings. SRSD is used in as many as 10,000 classrooms across the United States and in 12 other countries. In this interview blog, we spoke with Dr. Karen Harris, the developer of SRSD, to learn more about this effective instructional strategy, the IES research behind it, and next steps for further scaling of SRSD so that more students can benefit.

What led you to develop the Self-Regulated Strategy Development model?

Photo of Karen Harris

I began developing what became the SRSD model of instruction in the 1980s, based on my experiences tutoring and teaching. No one theory could address all of what I needed to do as a teacher, or all that my students needed as learners. SRSD instruction pulls together what has been learned from research across theories of learning and teaching. It is a multicomponent instructional model that addresses affective, behavioral, and cognitive aspects of learning. Further, SRSD instruction is intended to take place in inclusive classrooms, is discourse-driven, integrates social-emotional supports, and involves learning in whole class and group settings with peer collaboration. SRSD research started in writing because Steve Graham (my husband and colleague) was deeply interested in writing, and we co-designed the initial studies. Today, SRSD instruction research exists across a wide variety of areas, such as reading comprehension, mathematical problem solving, fractions, social studies, and science.

What are some of the key findings about this instructional strategy?

SRSD has been recognized by the What Works Clearinghouse (WWC) as an evidence-based practice with  consistently positive effects on writing outcomes.  A 2013 meta-analysis of SRSD for writing found that SRSD was effective across different research teams, different methodologies, differing genres of writing (such as narrative or persuasive), and students with diverse needs including students with learning disabilities and emotional and behavioral disorders. Effect sizes in SRSD research are typically large, exceeding .85 in meta-analyses and commonly ranging from 1.0 to 2.55 across writing and affective outcome measures.

Over the years, IES has supported a number of studies on SRSD, which has led to some key findings that have practical implications for instruction from elementary school through college.

Do you know how many teachers use SRSD in their classrooms?

It is hard to be sure how prevalent SRSD instruction is in practice, but there are two groups dedicated to scaling up SRSD in schools— thinkSRSD and SRSD Online—both of which I voluntarily advise. Together, they have reached over 300,000 students and their teachers in the United States. In addition, I am following or in touch with researchers or teachers in 12 countries across Europe, North America, Australia, Africa, Asia, and the Middle East.

What’s next for research on SRSD?  

Many students have difficulty writing by the time they get to upper elementary school. Currently, there is an ongoing development project that is adapting and testing SRSD for children in the lower elementary grades to support their oral language skills, transcription, and writing strategy skills. The research team is in the process of conducting a small-scale randomized controlled study and will have findings soon.

Beyond this study, there are many future directions for SRSD research, including further work in different genres of writing, different grades, and involving families in the SRSD process. More work on how to integrate SRSD strategies into instruction across content areas, such as social studies or science is also needed. Despite the evidence base for and interest in SRSD, a major challenge is scaling up SRSD in schools. We and other researchers have identified numerous barriers to this goal. We also need research on working with administrators, schools, and teachers to use writing more effectively as a tool for self-expression, self-advocacy, and social and political engagement. Writing can also be an important and effective means of addressing issues of equity and identity, and little SRSD research has been done in these areas.

Dr. Karen Harris is Regents Professor and the Mary Emily Warner Professor at Arizona State University’s Mary Lou Fulton Teachers College. Her current research focuses on refining a web-based intelligent tutor to augment SRSD instruction with elementary students in persuasive writing, integrating SRSD with reading to learn and writing to inform, developing a Universal Design for Learning Science Notebook, and developing practice-based professional development for SRSD.

This blog was produced by Julianne Kasper, Virtual Student Federal Service intern at IES and graduate student in education policy & leadership at American University.

Access an NCES Presentation on ECLS Reading Data From the IES Reading Summit

NCES staff presented information on reading data from the Early Childhood Longitudinal Studies (ECLS) Program at the June 2021 Institute of Education Sciences (IES)/Council of the Great City Schools (CGCS) Reading Summit. The ECLS data cover a wide range of reading-related topics, such as children’s reading knowledge and skills, home literacy activities, and teachers’ instructional practices. The presentation included a brief overview of three ECLS program studies and the reading-related data collected by each. In addition, the presentation included a discussion of the resources available to either see what research has been conducted with the data or explore the data independently. As the focus of the presentation was on data available to the public for secondary analysis, its target audience was researchers and others with a data science focus.

Access the Reading Summit presentation—Reading Data Available from the Early Childhood Longitudinal Studies (ECLS)—and handout below to learn more about ECLS reading data.

Be sure to also check out this blog post to learn more about the work highlighted at the IES Reading Summit.

 

By Jill Carlivati McCarroll, NCES

Culturally Responsive Language and Literacy Enrichment for Native American Children

As part of our recognition of Native American Heritage Month, we asked Diane Loeb to discuss her IES-funded research on culturally responsive language and literacy enrichment for Native American children.

Development of language and exposure to early literacy is critical to a child’s academic success. Speaking and listening skills are necessary to navigate learning at every level of school. According to NCES, American Indians/Alaska Native populations have the highest percentage of students who receive services under the Individuals with Disabilities Education Act. There continues to be a significant need for Native American speech-language pathologists and audiologists, culturally sensitive assessment tools, and intervention approaches.

In 2006, I had the privilege to work with ten Native American college students who were recruited to the University of Kansas for the speech-language pathology and audiology master’s program. The students were from tribes across the country and varied greatly in their undergraduate preparation and world experiences. One thing that they had in common is that they wanted to make a difference in the lives of others—in particular, those who needed help with their speech, language, hearing skills, and related difficulties. As a result of working with these amazing students, I learned about their families, their customs, and their dreams. I also became painfully aware of the historical trauma Native Americans experience as a result of genocide, colonialism, and racism. In the twentieth century, Native Americans were sent to boarding schools and deprived of their language, culture, and their family.

As the students advanced in their academic studies and clinical work, it became clear to me that there were very few resources for identifying and intervening with language delay and language disorder. Under- and over-identification for special education services were highly possible due to our lack of understanding of Native American history, level of family assimilation, and inter-tribal differences. Although there were a handful of articles related to conducting assessments, very few studies addressed culturally sensitive and responsive intervention, where children’s cultural values and beliefs, experiences, and how they learn guide the assessment and intervention. The lack of culturally responsive tools for Native Americans propelled me to write an IES-funded grant proposal designed to implement culturally authentic intervention designed to be meaningful, sensitive, and respectful of Native American culture.

As a result of the IES grant we received, we developed a culturally based language and vocabulary intervention for Native American kindergarten children at risk for speech and language impairment, as well as a training program for teachers and speech-language pathologists. Language and literacy lessons were based on positive stories about Native Americans in storybooks and storytelling was taught through the venue of shared reading. Native American adults from the Native American school we were working with examined our materials to ensure that our activities were in line with the values and beliefs of the participating children. Pilot testing suggested that students made gains in literacy and language skills following intervention. 

My colleague, Grace McConnell, and I recently published an in-depth analysis of the narratives produced by the children in our initial studies. We found distinct trends in narrative structure and evaluative comments depending on student age and whether there were visual supports. What we found highlights the importance of culturally responsive language and literacy interventions for Native American children. There remains a great need for these interventions. From my work, I have learned several important lessons that may be useful to current and future researchers. The three most salient to me are

  • Include members of the tribe with whom you are working as part of the process of developing assessments and interventions for children who are Native American. This helps to ensure that your assessments and interventions are culturally sensitive.
  • Develop authentic materials that are culturally relevant, sensitive, and meaningful. We found several books with positive cultural lessons, such as respecting the earth, working together, and harmony with others and nature.
  • Remember that tribes can differ substantially from one another and that families may differ regarding cultural values and beliefs within a given tribe. When we designed literacy and language units around Native American storybooks, they often were related to specific tribes (such as Navajo or Apache). This gave us the opportunity to discuss different tribes in various parts of the country and for the children to learn about and compare their own customs and beliefs with another tribe. Students also learned about different family practices within their own tribe by sharing their family experiences with other children.

Following my work with Native American students and children, I pursued grant and research opportunities focused on the development of children born preterm of all races/ethnicities. I am working with neonatologists and nurses on studies to improve the developmental outcomes of children born preterm. Approximately 25% of children born preterm are later diagnosed with language delay or language disorder. I am currently designing NICU interventions to facilitate language, cognitive, motor, and social interaction skills that support academic success. A future goal is to focus my intervention work with Native American infants born preterm and their families. Providing facilitation of language and literacy early in development for these at-risk infants may be key for their later academic success.

Diane Loeb at Diane_Loeb@Baylor.edu is the Martin Family Endowed Chair of Communication Sciences and Disorders and Department Chair at Robbins College of Health and Human Sciences at Baylor University in Waco, Texas. She is a first-generation college graduate. This research was conducted while she was an Associate Professor at the University of Kansas in Lawrence, KS.

This guest blog was produced by Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council, and Amy Sussman (Amy.Sussman@ed.gov), NCSER Program Officer.