IES Blog

Institute of Education Sciences

Adult Ed Grantee Spotlight: Aydin Durgunoglu and Research for Adult English Learners

As part of the IES 20th Anniversary, NCER is reflecting on the past, present, and future of adult education research. In this blog, Dr. Aydin Durgunoglu, Distinguished Global Professor Emeritus at the University of  Minnesota-Duluth, reflects on how her life and training have influenced her work. Dr. Durgunoglu is the principal investigator on Content-Integrated Language Instruction for Adults with Technology Support, one of the six research projects that comprise the CREATE Adult Skills Research Network.  As part of this network, Dr. Durgunoglu and her team are focusing on the needs of adult English learners and on U.S. history and civics education, such as what might be taught as part of Integrated English Literacy and Civics Education programs. Hers is the first grant NCER has funded that is focusing on this area for this population.

Please describe your IES project.

My colleagues and I are developing a curriculum called CILIA-T (Content-Integrated Language Instruction for Adults with Technology Support). We are embedding English instruction into U.S. History and Civics content and providing technology supports for both students and teachers as part of the curriculum. Our goal is to provide a complete and integrated resource that can be used by teachers with varying levels of experience in English as a Second Language (ESL), civics/U.S. history and citizenship classes.

What motivates you to do this work?

Two of my motivations are my background as an English learner and immigrant and my training as a cognitive psychologist.

I started learning English when I was 12 in an immersion-based approach. I recall some of the struggles I had such as misunderstanding that “you may sit down” was a full sentence in English because, in Turkish, the single word oturun has a similar meaning to the English sentence. This realization along with both Turkish and American experiences helped me to see the importance of culture, language, and instruction.

As a cognitive psychologist by training, I am interested in learning, memory, knowledge acquisition, and—most of all—language. One of my research areas has been how literacy develops across different languages—how it may progress differently in Spanish, Turkish, English, Hmong, etc. and whether it involves general cognitive processes that are language independent.

These experiences and interests have long influenced my work. For example, my colleagues and I collaborated on literacy projects for Mother Child Education Foundation (MOCEF) based in Turkey that have evolved and currently include a focus on women’s citizenship and empowerment. This work was based on my theoretical work on literacy development in Turkish. In the United States, I have conducted studies with adults and children on how what they know in their home languages can help them learn English (cross-language transfer). All of these experiences led me to our work as the CILIA-T team.

How are you leveraging your experiences to build CILIA-T?

In addition to my theoretical and applied experiences in adult education, this project is benefiting from the contributions of a group of dedicated adult educators. These colleagues are teaching ESL, citizenship, history, and civics classes. We are collaborating on writing a curriculum that teachers like themselves would like to use. Based on our experiences and findings from the field, we have identified the components that we feel are key for CILIA-T. Three of these main components include

  1. Multimodal input:  Contrary to how I started learning English, providing linguistic input in several different modalities is helpful. Technology provides many opportunities to realize this goal. Learners can interact with and produce content in many forms. For example, they can create and share academic vocabulary sets and review them like a game. Technology can also facilitate deep conceptual understanding of academic topics. For example, learners can share and discuss not only texts but also audios and videos for a deeper analysis and application of civics and history topics.
  2. Build on first languages (L1): Adults already have a well-developed language system or systems, if they know multiple languages. They use the clues from their L1 to understand how English operates. Therefore, we can provide opportunities to bring that existing linguistic knowledge to the forefront and to compare and contrast explicitly. One clear way to leverage L1 is to integrate oral language and help bridge what the adults can do orally with what they aim to do in reading and writing.
  3. Academic vocabulary: Individuals with limited or interrupted schooling tend to have lower levels of academic vocabulary in their first language, and thus, likely lower levels in English. A language learner may be quite fluent in using English in their everyday interactions, but that does not mean they have a strong academic vocabulary across different domains, such as health, math, science, civics, and finance. CILIA-T covers academic and discipline-specific vocabulary in a purposeful way. Academic vocabulary is closely related to the conceptual understanding of a phenomenon. Therefore, just learning word definitions is not enough. The vocabulary has to be contextualized with a conceptual understanding. For example, executive branch does not mean much by itself unless it is situated within an understanding of an overall governmental system. Similarly, the definition of the word mortgage may be forgotten quickly if the learner is not familiar with the loan and repayment system in the United States. Luckily, adults have a lot of background knowledge to facilitate such conceptual scaffolding, but that is for another blog.

What value do you hope CILIA-T might bring to the students, teachers, and communities?

We believe that when all individuals, but especially the newcomers, understand the systems, practices, historical contexts, and the language(s) of their society, they can become more active participants in their communities and can work towards accomplishing their life goals more effectively. We hope that CILIA-T provides the adult learners and the educators and programs that support them with a tool to facilitate this growth.


This blog was produced by Dr. Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer at NCER.

Supporting Strategic Writers: The Use of Strategy Instruction and Genre Pedagogy in the Basic Writing Classroom

NCER student volunteer, Rachael Higham, has long been interested in writing instruction. She currently works as a remedial language tutor for high school students with disabilities, and she began her graduate studies with a focus on postsecondary writing instruction. To learn more about the current science and research on writing, Rachael interviewed Dr. Charles MacArthur about his research-based postsecondary writing curriculum, Supporting Strategic Writers (SSW), which he and his team developed and evaluated through IES grants. The goal of SSW is to foster metacognitive self-evaluation through the use of strategic learning and genre-based pedagogy to help improve writing skills and self-confidence.


Take a minute to answer this question: Do you remember how you were taught to write a paper in high school or in college?

Maybe you remember the five-paragraph essay, MLA formatting, or the RACE strategy, but were you ever taught specific strategies for planning and evaluating your papers?

While I was interviewing Dr. MacArthur about his recently completed IES project, he posed a similar question to me. He asked me how I navigated writing in college and if a teacher had ever explicitly taught me how to write. I realized that while I had some explicit teaching in text structure in high school, by the time I reached college, I relied heavily on feedback to inform my future writing. The idea that students learn from revising is a common view in writing education. However, this view does not always consider students who struggle with writing and who may need more explicit instruction, even in college.

As a teacher of high school students with learning disabilities, I often find that by the time many of my students reach my classroom, they feel defeated by the writing process. Writing is something that has become a source of fear and dread for them. My goal with each student is to find and develop strategies that bolster their writing skills and change writing from something that seems unattainable to something that they can do independently. I was excited to talk to Dr. MacArthur and learn more about the research that he and his have been doing. Below are his responses to the questions I posed.

What are the key components of the SSW curriculum?

The emphasis of SSW is to enable students to take control of their own learning through rhetorical analysis of genre. To do that, students are taught explicit strategies and cognitive procedures based on what good writers do. This is reinforced with metacognitive strategies that help students become aware of why they are using specific writing strategies and procedures and recognize how and when to transfer them to other classes. SSW places emphasis on genre-based strategies not only in the text but also in the planning and evaluation phases.

The heart of strategy instruction in SSW is the “think-aloud,” which is when instructors share, in real time, the thoughts that they are experiencing as they’re writing or editing a text to show how they are figuring things out. Instructors need to show—not just explain—how to write. What we writing instructors are teaching is invisible, so the think aloud makes the process visible to students. It also lets students see that writing is hard even for their teacher. Teachers can get stuck and need to work through it based on the strategies that are being taught.

What is the number one thing that you would tell a developmental or first-year writing teacher?

Teaching strategies to students on planning and evaluating their work helps improve writing. There have been hundreds of studies from K-12 (see these meta-analyses as examples 1, 2, 3) that show how strategy instruction works to improve writing. This experimental study of SSW adds to that literature and shows that strategic instruction with genre pedagogy can work in the postsecondary developmental writing environment.

What type of future research would you like to see done with the SSW curriculum?

There is a wealth of valuable research that could be done in the future. Future research could delve into how to build on the developmental course’s gain in subsequent courses. For example, it would be interesting to look at the transition between developmental writing courses and first-year composition in terms of pedagogical integration.

Another area of transfer is between compositions courses and disciplinary writing in postsecondary settings. For example, how could postsecondary institutions improve writing across the curriculum? How could strategy instruction similar to SSW work in this setting?

Additionally, strategy instruction started in special education, but it was found to be useful throughout the entire K-12 population. Similarly, SSW was found to be successful in developmental writing classrooms. It would be great to see the effects of SSW in first year composition classes.

You can find publications from this project and the earlier SSW project in ERIC here and here respectively. The What Works Clearinghouse also reviewed an earlier evaluation of the SSW here.


This blog was written by Rachael Higham, a graduate intern through the Virtual Student Federal Service Internships program, and facilitated by Dr. Meredith Larson (Meredith.Larson@ed.gov), a research analyst and program officer at NCER.

 

From the NCER Commissioner: How IES’ Investment in Literacy Is Changing Education

A cornerstone challenge in education is that too many learners in our nation can’t read well enough to succeed in their future education and employment. In addition, a disproportionate number of individuals with low literacy skills are members of underrepresented groups. Since IES’ founding in 2002, we have devoted millions of dollars to addressing this challenge, seeking to generate high-quality evidence about literacy practices that work for learners across our nation. Today, we can see how this 20-year investment focused on improving literacy has generated interventions and assessments that are transforming practices at scale and meeting the needs of learners and educators by incorporating evidence-based practices into the materials they use daily.

Since IES is an applied research agency, its mission is to provide scientific evidence on which to ground education practice and policy and to share this information in formats that are useful and accessible to educators, parents, policymakers, researchers, and the public. IES and its four centers work together to collect data on the current state of education; identify solutions and innovations through research, grant programs, and competitions; and evaluate the success of investments in order to identify solutions worthy of scaling across the nation’s education system.

The education research community is often accused of generating findings and products that sit in an attic corner unused. We aim to disrupt this perception and make it clear that our grantees’ knowledge and tools are both useful and used. Here I want to share a few examples to showcase how American tax dollars are transforming how millions of learners are learning to read.

IES Technologies and Google Classroom

In April 2022, we were excited to learn that Google had acquired the intellectual property rights for Moby.Read and SkillCheck, education technology products developed through IES programs by California-based Analytic Measures, Inc. (AMI). Google Classroom is advertised as an educators’ “all-in-one place for teaching and learning,” and many tools and apps are integrated into the system, including the IES-developed and IES-evaluated product, ASSISTments, which provides student feedback assistance and assessment data to teachers.

Moby.Read and SkillCheck are technology solutions created to provide teachers with a more efficient way to assess their students’ reading skills and provide them with individualized feedback.  These technologies were developed over two decades with IES funding, a process that included prototype development starting in 2002, followed by ED/IES Small Business Innovation Research (SBIR) funding to test Moby.Read in 2016 and 2017 and SkillCheck in 2020 and 2021, with validation research conducted all along the way.

Since their commercial launch in 2019, Moby.Read and SkillCheck have been used for more than 30,000 student assessments in 30 states.

IES Literacy Innovations and Scholastic

In September, Scholastic announced that the A2i (Assessment to Instruction) system—a system for literacy screening, progress monitoring and assessment, and instructional planning designed for classrooms and community organizations—and the Learning Ovations team that had developed and evaluated A2i would become part of its education solutions group. A2i provides educators with a system that enables them to deliver individualized reading. IES has invested in developing and evaluating this system since 2003, generating evidence of its effectiveness in improving young learners’ reading skills and comprehension. In 2020, we interviewed the creators of this system, who told the story of how their evidence-based system was prepared to scale. The system will continue to evolve so that it can serve all learners in our nation: IES is currently supporting the expansion of this system and its assessments for use with English learners.

A2i will help enhance Scholastic’s literacy platform, which integrates literacy screening, progress monitoring and assessment, instructional planning, and professional learning with their books and e-books, print- and technology-based learning programs, and other products that support children’s learning and literacy. With this acquisition, the IES-supported A2i system will have the opportunity to reach the 115,000 schools in the Scholastic community, potentially helping 3.8 million educators, 54 million students, and 78 million parents/caregivers in the United States.

Improving Literacy Outcomes Through Assessment

Teaching students how to read depends upon knowing what learners do and do not know. The acquisition of Moby.Read and SkillCheck highlights the recognition of that need by Google but is only one example of the IES commitment to developing and validating literacy assessments. While the two examples described above have the potential to touch many millions of learners, we have also invested in many other literacy assessments that are being widely used.

For example, since 2014, more than 2.5 million 3rd to 12th grade learners have been evaluated through a reading diagnostic system developed with IES funding: the Florida Assessments for Instruction in Reading Aligned to Florida Standards. Another diagnostic tool for 3rd to 12th graders, available nationally via the Educational Testing Service (ETS), is Capti Assess with ETS® ReadBasix™. This diagnostic assessment system was developed and validated with funding from both NCER and ED/IES SBIR.

Educators in more than 13,000 U.S. schools rely on myIGDIs (currently distributed via Renaissance Learning) to evaluate the needs of their preschool learners. These individual growth and development indicators (IGDIs) are brief, easy-to-use measures of early language and literacy designed for use with preschool children. The development and validation of these measures have been (and are being) supported by multiple IES projects. Their current work seeks to expand the IGDIs for use with young Spanish-speaking and Hmong-speaking learners.

Scaling Evidence-Based Innovations to Accelerate Literacy Learning After COVID

Launched with funding from the American Rescue Plan, the Leveraging Evidence to Accelerate Recovery Nationwide Network (the LEARN Network) is adapting and preparing to scale existing, evidence-based products to assess students whose learning was affected by the COVID-19 pandemic. IES has made four awards to product teams and one to a scaling lead, and these five teams will establish the LEARN Network together.

In addition to the LEARN Network’s generating of solutions to the nation’s most pressing challenges to COVID-19 recovery within the education sector, IES expects that the combined efforts of this network will lead to the establishment of best practices for the field for how to prepare to scale evidence-based products effectively.

Three of the four product teams are focused on preparing to scale literacy products developed and tested with prior IES funding. These innovations are designed for students in grades K–3 (Targeted Reading Instruction), fourth and fifth grades (Peer-Assisted Learning Strategies), and middle school (Strategic Adolescent Reading Intervention). The projects will work with students and teachers in elementary schools in Florida and North Carolina, in fourth grade classrooms in the Rio Grande Valley in Texas, and in urban middle schools in the District of Columbia.

As I reflect on 20 years of investment in rigorous and relevant literacy research, I am hopeful. Our investment is transforming what we know and improving how that knowledge is being translated to ensure that every learner in our nation can read at or above grade level.

With our newest investment in supporting the systematic scaling of evidence-based practices, I believe that our educators and learners will have access to tools that support their needs for the next 20 years and beyond.

Elizabeth Albro (elizabeth.albro@ed.gov) is the commissioner of the National Center for Education Research.

Program for the International Assessment of Adult Competencies (PIAAC) 2022–23 Data Collection Begins

Last month, the National Center for Education Statistics (NCES) kicked off a major survey of adults (ages 16–74) across the nation to learn about their literacy skills, education, and work experience. Information collected through this survey—officially known as Cycle 2 of the Program for the International Assessment of Adult Competencies (PIAAC) in the United States—is used by local, state, and national organizations, government entities, and researchers to learn about adult skills at the state and local levels (explore these data in the PIAAC Skills Map, shown below).


Image of PIAAC Skills Map on state and county indicators of adult literacy and numeracy


Specifically, these data are used to support educational and training initiatives organized by local and state programs. For example, the Houston Mayor’s Office for Adult Literacy has used the PIAAC Skills Map data in developing the Adult Literacy Blueprint, a comprehensive plan for coordinated citywide change to address the systemic crisis of low literacy and numeracy in the city. In addition, the Kentucky Career and Technical College System developed a comprehensive data-driven app for workforce pipeline planning using the county-level PIAAC Skills Map data as one of the education pipeline indicators.

This is not the first time NCES is administering PIAAC. NCES collected PIAAC data three times between 2011 and 2017, when the first cycle of this international study was administered in 39 countries. Developed by the Organization for Economic Cooperation and Development (OECD), PIAAC measures fundamental cognitive and workplace skills needed for individuals to participate in society and for economies to prosper. Among these fundamental skills are literacy, numeracy, and digital problem-solving. Data from the first cycle of PIAAC (2011–17) provided insights into the relationships between adult skills and various economic, social, and health outcomes—both across the United States as a whole and for specific populations of interest (e.g., adults who are women, immigrants, older, employed, parents, or incarcerated). The OECD and NCES have published extensively using these data.

The current cycle (Cycle 2) of PIAAC will resemble the first cycle in that interviewers will visit people’s homes to ask if they are willing to answer background questionnaire and take a self-administered test of their skills. However, unlike the first cycle when respondents could respond to the survey on paper or on a laptop, this cycle will be conducted entirely on a tablet. PIAAC is completely voluntary, but each respondent is specifically selected to provide invaluable information that will help us learn about the state of adult skills in the country (participants can also receive an incentive payment for completing the survey).

PIAAC’s background questionnaire includes questions about an individual’s demographics, family, education, employment, skill use, and (new in Cycle 2 and unique to the United States) financial literacy. The PIAAC test or “direct assessment” measures literacy, numeracy, and (new in Cycle 2) adaptive problem-solving skills of adults.1

Each sampled person’s response is not only kept confidential but also “anonymized” before the data are released (so that no one can ever definitively identify an individual from personal characteristics in the datafile).

The international report and data for PIAAC Cycle 2 is scheduled to be released by the OECD in December 2024.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on PIAAC report and data releases and resources.

 

By Saida Mamedova, AIR, Stephen Provasnik, NCES, and Holly Xie, NCES


[1] Data is collected from adults ages 16–74 in the United States and ages 16–65 in the other countries.

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.