IES Blog

Institute of Education Sciences

From the NCER Commissioner: How IES’ Investment in Literacy Is Changing Education

A cornerstone challenge in education is that too many learners in our nation can’t read well enough to succeed in their future education and employment. In addition, a disproportionate number of individuals with low literacy skills are members of underrepresented groups. Since IES’ founding in 2002, we have devoted millions of dollars to addressing this challenge, seeking to generate high-quality evidence about literacy practices that work for learners across our nation. Today, we can see how this 20-year investment focused on improving literacy has generated interventions and assessments that are transforming practices at scale and meeting the needs of learners and educators by incorporating evidence-based practices into the materials they use daily.

Since IES is an applied research agency, its mission is to provide scientific evidence on which to ground education practice and policy and to share this information in formats that are useful and accessible to educators, parents, policymakers, researchers, and the public. IES and its four centers work together to collect data on the current state of education; identify solutions and innovations through research, grant programs, and competitions; and evaluate the success of investments in order to identify solutions worthy of scaling across the nation’s education system.

The education research community is often accused of generating findings and products that sit in an attic corner unused. We aim to disrupt this perception and make it clear that our grantees’ knowledge and tools are both useful and used. Here I want to share a few examples to showcase how American tax dollars are transforming how millions of learners are learning to read.

IES Technologies and Google Classroom

In April 2022, we were excited to learn that Google had acquired the intellectual property rights for Moby.Read and SkillCheck, education technology products developed through IES programs by California-based Analytic Measures, Inc. (AMI). Google Classroom is advertised as an educators’ “all-in-one place for teaching and learning,” and many tools and apps are integrated into the system, including the IES-developed and IES-evaluated product, ASSISTments, which provides student feedback assistance and assessment data to teachers.

Moby.Read and SkillCheck are technology solutions created to provide teachers with a more efficient way to assess their students’ reading skills and provide them with individualized feedback.  These technologies were developed over two decades with IES funding, a process that included prototype development starting in 2002, followed by ED/IES Small Business Innovation Research (SBIR) funding to test Moby.Read in 2016 and 2017 and SkillCheck in 2020 and 2021, with validation research conducted all along the way.

Since their commercial launch in 2019, Moby.Read and SkillCheck have been used for more than 30,000 student assessments in 30 states.

IES Literacy Innovations and Scholastic

In September, Scholastic announced that the A2i (Assessment to Instruction) system—a system for literacy screening, progress monitoring and assessment, and instructional planning designed for classrooms and community organizations—and the Learning Ovations team that had developed and evaluated A2i would become part of its education solutions group. A2i provides educators with a system that enables them to deliver individualized reading. IES has invested in developing and evaluating this system since 2003, generating evidence of its effectiveness in improving young learners’ reading skills and comprehension. In 2020, we interviewed the creators of this system, who told the story of how their evidence-based system was prepared to scale. The system will continue to evolve so that it can serve all learners in our nation: IES is currently supporting the expansion of this system and its assessments for use with English learners.

A2i will help enhance Scholastic’s literacy platform, which integrates literacy screening, progress monitoring and assessment, instructional planning, and professional learning with their books and e-books, print- and technology-based learning programs, and other products that support children’s learning and literacy. With this acquisition, the IES-supported A2i system will have the opportunity to reach the 115,000 schools in the Scholastic community, potentially helping 3.8 million educators, 54 million students, and 78 million parents/caregivers in the United States.

Improving Literacy Outcomes Through Assessment

Teaching students how to read depends upon knowing what learners do and do not know. The acquisition of Moby.Read and SkillCheck highlights the recognition of that need by Google but is only one example of the IES commitment to developing and validating literacy assessments. While the two examples described above have the potential to touch many millions of learners, we have also invested in many other literacy assessments that are being widely used.

For example, since 2014, more than 2.5 million 3rd to 12th grade learners have been evaluated through a reading diagnostic system developed with IES funding: the Florida Assessments for Instruction in Reading Aligned to Florida Standards. Another diagnostic tool for 3rd to 12th graders, available nationally via the Educational Testing Service (ETS), is Capti Assess with ETS® ReadBasix™. This diagnostic assessment system was developed and validated with funding from both NCER and ED/IES SBIR.

Educators in more than 13,000 U.S. schools rely on myIGDIs (currently distributed via Renaissance Learning) to evaluate the needs of their preschool learners. These individual growth and development indicators (IGDIs) are brief, easy-to-use measures of early language and literacy designed for use with preschool children. The development and validation of these measures have been (and are being) supported by multiple IES projects. Their current work seeks to expand the IGDIs for use with young Spanish-speaking and Hmong-speaking learners.

Scaling Evidence-Based Innovations to Accelerate Literacy Learning After COVID

Launched with funding from the American Rescue Plan, the Leveraging Evidence to Accelerate Recovery Nationwide Network (the LEARN Network) is adapting and preparing to scale existing, evidence-based products to assess students whose learning was affected by the COVID-19 pandemic. IES has made four awards to product teams and one to a scaling lead, and these five teams will establish the LEARN Network together.

In addition to the LEARN Network’s generating of solutions to the nation’s most pressing challenges to COVID-19 recovery within the education sector, IES expects that the combined efforts of this network will lead to the establishment of best practices for the field for how to prepare to scale evidence-based products effectively.

Three of the four product teams are focused on preparing to scale literacy products developed and tested with prior IES funding. These innovations are designed for students in grades K–3 (Targeted Reading Instruction), fourth and fifth grades (Peer-Assisted Learning Strategies), and middle school (Strategic Adolescent Reading Intervention). The projects will work with students and teachers in elementary schools in Florida and North Carolina, in fourth grade classrooms in the Rio Grande Valley in Texas, and in urban middle schools in the District of Columbia.

As I reflect on 20 years of investment in rigorous and relevant literacy research, I am hopeful. Our investment is transforming what we know and improving how that knowledge is being translated to ensure that every learner in our nation can read at or above grade level.

With our newest investment in supporting the systematic scaling of evidence-based practices, I believe that our educators and learners will have access to tools that support their needs for the next 20 years and beyond.

Elizabeth Albro (elizabeth.albro@ed.gov) is the commissioner of the National Center for Education Research.

Program for the International Assessment of Adult Competencies (PIAAC) 2022–23 Data Collection Begins

Last month, the National Center for Education Statistics (NCES) kicked off a major survey of adults (ages 16–74) across the nation to learn about their literacy skills, education, and work experience. Information collected through this survey—officially known as Cycle 2 of the Program for the International Assessment of Adult Competencies (PIAAC) in the United States—is used by local, state, and national organizations, government entities, and researchers to learn about adult skills at the state and local levels (explore these data in the PIAAC Skills Map, shown below).


Image of PIAAC Skills Map on state and county indicators of adult literacy and numeracy


Specifically, these data are used to support educational and training initiatives organized by local and state programs. For example, the Houston Mayor’s Office for Adult Literacy has used the PIAAC Skills Map data in developing the Adult Literacy Blueprint, a comprehensive plan for coordinated citywide change to address the systemic crisis of low literacy and numeracy in the city. In addition, the Kentucky Career and Technical College System developed a comprehensive data-driven app for workforce pipeline planning using the county-level PIAAC Skills Map data as one of the education pipeline indicators.

This is not the first time NCES is administering PIAAC. NCES collected PIAAC data three times between 2011 and 2017, when the first cycle of this international study was administered in 39 countries. Developed by the Organization for Economic Cooperation and Development (OECD), PIAAC measures fundamental cognitive and workplace skills needed for individuals to participate in society and for economies to prosper. Among these fundamental skills are literacy, numeracy, and digital problem-solving. Data from the first cycle of PIAAC (2011–17) provided insights into the relationships between adult skills and various economic, social, and health outcomes—both across the United States as a whole and for specific populations of interest (e.g., adults who are women, immigrants, older, employed, parents, or incarcerated). The OECD and NCES have published extensively using these data.

The current cycle (Cycle 2) of PIAAC will resemble the first cycle in that interviewers will visit people’s homes to ask if they are willing to answer background questionnaire and take a self-administered test of their skills. However, unlike the first cycle when respondents could respond to the survey on paper or on a laptop, this cycle will be conducted entirely on a tablet. PIAAC is completely voluntary, but each respondent is specifically selected to provide invaluable information that will help us learn about the state of adult skills in the country (participants can also receive an incentive payment for completing the survey).

PIAAC’s background questionnaire includes questions about an individual’s demographics, family, education, employment, skill use, and (new in Cycle 2 and unique to the United States) financial literacy. The PIAAC test or “direct assessment” measures literacy, numeracy, and (new in Cycle 2) adaptive problem-solving skills of adults.1

Each sampled person’s response is not only kept confidential but also “anonymized” before the data are released (so that no one can ever definitively identify an individual from personal characteristics in the datafile).

The international report and data for PIAAC Cycle 2 is scheduled to be released by the OECD in December 2024.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on PIAAC report and data releases and resources.

 

By Saida Mamedova, AIR, Stephen Provasnik, NCES, and Holly Xie, NCES


[1] Data is collected from adults ages 16–74 in the United States and ages 16–65 in the other countries.

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.

Investing in Next Generation Technologies for Education and Special Education

The Department of Education’s (ED) Small Business Innovation Research (SBIR) program, administered by the Institute of Education Sciences (IES), funds entrepreneurial developers to create the next generation of technology products for students, teachers, and administrators in education and special education. The program, known as ED/IES SBIR, emphasizes an iterative design and development process and pilot research to test the feasibility, usability, and promise of new products to improve outcomes. The program also focuses on planning for commercialization so that the products can reach schools and end-users and be sustained over time.

In recent years, millions of students in tens of thousands of schools around the country have used technologies developed through ED/IES SBIR, including more than million students and teachers who used products for remote teaching and learning during the COVID-19 pandemic.

ED/IES SBIR Announces 2022 Awards

IES has made 10 2022 Phase I awards for $250,000*. During these 8 month projects, teams will develop and refine prototypes of new products and test their usability and initial feasibility. All awardees who complete a Phase I project will be eligible to apply for a Phase II award in 2023.

IES has made nine 2022 Phase II awards, which support further research and development of prototypes of education technology products that were developed under 2021 ED/IES SBIR Phase I awards. In these Phase II projects, teams will complete product development and conduct pilot studies in schools to demonstrate the usability and feasibility, fidelity of implementation, and the promise of the products to improve the intended outcomes.

IES also made one Direct to Phase II award to support the research, development, and evaluation of a new education technology product to ready an existing researcher-developed evidence-based intervention for use at scale and to plan for commercialization. The Direct to Phase II project is awarded without a prior Phase I award. All Phase II and the Direct to Phase II awards are for $1,000,000 for two-years. Across all awards, projects address different ages of students and content areas.

The list of all 2022 awards is posted here. This page will be updated with the two additional Phase I awards after the contracts are finalized.

 

 

The 2022 ED/IES SBIR awards highlight three trends that continue to emerge in the field of education technology.

Trend 1: Projects Are Employing Advanced Technologies to Personalize Learning and Generate Insights to Inform Tailored Instruction

About two-thirds of the new projects are developing software components that personalize teaching and learning, whether through artificial intelligence, machine learning, natural language processing, automated speech recognition, or algorithms. All these projects will include functionalities afforded by modern technology to personalize learning by adjusting content to the level of the individual learner, offer feedback and prompts to scaffold learning as students progress through the systems, and generate real-time actionable information for educators to track and understand student progress and adjust instruction accordingly. For example:

  • Charmtech Labs and Literably are fully developing reading assessments that provide feedback to inform instruction.
  • Sirius Thinking and studio:Sckaal are developing prototypes to formatively assess early grade school students in reading.
  • Sown To Grow and xSEL Labs are fully developing platforms to facilitate student social and emotional assessments and provide insights to educators.
  • Future Engineers is fully developing a platform for judges to provide feedback to students who enter STEM and educational challenges and contests.
  • Querium and 2Sigma School are developing prototypes to support math and computer science learning respectively.
  • ,Soterix is fully developing a smart walking cane and app for children with visual impairments to learn to navigate.
  • Alchemie is fully developing a product to provide audio cues to blind or visually impaired students learning science.
  • Star Autism Support is developing a prototype to support practitioners and parents of children with autism spectrum disorder.

Trend 2: Projects Focusing on Experiential and Hands-On Learning
Several new projects are combining hardware and software solutions to engage students through pedagogies employing game-based, hands-on, collaborative, or immersive learning:

  • Pocketlab is fully developing a matchbox-sized car with a sensor to collect physical science data as middle school students play.
  • GaiaXus is developing a prototype sensor used for environmental science field experiments.
  • Mind Trust is a developing a virtual reality escape room for biology learning.
  • Smart Girls is developing a prototype science game and accompanying real-world hands-on physical activity kits.
  • Indelible Learning is developing a prototype online multi-player game about the electoral college.
  • Edify is fully developing a school-based program for students to learn about, create, and play music.

Trend 3: Projects to Advance Research to Practice at Scale

Several new awards will advance existing education research-based practices into new technology products that are ready to be delivered at scale:

  • INSIGHTS is fully developing a new technology-delivered version to ready an NIH- and IES-supported social and emotional intervention for use at scale.
  • xSEL Laband Charmtech Labs (noted above) are building on prior IES-funded research-based interventions to create scalable products.
  • Scrible is developing an online writing platform in partnership with the National Writers Project based on prior Department of Education-funded research. 

 


*Note: Two additional 2022 Phase I awards are forthcoming in 2022. The contracts for these awards are delayed due to a back-up in the SAM registration process.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Edward Metz (Edward.Metz@ed.gov) is the Program Manager of the ED/IES SBIR program.

Michael Leonard (Michael.Leonard@ed.gov) is the Program Analyst of the ED/IES SBIR program.

 

Improving Academic Achievement through Instruction in Self-Regulated Strategy Development: The Science Behind the Practice

Self-Regulated Strategy Development (SRSD) is an evidence-based instructional approach characterized by active, discussion-based, scaffolded, and explicit learning of knowledge of the writing process; general and genre-specific knowledge; academic vocabulary; and validated strategies for teaching reading and writing. IES has supported multiple research studies on SRSD for students with learning disabilities in K-12 and postsecondary general education settings. SRSD is used in as many as 10,000 classrooms across the United States and in 12 other countries. In this interview blog, we spoke with Dr. Karen Harris, the developer of SRSD, to learn more about this effective instructional strategy, the IES research behind it, and next steps for further scaling of SRSD so that more students can benefit.

What led you to develop the Self-Regulated Strategy Development model?

Photo of Karen Harris

I began developing what became the SRSD model of instruction in the 1980s, based on my experiences tutoring and teaching. No one theory could address all of what I needed to do as a teacher, or all that my students needed as learners. SRSD instruction pulls together what has been learned from research across theories of learning and teaching. It is a multicomponent instructional model that addresses affective, behavioral, and cognitive aspects of learning. Further, SRSD instruction is intended to take place in inclusive classrooms, is discourse-driven, integrates social-emotional supports, and involves learning in whole class and group settings with peer collaboration. SRSD research started in writing because Steve Graham (my husband and colleague) was deeply interested in writing, and we co-designed the initial studies. Today, SRSD instruction research exists across a wide variety of areas, such as reading comprehension, mathematical problem solving, fractions, social studies, and science.

What are some of the key findings about this instructional strategy?

SRSD has been recognized by the What Works Clearinghouse (WWC) as an evidence-based practice with  consistently positive effects on writing outcomes.  A 2013 meta-analysis of SRSD for writing found that SRSD was effective across different research teams, different methodologies, differing genres of writing (such as narrative or persuasive), and students with diverse needs including students with learning disabilities and emotional and behavioral disorders. Effect sizes in SRSD research are typically large, exceeding .85 in meta-analyses and commonly ranging from 1.0 to 2.55 across writing and affective outcome measures.

Over the years, IES has supported a number of studies on SRSD, which has led to some key findings that have practical implications for instruction from elementary school through college.

Do you know how many teachers use SRSD in their classrooms?

It is hard to be sure how prevalent SRSD instruction is in practice, but there are two groups dedicated to scaling up SRSD in schools— thinkSRSD and SRSD Online—both of which I voluntarily advise. Together, they have reached over 300,000 students and their teachers in the United States. In addition, I am following or in touch with researchers or teachers in 12 countries across Europe, North America, Australia, Africa, Asia, and the Middle East.

What’s next for research on SRSD?  

Many students have difficulty writing by the time they get to upper elementary school. Currently, there is an ongoing development project that is adapting and testing SRSD for children in the lower elementary grades to support their oral language skills, transcription, and writing strategy skills. The research team is in the process of conducting a small-scale randomized controlled study and will have findings soon.

Beyond this study, there are many future directions for SRSD research, including further work in different genres of writing, different grades, and involving families in the SRSD process. More work on how to integrate SRSD strategies into instruction across content areas, such as social studies or science is also needed. Despite the evidence base for and interest in SRSD, a major challenge is scaling up SRSD in schools. We and other researchers have identified numerous barriers to this goal. We also need research on working with administrators, schools, and teachers to use writing more effectively as a tool for self-expression, self-advocacy, and social and political engagement. Writing can also be an important and effective means of addressing issues of equity and identity, and little SRSD research has been done in these areas.

Dr. Karen Harris is Regents Professor and the Mary Emily Warner Professor at Arizona State University’s Mary Lou Fulton Teachers College. Her current research focuses on refining a web-based intelligent tutor to augment SRSD instruction with elementary students in persuasive writing, integrating SRSD with reading to learn and writing to inform, developing a Universal Design for Learning Science Notebook, and developing practice-based professional development for SRSD.

This blog was produced by Julianne Kasper, Virtual Student Federal Service intern at IES and graduate student in education policy & leadership at American University.