IES Blog

Institute of Education Sciences

Pennsylvania Student Proficiency Rates Rebound Partially from COVID-19-Related Declines

Given the magnitude of the disruption the COVID-19 pandemic caused to education practices, there has been considerable interest in understanding how the pandemic may have affected student proficiency. In this guest blog, Stephen Lipscomb, Duncan Chaplin, Alma Vigil, and Hena Matthias of Mathematica discuss their IES-funded grant project, in partnership with the Pennsylvania Department of Education (PDE), that is looking at the pandemic’s impacts in Pennsylvania.  

The onset of the COVID-19 pandemic in spring 2020 brought on a host of changes to K–12 education and instruction in Pennsylvania. Many local education agencies (LEAs) instituted remote learning and hybrid schedules as their primary mode of educating students, while others maintained in-person learning. Statewide assessments, which were suspended in spring 2020, resumed in 2021 with low participation rates, particularly among students with lower performance before the pandemic. Furthermore, test administration dates varied from spring 2021 to fall 2021. Pennsylvania statewide assessment data reveal that student proficiency rates may have rebounded in 2022, despite remaining below pre-pandemic levels. In grades 5–8, there was a marked increase in proficiency in English language arts (ELA) and a slightly smaller increase in proficiency in math compared to 2021 proficiency rates predicted in recent research. Despite these gains, increasing student proficiency rates to pre-pandemic levels will require additional efforts.

The Pennsylvania Department of Education (PDE) has been committed to providing LEAs with the resources and support necessary to help students achieve pre-pandemic academic proficiency rates. To learn more about changes in how those rates may have been associated with the pandemic, PDE and Mathematica partnered to explore trends in student proficiency data for students in grades 5–8. Given the lower and nonrepresentative participation in the 2021 statewide assessments, as well as the differences in when LEAs administered the assessments, we developed a predictive model of statewide proficiency rates for spring 2021 to produce predicted proficiency rates that would be more comparable to previous and future years. The results revealed that steep declines in proficiency likely occurred between 2019 and 2021 (see Figure 1 below). By spring 2022, proficiency rates in grades 5–8 regained 6 percentage points of their 10 percentage point drop in ELA and nearly 5 percentage points of their 13 percentage point drop in math. Taken together, these results suggest that although the pandemic may have originally been associated with declines in students’ academic proficiency, over time, student proficiency might move back towards pre-pandemic levels.

 

Figure 1. Actual and predicted proficiency rates in grades 5–8 in Pennsylvania, 2015–2022

The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle. Notes: Open circle indicates Statewide assessment cancelled; closed circle indicates predicted proficiency rate. The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle.

Source: Data from 2015–2019 and 2022 are from the Pennsylvania Department of Education. The 2021 data are predicted proficiency rates from Lipscomb et al. (2022a). The figure originally appeared in Lipscomb et al. (2022b).  

 

The next steps for this project will include a strong focus on dissemination of our findings. For example, we will develop a research brief that describes the role of remote learning in shaping academic outcomes beyond proficiency rates and community health outcomes during the pandemic. The findings will help PDE and LEAs refine strategies for supporting vulnerable students and help state policymakers and educators learn from the COVID-19 pandemic—specifically how it might have affected student outcomes and educational inequities.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner for Policy and Systems Division, NCER.

What do Geometry Projects and Pie Have in Common?

On March 14, many students across the country will be paying homage to one of the most important irrational numbers – Pi (or π), the mathematical constant that represents the ratio between a circle’s circumference and its diameter. Here at IES, we are celebrating Pi Day in two ways – one, by highlighting two projects that are helping students better understand and apply pi and other geometry concepts, and two, by daydreaming about the other kind of pie (more on that below).

Highlighting Two IES-Funded Geometry Projects

Julie Booth (Temple University) and colleagues are developing and testing an intervention, GeometryByExample, to improve student learning of high school geometry. GeometryByExample provides strategically designed, worked-example based assignments of varied geometric content for students to complete in class in place of more typical practice assignments. Instead of solving all of the practice problems themselves, students study correct or incorrect examples of solutions to half of the problems and respond to prompts asking them to write explanations of why those procedures are correct or incorrect.

Candace Walkington and colleagues are exploring how the interaction between collaboration and multisensory experiences affects geometric reasoning. They are using augmented reality (AR) technology to explore different modalities for math learning, such as a hologram, a set of physical manipulatives, a dynamic geometry system (DGS) on a tablet, or a piece of paper. Each of these modalities has different affordances, including the degree to which they can represent dynamic transformations, represent objects and operations in 3 dimensions, support joint attention, and provide situational feedback. Researchers have developed an experimental platform that uses AR and situates experimental tasks in an engaging narrative story. The overarching research questions they are exploring are (1) How does shared AR impact student understanding of geometric principles? (2) how are these effects mediated by gesture, language, and actions? and (3) how are these effects moderated by student and task characteristics?

The Other Kind of Pi(e)

Baking pies with the pi symbol is another fun way to celebrate the day, so in the spirit of that, we asked NCER and NCSER staff about their favorite pie flavors. Below is a pie graph with the results – not much consensus on flavor, but it’s clear we all love pie. Happy Pi Day!


This chart shows NCER and NCSER staff pie preferences in percentages: Apple 19%, Cherry 19%, Chocolate 12%, Key Lime 12%, Peach 6%, Pecan 19%, and Rhubarb 13%.

 


Written by Christina Chhin (Christina.Chhin@ed.gov) and Erin Higgins (Erin.Higgins@ed.gov), NCER Program Officers.

Innovative Approaches to High Quality Assessment of SEL Skills

In celebration of IES’s 20th anniversary and SEL Day, we are highlighting NCER’s investments in field-initiated research. In this blog, program officer Dr. Emily Doolittle discusses a persistent barrier to supporting social and emotional learning (SEL) in schools—the lack of high quality, reliable, and valid SEL assessments—and the innovative research supported by IES to tackle this challenge.

High quality measurement is critical for education research and practice. Researchers need valid and reliable assessments to answer questions about what works for whom and why. Schools use assessments to guide instruction, determine student response to intervention, and for high-stakes decision-making such as provision of special education services.

For social and emotional learning (SEL), assessment can be particularly challenging due to lack of precision in defining core SEL competencies. One consequence of this imprecision is that measures and intervention targets are often misaligned. SEL assessment also tends to rely on student, teacher, and parent reports despite the lack of agreement among reporters and the potential for biased responding. Through NCER, IES is supporting the development and validation of SEL measures using new technologies and approaches to address some of these challenges. Here are some examples of this innovative measurement work.

  • SELweb is a web-based direct assessment of four specific SEL skills - emotion recognition, social perspective taking, social problem solving, and self-control. It is available for use with elementary school students in grades K-3 and 4-6 with a middle school version currently under development. The SEL Quest Digital Platform will support school-based implementation of SELweb and other SEL assessments with an instrument library and a reporting dashboard for educators.
  • vSchool uses a virtual reality (VR) environment to assess prosocial skills. Students in 4th to 6th grades build their own avatar to interact with other characters in school settings using menu-driven choices for prosocial (helping, encouraging, sharing) and non-prosocial (aggressive, bullying, teasing) behaviors.
  • VESIP (Virtual Environment for Social Information Processing) also uses a VR school environment with customizable avatars to assess 3rd through 7th grade students’ social information processing in both English and Spanish.

Other assessments take a different approach to the challenges of SEL measurement by looking for ways to improve self, teacher, and parent reports.

  • In Project MIDAS, the research team is creating a system to integrate the different information provided by students, teachers, and parents to see if combining these reports will lead to more accurate identification of middle school students with SEL needs.
  • In Project EASS-E, the researchers are creating a teacher-report measure that will incorporate information about a child’s environment (e.g., neighborhood and community context) to better support elementary school students’ needs.

Please check out IES’s search tool to learn more about the wide variety of measurement research we fund to develop and validate high quality assessments for use in education research and practice.


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research

 

Educational Diagnostician Promotes Knowledge of IES-Supported Research on Measurement and Interventions for Learning Disabilities

This week, Texas celebrates Educational Diagnosticians’ Week. In recognition, NCSER highlights the important work that one Texas-based educational diagnostician, Mahnaz (Nazzie) Pater-Rov, has been doing to disseminate information from IES researchers to practitioners on improving reading outcomes.

Nazzie conducts assessments of students who have been referred for testing within multi-tiered systems of support (MTSS) to determine whether they have a learning disability (LD) and makes recommendations for intervention/instruction to improve their literacy and achieve their Individualized Education Plan goals. Working in this field requires an understanding of district/school policies and research-based evidence on identifying students with disabilities. To do this, Nazzie has immersed herself in current research by reading many of the resources IES provides through the What Works Clearinghouse and IES-funded grants so that she can use valid measures and recommend evidence-based interventions. After 16 years in the profession, Nazzie has realized that she is not alone and wants to help other diagnosticians understand the latest developments in LD identification and intervention. Nazzie uses a social media audio application called Clubhouse to share what she is learning, including hosting researchers for chats to present current work on related topics. Nazzie’s chat room is called ED. DIAGNOSTICIANS and has over 900 members, mostly education diagnosticians. Some of her speakers have been IES-funded researchers.  

 

Date

Title

Researcher (Link to IES Grants)

1/13/2023

Are Subtypes of Dyslexia Real?

Jack Fletcher, University of Houston

6/17/2022

Efforts to Reduce Academic Risk at the Meadows Center

Sarah Powell, University of Texas at Austin

6/3/2022

Bringing the Dyslexia Definition in to Focus

Jeremy Miciak, University of Houston

5/27/22

Pinpointing Needs with Academic Screeners

Nathan Clemens, University of Texas at Austin

3/4/2022

Using EasyCBMs in our Evaluation Reports

Julie Alonzo, University of Oregon

 

We asked Nazzie to share some of her top concerns and recommendations for research.

Responses have been edited for brevity and clarity.

What stimulated your desire to bring about changes not only in your school but across the state?

When Texas removed its cap on the number of students that could be identified as in need of special education, and districts changed procedures for identifying need, we started to experience a “tsunami” of referrals. Now we are creating a whole population of children identified with LDs without also simultaneously looking at ways to improve our system of policies, procedures, and instruction to ensure we meet the needs of all students through preventative practices.

How has the role of education diagnostician changed since the reauthorization of IDEA (2004)?

Prior to the reauthorization of IDEA, we would compare a student’s IQ with their academic performance. If there was a discrepancy, they were identified as LD. Many states now use a pattern of strengths and weaknesses (PSW) for identification, which is based on multiple measures of cognitive processes.

In Texas, there is also an increased demand for the specialized, evidence-based instruction now that we are better understanding how to identify students as LD and parents are seeing the need for identification and services for their children. However, this has led to doubling the LD identification rate in many districts. This, in turn, is increasing our caseloads and burning us out!

Some experts in the field advocate for using a tiered systems approach, such as MTSS, to identify when a student is not responding to instruction or intervention rather than relying only on the PSW approach. However, the challenge is that there are not enough evidence-based interventions in place across the tiers within MTSS for this identification process to work. In other words, can students appropriately be identified as not responding to instruction when evidence-based interventions are not being used? By not making these types of evidence-based interventions accessible at younger ages to general education students within MTSS, I worry that we are just helping kids tread water when we could have helped them learn to swim earlier.

What are your recommendations for systemic reform?

We need to find a better way to weave intervention implementation into teachers’ everyday practice so it is not viewed as “extra work.” Tiered models are general education approaches to intervention, but it is important for special education teachers and educational diagnosticians to also be involved. My worry is that diagnosticians, including myself, are helping to enable deficit thinking by reinforcing the idea that the child’s performance is more a result of their inherited traits rather than a result of instruction when, instead, we could focus our energy on finding better ways to provide instruction. Without well-developed tiered models, I worry that we end up working so hard because what we are doing is not working.

Are there specific training needs you feel exist for education diagnosticians?

Many new diagnosticians are trained on tools or methods that are outdated and no longer relevant to current evidence-based testing recommendations. This is a problem because instructional decisions can only be as good as the data on which they are based. We need training programs that enable us to guide school staff in selecting the appropriate assessments for specific needs. If diagnosticians were trained in data-based individualization or curriculum-based measures for instructional design rather than just how to dissect performance on subtests of cognitive processing (the PSW approach), they could be helping to drive instruction to improve student outcomes. The focus of an assessment for an LD should not be on a static test but be on learning, which is a moving target that cannot be measured in one day. 

What feedback do you have for education funding agencies?

Implementing a system of academic interventions is challenging, especially after COVID-19, where social-emotional concerns and teacher shortages remain a top priority in many schools. Funding agencies should consider encouraging more research on policies and processes for the adoption of evidence-based interventions. Diagnosticians can be important partners in the effort.

This blog was authored by Sarah Brasiel (Sarah.Brasiel@ed.gov), program officer at NCSER. IES encourages special education researchers to partners with practitioners to submit to our research grant funding opportunities

Adult Education and Foundational Skills Grantee Spotlight: Dr. Daphne Greenberg’s Advice for New Researchers

As part of the IES 20th Anniversary, NCER is reflecting on the past, present, and future of adult education research. In this blog, Dr. Daphne Greenberg, Distinguished University Professor at Georgia State University, reflects on her connection to adult education and gives advice to researchers interested in this field. Dr. Greenberg has served as the principal investigator (PI) on three NCER grants, including the Center for the Study of Adult Literacy, and is also co-PI on three current NCER grants (1, 2, 3). She helped found the Georgia Adult Literacy Advocacy group and the Literacy Alliance of Metro Atlanta and has tutored adult learners and engaged in public awareness about adult literacy, including giving a TedTalk: Do we care about us? Daphne Greenberg at TEDxPeachtree.

Head Shot of Daphne GreenbergWhat got you interested in adult education research?

During graduate school, I was a volunteer tutor for a woman who grew up in a southern sharecropper family, did not attend school, and was reading at the first-grade level. Her stories helped me understand why learning was important to her. For example, her sister routinely stole money from her bank account because she couldn’t balance her checkbook.

I began wondering whether adults and children reading at the same level had similar strengths and weaknesses and whether the same word-reading components were equally important for them. I later published an article that became a classic in adult literacy research about this.

Over the years, I have grown to admire adult learners for their unique stories and challenges and am deeply impressed with their “grit” and determination even when faced with difficulties. When I watch a class of native-born adults reading below the 8th grade levels, I am inspired by them and yet deeply conflicted about our K-12 system and how many students aren’t getting what they need from it.

How does your personal experience influence your perspective?

I think my childhood and family planted the seeds. My grandfather ran a grocery store but had only a third-grade education. My parents were immigrants who worked hard to navigate a new culture and language, and I struggled with reading in English and English vocabulary growing up. As a result, I understand how people hide and compensate for academic weaknesses.

Also, my brother has profound autism. As a child, I insisted that I could teach him many skills, and I did. This taught me patience and the joy one feels when even the smallest gain is made.

As an adult, I mess up idioms, use Hebraic sentence structure, and need help with editing. I also have a visual condition that causes me to miss letters when I read and write. These difficulties help me relate to the struggles of adult learners. I often feel deep embarrassment when I make mistakes. But I am very fortunate that I have colleagues who celebrate my strengths and honor my weaknesses. Not all of our adult learners are as fortunate.

What should researchers new to adult education know about the system?

Adult education serves students with significant needs and numerous goals—from preparing for employment or postsecondary education to acquiring skills needed to pass the citizenship exam or helping their children with homework. But the adult education system has less public funding than K-12 or postsecondary systems.

Many of the educators are part-time or volunteers and may not have a background in teaching—or at least in teaching adults. There just aren’t the same level of professional development opportunities, technological and print instructional resources, infrastructure, or supporting evidence-based research that other education systems have. 

What should researchers know about adult learners?

As a starting point, here are three things that I think researchers should know about adult learners:

  • What it means to live in poverty. For example, I once worked with a researcher who, when told that adult learners wouldn’t have access to the internet, replied “That’s not an issue. They can take their laptops to a Starbucks to access the Internet.”
  • That adult learners are motivated. The fact that they have inconsistent attendance does not mean that they are not motivated. It means that they have difficult lives, and if we were in their shoes, we would also have difficulty attending on a regular basis.
  • That adult learners’ oral vocabulary often matches their reading vocabulary. If you want adult learners to understand something, such as informed consents, realize that their oral vocabulary often is very similar to their reading grade equivalencies and consider simplifying complex vocabulary and syntax structure.

What specific advice do you have about conducting research with adult learners?

Testing always takes longer to complete than anticipated. I never pull students out from classes for testing because their class time is so precious. So they have to be available after or before class to participate in research, and this can be problematic. We often need to reschedule an assessment because public transportation is late, a job shift suddenly changes, or a family member is sick.

Finding enough of particular types of students is difficult because sites often attract different demographics. For example, one site may have primarily 16- and 17-year-olds, another site may have mostly non-native speakers, and another site may have either lower- or higher-skilled adult learners.

Having a “clean” comparison group at the same site is challenging because of intervention “leakage” to nonintervention teachers.  Adult education teachers are often so hungry for resources that they may try to peak into classrooms while an intervention is in process, get access to materials, or otherwise learn about the intervention. Their desire for anything that might help students makes contamination a concern.  

What areas of adult education could use more research?

I think that policymakers and practitioners would benefit from many areas of research, but two come to mind.

  • How to measure outcomes and demonstrate “return”: Many funding agencies require “grade level” growth, but it can take years for skills to consolidate and manifest as grade level change. In the meantime, adults may have found a job, gotten promoted, feel more comfortable interacting with their children’s schools, voted for the first time, etc. Are we measuring the right things in the right way? Are we measuring the things that matter to students, programs, and society? Should life improvements have equal or even more weight than growth on standardized tests? After how much time should we expect to see the life improvements (months, years, decades)?
  • How to create useful self-paced materials for adults who need to “stop-out”: Due to the complexities of our learners’ lives, many have to “stop-out” for a period before resuming class attendance. These adults would benefit from resources that they could use on their own, at their own pace during this time. What is the best practice for delivery of these types of resources? Does this “best practice” depend on the adult’s ability level? Does it depend on the content area? 

Any final words for researchers new to adult education?

I extend a warm welcome to anyone interested in research with adult learners. You will discover that many adult learners are eager to participate in research studies. They feel good helping researchers with their work and are hopeful that their time will help them or be of help to future learners. I highly recommend that you collaborate with researchers and/or practitioners who are familiar with the adult education context to help smooth the bumps you will inevitably experience.


This blog was produced by Dr. Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer at NCER.