Inside IES Research

Notes from NCER & NCSER

The Importance of Partnering with Practitioners in English Learner Research

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In FY 2014, Dr. Karen Thompson was awarded a grant for The Oregon English Learner Alliance: A Partnership to Explore Factors Associated with Variation in Outcomes for Current and Former English Learners in Oregon to determine best practices to support academic achievement among current and former English learners. Dr. Thompson and her colleagues wrote a guest blog post describing the work that the partnership undertook to better understand and improve the performance of English learners in Oregon. In this blog, we interviewed Dr. Thompson—three years after the end of the grant—to get her perspectives on the partnership, outcomes of their work, and where things currently stand.

 

What was the purpose of your research and what led you to do this work?

When I came to Oregon from California in 2012, there was growing momentum in the state to better understand and meet the needs of the state’s multilingual student population, particularly students classified as English learners (ELs). The state had developed an ambitious EL strategic plan, which included a variety of goals and action steps, such as identifying model programs and sharing best practices. I noticed that Oregon did not have publicly available information about the state’s former EL students. In prior work, other researchers and I had demonstrated that analyzing data only about students currently classified as English learners without also analyzing data about former EL students can provide incomplete and misleading information. Therefore, for Oregon to realize its goals and truly understand which programs and practices were most effectively educating its multilingual students, the state needed to make changes to its data systems. This was the seed that led to the Oregon Department of Education/Oregon State University English Language Learner Partnership. Our first goal was to simply determine how many former EL students there were in the state. Then, once the state had created a flag to identify former EL students, we were able to conduct a wide range of analyses to better understand opportunities and outcomes for both current and former EL students in ways that have informed state reporting practices and policy decisions.

 

How does this research differ from other work in the field? Why do you think partnerships with practitioners were necessary to carry out the work?

When we began our partnership, collecting and analyzing information about both current and former EL students was not common. Happily, more and more researchers and education agencies have now adopted these approaches, and we think our partnership has helped play a role in this important and illuminating shift.  

It was crucial to conduct this work via partnerships between researchers and practitioners. Practitioner partners had deep knowledge of the state’s current data systems, along with knowledge about which reporting and analysis practices could shift to incorporate new information about current and former EL students. Research partners had the bandwidth to conduct additional analyses and to lead external dissemination efforts. Our regular partnership meetings enabled our work to evolve in response to new needs. 

 

What do you think was the most important outcome of your work and why?

I think the most important outcome of our work is that educators across Oregon now have information about both their current and former English learner students and can use this data to inform policy and practice decisions. Other analyses we conducted have also informed state actions. For example, our analysis of how long it takes Oregon EL students to develop English proficiency and exit EL services informed the state’s EL progress indicator under the Every Student Succeeds Act.

 

What are the future directions for this work?

Our IES-funded partnership led to funding from the Spencer Foundation to do further research about EL students with disabilities in Oregon, which has impacted practices in the state. In addition, I am excited to be one of the collaborators in the new IES-funded National Research and Development Center to Improve Education for Secondary English Learners (PI: Aída Walquí, WestEd). As part of the Center’s research, I am working with colleagues at the University of Oregon and the University of California, Los Angeles to analyze malleable factors impacting content-course access and achievement for secondary EL students. We are collaborating with four states in this work, and as in our ODE/OSU partnership, we will be analyzing data for both current and former EL students. At a policy level, colleagues and I are involved in conversations about how data collection and reporting at the federal level could also incorporate analysis of data for both current and former EL students, including ways this might inform future reauthorizations of the Elementary and Secondary Education Act.

 

---

Dr. Karen Thompson is an Associate Professor at the College of Education at Oregon State University. Her research focuses on how curriculum and instruction, teacher education and policy interact to share the classroom experiences of K-12 multilingual students.

 

Written by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for English Learner Program, National Center for Education Research.

Catching Up with Former NCSER Fellows: Experiences and Advice for Early Career Researchers

Since 2008, the National Center for Special Education Research (NCSER) has supported postdoctoral training programs to prepare fellows in conducting early intervention and special education research that addresses issues that are important to infants, toddlers, children, and youth with or at risk for disabilities, their families, practitioners, and policymakers. As part of our Spotlight on IES Training Programs series, we reached out to a few former NCSER fellows who are now principal investigators (PIs) on IES grants to ask about their current research projects, how the NCSER fellowship prepared them for those projects, roadblocks they faced in applying for research funding, and advice for early career researchers interested in applying for IES funding. Below is what they had to say.

Photo of Angel FettigAngel Fettig, University of Washington

My NCSER postdoctoral position at Frank Porter Graham Child Development Institute at the University of North Carolina, Chapel Hill provided the opportunities and resources to prepare me to be the researcher I am today. Through my postdoctoral position, I had the opportunity to work on multiple NCSER-funded projects and got a solid understanding of the day-to-day activities of large research grants. I also received resources and supports to attend trainings and hone my research skills. Most importantly, I was surrounded by a community of researchers and mentors who are committed to promoting the use of rigorous research methodologies to build on evidence-based practices. Since the completion of my postdoctoral position, I have engaged in continuous learning around innovative research methodologies and apply them in my research grant applications. My current research, including the NCSER project I lead, focuses on equipping educators and parents with evidence-based practices to support young children’s social and emotional development and reduce challenging behaviors. I strongly believe that social emotional development is critical in ensuring the success of young children with and at risk for disabilities as they enter schools, and adults who interact with them play a crucial role in fostering this development. My advice for early career researchers is to find good mentors and colleagues who are interested in similar topics, craft an idea that addresses the current needs, design a study with rigorous and innovative research methodologies, and then just apply for funding! You can’t score a goal if you don’t take a shot!

Photo of Paulo GrazianoPaulo Graziano, Florida International University

My NCSER postdoctoral position at Florida International University provided me with specialized training in evidence-based assessments and interventions for children with disruptive behavior disorders. In combination with my background in developmental psychopathology, this training allowed me to find gaps in the research on how to best prepare preschoolers with disruptive behavior disorders for school entry, which led me to apply for additional IES grants. The NCSER project that I was awarded in 2012 entailed iteratively developing and testing a summer treatment program targeting pre-kindergarteners with disruptive behavior. As part of the project, we learned which curriculum, length, and level of parental involvement was needed to optimize children's academic, behavioral, and social-emotional growth during kindergarten. I was fortunate enough to get this award while still finishing up my postdoctoral fellowship, which was tremendously helpful in obtaining a faculty position and continuing my work at the same institution. One roadblock I faced applying for funding was obtaining permission from my university to apply for a grant as the PI while still a postdoc and responding to reviewers who thought that a postdoc should not be a PI. However, I overcame both roadblocks with the support of my postdoc mentor. This initial IES grant and my NCSER postdoc training were essential for launching my career and establishing a translational line of research that integrates developmental and neuroscience research to inform the treatment of disruptive behavior disorders. This integrated line of research has also allowed me to successfully receive funding from other agencies including the National Institutes of Health. I would highly encourage early career researchers to develop solid relationships with their community's school system. Forming a partnership is critical towards submitting a project for funding that will not only be implemented with high fidelity but that will be well received and maintained/adopted by stakeholders once the grant ends.

Photo of Dwight IrvinDwight Irvin, University of Kansas

My NCSER postdoctoral fellowship at Juniper Gardens Children’s Project at the University of Kansas focused on response to intervention in early childhood. With support and guidance from my mentors, Charles Greenwood and Judith Carta, I was afforded an opportunity to assist on multiple IES projects that allowed me to engage in planning, problem-solving, technology design/development, and statistical analysis. Importantly, I learned how an idea becomes a proposal, a funded grant, and is implemented to meet the proposed deliverables. During my postdoc, I formulated my own line of research and collected pilot data for future proposal development. It’s these experiences that I feel were most beneficial in preparing me for my current work and research. In our current NCSER project, we aim to validate a tool, the Classroom Code for Interactive Recording of Children's Learning Environments (CIRCLE) (Version 2.0), to assist preschool teachers in adjusting their instruction for young children at risk of not being ready for kindergarten. CIRCLE is a digital, live classroom observation system that assesses teacher and child behavior within multiple learning contexts. Our goal is to learn under what conditions and for whom intentional instruction is effectively promoting children’s literacy engagement and school readiness outcomes. Applying for research funding is always a formidable task. A big challenge is just being an early career investigator and lacking a reputation that convinces reviewers the work is feasible and worth funding. Another is learning how to write a proposal that is absent of fatal flaws and not viewed as too “ambitious.” My advice for early career researchers is to surround yourself with colleagues who value mentoring and have a history of funding. Find a way to involve yourself in developing a proposal even if it is not your own work and find a role on it even if it is not as an investigator. It is best not to expect success on an initial proposal submission, rather look at getting a panel review as a win. And lastly, find ways to collect and include meaningful pilot data to incorporate into a proposal as evidence that it is worth the investment.

This blog was written by Alice Bravo, virtual intern for IES and doctoral candidate in special education at the University of Washington, and Katie Taylor, program officer for NCSER’s postdoctoral training program.

National Research & Development Center Launches Website to Provide Research Evidence and Actionable Information for Improving Education Outcomes for Secondary English Learners

Many English Learners (ELs) in secondary school settings are identified as long-term ELs—students who have been enrolled in U.S. schools for six or more years who have not made significant progress in English—and are at risk for dropping out of high school. These students face unique challenges and barriers in accessing education opportunities, which has resulted in persistent differences in academic outcomes between ELs and non-ELs, as well as negative consequences that reach far beyond school.  

 

About the Center

The IES-funded National Research & Development Center to Improve Education for Secondary English Learners has identified two specific challenges that ELs in secondary school face as they simultaneously develop English proficiency and subject-matter knowledge: 1) barriers to enrollment in challenging courses, and 2) scarcity of quality learning opportunities. The Center is taking a multi-pronged research approach to improve outcomes for ELs in secondary school settings by:

  • Identifying and describing the systemic barriers that prevent secondary ELs from successfully accessing the general curriculum
  • Developing and testing innovative curricular materials that strengthen the learning opportunities and experiences of both teachers and ELs as they engage in disciplinary practices

 

New Website Launched

The Center has launched a new website that provides information about their work and resources for researchers, practitioners, policymakers, and other education stakeholders to address current challenges and needs facing ELs in secondary school settings. Visit https://www.elrdcenter.wested.org/ for information ranging from how teachers and school and district leaders can support adolescent ELs in distance learning to modules that can be used for teacher preparation or professional development sessions to develop expertise in working with adolescent ELs.

For more information about IES’s investment in improving opportunities and achievement for English learners in secondary school settings, please see here.


Written by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for English Learners Program, National Center for Education Research.

 

Building a Reading Comprehension Measure for Postsecondary Students

Assessments of both U.S. adults and 12th-grade students indicate that millions of learners may have significant reading skill gaps. Because these students may lack the fundamental reading and comprehension skills needed to thrive in college, postsecondary institutions need valid reading measures that accurately determine the source of student difficulties.

An IES-funded research team is developing and validating such a measure: Multiple-choice Online Causal Comprehension Assessment for Postsecondary Students (MOCCA-College). MOCCA-College aims to assess the reading comprehension abilities of postsecondary students and distinguish between common comprehension difficulties. This information could help students, faculty, and programs better determine who might need what type of additional reading instruction.

The current version of MOCCA-College is still being validated, but it already contains components that may interest postsecondary institutions, faculty, and students. For example, it suggests classroom interventions based on a student’s results and allows for different user roles, such as student, faculty member, or administrator. 

Results from pilot work indicate that MOCCA-College can reliably distinguish between postsecondary readers with strong comprehension skills and those who may need to build these skills. MOCCA-College uses both narrative and expository texts to determine student performance. The results indicate that both types of passages measure a single dimension of ability, though narrative passages may more easily and accurately discriminate between those who have good comprehension skills and those who do not.

This finding is in keeping with meta-analysis work that finds a similar pattern for narrative and expository items. Narrative passages appear to consistently measure inferential comprehension more accurately than expository passages for both younger and older readers. This holds even after matching texts for readability and demands on background knowledge.

As the researchers continue to validate MOCCA-College, we will continue to learn more about the needs of postsecondary readers, as well as how to identify and address these needs.

 


This research and articles referenced above are supported through NCER grant R305A180417: Multiple-choice Online Causal Comprehension Assessment for Postsecondary Students (MOCCA-College).

Dr. Meredith Larson, program officer for postsecondary and adult education, wrote this blog. Contact her at Meredith.Larson@ed.gov for additional information about MOCCA-College and postsecondary teaching and learning research.

 

Addressing COVID-19’s Disruption of Student Assessment

Under an IES grant, the RAND Corporation, in collaboration with NWEA, is developing strategies for schools and districts to address the impacts of COVID-19 disruptions on student assessment programs. The goal is to provide empirical evidence of the strengths and limitations of strategies for making decisions in the absence of assessment data. Jonathan Schweig, Andrew McEachin, and Megan Kuhfeld describe early findings from surveys and structured interviews regarding key concerns of districts and schools. 

 

As a first step, we surveyed assessment and research coordinators from 23 school districts (from a sample of 100 districts) and completed follow-up interviews with seven of them on a variety of topics, including the re-entry scenario for their district, the planning activities that they were not able to perform this year due to coronavirus-based disruptions to spring 2020 assessments, and the strategies they were employing to support instructional planning in the absence of assessment data. While the research is preliminary and the sample of respondents is not nationally representative, the survey and interview responses identified two key concerns arising from the lack of spring 2020 assessment data which has made it challenging to examine student or school status and change over time, especially as COVID-19 has differential impacts on student subgroups:

 

  • Making course placement decisions. Administrators typically rely on spring assessment scores—often in conjunction with other assessment information, course grades, and teacher recommendations—to make determinations for course placements, such as who should enroll in accelerated or advanced mathematics classes. 
  • Evaluating programs or district-wide initiatives. Many districts monitor the success of these programs internally by looking at year-to-year change or growth for schools or subgroups of interest. 

 

How are school systems responding to these challenges? Not surprisingly, the responses vary depending on local contexts and resources. Where online assessments were not feasible in spring 2020, some school districts used older testing data to make course recommendations, either from the winter or from the previous school year. Some districts relaxed typical practice and provided more autonomy to individual schools, relying on school staff to exercise local judgment around course placements and using metrics like grades and teacher recommendations. Other districts reported projecting student scores based on student assessment histories. Relatedly, some districts were already prepared for this decision because they had recently experienced difficulties with adopting an online assessment system and had to address similar problems caused by large numbers of missing or invalid tests.

 

School districts also raised concerns about whether assessments administered during the 2020-21 school year would be valid and comparable so that they could be used in student placement and program evaluation decisions. These concerns included the following:

  • Several respondents raised concerns about the trustworthiness of remote assessment data collected this fall and the extent to which results could be interpreted as valid indicators of student achievement or understanding.
  • Particularly for districts that started the 2020-21 school year remotely, respondents were concerned about student engagement and motivation and the possibility of students rushing assessments, running into technological or internet barriers, or seeking assistance from guardians or other resources. 
  • Respondents raised questions about the extent to which available assessment scores are representative of school or district performance as a whole. Given that vulnerable students (for example, students with disabilities, students experiencing homelessness) may be the least likely to have access to remote instruction and assessments, it is likely that the students who are not assessed this year are different from students who are able to be assessed.
  • Other respondents noted that they encountered resistance from parents around fall assessment because they prioritized student well-being (for example, safety, sense of community, and social and emotional well-being) more so than academics. This is a perspective that resonates with recent findings from a nationally representative sample of teachers and school leaders drawn from RAND’s American Educator Panel (AEP).

 

In the next phase of the work, the research team plans to:

  • Conduct a series of simulation and empirical studies regarding the most common strategies that the district respondents indicated they were using to make course placement decisions and to evaluate programs or district-wide initiatives.
  • Provide a framework to help guide local research on the intended (and unintended) consequences for school and school system decision making when standardized test scores are not available.

 

We welcome individuals to reach out to RAND with additional recommendations or considerations. We are also interested in hearing how districts are approaching course placement, accountability, and program evaluation across the country. Connect with the research team via email at jschweig@rand.org.

 


Jonathan Schweig is a social scientist at the nonprofit, nonpartisan RAND Corporation.

Andrew McEachin is a senior policy researcher at the nonprofit, nonpartisan RAND Corporation.

Megan Kuhfeld is a researcher at NWEA.