Inside IES Research

Notes from NCER & NCSER

Experts Discuss the Use of Mixed Methods in Education Research

By Corinne Alfeld and Meredith Larson, NCER Program Officers

Since IES was founded more than a dozen years ago, it has built a reputation for funding rigorous research to measure the causal effects of education policies and programs.  While this commitment remains solid, we also recognize the value of well-designed qualitative research that deepens understanding of program implementation and other educational processes and that generates new questions or hypotheses for study. In this blog post, we highlight the outcomes from a recent meeting we hosted focused on the use of mixed methods – that is, studies that combine qualitative and quantitative methods – and share some of the ways in which our grantees and other researchers incorporate mixed methods into their research.

On May 29, 2015, 10 researchers with experience designing and conducting mixed methods research met with staff from the two IES research centers in a technical working group (TWG) meeting. The TWG members shared their experiences carrying out mixed methods projects and discussed what types of technical assistance and resources we could provide to support the integration of high-quality mixed methods into education research. There was consensus among the TWG members that qualitative data is valuable, enriches quantitative data, and provides insight that cannot be gained from quantitative research alone.  Participants described how mixed methods in currently used in education research, proposed potential NCER and NCSER guidance and training activities to support the use of high-quality mixed methods, and offered suggestions for researchers and the field. Below are just a few examples that were shared during the meeting:

  • Dr. Carolyn Heinrich and colleagues used a longitudinal mixed method study design to evaluate the efficacy of supplemental education services provided to low-income students under No Child Left Behind. One of the critical findings of the study was that there was substantial variation across school districts in what activities were included in an hour of supplemental instruction, including (in some cases) many non-instructional activities.  This was revealed as the team examined the interview data describing what activities lay behind the shared metric of an hour of instructional time.  Having that level of information provided the team with critical insights as they examined the site-by-site variation in efficacy of supplemental education services.  Dr. Heinrich emphasized the need for flexibility in research design because the factors affecting the impact of an intervention are not always apparent in the design phase. In addition, she reminded the group that while statistical models provide an average impact score, there is valuable information included in the range of observed impacts, and that that variability is often best understood with information collected using in-depth field research approaches.
  • Dr. Mario Small used mixed methods research to examine social networks in childcare centers in New York City. Using observational methods, he discovered that variations in the level of networking among mothers depended on the individual child care center, not the neighborhood. He hypothesized that child care centers that had the strictest rules around pick-up and drop-off, as well as more opportunities for parent involvement (such as field trips), would have the strongest social networks. In such settings, parents tend to be at the child care center at the same time and, thus, have more interaction with each other. Dr. Small tested the hypotheses using analysis of survey and social network data and found that those who developed a social network through their child care center had higher well-being than those who did not. He concluded from this experience that without the initial observations, he would not have known that something small, like pick-up and drop-off policies, could have a big effect on behavior.
  • Dr. Jill Hamm described a difficult lesson learned about mixed methods “after the fact” in her study, which was funded through our National Research Center on Rural Education Support. In planning to launch an intervention to be delivered to sixth-grade teachers to help adolescents adjust to middle school, she and her colleagues worked with their school partners to plan for possible challenges in implementation. However, because some of the qualitative data collected in these conversations were not part of the original research study – and, thus, not approved by her Institutional Review Board – the important information they gathered could not be officially reported in publications of the study’s findings. Dr. Hamm encouraged researchers to plan to use qualitative methods to complement quantitative findings at the proposal stage to maximize the information that can be collected and integrated during the course of the project.
  • In a study conducted by Dr. Tom Weisner and his colleagues, researchers conducted interviews with families of children with disabilities to determine the level of “hassle” they faced on a daily basis and their perceptions of sustainability of their family’s routines. Findings from these interviews were just as good at predicting family well-being as parental reports of coping or stress on questionnaires. The findings from the analysis of both the qualitative and quantitative data collected for this study enhanced researchers’ understanding of the impact of a child’s disability on family life more than either method could have alone. Dr. Weisner observed that the ultimate rationale of mixed methods research should be to gather information that could not have been revealed without such an approach. Because “the world is not linear, additive, or decontextualized,” he suggested that the default option should always be to use mixed methods and that researchers should be required to provide a rationale for why they had not done so, where feasible.

Curious to learn more about what was discussed? Additional information is available in the meeting summary.

Comments? Questions? Please email us at IESResearch@ed.gov.

An Intern's Perspective on the National Center for Education Research

By Brittney Fraumeni, NCER Intern

Photo of Brittney Fraumeni

 

Each year, the Institute of Education Sciences’ two research centers offer unpaid internships for undergraduate or graduate students interested in learning about the research grant making process and contributing to the work of the centers.  Internships are coordinated through the U.S. Department of Education’s student volunteer office and are available throughout the year.  For application information, please see the ED Student Volunteer Unpaid Internship Program.  

This summer Brittney Fraumeni, a doctoral student in Psychological Science at DePaul University interned with the National Center for Education Research (NCER). At the end of her internship, Brittney reflected on her summer with NCER. 

What brought you to the internship?

As I headed in to the final months of my third year of graduate school, I began to really question what I wanted to do with my degree when I was finished. My PhD program emphasizes training for an academic position, but I had doubts about whether or not that was the best fit for me. So when the opportunity to be a summer intern at NCER presented itself, I seized it, hoping for a learning experience that would help shape my view of my future career.

How did you hear about the internship?

I briefly worked as a freelance researcher on a US Department of Education grant, which was the first time it occurred to me that the government had a research department. I easily found the IES website, and after some exploring on the site, discovered that they had internships available. I applied ASAP.

Why did you want to do the internship?

I really wanted an opportunity to see what a non-academic position could be like. As I mentioned, most of my graduate training has revolved around obtaining an academia related career, and so I had no idea what else was out there.

What were your days like at the internship?

The internship schedule was really flexible, and I was allowed to choose my own days and hours. Additionally, I was in charge of my own time management throughout the day. At the beginning of my six week stay, the main projects I was going to be working on were given to me, meaning that everyday I came in to the office after that, I mostly knew what I would be working on. I had three big projects to work on, so I usually just circulated through tasks for those, and every once in awhile a small project would head my way that I would add to my schedule.

What was beneficial about the internship?

The internship really helped confirm what I was already thinking at the beginning of the summer: I’d like to get a job in a non-academic field upon completing my doctorate. But, more than that, the internship gave me the chance to work with like-minded individuals who were open to letting me pick their brain and providing contact information with people in the education research field. Overall, it was a great learning and networking opportunity.

What did you learn from the internship?

More than just learning more about education research, I learned new skills. Before the internship, I only knew of social media from a personal standpoint. But, as more companies branch out to different social media outlets to promote their work, it’s important to know how to have a professional and effective social media presence. Working on the social media team at IES really boosted my social media skills.

What did you learn about IES/ED from doing the internship?

Before applying for the internship, I thought ED was really only a department focused on policy; I wasn’t even aware that the department was involved in research! However, through actually working here, I learned that not only is there a research department, there are so many more departments than I even could have imagined. IES itself is broken down in to multiple branches that all have different focuses on research elements. By working with the people here and having the opportunity to sit in on different meetings, I was able to learn what each department does and the special role they each play in promoting education research. Furthermore, I learned that not everybody took the same path to get here; IES is made up of employees with all different backgrounds, which makes for a fun and diverse environment to work in.

How did the internship reshape your thinking about education research?

I used to think education research was a relatively small area. Now, after having hands on experience with writing up award summaries, I know that there are many people interested in education research and pursuing it. It never occurred to me how many different companies (not just schools!) had an interest in developing interventions for education purposes. It is so inspiring to now know just how many people out there are trying to promote the best outcomes for students, from pre-k to college.


Questions? Comments? Please send them to IESResearch@ed.gov. 

The ‘Not So Simple’ View of Reading

By Karen Douglas, NCER Program Officer

 

Improving students’ capacity to understand what they read in all subject areas is a primary focus of educators and policymakers. Educators and researchers have been focused on interventions to improve reading for decades, and a great deal of attention has been given to improving word level skills (such as phonemic awareness and decoding). In part, this focus can be traced to the ‘Simple View of Reading,’ a theoretical framework developed by Gough and Tunmer almost 30 years ago.

The Simple View states that readers need to both understand language and decode the symbols on the page in order to comprehend written text. The influential role of decoding on reading outcomes has been well studied, and many interventions have been developed that show good results in improving these skills for many students. But improvement in decoding skills, while necessary, has not generally been sufficient to improve reading comprehension.

In recent years, researchers have begun exploring the other part of the equation -- language. Most often, researchers use vocabulary knowledge as a proxy for language skills and a great deal of research is focused on improving vocabulary skills. Efforts to improve vocabulary generally show that students learn the new words they are taught, but generalized effects on vocabulary knowledge and reading comprehension are elusive. It seems likely that in addition to understanding the meanings of individual words, students also need to know how words are constructed (morphology), how they are used in text (syntax and grammar), and how to make inferences from text in order to make sense of the wide variety of materials they must read.

The Reading for Understanding Research Initiative (RfU), funded in 2010 by IES, is addressing a broader conception of language in trying to improve reading comprehension. RfU provided funding for six research teams to study the basic processes that undergird reading comprehension, develop and test new curricula and instructional programs to improve it, and develop new assessments to provide a better measure of students’ capacity to read in authentic scenarios. Collectively, RfU researchers are studying the development of reading for understanding from prekindergarten through high school with the goal of creating new knowledge about what matters at each developmental stage in order for students to finish high school with sufficient reading skills for college and career. Each of these six teams has incorporated attention to aspects of language beyond vocabulary knowledge and several teams have published results that provide evidence of the potential of improved language skills for building reading comprehension. Abstracts for studies and publications to date can be found on the IES website.

In a recent article in Educational Psychology Review, my co-author Elizabeth Albro and I describe the purpose of the RfU Research Initiative, the goals of the six teams funded under the initiative, and progress made through 2014. As the work of the RfU Research Initiative comes to completion, the RfU researchers are positioned to make important contributions to what we know about the development of reading for understanding and how we can best improve it for all students. Expanded knowledge about the language skills that support reading for understanding and how to improve them will be a key component of this contribution. Stay tuned to Inside IES Research to learn more about what the teams are finding.

 

The Month in Review: July 2015

By Liz Albro, NCER Associate Commissioner of Teaching and Learning

Summer Conference Season

Many IES-funded researchers have been sharing the findings of their studies at academic conferences this past month.  Want to learn more? Lists of presentations describing IES-funded research at the Society for Text & Discourse and Society for the Scientific Study of Reading annual meetings are available on our conferences page.

A Busy Month for IES Research in the News

Have you visited our IES Research in the News page lately? It’s a great way to learn more about IES-funded research.  Not only can you read more about the new awards that have been recently made, you can learn about findings from recent studies. We do our best to keep up, but if we’re missing something, send us a note at IESResearch@ed.gov.

More Recognition for ED/IES SBIR Products

ED/IES SBIR supported games by Triad Interactive Media (PlatinuMath) and Electric Funstuff won Gold at the Serious Play Conference.  And ED/IES SBIR awardee Fluidity Software won 1st Place in the “Best Performing Office Add-On” category, for their FluidMath app, which teachers and students use to create dynamic math and physics formulas.

Summer Research Training Institute on Cluster-Randomized Trials in Education Sciences

Congratulations to the 29 participants who completed the ninth Summer Research Training Institute on cluster-randomized trials (CRTs) in education sciences!

The purpose of this training is to prepare current education researchers to plan, design, conduct, and interpret cluster-randomized trials. A tenth Institute will be held in summer 2016, so be sure to follow us on Twitter or subscribe to the IES Newsflash to get application information as soon as it is available. 

Please send any questions or comments to IESResearch@ed.gov.

Congratulations Dr. Donald Compton and Colleagues at Vanderbilt University for Winning the Albert J. Harris Award!

By Sammi Plourde, NCSER Intern; Kristen Rhoads, NCSER Program Officer; and Becky McGill-Wilkinson, NCER Program Officer

IES-funded research by Dr. Compton and his colleagues was recently awarded the International Literacy Association’s (ILA) Albert J. Harris Award!  ILA is an advocacy organization that publishes current research on literacy and provides resources for practitioners, students, and leaders involved in facilitating literacy development across the world.  The Albert J. Harris Award is given annually to a recently published journal article or monograph that contributes to better understanding of prevention or measurement of learning disabilities or reading disabilities.

Picture of teacher reading a book to four children

The winning article by Jennifer K. Gilbert, Donald L. Compton, Douglas Fuchs, Lynn S. Fuchs, Bobette Bouton, Laura A. Barquero, and Eunsoo Cho entitled “Efficacy of a First-Grade Responsiveness-to-Intervention Prevention Model for Struggling Readers,” features findings from a NCSER-funded measurement study focused on identifying and intervening with struggling readers as early as first grade.  The article describes effects of intensive intervention within a multi-tiered prevention model. Struggling readers who were randomly assigned to receive an intensive, small-group intervention had better reading gains compared to students who received classroom instruction as usual. However, some students continued to struggle despite receiving the intensive intervention.  Those students were then randomly assigned to receive the intensive intervention in a one-on-one format or to continue in a small-group format. Results indicated that no differences in performance existed between the two formats.  They also found that more than half of the students who participated in the intervention failed to achieve average reading scores by the end of third grade.  These findings suggest that students with persistent reading problems need intervention as early as possible that spans multiple years.  They also suggest that instruction for the students should be tailored to meet individual needs.  

Dr. Compton and his colleagues are continuing this research with IES.  They were funded by NCER to conduct a follow-up research study to identify characteristics of children who begin elementary school with typical reading development but are then later identified as having a reading disability. This work will provide information on how to guide instruction for students who have these characteristics.

Congratulations to Dr. Compton and his colleagues for making such an important contribution to identifying, preventing, and treating reading disabilities!


Questions? Comments? Please send them to IESResearch@ed.gov