IES Blog

Institute of Education Sciences

The PI Meeting in 140 Characters

By Wendy Wei, Program Assistant, National Center for Education Research

How can practitioners and policymakers apply education research to their everyday work if they never hear about it or do not understand it? Communicating and disseminating research findings plays an integral role in promoting the education sciences and advancing the field.

That is why we made communication and dissemination a major theme at the IES Principal Investigators’ Meeting held earlier this month (December 10-11). The two-day meeting in Washington, D.C., featured five sessions that focused on communications – ranging from data visualization techniques to effective dissemination strategies to hearing journalists’ perspectives on how to share scientific results with the general public.

There was a lot of talk about social media during the meeting and plenty of tweeting about the presentations. We used the Twitter hashtag, #IESPIMtg, to foster an ongoing conversation for meeting attendees and to share findings that emerged from sessions.  Any tweet that included #IESPIMtg was automatically pooled together, generating a live Twitter feed that was on display in the lobby throughout the meeting.

 You can see all of the #IESPImtg tweets online, but here are some highlights:

"There is a tremendous sense of urgency to bridge the gap between research and practice..." --John B King #IESPIMtg

— Leah Wisdom (@lifelnglearner) December 10, 2015

.@StanfordEd's Sean Reardon: Good partnership work can lead to new knowledge, change policy+practice, improve data quality #IESPIMtg

— Bill Penuel (@bpenuel) December 11, 2015

#IESPIMtg Practitioner partners play a critical role in making sense of data and analyses in RPPs.

— Jennifer Russell (@Jenn_L_Russell) December 10, 2015

And we can get a little bit meta now…communicating about how to communicate:

Hirsh-Pasek & Golinkoff urges researchers to create "'edible science' that is accessible, digestible and usable." #IESPIMtg

— Tomoko Wakabayashi (@twakabayashi264) December 10, 2015

Awesome presentation on #DataVisualization by @jschwabish: Show the data, reduce the clutter, stop distracting attention. #IESPIMtg

— Rudy Ruiz (@RudyRuiz_BMore) December 10, 2015

.@KavithaCardoza Explaining your research--Don't think of it as "dumbing down." Think of it as simplifying. #IESPIMtg

— Dana Tofig (@dtofig) December 11, 2015

And, of course, what's Twitter without a little fun? When we tweeted this picture...

The poster session is going strong. Principal investigators present findings from #iesfunded research. #IESPIMtg

— IES Research (@IESResearch) December 10, 2015

...Chris Magnuson, Director of Innovation for Live It, Learn It, posted this reply: 

@IESResearch careful...photo looks like it was taken on Death Star! May the force be with all grantees! #SBIR #IES

— Chris Magnuson (@cromagnuson) December 10, 2015

The National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER) have made a commitment to be active contributors in communicating with and engaging the general public in the exciting findings of NCER- and NCSER-funded work. Over the past few years, we have been active on Twitter (you can follow us @IESResearch), and this past year, we launched our blog (the very one you are reading!). These two platforms have provided us with an outlet to share research findings, provide updates about events and deadlines, and connect with audiences we otherwise might not reach.

For those of you who could not make the PI meeting, videos will be posted on the conference website in about a month. So stay tuned!

We hope you’ll continue the conversation started at the PI meeting by following us on Twitter at @IESResearch or sharing your thoughts with us at IESResearch@ed.gov.

 

IES Honors Statistician Nathan VanHoudnos as Outstanding Predoctoral Fellow

By Phill Gagne and Katina Stapleton, NCER Program Officers

Each year, IES recognizes an outstanding fellow from its Predoctoral Interdisciplinary Research Training Programs in the Education Sciences for academic accomplishments and contributions to education research. The 2014 winner, Dr. Nathan VanHoudnos completed his Ph.D. at Carnegie Mellon University and wrote his dissertation on the efficacy of the Hedges Correction for unmodeled clustering. Nathan is currently a postdoctoral fellow at Northwestern University. In this blog, Nathan provides insights on becoming an education researcher and on research study design. 

How did you become interested in education research?

I was born into it. Before he retired, my father was the Director of Research for the Illinois Education Association. Additionally, my grandparents on my mother's side were both teachers. 

 

As a statistician, how do you explain the relevance of your research to education practitioners and policy-makers?

I appeal to the crucial role biostatisticians play in the progress of medical research. Doctors and medical researchers are able to devote their entire intellectual capacity towards the development of new treatments, while biostatisticians are able to think deeply about both how to test these treatments empirically and how to combine the results of many such studies into actionable recommendations for practitioners and policy makers.  I aim to be the education sciences analogue of a biostatistician. Specifically, someone whose career success is decided on (i) the technical merits of the new methodology I have developed and (ii) the usefulness of my new methodology to the field. 

Your research on the Hedges correction suggests that many education researchers mis-specify their analyses for clustered designs. What advice would you give researchers on selecting the right analyses for clustered designs? 

My advice is to focus on the design of the study. If the design is wrong, then the analysis that matches the design will fail, and it is likely that no re-analysis of the collected data will be able to recover from the initial mistake. For example, a common design error is randomizing teachers to experimental conditions, but then assuming that how the school registrar assigned students to classes was equivalent to the experimenter randomizing students to classes. This assumption is false. Registrar based student assignment is a kind of group based, or clustered, random assignment. If this error is not caught at the design stage, the study will necessarily be under powered because the sample size calculations will be off. If the error is not caught at the publication stage, the hypothesis test for the treatment effect will be anti-conservative, i.e. even if the treatment effect is truly zero, the test statistic is still likely to be (incorrectly!) statistically significant. The error will, however, be caught if the What Works Clearinghouse decides to review the study. Their application of the Hedges correction, however, will not fix the design problem. The corrected test statistic will, at best, have low power, just like a re-analysis of the data would. At worst, the corrected test statistic can have nearly zero power. There is no escape from a design error. 


To give a bit of further, perhaps self-serving advice, I would also suggest engaging your local statistician as a collaborator. People like me are always looking to get involved in substantively interesting projects, especially if we can get involved at the planning stage of the project. Additionally, this division of labor is often better for everyone: the statistician gets to focus on interesting methodological challenges and the education researcher gets to focus on the substantive portion of the research. 

How has being an IES predoc and now an IES postdoc helped your development as a researcher?

This is a bit like the joke where one fish asks another "How is the water today?" The other fish responds "What's water?" 

I came to Carnegie Mellon for the joint Ph.D. in Statistics and Public Policy, in part, because the IES predoc program there, the Program for Interdisciplinary Education Research (PIER), would both fund me to become and train me to become an education researcher. The PIER program shaped my entire graduate career. David Klahr (PIER Director) gave me grounding in the education sciences. Brian Junker (PIER Steering committee) taught me how to be both methodologically rigorous and yet still accessible to applied researchers. Sharon Carver (PIER co-Director), who runs the CMU lab school, built in a formal reflection process for the "Field Base Experience" portion of our PIER training. That essay, was, perhaps, the most cathartic thing I have ever written in that it helped to set me on my career path as a statistician who aims to focus on education research. Joel Greenhouse (affiliated PIER faculty), who is himself a biostatistician, chaired my thesis committee. It was his example that refined the direction of my career: I wish to be the education sciences analogue of a biostatistician. 

The IES postdoc program at Northwestern University, where I am advised by Larry Hedges, has been very different. Postdoctoral training is necessarily quite different from graduate school. One thread is common, however, the methodology I develop must be useful to applied education researchers. Larry is, as one might suppose, quite good at focusing my attention on where I need to make technical improvements to my work, but also how I might better communicate my technical results and make them accessible to applied researchers. After only a year at Northwestern, I have grown considerably in both my technical and communication skills.

What career advice would you give to young researchers?

Pick good mentors and heed their advice. To the extent that I am successful, I credit the advice and training of my mentors at Carnegie Mellon and Northwestern. 


Comments? Questions? Please write to us at IESResearch@ed.gov.

Experts Discuss the Use of Mixed Methods in Education Research

By Corinne Alfeld and Meredith Larson, NCER Program Officers

Since IES was founded more than a dozen years ago, it has built a reputation for funding rigorous research to measure the causal effects of education policies and programs.  While this commitment remains solid, we also recognize the value of well-designed qualitative research that deepens understanding of program implementation and other educational processes and that generates new questions or hypotheses for study. In this blog post, we highlight the outcomes from a recent meeting we hosted focused on the use of mixed methods – that is, studies that combine qualitative and quantitative methods – and share some of the ways in which our grantees and other researchers incorporate mixed methods into their research.

On May 29, 2015, 10 researchers with experience designing and conducting mixed methods research met with staff from the two IES research centers in a technical working group (TWG) meeting. The TWG members shared their experiences carrying out mixed methods projects and discussed what types of technical assistance and resources we could provide to support the integration of high-quality mixed methods into education research. There was consensus among the TWG members that qualitative data is valuable, enriches quantitative data, and provides insight that cannot be gained from quantitative research alone.  Participants described how mixed methods in currently used in education research, proposed potential NCER and NCSER guidance and training activities to support the use of high-quality mixed methods, and offered suggestions for researchers and the field. Below are just a few examples that were shared during the meeting:

  • Dr. Carolyn Heinrich and colleagues used a longitudinal mixed method study design to evaluate the efficacy of supplemental education services provided to low-income students under No Child Left Behind. One of the critical findings of the study was that there was substantial variation across school districts in what activities were included in an hour of supplemental instruction, including (in some cases) many non-instructional activities.  This was revealed as the team examined the interview data describing what activities lay behind the shared metric of an hour of instructional time.  Having that level of information provided the team with critical insights as they examined the site-by-site variation in efficacy of supplemental education services.  Dr. Heinrich emphasized the need for flexibility in research design because the factors affecting the impact of an intervention are not always apparent in the design phase. In addition, she reminded the group that while statistical models provide an average impact score, there is valuable information included in the range of observed impacts, and that that variability is often best understood with information collected using in-depth field research approaches.
  • Dr. Mario Small used mixed methods research to examine social networks in childcare centers in New York City. Using observational methods, he discovered that variations in the level of networking among mothers depended on the individual child care center, not the neighborhood. He hypothesized that child care centers that had the strictest rules around pick-up and drop-off, as well as more opportunities for parent involvement (such as field trips), would have the strongest social networks. In such settings, parents tend to be at the child care center at the same time and, thus, have more interaction with each other. Dr. Small tested the hypotheses using analysis of survey and social network data and found that those who developed a social network through their child care center had higher well-being than those who did not. He concluded from this experience that without the initial observations, he would not have known that something small, like pick-up and drop-off policies, could have a big effect on behavior.
  • Dr. Jill Hamm described a difficult lesson learned about mixed methods “after the fact” in her study, which was funded through our National Research Center on Rural Education Support. In planning to launch an intervention to be delivered to sixth-grade teachers to help adolescents adjust to middle school, she and her colleagues worked with their school partners to plan for possible challenges in implementation. However, because some of the qualitative data collected in these conversations were not part of the original research study – and, thus, not approved by her Institutional Review Board – the important information they gathered could not be officially reported in publications of the study’s findings. Dr. Hamm encouraged researchers to plan to use qualitative methods to complement quantitative findings at the proposal stage to maximize the information that can be collected and integrated during the course of the project.
  • In a study conducted by Dr. Tom Weisner and his colleagues, researchers conducted interviews with families of children with disabilities to determine the level of “hassle” they faced on a daily basis and their perceptions of sustainability of their family’s routines. Findings from these interviews were just as good at predicting family well-being as parental reports of coping or stress on questionnaires. The findings from the analysis of both the qualitative and quantitative data collected for this study enhanced researchers’ understanding of the impact of a child’s disability on family life more than either method could have alone. Dr. Weisner observed that the ultimate rationale of mixed methods research should be to gather information that could not have been revealed without such an approach. Because “the world is not linear, additive, or decontextualized,” he suggested that the default option should always be to use mixed methods and that researchers should be required to provide a rationale for why they had not done so, where feasible.

Curious to learn more about what was discussed? Additional information is available in the meeting summary.

Comments? Questions? Please email us at IESResearch@ed.gov.