IES Blog

Institute of Education Sciences

Inside IES Special Interview Series: From University Research to Practice at Scale in Education

Over two decades, the National Center for Education Research and the National Center for Special Education Research at IES have built a knowledge base to inform and improve education practice. This work has also spurred the development of evidence-based tools, technological products, training guides, instructional approaches, and assessments. 

While some IES-supported interventions are used on a wide scale (hundreds of schools or more), we acknowledge that a “research to practice gap” hinders the uptake of more evidence-based interventions in education.  The gap refers to the space between the initial research and development in university laboratories and pilot evaluations in schools, and everything else that is needed for the interventions to be adopted as a regular practice outside of a research evaluation.

For many academic researchers, advancing beyond the initial stage of R&D and pilot evaluations is complex and often requires additional time, financing, and specialized expertise and support. For example, interventions often need more R&D to ready interventions for scale—whether to ensure that implementation is turnkey and feasible without any researcher assistance, that interventions work the same across divergent settings and across different populations, or to bolster technology systems to be able to process huge amounts of data across numerous sites at the same time. Advancing from research to practice may also entail commercialization planning to address issues such as intellectual property, licensing, sales, and marketing, to facilitate dissemination of interventions from a university to the education marketplace, and to sustain it over time by generating revenue or securing other means of support.

Special Inside IES Research Interview Series

This winter and spring, Inside IES Research is publishing a series of interviews with the teams of researchers, developers, and partners who successfully advanced IES-funded education research from the university laboratory to practice in schools at scale.  Collectively, the interviews illustrate a variety of models and approaches for scaling evidenced-based interventions and for disseminating and sustaining the interventions over time.

Each interview will address a similar set of questions:

  • Was it part of the original plan to develop an intervention that could one day be used at scale in schools?
  • Describe the initial research and development that occurred. 
  • What role did the university play in facilitating the research to practice process? 
  • What other individuals or organizations provided support during the process?
  • Beyond the original R&D process through IES or ED grants, what additional R&D was needed to ready the intervention for larger scale use?
  • What model was used for dissemination and sustainability?
  • What advice would you provide to researchers who are looking to move their research from the lab to market? What steps should they take? What resources should they look for?

Check this page regularly to read new interviews.

We hope you enjoy the series.

This series is produced by Edward Metz of the Institute of Education Sciences

Diversify Education Sciences? Yes, We Can!

In this blog post, Stephen Raudenbush discusses the University of Chicago’s successful efforts to diversify its IES-funded predoctoral training program. This post is the first in a series exploring issues of diversity, inclusion, and equity in the education sciences.

In 2015, the Committee on Education at the University of Chicago launched a national campaign to recruit a talented and diverse group of pre-doctoral fellows. With funding from the Institute of Education Sciences, we sought to train a new generation of social scientists from across the disciplines to bring rigorous methods of social science to bear on questions related to the improvement of education.

We’d had previous success in pre-doctoral training with IES support. Our fellows had a great track record conducting research and getting good jobs, but we were deeply unsatisfied that only 3 of 35 of those fellows were members of under-represented minority groups. This didn’t make sense, particularly in Chicago—a city where 90% of the public-school students are African American and Hispanic—and where our aim was to build a strong research-practice partnership.

Our campaign was quite successful. We now have a terrific team of 23 PhD fellows, including 9 who are African American or Hispanic. All are making excellent progress toward degrees in disciplines as varied as Comparative Human Development, Economics, Political Science, Psychology, Public Policy, Social Services Administration, and Sociology. We’re writing to share our five key strategies that underscored our approach to improving student diversity in the education sciences.

Create a compelling intellectual argument for choosing education sciences. We invited prospective students to join us in an interdisciplinary research project focused on overcoming educational inequality. We organized the training around one question: “How can we improve the contribution of schooling to skills required for the labor market success of urban youth?” We reasoned that many of the most talented minority and non-minority scholars are deeply committed to answering this broad question, and we reasoned that many would be motivated to come to Chicago to study these questions. A plus for us is the University’s longstanding engagement with public schools in Chicago.

Hire a coordinator dedicated to recruitment. Faculty were totally committed to the recruitment goal, but they were too busy teaching, mentoring, doing research, and serving on departmental committees to oversee a major student recruitment campaign. So we hired a dedicated recruitment director to coordinate with prospective students and faculty and carry out many of the administrative tasks associated with recruitment. The recruitment director assigned every prospective student to a faculty member with kindred interests and followed up to see that faculty colleagues made connected with these students.

Reach out to social networks that include diverse students. Within the university, we worked closely with officers at the University of Chicago who are focused on recruiting diverse students. Our faculty made use of personal connections, and we pooled information about people we know at Historically Black Colleges and Universities, Minority-Serving Institutions, and liberal arts colleges. We made it easy for interested students to express interest and connect with faculty and staff through our website. We also found that organizations such as the American Educational Research Association and the National Equity Project were happy to spread the word about our campaign.

Maximize faculty contact with prospective fellows – well before applications are due. Our faculty were heroes in following up with every promising prospective fellow. We think it’s key to make a phone call before admissions decisions are made and to encourage potentially interested and promising persons to apply. In this way, every person who is admitted will already have a history of communication with a faculty member. Continued communication builds trust and the sense of belonging that encourages young people to join the project. Having a diverse faculty helps; however every faculty member, regardless of race, ethnicity, or gender, pitched in, and this united effort clearly paid off.

Build a welcoming culture. We encouraged all admitted students to visit before deciding what university to attend. We mobilized University funds to support travel and lodging. We encouraged the prospective fellows to meet each other and to meet our current doctoral students during these visits. The key is to convey to each student a true sense of belonging. We created lots of opportunity for small group discussions and social engagement to foster colleagueship and promote respect for the diversity of perspective. Our fellows run our weekly Education Workshop, which often showcases the work of minority scholars. Making this happen for our first cohort helped recruit our second cohort.

Learning from CTE Research Partnerships: Building a Collaborative Data Culture in South Dakota

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Austin Estes, Senior Policy Associate at Advance CTE, and Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES), are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The second interview was with Laura Scheibe of the South Dakota Department of Education and Marc Brodersen of REL Central at Marzano Research. [Note: this interview has been edited for length; you can find the full interview transcript here].

Could you both talk about the project(s) that you have worked on and your research questions? How did the relationship start, and who approached whom?

Marc When we were doing needs sensing with the states in our region, particularly with South Dakota, CTE emerged as a pretty high priority area. We needed to determine what the research questions were, what questions we could actually address, and what data were available that could be used in those research projects. So, this work started off as a technical assistance project where we were working with South Dakota pretty closely and getting all of the relevant players around a table and going through and mapping their data. And it was quite a long process.

Laura–There’s huge support in South Dakota behind CTE, but there wasn’t state-level evidence behind why CTE is such a good thing for students. So, the value that Marzano provided to the project in helping us walk through “this is the data that can help you, this is the process that we are going to go through to help you get to the answer” has been incredibly helpful and not something that we, as a pretty small department of education, could ever have undertaken on our own.

Can you talk about what research questions you ultimately came to and where you are in the process of answering those?

Marc –We have three main questions: 1) What is the impact of being a CTE concentrator on high school graduation, two- and five-year postsecondary enrollment, and completion status? 2) What is the impact of being a CTE concentrator on two-year and five-year employment and quarterly wage status? 3) How do the two-year and five-year outcomes vary by the various CTE Career Clusters®?

Connecting education to workforce data is really difficult, and we’re talking about collecting data over a five to 10 year span for an individual student. Many state data systems don’t go back that far, or data systems have changed, so it’s difficult trying to identify one data system that has 10 or more years of data for an individual student. We’re making it work, but it takes some time and some finagling. We haven’t even begun to analyze the data so, unfortunately, we can’t talk about any preliminary findings.

What were some of the early roadblocks in building this relationship and starting to examine and compile some of the data?

Laura– One of the roadblocks was just getting everyone around the table and bought into the idea. We’re a fairly small state, so it wasn’t hard to reach out to my counterparts at the other agencies who would need to be involved, but this project was, and continues to be, something that is on top of the day to day work that we do. It’s not driven by any specific policy initiative but rather by everybody around the table acknowledging and recognizing that “yeah, this would be really useful for us.” But, in that sense, it’s hard to get everyone’s commitments to the time it has taken and takes to pull this off and making sure that we’ve got the right people around the room as well. We’ve involved not just the Board of Regents but the technical college system and the people with the workforce data.

Marc – Having somebody at the policy level, the data level and the leadership level in the room at the same time is almost essential, particularly when you’re at the brainstorming phase. You can have the leadership that’s going to say “yes, this is important, and I want you to devote time to this,” and then the data person is saying “well, that data just doesn’t exist,” and the policy person may not know about that piece. And having all three of those perspectives at the same time can save a lot of time and effort.

How do you plan to use this research project to further policy in South Dakota?

Laura– First and foremost, this particular project is demonstrating the value that CTE has to the secondary students. This project pre-dated Perkins V [the Strengthening Career and Technical Education for the 21st Century Act], but as we’re moving into implementation of Perkins V full force in the coming calendar year, with the new requirements that Perkins places on states -- and therefore on schools -- to be an approved program, we’re seeing school districts question if it’s really worth it. This project is really coming in at a good time where we will hopefully have some data where we can say, “yes, CTE is worth it.” Being able to message that is hugely valuable from the perspective of a CTE Director in a state where almost every single public school district runs an approved program. Now that we’ve got Perkins V and the [comprehensive local] needs assessment, it will be just one more bit of evidence for schools to be able to examine whether they’re providing the best opportunities for our kids.

What advice would you give to other researcher/ State Director partners for conducting CTE research or establishing similar partnerships?

Marc – From my perspective, as far as establishing a partnership, I think face-to-face interactions are invaluable. It takes a while to trust each other or establish a positive working relationship.

Laura– My advice to State Directors would be to really plan for it and make it a priority. And don’t make it something that isn’t part of the day-to-day because then I think the thread can get lost. I would also say getting that higher-level buy-in is really important. It’s important to make sure that you’ve got that policy-level partner to keep things moving along. The benefits will be there in the end, it just has to be woven into the day-to-day of what you’re doing in order to make it all come together.

Going through this process has helped me form partnerships with my colleagues in other agencies even more strongly than I had before. Just the exercise of having gone through all of that and understanding their work and their data and everything they do, having them understand my role and my constraints better, has just made us a more effective CTE/ workforce team in our state. As we move forward with Perkins V and WIOA [Workforce Innovation and Opportunity Act] state plans and all of this other stuff coming, it just benefits us and enables us to work more effectively and work faster now that we have those strong relationships. They were there before, but they’re definitely stronger now as a result of this project.

Marc –One of the things I thought was really neat was getting all these folks together and thinking deeply about data. It might not be the most exciting topic for a lot of folks, but going through the process gave everyone a better understanding of what they can do and how they might be able to work with others. And I think in the day-to-day, not everyone spends that much time thinking at the data and variable level. But doing that will increase everyone’s capacity to be able to do this kind of work moving forward.

One other thing to add just as a side note. Throughout this process, we also collaborated with Nancy Copa at the Common Education Data Standards (CEDS) when we were doing the data mapping piece. We did not officially map the South Dakota data to that, but we used the CEDS as kind of a template to provide us with a common dictionary to have these conversations across departments. And that was really useful. In fact, all of us – the different departments in South Dakota and the CEDS folks –co-presented at the last STATS-DC conference, which I thought was a very positive experience.

The full transcript of this interview can be accessed on Advance CTE’s website. Other blog posts in this series can be viewed here.

Building the Evidence Base for BEST in CLASS – Teacher Training to Support Young Learners with the Most Challenging Classroom Behavior

Classroom teachers of young children face a seemingly never-ending challenge – how to manage disruptive behavior while simultaneously teaching effectively and supporting the needs of every student in the classroom. Researchers at Virginia Commonwealth University and the University of Florida have received five IES research grants over the past decade – three through the National Center for Special Education Research (NCSER) and two from the National Center for Education Research (NCER) – to develop and test a model of training and professional development, including coaching, for early childhood and early elementary school teachers on how best to support children who engage in disruptive and otherwise challenging classroom behaviors.

A group of young students, several with their hands raised sit cross legged on the floor

With their first IES grant in 2008, Drs. Maureen Conroy and Kevin Sutherland developed the original BEST in CLASS model for early childhood teachers. The goal of BEST in CLASS - PK is to increase the quantity and quality of specific instructional practices with young children (ages 3-5 years old) who engage in high rates of challenging behaviors with the ultimate goal of preventing and reducing problem behavior. Professional development consists of a six-hour workshop that uses didactic and interactive learning activities supported by video examples and practice opportunities. Following the workshop, teachers receive a training manual and 14 weeks of practice-based coaching in the classroom. 

Best in Class logoThe results of this promising development work led to a 2011 IES Efficacy study to test the impact of BEST in CLASS - PK on teacher practices and child outcomes. Based on positive findings from that Efficacy study the team was awarded two additional Development and Innovation grants – one in 2016 to develop a web-based version of BEST in CLASS – PK to increase accessibility and scalability and another in 2015 to adapt BEST in CLASS – PK for early elementary school classrooms (BEST in CLASS – Elementary). Drs. Sutherland and Conroy are currently in the second year of an Efficacy study to test the impact of BEST in CLASS - Elementary to determine if the positive effects of BEST in CLASS in preschool settings are replicated in early elementary classrooms.

Written by Emily Doolittle, NCER Team Lead for Social Behavioral Research, and Jacquelyn Buckley, NCSER Team Lead for Disability Research

Learning from CTE Research Partnerships: How Michigan Built Trust with Researchers to Better Understand State Data

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Austin Estes, Senior Policy Associate at Advance CTE, and Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES), are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The first interview was with Jill Kroll of the Michigan Department of Education and Dan Kreisman of Georgia State University (and Director of CTEx). [Note: this interview has been edited for length; you can find the full interview transcript here].

 

Jill Kroll Dan Kreisman
Michigan Department of Education Georgia State University

 

The first question we have is about the projects that you work on together: what were some of the research questions you came up with, and how did you come to settle on those research questions?

Jill – I first connected with Dan and with Brian Jacob at University of Michigan when I saw Brian present to our P-20 council about some research that he was doing connecting the wage record data for five community colleges. I was like “Gee, is there any way you can do something similar with the statewide secondary student data?” And he said it was possible. So I worked within our department procedures to find out how we could go about establishing a relationship that would allow this opportunity.

Dan – That led to a whole bunch of other discussions of things that we thought were interesting. So, to say that there is a set of research questions is not the way I view our relationship. We talk with folks in Jill’s office regularly to hear what questions are pressing for them, and then we try to help facilitate answering those and then see where those lead us. I think one of the important things is we try to think about where there are policy levers, so we want to say “If we answer this question, how can the state or the districts use that information to further their mission of providing CTE programming to students in Michigan?”

Jill – I’ve been really happy with the extent to which Dan and the research team have consistently focused on the “so what?” Rather than focusing on vague research questions of interest only to other researchers, they have emphasized their interest in doing research that has practical application, that can be used by educators in the field.

Could you share an example of how you’ve been able to use some of this evidence and research to change policy, or at least to shape your understanding on some decisions that you’re making at the state level?

Jill - When we were starting to work on our Perkins V [the Strengthening Career and Technical Education for the 21st Century Act] state plan, we had a short time to determine what we wanted to consider for our secondary indicator of program quality. Because Brian, Dan, and their students had been working with this data for so many years, they had the capacity to very quickly do the matching and

 come up with an approximation for us about what postsecondary credit attainment would look like, and what strengths and weaknesses they saw in the data. It would have been really difficult for our office, or even multiple state agencies, to have been able to work that quickly and give it the critical analysis that they did.

The other thing they did when we were making the decision for that indicator is look at the data that we had for work-based learning and tell us what could be done with it. What came out of that was that the data was not in any form that could be analyzed (text and PDFs). This was really revealing to our State Director Brian Pyles, and it led him to set a policy that we are going to build a consistent way of collecting data on work-based learning. So that is another piece where it influenced practice and policy. One of the most exciting and valuable things that I find about the partnership is that Dan and the other researchers have a lot more capacity to analyze the data in a way that we just don’t have the time to do. Sometimes we don’t have the expertise, and sometimes we just don’t look at the data in the same way.

Dan –And there’s a flip side that without their input, we often are looking at data and can’t make heads or tails of something. And we can get on the phone or write an email to someone over there and say “Hey we’re seeing this thing. Can you tell me what that means?” And they will come back with “Oh, the system changed” or “There was this one policy,” and “Here’s what you have to do to make it fit everything else.” And this happens all the time. We would be completely lost without this open channel that we have to their office.

I think it’s important not to dismiss the power of good descriptive work. Lots of times, the questions that states are grappling with can often be illuminated with some really careful and good descriptive work. You can say, “This is what we’re seeing, this is the big picture,” if you step back for a minute, and that information lots of times has been as valuable as the stuff we try to do that is more causally oriented in our research.

Jill – I agree, and I want to follow up on the whole issue of how important trust is. I cannot emphasize enough how important it is to me that Dan and the other researchers come to us with those questions, that they check in with us. That’s absolutely critical. Anyone who works with any kind of data knows that it’s just so complex. If you link tables wrong, or misunderstand a data field, you can come to a completely wrong decision. So that communication and that interaction and trust are key to accurate outcomes.

As you’re both looking ahead, what’s next on the agenda? What are some of the research questions and priorities you have for this partnership?

Dan – Number one is tracking students into the labor market. That’s our biggest and most outstanding question. And the degree to which CTE programs are preparing students for college and the labor market and careers. In terms of other projects, one of the things we’re interested in is technical assessments. We’re also part of a consortium of several states – that’s the CTEx group. We meet annually together, and that allows us to harmonize things across states to see how trends are similar, how enrollment rates work, all sorts of different questions across multiple states.

Jill – One of the things we’re talking about right now is that we don’t have, in an accessible form, data on access to a particular program. We know that career centers serve certain districts, but if someone asked, “If student A is going to Central High School, what programs do they have access to? we don’t have a good way of answering that at the moment. We’ve had a couple of discussions about how we can work together to build basically a dataset that clarifies that. That would be mutually beneficial and would take resources from both in order to do something like that.

Thinking back on this partnership, is there any advice you would give to other State Directors or CTE researchers?

Dan – Building a strong relationship is the first thing you have to do. And part of that is spending time face to face talking about questions, moving around ideas, looking at data together. We had the benefit of a long windup period. We spent at least a year just talking about questions and putting together data before we even started doing any analyses. We also had buy-in from Jill’s office up and down the line from folks who were doing the research to people who were in policymaking roles. And without all of that, none of this would even have been possible.

And the second part is to not downplay the value of just providing good information. A lot of us on the research side don’t realize how little time folks in the state offices have to take a step back and say, “What’s going on with our data? Let’s look at the big picture.” And one of the things we can provide them is just giving them that big picture and handing it to them in a digestible way. And doing that is the first step, is a really good way to start building that trust. They really see the value of what you can do early on. And then you can start to get into more difficult or longer-term questions.

Jill – The first advice I would give is: Do it! Partner with researchers. I can’t say enough positive about it. The second is: Follow department procedures and be transparent with department leadership. You know that windup might be really, really slow while you jog through the channels that you need to in your department to do things by the book, but I think it pays off in the long run.

My third one is: Be transparent and open with school districts. Share what you’re doing and invite their input. Anybody who works with state data would probably know, you’re always a little hesitant about what the public would think about this use of data. The way that Dan and the postdocs and graduate students have openly shared the work that they’ve done with our CTE administrators has really helped, in that I have not gotten any doubt from districts.

The full transcript can be accessed in Advance CTE’s Learning that Works Resource Center. Other blog posts in this series can be viewed here.