Inside IES Research

Notes from NCER & NCSER

DE21: A Researcher-Practitioner-Policymaker Conference on Dual Enrollment

Dual enrollment improves student college going and postsecondary success, but practitioners need help in understanding the impact of dual enrollment and in learning strategies associated with effective and equitable implementation. Under the auspices of the IES-funded Evaluation of Career and College Promise (CCP) project, the North Carolina Community College System suggested hosting a conference to build knowledge and capacity in the field about dual enrollment. The Evaluation of CCP is a partnership with the SERVE Center at the University of North Carolina at Greensboro, the North Carolina Department of Public Instruction, the North Carolina Community College System, and the RAND Corporation. In addition to the research goals—which involve looking at the implementation, impact, and cost of North Carolina’s dual enrollment program—the project also has a goal of capacity development for the agencies and for practitioners. As part of meeting this last goal, the project recently hosted a conference on Dual Enrollment: Accelerating Educational Attainment (DE21) with over 1,000 registrants from North Carolina and around the country.      

Julie Edmunds, the project’s principal investigator, discusses the DE21 conference.

Why host a conference on dual enrollment?

This was the brainchild of our partners at the North Carolina Community College System. They wanted to create an opportunity where researchers and practitioners could gather and share lessons learned from their respective work. The NC Community College System expected that we would be learning a lot from our project that we would want to share; they also knew that the people in the trenches had many valuable insights to help bridge the gap between research and practice. Because existing research shows that not all groups of students have the same access to dual enrollment, the project team decided collectively that the conference should have a strong focus on equity and to use the conference as a way to communicate and discuss strategies to support equity.

What happened at the conference?

We had a total of 40 sessions across two full days. There were dynamic keynote speakers, including Karen Stout from Achieving the Dream, and panels that discussed dual enrollment from the policy, research, student and parent perspectives. Although there was a strong North Carolina focus, there were sessions from other states such as Massachusetts, Texas, Indiana, and Ohio.

Conference presentations were organized into five themes: expanding access and equity, fostering college attainment, ensuring a successful transition to college and careers, preparing students for dual enrollment, and supporting success in dual enrollment courses.

The CCP study team presented findings from our evaluation of North Carolina’s dual enrollment pathways. We looked at individual and school-level factors associated with dual enrollment participation, such as student demographics, school size, locale, percentage of students from underrepresented minority groups, academic achievement, and workforce-orientation of students. Student socioeconomic level did not affect participation in dual enrollment. We also presented preliminary impacts of North Carolina’s three different dual enrollment pathways (college transfer, Career and Technical Education, and Cooperative Innovative High Schools or early colleges). Results from these three pathways showed that CCP participants had better high school outcomes such as higher school graduation rates and were more likely to enroll in postsecondary education. In addition, there were multiple sessions sharing research results from other states.

There were many presentations from practitioners that focused on topics like rigorous instruction, advising, participation of students with disabilities, creating strong secondary-postsecondary partnerships, using high school teachers as college instructors, among others. I need to give a huge shoutout to Katie Bao from the NC Community College System, who shepherded us all through the conference planning and implementation process.

What was the impact of the pandemic?

When we originally planned for the conference, we thought it would be in person. After the pandemic hit, we decided (as many other organizations did) to host it virtually. This made the conference much more accessible to a national audience, and we had participants and presenters from around the country.

What if someone missed the conference?

Another benefit of a virtual conference is that we are able to share all the sessions from the meeting. Please visit our site on YouTube to listen to the conference. 

What comes next?

Our study work continues, and we will share the results in a variety of ways, including through briefs and journal articles. We are also planning to host a second conference in 2023 and expect that it will have a virtual component so that it can continue to be available to a national audience.


Dr. Julie Edmunds is a Program Director at the SERVE Center at the University of North Carolina at Greensboro. In addition to being the PI on the Evaluation of Career and College Promise, she is one of the leading researchers on early college, a model that combines high school and college.

Student-Led Action Research as a School Climate Intervention and Core Content Pedagogy

Improving the social and emotional climate of schools has become a growing priority for educators and policymakers in the past decade. The prevailing strategies for improving school climate include social and emotional learning, positive behavioral supports, and trauma-informed approaches. Many of these strategies foreground the importance of students having a voice in intervention, as students are special experts in their own social and emotional milieus.

Parallel to this trend has been a push toward student-centered pedagogical approaches in high schools that are responsive to cultural backgrounds and that promote skills aligned with the demands of the modern workplace, like critical thinking, problem-solving, and collaboration. Culturally responsive and restorative teaching and problem- and project-based learning are prominent movements. In this guest blog, Dr. Adam Voight at Cleveland State University discusses an ongoing IES-funded Development and Innovation project taking place in Cleveland, Ohio that aims to develop and document the feasibility of a school-based youth participatory action research intervention.

 

Our project is exploring how youth participatory action research (YPAR) may help to realize two objectives—school climate improvement and culturally-restorative, engaged learning. YPAR involves young people leading a cycle of problem identification, data collection and analysis, and evidence-informed action. It has long been used in out-of-school and extracurricular spaces to promote youth development and effect social change. We are field testing its potential to fit within more formal school spaces.

Project HighKEY

The engine for our project, which we call Project HighKEY (High-school Knowledge and Education through YPAR), is a design team composed of high school teachers and students, district officials, and university researchers. It is built from the Cleveland Alliance for Education Research, a research-practice partnership between the Cleveland Metropolitan School District, Cleveland State University, and the American Institutes for Research. The design team meets monthly to discuss YPAR theory and fit with high school curriculum and standards and make plans for YPAR field tests in schools. We have created a crosswalk of the documented competencies that students derive from YPAR and high school standards in English language arts (ELA), mathematics, science, and social studies in Ohio. For example, one state ELA standard is “Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence,” and through YPAR students collect and analyze survey and interview data and use their findings to advocate for change related to their chosen topic. A state math standard is “Interpret the slope and the intercept of a linear model in the context of data,” and this process may be applied to survey data students collect through YPAR, making an otherwise abstract activity more meaningful to students.  

Assessing the Effectiveness of YPAR

Remaining open-minded about the various ways in which YPAR may or may not fit in different high school courses, we are currently testing its implementation in a pre-calculus course, a government course, an English course, and a life-skills course. For example, a math teacher on our design team has built her statistics unit around YPAR. Students in three separate sections of the course have worked in groups of two or three to identify an issue and create a survey that is being administered to the broader student body. These issues include the lack of extracurricular activities, poor school culture, and unhealthy breakfast and lunch options. Their survey data will be used as the basis for learning about representing data with plots, distributions, measures of center, frequencies, and correlation after the winter holiday. Our theory is that students will be more engaged when using their own data on topics of their choosing and toward the goal of making real change. Across all of our project schools, we are monitoring administrative data, student and teacher survey data, and interview data to assess the feasibility, usability, and student and school outcomes of YPAR.

Impact of COVID-19 and How We Adapted

We received notification of our grant award in March 2020, the same week that COVID-19 shut down K-12 schools across the nation. When our project formally began in July 2020, our partner schools were planning for a wholly remote school year, and we pivoted to hold design team meetings virtually and loosen expectations for teacher implementation. Despite these challenges, several successful YPAR projects during that first year—all of which were conducted entirely remotely—taught all of us much about how YPAR can happen in online spaces. This school year, students and staff are back to in-person learning, but, in addition to the ongoing pandemic, the crushing teacher shortage has forced us to continue to adapt. Whereas we once planned our design team meeting during the school day, we now meet after school due to a lack of substitute teachers, and we use creative technology to allow for mixed virtual and in-person attendance. Our leadership team is also spending a great deal of time in classrooms with teachers to assist those implementing for the first time. Our goal is to create a resource that teachers anywhere can use to incorporate YPAR into their courses. The product will be strengthened by the lessons we have learned from doing this work during these extraordinary times and the resulting considerations for how to deal with obstacles to implementation.


Adam Voight is the Director of the Center for Urban Education at Cleveland State University.

For questions about this grant, please contact Corinne Alfeld, NCER Program Officer, at Corinne.Alfeld@ed.gov.

Working to Understand the Policy Process in the Development of Michigan’s Read by Grade Three Law

In recent decades, there has been an emphasis on quantitative, causal research in education policy. These methods are best suited for answering questions about the effects of a policy and whether it achieved its intended outcomes. While the question of, “Did it work?” remains critical, there is a need for research that also asks, “Why did it work? For whom? In what contexts?” To answer these types of questions, researchers must incorporate rigorous qualitative methods into their quantitative studies. Education research organizations like the Association for Education Finance and Policy have explicitly invited proposals using qualitative and mixed methodologies in an effort to elevate research addressing a range of critical education policy questions. Funding organizations, including IES, encourage applicants to incorporate qualitative methods into their research process. In this guest blog, Amy Cummings, Craig De Voto, and Katharine Strunk discuss how they are using qualitative methods in their evaluation of a state education policy.

 

In our IES-funded study, we use qualitative, survey, and administrative data to understand the implementation and impact of Michigan’s early literacy law—the Read by Grade Three Law. Like policies passed in 19 other states, the Read by Grade Three Law aims to improve K-3 student literacy skills and mandates retention for those who do not meet a predetermined benchmark on the state’s third-grade English language arts assessment. Although the retention component of these policies remain controversial, similar laws are under consideration in several other states, including Alaska, Kentucky, and New Mexico. Below are some of the ways that we have integrated qualitative methods in our evaluation study to better understand the policy process in the development of the Read by Grade Three Law.  

Collecting qualitative sources helped us understand how the policy came to be, thereby assisting in the structure of our data collection for examining the law’s implementation and subsequent effects. In our first working paper stemming from this study, we interviewed 24 state-level stakeholders (policymakers, state department of education officials, early literacy leaders) involved in the development of the law and coded state policy documents related to early literacy to assess the similarity between Michigan’s policy and those of other states. Understanding the various components of the Law and how they ended up in the policy led us to ensure that we asked educators about their perceptions and implementation of these components in surveys that are also part of our evaluation. For example, because our interviews made clear the extent to which the inclusion of the retention component of the Law was controversial during its development, we included questions in the survey to assess educators’ perceptions and intended implementation of this component of the Law. In addition, it confirmed the importance of our plan to use state administrative retention and assessment data to evaluate the effect of retention on student literacy outcomes.

To trace the Read by Grade Three Law’s conception, development, and passage, we analyzed these qualitative data using two theories of the policy process: Multiple Streams Framework (MSF) and policy transfer. MSF says that policy issues emerge on government agendas through three streams: problem, policy, and political. When these streams join, a policy window is opened during which there is a greater opportunity for passing legislation. Meanwhile, policy transfer highlights how policies enacted in one place are often used in the development of policies in another.

We found that events in the problem and political streams created conditions ripe for the passage of an early literacy policy in Michigan:

  • A national sentiment around improving early literacy, including a retention-based third-grade literacy policy model that had been deemed successful in Florida
  • A pressing problem took shape, as evidenced by the state’s consistently below average fourth-grade reading scores on the National Assessment of Educational Progress
  • A court case addressing persistently low-test scores in a Detroit-area district
  • Previous attempts by the state to improve early literacy

As a result of these events, policy entrepreneurs—those willing to invest resources to get their preferred policy passed—took advantage of political conditions in the state and worked with policymakers to advance a retention-based third-grade literacy policy model. The figure below illustrates interviewee accounts of the Read by Grade Three Law’s development. Our policy document analysis further reveals that Michigan’s and Florida’s policies are very similar, only diverging on nine of the 50 elements on which we coded.

 

 

Although this study focuses on the development and passage of Michigan’s early literacy law, our findings highlight both practical and theoretical elements of the policy process that can be useful to researchers and policymakers. To this end, we show how particular conditions, coupled by policy entrepreneurs, spurred Michigan’s consideration of such a policy. It is conceivable that many state education policies beyond early literacy have taken shape under similar circumstances: a national sentiment combined with influential brokers outside government. In this way, our mixed-methods study provides a practical model of what elements might manifest to enact policy change more broadly.

From a theoretical standpoint, this research also extends our understanding of the policy process by showing that MSF and the theory of policy transfer can work together. We learned that policy entrepreneurs can play a vital role in transferring policy from one place to another by capitalizing on conditions in a target location and coming with a specific policy proposal at the ready.

There is, of course, more to be learned about the intersection between different theories of the policy process, as well as how external organizations as opposed to individuals operate as policy entrepreneurs. As the number of education advocacy organizations continues to grow and these groups become increasingly active in shaping policy, this will be an exciting avenue for researchers to continue to explore.

This study is just one example of how qualitative research can be used in education policy research and shows how engaging in such work can be both practically and theoretically valuable. The most comprehensive evaluations will use different methodologies in concert with one another to understand education policies, because ultimately, how policies are conceptualized and developed has important implications for their effectiveness.


Amy Cummings is an education policy PhD student and graduate research assistant at the Education Policy Innovation Collaborative (EPIC) at Michigan State University (MSU).

Craig De Voto is a visiting research assistant professor in the Learning Sciences Research Institute at the University of Illinois at Chicago and an EPIC affiliated researcher.

Katharine O. Strunk is the faculty director of EPIC, the Clifford E. Erickson Distinguished Chair in Education, and a professor of education policy and by courtesy economics at MSU.

How Remote Data Collection Enhanced One Grantee’s Classroom Research During COVID-19

Under an IES grant, Michigan State University, in collaboration with the Michigan Department of Education, the Michigan Center for Educational Performance and Information, and the University of Michigan, is assessing the implementation, impact, and cost of the Michigan “Read by Grade 3” law intended to increase early literacy outcomes for Michigan students. In this guest blog, Dr. Tanya Wright and Lori Bruner discuss how they were able to quickly pivot to a remote data collection plan when COVID-19 disrupted their initial research plan.  

The COVID-19 pandemic began while we were planning a study of early literacy coaching for the 2020-2021 academic year. It soon became abundantly clear that restrictions to in-person research would pose a major hurdle for our research team. We had planned to enter classrooms and record videos of literacy instruction in the fall. As such, we found ourselves faced with a difficult choice: we could pause our study until it became safer to visit classrooms and miss the opportunity to learn about literacy coaching and in-person classroom instruction during the pandemic, or we could quickly pivot to a remote data collection plan.

Our team chose the second option. We found that there are multiple technologies available to carry out remote data collection. We chose one of them (a device known as the Swivl) that included a robotic mount, where a tablet or smartphone can be placed to take the video, with a 360-degree rotating platform that works in tandem with a handheld or wearable tracker and an app that allows videos to be instantly uploaded to a cloud-based storage system for easy access.

Over the course of the school year, we captured over 100 hours of elementary literacy instruction in 26 classrooms throughout our state. While remote data collection looks and feels very different from visiting a classroom to record video, we learned that it offers many benefits to both researchers and educators alike. We also learned a few important lessons along the way.

First, we learned remote data collection provides greater flexibility for both researchers and educators. In our original study design, we planned to hire data collectors to visit classrooms, which restricted our recruitment of schools to a reasonable driving distance from Michigan State University (MSU). However, recording devices allow us to capture video anywhere, including rural areas of our state that are often excluded from classroom research due to their remote location. Furthermore, we found that the cost of purchasing and shipping equipment to schools is significantly less than paying for travel and people’s time to visit classrooms. In addition, using devices in place of data collectors allowed us to easily adapt to last-minute schedule changes and offer teachers the option to record video over multiple days to accommodate shifts in instruction due to COVID-19.

Second, we discovered that we could capture more classroom talk than when using a typical video camera. After some trial and error, we settled on a device with three external wireless microphones: one for the teacher and two additional microphones to place around the classroom. Not only did the extra microphones record audio beyond what the teacher was saying, but we learned that we can also isolate each microphone during data analysis to hear what is happening in specific areas of the classroom (even when the teacher and children were wearing masks). We also purchased an additional wide-angle lens, which clipped over the camera on our tablet and allowed us to capture a wider video angle.  

Third, we found remote data collection to be less intrusive than sending a research team into schools. The device is compact and can be placed on any flat surface in the classroom or be mounted on a basic tripod. The teacher has the option to wear the microphone on a lanyard to serve as a hands-free tracker that signals the device to rotate to follow the teacher’s movements automatically. At the end of the lesson, the video uploads to a password-protected storage cloud with one touch of a button, making it easy for teachers to share videos with our research team. We then download the videos to the MSU server and delete them from our cloud account. This set-up allowed us to collect data with minimal disruption, especially when compared to sending a person with a video camera to spend time in the classroom.

As with most remote work this year, we ran into a few unexpected hurdles during our first round of data collection. After gathering feedback from teachers and members of our research team, we were able to make adjustments that led to a better experience during the second round of data collection this spring. We hope the following suggestions might help others who are considering such a device to collect classroom data in the future:

  1. Consider providing teachers with a brief informational video or offering after-school training sessions to help answer questions and address concerns ahead of your data collection period. We initially provided teachers with a detailed user guide, but we found that the extra support was key to ensuring teachers had a positive experience with the device. You might also consider appointing a member of your research team to serve as a contact person to answer questions about the remote data collection during data collection periods.
  2. As a research team, it is important to remember that team members will not be collecting the data, so it is critical to provide teachers with clear directions ahead of time: what exactly do you want them to record? Our team found it helpful to send teachers a brief two-minute video outlining our goals and then follow up with a printable checklist they could use on the day they recorded instruction. 
  3. Finally, we found it beneficial to scan the videos for content at the end of each day. By doing so, we were able to spot a few problems, such as missing audio or a device that stopped rotating during a lesson. While these instances were rare, it was helpful to catch them right away, while teachers still had the device in their schools so that they could record missing parts the next day.

Although restrictions to in-person research are beginning to lift, we plan to continue using remote data collection for the remaining three years of our project. Conducting classroom research during the COVID-19 pandemic has proven challenging at every turn, but as we adapted to remote video data collection, we were pleased to find unanticipated benefits for our research team and for our study participants.


This blog is part of a series focusing on conducting education research during COVID-19. For other blog posts related to this topic, please see here.

Tanya S. Wright is an Associate Professor of Language and Literacy in the Department of Teacher Education at Michigan State University.

Lori Bruner is a doctoral candidate in the Curriculum, Instruction, and Teacher Education program at Michigan State University.

Cost Analysis in Practice: Resources for Cost Analysis Studies

IES supports rigorous research that can provide scientific evidence on how best to address our nation’s most pressing education needs. As part of the Standards for Excellence in Education Research (SEER) principles, IES-funded researchers are encouraged, and in some cases required, to conduct a cost analysis for their projects with the intended goal of supporting education agencies’ decision-making around the adoption of programs, policies, or practices. 

 

The Cost Analysis in Practice (CAP) Project is a 3-year initiative funded by IES to support researchers and practitioners who are planning or conducting a cost analysis of educational programs and practices. This support includes the following freely available resources.

  • Resources developed by the CAP Project
    • Introductory resources on cost analysis including Standards and Guidelines 1.1, an infographic, a video lecture, and FAQs.
    • Tools for planning your cost analysis, collecting and analyzing cost data, and reporting your results.
    • A Help Desk for you to submit inquiries about conducting a cost analysis with a response from a member of the CAP Project Team within two business days.
  • Other resources recommended by the CAP Project
    • Background materials on cost analysis
    • Guidance on carrying out a cost analysis
    • Standards for the Economic Evaluation of Educational and Social Programs
    • Cost analysis software

 

The CAP Project is also involved in longer-term collaborations with IES-funded evaluation projects to better understand their cost analysis needs. As part of this work, the CAP Project will be producing a set of three blogs to discuss practical details regarding cost studies based on its collaboration with a replication project evaluating an intervention that integrates literacy instruction into the teaching of American history. These blogs will discuss the following:

  • Common cost analysis challenges that researchers encounter and recommendations to address them
  • The development of a timeline resource for planning a cost study
  • Data collection for a cost study

 

The CAP Project is interested in your feedback on any of the CAP Project resources and welcomes suggestions for additional resources to support cost analysis. If you have any feedback, please fill out a suggestion form at the bottom of the Resources web page.