Inside IES Research

Notes from NCER & NCSER

IES Announces Forthcoming Funding Opportunity For the R&D of an “ROI Tool” to Inform Students’ Postsecondary Education and Career Decision Making

Students with electronic devices sitting against a wall.

Overview

On or about February 15, 2019, the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences (ED/IES SBIR) anticipates releasing a Special Topic Solicitation #91990019R0016 in Postsecondary Education. The solicitation will be announced through an IES Newsflash and will be posted here. It will request Phase I proposals for awards of up to $200,000 for 8 months to develop a prototype of a "ROI tool.” The tool will be designed to measure the costs versus benefits (the return on investment) of different postsecondary education and training programs to help students make well-informed choices about options to pursue after they complete high school.

Applicants must be a for-profit business 500 employees or less, and U.S. owned and operated. Applicants may partner with entities or organizations working on related initiatives in the field of postsecondary education, or may subcontract to non-profit researchers or individuals with specialized expertise as needed. The due date for submission for proposals will likely be on or about April 15, 2019, with awards in mid-June, and projects beginning shortly thereafter. All Phase I awardees will be eligible to apply for a Phase II award in 2020, for $900,000 for full scale development and research to test and validate the ROI tool.

Background

While many websites provide ways for students to explore colleges or careers and occupations of interest (e.g., such as College Scorecard and CareerOneStop), there is currently no tool that helps students understand the costs and benefits of individual postsecondary programs in an integrated, customizable, and user-friendly manner.  An ROI tool would likely combine information on individual program’s tuition and fees, time needed to complete, and expected earnings. Because these characteristics can vary significantly across programs and institutions, creating a single estimated measure of ROI would allow students to more easily compare postsecondary program options. If it helps students make better choices, it could lead to improved program completion rates, higher levels of employment and earnings, less education-related debt, and more satisfaction with their selected education and career paths. 

The ED/IES SBIR Special Topic intends to fund up to five (5) Phase I projects to (a) develop and research a prototype of an ROI tool, and (b) conduct planning and concept testing for a fully developed ROI tool that provides a user-friendly experience for students. The prototype of the ROI tool developed in Phase I shall integrate with one or more existing technology systems, data sets, data standards, or resources (such as CareerOneStop or College Scorecard), and add new data elements provided by an end-user.  After a successful Phase I project, it is anticipated that small businesses that win Phase II awards will complete the full-scale development of the ROI tool that was started in Phase I, including developing an interface to improve the experience of students using the ROI tool.

Because data for ROI at the program level may only be available from some states, regions, or sets of institutions at this time, it is expected that the scope of the ROI tool developed in Phase I & II would be limited and would not be an attempt to calculate ROI for every program and institution in the country. Applicants must propose a project scope that appropriately reflects the datasets that are to be integrated within the new ROI tool, and the amount of funding and time allotted for development and research of the SBIR awards in Phase I and II.  Small businesses that are interested in this solicitation must have expertise with related efforts in the field to enhance student choices by linking education and workforce information.

Potential applicants may submit questions to ED’s Contracting Specialist Kevin.Wade@ed.gov. All questions and responses will be posted publically on the same website where the solicitation is posted as Amendments to the Solicitation.

 

IES is Expanding the Evidence Base for Career and Technical Education (CTE)

Since February is Career and Technical Education (CTE) Month—let’s look at what is going on in CTE training and research. Formerly known as “vocational education,” CTE generally comprises instruction in the academic, technical, and employability skills and knowledge required to enter into and succeed in specific occupations. CTE can introduce high school students to different career paths and help them build marketable skills or even credentials. For college students, CTE offers an entry point for new and returning students as they gain knowledge and skills in certain occupational fields.

Many policymakers consider CTE to be a key aspect of “college and career readiness.” In 2017, 49 states enacted 241 CTE policies, and 42 states enacted an additional 146 CTE policies in 2018. However, CTE practice and policy are way ahead of research—particularly in terms of research that can more definitively link CTE to specific outcomes and impacts. Over the past few years, IES has made some important strides in this area.

Following a Technical Working Group meeting on the future of CTE research, IES partnered with the Office of Career Technical and Adult Education to launch a new research network called “Expanding the Evidence Base for Career and Technical Education (CTE).” 

 

 

The CTE research network is a five-year grant, to be led by the American Institutes for Research (AIR), with partners at Vanderbilt University, Jobs for the Future, and the Association for Career and Technical Education (ACTE). Currently, three IES research projects have joined the new network, and we hope to eventually include up to six. All of these projects will look at the causal impact of CTE on student outcomes.

Throughout the five-year grant period, the Network Lead will bring together project teams and help to provide vision and support to the research projects as a whole. The Network Lead will also conduct research, provide CTE research training activities, and work to disseminate the Network’s research products so that they can reach the widest possible stakeholder audience. 

The first meeting of the Network members occurred on January 8, 2019 in Washington, DC, where members began to set the priorities for collaborative work across projects. Network members agreed that it is critical for all research projects to provide detailed data broken out by CTE field and by student subgroups, including students with disabilities.

One key priority of the Network is to develop a working definition of CTE for research purposes (i.e., how to define a CTE student and how to measure CTE participation). A related priority is to identify or develop appropriate measures of CTE participation and outcomes that network members, as well as other CTE researchers, can use. Over the course of the grant, network members will have the opportunity to collaborate on a variety of activities.

We will be reporting on the Network’s progress periodically on this blog, but readers are also encouraged to visit the CTE Research Network website, housed by AIR.

 

Blog post by Corinne Alfeld, program officer in the IES National Center for Education Research (NCER)

For more information about the CTE research network, contact corinne.alfeld@ed.gov. Corinne is also the program officer for NCER’s CTE research topic, which will be accepting grant applications for all types of CTE research later in 2019! (Note that funded studies designed to measure the causal impact of CTE programs or policies may be eligible to join the CTE research network in future years). Sign up for the IES Newsflash to be notified when the NCER Requests for Applications are released.

New Reports and Resources Around ELs and STEM

In recent months, several federal reports and resources related to English learner (EL) learning and education related to science, technology, engineering, and mathematics (STEM) have been released.

First, the Office of English Language Acquisition (OELA) released its third “data story” about ELs in US schools. This story, which builds on two previously released stories about the characteristics and educational experiences of ELs, focuses specifically on ELs’ NAEP performance and high school graduation rates. Through interactive infographics (many of which are built on data from the National Center for Education Statistics), the story shows that higher percentages of ELs are proficient in math than in reading, but that nearly half of all states experienced declines in the number of ELs who scored proficient in math between 2009 and 2017. The story also shows that graduation rates for ELs improved by 10 percentage points between 2010-11 and 2015-16 (from 57 percent to 67 percent), but still fall well below the rates for non-ELs (84 percent). While interesting and informative, the data story also underscores the necessity of research and development to produce better resources and information to support EL learning.

In that vein, the National Academies of Sciences, Engineering, and Medicine released English Learners in STEM Subjects: Transforming Classrooms, Schools, and Lives. This report examines what we know about ELs’ learning, teaching, and assessment in STEM subjects and provides guidance on how to improve STEM learning outcomes for these students. It reflects the consensus of a committee of EL experts that was chaired by NCER and NCSER grantee Dr. David Francis and included past grantees Dr. Okhee Lee and Dr. Mary Schleppegrell alongside a dozen other experts in EL education, STEM education, and teaching. One of the report’s central conclusions is that ELs develop proficiency in both STEM subjects and language when their classroom teachers provide them with opportunities for meaningful interaction and actively support both content and language learning. Given that many STEM teachers do not receive preparation to teach in this way, the report provides several recommendations to improve pre-service and in-service training. It also includes recommendations for how developers and publishers might produce better instructional materials and assessments to help both teachers and EL students. 

Efforts of both types – instructional preparation and development of new materials – may be further supported by two new toolkits released by the Office of Education Technology. The toolkits are designed for educators and developers, and each is organized around five specific guiding principles to help the targeted group approach education technology with ELs’ unique needs in mind. The principles for developers emphasize the importance of thinking ahead about EL needs for those who wish to make products for this population. Meanwhile, the educator principles center on issues of awareness, and encourage teachers to learn more about the features, platforms, and resources that are available for ELs in the world of education technology. The principles also complement one another – for example, developers are encouraged to offer instruction-focused professional development, and educators are encouraged to seek out the same.

Brought together, these resources provide a snapshot of ELs’ mathematics achievement, a summary of research evidence about learning and instruction for ELs in STEM, and a set of principles to guide instruction and development efforts in the technology space moving forward. They also make a clear case for continued investment in R&D efforts to support STEM learning for both EL students and their teachers. Since 2010, the National Center for Education Research has invested nearly $20 million across 13 research and researcher-practioner partnership grants that have focused on STEM learning and ELs. Several such grants are coming to a close in the 2019 fiscal year; watch this space for future blog posts about the products and findings from these projects.

 

CAPR: Answers to Pressing Questions in Developmental Education

Since 2014, IES has funded the Center for the Analysis of Postsecondary Readiness (CAPR) to answer questions about the rapidly evolving landscape of developmental education at community colleges and open-access four-year institutions. CAPR is providing new insights into how colleges are reforming developmental education and how their reforms are impacting student outcomes through three major studies:

  • A survey and interviews about developmental education practices and reform initiatives
  • An evaluation of the use of multiple measures for assessing college readiness
  • An evaluation of math pathways.

Preliminary results from these studies indicate that some reforms help more students finish their developmental requirements and go on to do well in college-level math and English.

National Study of Developmental Education Policies and Practices

CAPR has documented widespread reform in developmental education at two- and four-year colleges through a national survey and interviews on developmental education practices and reforms. Early results from the survey show that colleges are moving away from relying solely on standardized tests for placing students into developmental courses. Colleges are also using new approaches to delivering developmental education including shortening developmental sequences by compressing or combining courses, using technology to deliver self-paced instruction, and placing developmental students into college-level courses with extra supports, often called corequisite remediation.

Developmental Math Instructional Methods in Public Two-Year Colleges (Percentages of Colleges Implementing Specific Reform Strategies)

Notes: Percentages among two-year public colleges that reported offering developmental courses. Colleges were counted as using an instructional method if they used it in at least two course sections. Categories are not mutually exclusive.

Evaluation of Developmental Math Pathways and Student Outcomes

CAPR has teamed up with the Charles A. Dana Center at the University of Texas at Austin to evaluate the Dana Center Mathematics Pathways (DCMP) curriculum at four community colleges in Texas. The math pathways model tailors math courses to particular majors, with a statistics pathway for social science majors, a quantitative reasoning pathway for humanities majors, and an algebra-to-calculus pathway for STEM majors. DCMP originally compressed developmental math into one semester, though now the Dana Center is recommending corequisite models. Instructors seek to engage students by delving deeply into math concepts, focusing on real-world problems, and having students work together to develop solutions.

Interim results show that larger percentages of students assigned to DCMP (versus the traditional developmental sequence) enrolled in and passed developmental math. More of the DCMP students also took and passed college-level math, fulfilling an important graduation requirement. After three semesters, 25 percent of program group students passed a college-level math course, compared with 17 percent of students assigned to traditional remediation.

Evaluation of Alternative Placement Systems and Student Outcomes (aka Multiple Measures)

CAPR is also studying the impact of using a combination of measures—such as high school GPA, years out of high school, and placement test scores—to predict whether students belong in developmental or college-level courses. Early results from the multiple measures study show that, in English and to a lesser extent in math, the multiple measures algorithms placed more students into college-level courses, and more students passed those courses (compared to students placed with a single test score).

 

College-Level English Course Placement, Enrollment, and Completion in CAPR’s Multiple Measures Study (Percentages Compared Across Placement Conditions)

 

College-Level Math Course Placement and Completion in CAPR’s Multiple Measures Study

Looking Ahead to the Future of Developmental Education

These early results from CAPR’s evaluations of multiple measures and math pathways suggest that those reforms are likely to be important pieces of future developmental education systems. CAPR will release final results from its three studies in 2019 and 2020.

Guest blog by Nikki Edgecombe and Alexander Mayer

Nikki Edgecombe is the principal investigator of the Center for the Analysis of Postsecondary Readiness, an IES-funded center led by the Community College Research Center (CCRC) and MDRC, and a senior research scientist at CCRC. Alexander Mayer is the co-principal investigator of CAPR and deputy director of postsecondary education at MDRC.

Companion Guidelines on Replication and Reproducibility in Education Research

Just over five years ago the Institute of Education Sciences (IES) and the National Science Foundation (NSF) released the Common Guidelines for Education Research and Development. The Guidelines provided the expected purposes, justifications, and contributions of various types of research aimed at improving our knowledge of interventions and strategies for improving teaching and learning.  Since 2013, there has been increased attention to replication and reproducibility studies and their role in building the evidence base. In response to this interest and the importance of this work, the two organizations jointly issued the new Companion Guidelines on Replication and Reproducibility in Education Research to supplement the Common Guidelines for Education Research and Development. The companion document provides guidance on the steps researchers can take to promote corroboration, ensure the integrity of research, and extend the evidence base.

The Companion Guidelines identify principles to help education stakeholders design and report reproducibility and replication studies. These principles are consistent with and draw from guidelines provided by scientific and professional organizations, advisory committees, and have emerged in consultation with the field (e.g., Dettmer, Taylor, and Chhin, 2017; Subcommittee on Replicability and Science, 2015). The principles address three main areas – (1) replication and reproducibility at the proposal stage, (2) promoting transparency and openness in designing studies, and (3) considerations in the reporting of results. 

Although the importance of reproducibility and replication studies for advancing scientific knowledge has been widely acknowledged, there are several challenges for researchers in our field, including actual or perceived disincentives (e.g., publication bias; reputation and career advancement norms; emphases on novel, potentially transformative lines of inquiry), implementation difficulties (especially for direct replications), and complexities of interpreting results (e.g., lack of consensus on what it means to “replicate” findings, low statistical power for replications). Grant funding agencies such as IES and NSF as well as education researchers have a role to play in addressing these challenges, promoting reproducibility and replication studies, and ultimately moving the field forward.

Why focus on replication and reproducibility?

The original Common Guidelines document did not substantively address issues pertaining to replication and reproducibility of research.  Given the interest in and importance of this work, IES and NSF are providing additional clarity to the field in terms of common definitions and principles around replication and reproducibility.

Who is the audience for the Companion Guidelines on Replication and Reproducibility? 

The primary audience for this document is education researchers; however, education research funding agencies and reviewers of grant applications are additional audiences for this document.

How should this document be used by researchers intending to apply for grants to conduct a reproducibility or replication study?

This document is meant to highlight the importance of replication and reproducibility studies and to offer guidelines to education stakeholders for thinking about and promoting reproducibility and replication in education research. It does not supersede the guidance provided in the requests for applications provided by IES and NSF. 

What are the guiding principles for proposing replication and reproducibility studies?

The overarching principles at the grant proposal stage are as follows:

  1. Clarify how reproducibility or replication studies would build on prior studies and contribute to the knowledge base.
  2. Clearly specify any variations from prior studies and the rationale for such variations.
  3. Ensure objectivity (e.g., by conducting an independent investigation, or by putting safeguards in place if the original investigator(s) is involved).

In addition to these principles, the document also lays out principles for promoting transparency, open science, and reporting results.

Read the full Companion Guidelines here.