|Title:||Center for the Analysis of Postsecondary Readiness|
|Principal Investigator:||Edgecombe, Nicole||Awardee:||Teachers College, Columbia University|
|Program:||Education Research and Development Centers [Program Details]|
|Award Period:||5 years (7/1/2014–6/30/2019)||Award Amount:||$9,989,803|
|Type:||Multiple Goals||Award Number:||R305C140007|
Co-Principal Investigator: Mayer, Alex (MDRC)
Long-Term Follow-Up Award: 3 Years (FY 2020 – FY 2022), $2,000,003.
Project Website: http://postsecondaryreadiness.org/
Purpose: During the past 40 years, the United States has made major advances in expanding access to postsecondary education, but many students arrive at college without the requisite English and math skills to perform college-level work. Community colleges and other open-access institutions have traditionally responded by placing such students into sequential developmental (or remedial) education courses with outdated curricula and instructional practices. Unfortunately, research indicates that students placed into such courses make slow progress in improving their skills and rarely earn college degrees.
The Center for the Analysis of Postsecondary Readiness conducted research to document current practices in developmental English and math education across the U.S., identify innovative practices in assessment and instruction, evaluate the efficacy of practices that show promise for improving student outcomes, and assess the scalability of these models. In addition to its focused program of research, the Center engages in leadership and outreach activities that convene policymakers, practitioners, and researchers interested in improving developmental education and assists efforts by States, colleges, and universities to bring effective models to scale.
Research Projects: There are three major components of the focused program of research for the Center: one descriptive study and two evaluation studies. The goal of the descriptive study is to provide policymakers with better information on developmental education practices that are currently being used. The goal of the evaluation studies is to determine whether specific innovations in instructional practices or assessments are more likely to lead to improved student outcomes.
Descriptive Study The descriptive study was built around a nationally representative survey of open access two-year and nonselective four-year colleges. The total sample size required for the survey was 1,690 institutions. The researchers collected detailed information on how widely colleges and states have adopted or are adopting specific reforms (e.g., changes in teaching methods) as well as on comprehensive reforms that affect their entire set of developmental education offerings or their connection with high school and college-level programs. As part of the descriptive study, qualitative interviews with 40 institutional personnel and 40 state-level leaders were conducted.
Assessment Evaluation – Evaluation of Alternative Placement Systems and Student Outcomes
In the alternative placement system evaluation, carried out in partnership with the State University of New York (SUNY), the Center conducted a randomized control trial (RCT) to evaluate a course assignment method whereby colleges used multiple measures to predict student performance in college-level math and college-level English courses. In addition to placement test scores, these predictive measures included high school GPA and high school course-taking patterns. The sample of two-year eligible students at the set of 7 SUNY colleges participating in the study includes over 13,000 participants (approximately one quarter of all eligible students). In its initial trial, Center researchers found that students placed using the multiple measures method were more likely to enroll in and complete (with a grade of C or higher) a college-level math or English course during their first semester, compared to students placed using the business-as-usual method. Qualitative data from site visits were collected to document how colleges implemented the multiple measures method. As part of the evaluation, the researchers also collected cost data and conducted cost and cost-effectiveness analyses.
The SUNY multiple measures follow-up study extends the assessment evaluation for an additional 3 years, lengthening the time window when researchers observe student outcomes from 2 to 5 years. During the follow-up study, researchers will track outcomes for the full sample of 13,400 students for 4 to 5 years from the time that they were randomly assigned between fall 2016 and fall 2017. The extended time horizon will allow the researchers to assess students’ course completions after their first credit-bearing math and English course completions as well as their credit attainment and whether they completed a degree. The researchers will also conduct an additional cost-effectiveness analysis using updated cost and student outcome data from the colleges.
Instruction Study – Evaluation of Developmental Math Pathways and Student Outcomes
In the instruction evaluation, the researchers conducted an RCT to test the effectiveness of the Dana Center Math Pathways project (DCMP), an innovative math reform developed by the Charles A. Dana Center at the University of Texas at Austin. DCMP uses a student-centered, activities-based pedagogy to better engage students. It contrasts with the traditional approach by accelerating progression through remediation, providing alternative curricula based on students’ academic and career goals, and by aligning developmental content more closely to college-level programs of study. Under DCMP, developmental math students with appropriate career and academic goals were assigned to one of two pathways—Statistics or Quantitative Reasoning—depending on their intended major. Students assigned to DCMP took an accelerated one semester developmental math course, followed by a college-level statistics or quantitative reasoning course.
Four Texas community colleges participated in the evaluation, which included an implementation study to assess implementation fidelity and program-control contrast. The research sample includes about 1,461 students who were assessed as in need of one or two levels of developmental math. Students who met the study criteria and agreed to participate were randomly assigned to a treatment group offered the year-long DCMP program, including a one-semester developmental course followed by a one-semester college-level course, or a control group offered their colleges’ traditional developmental and college-level math sequence. Students in the treatment group were assigned to either the Statistics or the Quantitative Literacy pathway based on their expected major.
The initial analysis found that students assigned to DCMP were more likely than their control group counterparts to complete their developmental math sequence and become college-ready during their first three semesters after entering the study. Students in DCMP were also more likely to have passed a college-level math course and earned slightly more math credits than control group students as of their third semester in college. Exploratory analyses indicate that the impacts of DCMP were greater for part-time students and students assessed as needing multiple developmental courses. Instruction in DCMP courses contrasted strongly with colleges’ standard developmental course offerings and college-level algebra courses.
In the DCMP follow-up study, Center researchers will track the original sample of 1,411 students who enrolled in the study between fall 2015 and fall 2016 for a total of five years. This time horizon will facilitate comparisons across the DCMP and control group on a host of postsecondary outcomes including year-to-year persistence, credit attainment, completion of a degree or certificate, and transfer to a 4-year institution.
Leadership and Dissemination Activities:
The Center has hosted several policy forums and annual meetings of Center researchers, institutional and state representatives, and affiliated researchers to present findings and review progress. It held a large national conference in fall 2019 to showcase the work of the Center and to identify promising areas of research and development for the field. The Center works with a range of stakeholders to assist in their efforts to improve developmental education. The work addresses feasible policy design, improvement or adaptation of promising models, overcoming barriers and obstacles to implementation, and progress assessment.
The Center disseminates its work, as well as the work of affiliated researchers and practitioners, through various media including an engaging website with blogs, infographics, and answers to FAQs. The Center produces and distributes a variety of publications on the Center’s research, including reports, working papers, and peer-reviewed journal articles. To reach a broader audience, the Center also obtains media coverage for important findings in both education trade journals and popular news outlets.
Key PersonnelThomas Bailey (Teachers College, Columbia University), Lashawn Richburg-Hayes (MDRC), Elisabeth Barnett (Teachers College, Columbia University), Clive Belfield (CUNY), Eric Bettinger (Stanford University), Angela Boatman (Vanderbilt University), Nicole Edgecombe (Teachers College, Columbia University), Robert Ivry (MDRC), Shanna Jaggars (Teachers College, Columbia University), Michal Kurlaender (University of California, Davis), Susanna Loeb (Stanford University), Alexander Mayer (MDRC), Lisa Rothman (Teachers College, Columbia University), Elizabeth Zachry Rutschow (MDRC), Judith Scott-Clayton (Teachers College, Columbia University), Doug Slater (Teachers College, Columbia University), Michael Weiss (MDRC)
Moore, J.L., Allen, J.M., and Camara, W.J. (2017). Empirically Based College- and Career-Readiness Cut Scores and Performance Standards. In M.N. Gaertner, K.L. McClarty, and K. Mattern (Eds.), Preparing Students for College and Careers: Theory, Measurement, and Educational Practice. New York: Routledge.
Reddy, V., and Barnett, E. (2017). College Placement Strategies: Evolving Considerations and Practices. In M.N. Gaertner, K.L. McClarty, and K. Mattern (Eds.), Preparing Students for College and Careers: Theory, Measurement, and Educational Practice. (pp. 100–111). New York: Routledge.
Natow, R. S., Reddy, V., and Grant, M. (2017). How and Why Higher Education Institutions Use Technology in Developmental Education Programming. A CAPR Working Paper. Community College Research Center, Teachers College, Columbia University.