Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Efficacy of Viridis Learning Tool: A Technology-Based Approach to Advising and Job Matching
Center: NCER Year: 2018
Principal Investigator: Karam, Rita Awardee: RAND Corporation
Program: Postsecondary and Adult Education      [Program Details]
Award Period: 5 years (09/01/2018 - 08/31/2023) Award Amount: $3,299,866
Type: Efficacy and Replication Award Number: R305A180377
Description:

Co-Principal Investigator: Goldman, Charles A.

Purpose: The purpose of this project is to assess the efficacy of the Viridis tool, a multi-platform software application designed to facilitate credential completion and subsequent job placement for community college students enrolled in one-year occupational certificate programs. Although significant increases in wages and job quality are associated with certificate completion, many students do not complete a credential of any kind before exiting community college. Unfinished occupational credentials result in failed investments for students and colleges as well as unfilled jobs in high-demand occupations. Also, community college students often lack relationships with family members, peers, or college advisors that can help them navigate their program of study. The Viridis tool aims to meet students' needs for relevant guidance by providing a direct digital link to their advisors and substantially increasing their advisors' access to real-time information about their academic progress.

Project Activities: The research team will test the Viridis tool during two years of implementation at ten community colleges using a randomized controlled trial (RCT). During year 1, the research team will pilot the tool at 3 colleges for one semester. During years 2 and 3, the project team will initiate two waves of evaluation. During this period, the research team will work with the colleges and developer to insure full implementation of the tool and its features and will survey students and advisors regarding their engagement with the tool. During years 4 and 5, the research team will track academic progress and job placement for students in the study, analyze impacts of the tool on these outcomes, and communicate their findings to college administrators, policymakers and the research field.

Products: Researchers will generate causal evidence about impacts of The Viridis Learning Tool on community college students' completion of occupational certificate programs and subsequent job placement. The research team will also communicate its findings through peer-reviewed publications, conference presentations and briefs.

Structured Abstract

Setting: This study will take place at ten community colleges located in California, Hawaii, and Texas.

Sample: The sample includes 20,000 community college students enrolled in 50 one-year occupational certificate programs evenly distributed across 10 participating community colleges. 

Intervention: Viridis Learning Tool is a technology-based planning, advising, and job matching tool. It has four key features. Key feature 1 draws on college and regional data to display how the student's chosen pathway fits into a ladder of training opportunities that can increase the student's workplace value and salary. Key feature 2 draws on college administrative data to monitor each student's academic progress in real-time and synthesizes these outcomes into a Skills Passport reflecting the student's proficiencies and employability. Key feature 3 translates progress measures into an early alert system that allows advisors to identify students who are failing to complete work or required courses, underperforming, or omitting key courses needed for completing their pathway. Key feature 4 draws on job data from local employers to match students to jobs based on their acquired skills and completed credentials.

Research Design and Methods: The team will use a randomized block design to assign equal numbers of students to the treatment and control conditions within each of the participating certificate programs. Randomization will occur over a two-year period. Researchers will then track the cohorts for two years for certificate completion and another six months after certificate completion for employment outcomes. Because randomization will take place within programs and students in the treatment and control groups will share the same educational environment, the design will facilitate clear interpretations of observed treatment impacts. The design will facilitate unbiased intent-to-treat impact estimates of offering the tool on proximate academic outcomes, eventual academic outcomes, and job placement.

Control Condition: Students in the control condition will not have access to the Viridis Learning Tool.

Key Measures: Developers of the Viridis tool will draw on data provided by students and colleges' administrative data to construct measures of students' progress and employability. Researchers will collect transcript data from the colleges to construct measures of on-time and near-time degree completion. To construct fidelity measures and understand how tool usage mediates academic and job outcomes, researchers will collect monthly tool usage data and conduct annual surveys of students and advisors. Responses from the student and advisor surveys will also inform the research team regarding the frequency, quality, and content of student-advisor interactions. Advisor surveys and interviews will help researchers understand how advisors implement the tool and how advising practices differ across treatment and control groups. 

Data Analytic Strategy: Researchers will employ a multi-level logistic regression model that aligns with block randomization design. At level 1, they will add measures of student demographic characteristics to improve the precision of impact estimates. At level 2, the research team will enter treatment assignment, along with measures of occupational program features. The team will use this model to estimate impacts of the tool on all outcomes of interest. In a separate analysis that limits the sample to students in the treatment group, the research team will assess the extent to which student characteristics and program (or college) support for implementation of the tool predict tool usage and student-advisor interactions. Next, the team will explore whether variation in tool usage and/or advisor interactions explains any observable differences in program completion or job placement. Finally, they will assess whether impacts of the tool vary according to students' race-ethnicity and family income.


Back