
Are Artificially Intelligent Conversational Chatbots Uniformly Effective in Reducing Summer Melt? Evidence from a Randomized Controlled Trial
Nurshatayeva, Aizat; Page, Lindsay C.; White, Carol C.; Gehlbach, Hunter (2021). Research in Higher Education, v62 n3 p392-402. Retrieved from: https://eric.ed.gov/?id=EJ1294362
-
examining4,442Students, gradePS
Single Study Review
Review Details
Reviewed: September 2021
- Single Study Review (findings for Nudging intervention via AI chatbots)
- Randomized Controlled Trial
- Meets WWC standards without reservations because it is a randomized controlled trial with low attrition.
This review may not reflect the full body of research evidence for this intervention.
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Findings
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Enrolled in any 4-year college |
Nudging intervention via AI chatbots vs. Business as usual |
0 Days |
Full sample;
|
95.50 |
96.00 |
No |
-- | ||
Show Supplemental Findings | |||||||||
Enrolled in any 4-year college - 1st gen |
Nudging intervention via AI chatbots vs. Business as usual |
0 Days |
First generation college students;
|
94.70 |
92.00 |
No |
-- | ||
Enrolled in ECU - 1st gen |
Nudging intervention via AI chatbots vs. Business as usual |
0 Days |
First-generation students;
|
92.40 |
89.00 |
No |
-- | ||
Enrolled in ECU |
Nudging intervention via AI chatbots vs. Business as usual |
0 Days |
Full sample;
|
92.30 |
93.00 |
No |
-- |
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Sample Characteristics
Characteristics of study sample as reported by study author.
-
Female: 58%
Male: 42% -
Urban
-
- B
- A
- C
- D
- E
- F
- G
- I
- H
- J
- K
- L
- P
- M
- N
- O
- Q
- R
- S
- V
- U
- T
- W
- X
- Z
- Y
- a
- h
- i
- b
- d
- e
- f
- c
- g
- j
- k
- l
- m
- n
- o
- p
- q
- r
- s
- t
- u
- v
- x
- w
- y
North Carolina
-
Race Asian 3% Black 15% Native American 1% Other or unknown 15% White 66% -
Ethnicity Hispanic 8% Not Hispanic or Latino 92%
Study Details
Setting
The study was conducted at East Carolina University (ECU), a four-year college in Greenville, NC with students who had applied to ECU and been offered admission at ECU (“ECU-intending students”) but had not yet completed pre-matriculation or enrollment activities. The study aimed to reduce “summer melt”, which is when students who have passed other enrollment requirements by spring of their senior year of high school, fail to enroll in college in the following fall. The researchers named the artificial intelligence (AI) chatbot, PeeDee, after the campus mascot. Generally, 84% of students at ECU were in-state students, and over one-third (34%) received Pell grants. ECU is located in rural, coastal North Carolina, a region with lower-income counties compared with the rest of the state.
Study sample
Students in the sample were about 18 years old and over half were female (58%). Nearly two-thirds (66%) were White, 15% were Black, 3% were Asian, 1% were Native American, and 6% reported being multiracial. 8% of the sample were Hispanic and 18% were first-generation college students. Slightly more than the college average (87%) students were in-state.
Intervention Group
Throughout the summer preceding fall enrollment, intervention group students received text messages from the AI chatbot, PeeDee, with information and reminders about the logistical and administrative tasks they would need to complete in order to successfully matriculate in the fall. There were eight categories of messages that PeeDee sent: introduction (where PeeDee introduced their functionality), orientation (reminding students to register for the orientation sessions and including registration links and deadlines), course registration (reminders and offers of help), housing (information about moving into residence halls and steps required), financial aid messages, social involvement (social media and freshman events), academic exploration and enrollment help), and rapport-building such as trivia and encouraging messages. Messages were tailored to students’ needs so that, for example, students who had already applied for financial aid would not receive reminders about financial aid deadlines. Importantly, students could send text messages to PeeDee, such as requests for follow up information. On average, 26 messages were sent to students (ranging from 3-97 messages) and three were sent to PeeDee from students (ranging from 0-52 messages). Students could also opt out of receiving messages and about 6% opted out of receiving messages.
Comparison Group
Students in both the intervention and control groups received business-as-usual communications from the college on topics that included orientation, registration, housing, and setting up a university email account, academic opportunities on the ECU campus, tuition bill information and reminders, and other information intended to help students with social adjustment such as campus events and information about Greenville.
An indicator of the effect of the intervention, the improvement index can be interpreted as the expected change in percentile rank for an average comparison group student if that student had received the intervention.
For more, please see the WWC Glossary entry for improvement index.
An outcome is the knowledge, skills, and attitudes that are attained as a result of an activity. An outcome measures is an instrument, device, or method that provides data on the outcome.
A finding that is included in the effectiveness rating. Excluded findings may include subgroups and subscales.
The sample on which the analysis was conducted.
The group to which the intervention group is compared, which may include a different intervention, business as usual, or no services.
The timing of the post-intervention outcome measure.
The number of students included in the analysis.
The mean score of students in the intervention group.
The mean score of students in the comparison group.
The WWC considers a finding to be statistically significant if the likelihood that the finding is due to chance alone, rather than a real difference, is less than five percent.
The WWC reviews studies for WWC products, Department of Education grant competitions, and IES performance measures.
The name and version of the document used to guide the review of the study.
The version of the WWC design standards used to guide the review of the study.
The result of the WWC assessment of the study. The rating is based on the strength of evidence of the effectiveness of the intervention. Studies are given a rating of Meets WWC Design Standards without Reservations, Meets WWC Design Standards with Reservations, or >Does Not Meet WWC Design Standards.
A related publication that was reviewed alongside the main study of interest.
Study findings for this report.
Based on the direction, magnitude, and statistical significance of the findings within a domain, the WWC characterizes the findings from a study as one of the following: statistically significant positive effects, substantively important positive effects, indeterminate effects, substantively important negative effects, and statistically significant negative effects. For more, please see the WWC Handbook.
The WWC may review studies for multiple purposes, including different reports and re-reviews using updated standards. Each WWC review of this study is listed in the dropdown. Details on any review may be accessed by making a selection from the drop down list.
Tier 1 Strong indicates strong evidence of effectiveness,
Tier 2 Moderate indicates moderate evidence of effectiveness, and
Tier 3 Promising indicates promising evidence of effectiveness,
as defined in the
non-regulatory guidance for ESSA
and the regulations for ED discretionary grants (EDGAR Part 77).