Project Activities
Using both secondary and primary data, the researchers examined interactions among factors of writing achievement and explored how these factors relate to college retention and completion. They used a theoretical model of writing that assumes that multiple skills contribute to writing achievement including writing-domain knowledge, domain-general knowledge, and intrapersonal factors. The researchers conducted a mixed-methods study of both primary and secondary data, using classroom artifacts (samples of student writing) and data from an automated writing evaluation to refine this theoretical model and determine correlations among the components and student outcomes.
Structured Abstract
Setting
Study 1 used secondary data collected from 22 diverse 4-year postsecondary institutions from across the country. Study 2 used data from students attending 6 different 4-year university settings in urban, suburban, and rural settings.
Sample
The data for study 1 were collected from 1,791 students who completed both the ETS HEIghten Critical Thinking and the Written Communications Assessments from a previous pilot study. The data for study 2 was drawn from 735 students enrolled in first-year writing classes or disciplinary classes (e.g., humanities or science classes). These students represented a diverse group in terms of gender (41 percent were male, and 59 percent were female), ethnicity (23 percent Hispanic), and race (5 percent Asian, 32 percent Black or African American, 38 percent White, 2 percent two or more groups).
Underlying this research is the assumption that writing skills rely on writing-domain knowledge, general-domain knowledge, and intrapersonal factors. Combined, these three components may predict student writing and, hence, students' ability to complete writing courses and persist in postsecondary classes. In terms of writing-domain knowledge, the researchers focused on five main constructs:
- Conventions (grammar, spelling, citations, etc.)
- Coherence (topic development, topical cohesion, etc.)
- Organization (thesis sentences, use of rhetorical and discourse markers, such as therefore, however)
- Source use and integration (appropriate use of citations, etc.)
- Topicality (relevant vocabulary, etc.)
Students' skill in these constructs may vary depending on student background characteristics. For example, some students may have deficits in some, but not all, constructs or be weak or strong across the board. In terms of domain-general knowledge, the researchers explored general critical thinking skills that may correlate with writing ability. In terms of intrapersonal factors, they explored whether a student's level of engagement, goal setting, interest, motivation, and self-efficacy in writing correlated with writing ability. They did not explore relationships between writing achievement constructs and student demographic backgrounds.
Research design and methods
The researchers used both existing secondary data sets (study 1) and primary data (study 2). They examined samples of students' writing using ETS's existing automated writing evaluation (AWE) capabilities and student writing samples from writing prompts used in HEIghten Written Communication Assessment (studies 1 & 2) and disciplinary course assignments (study 2). The AWE tool analyzes writing samples for the 5 writing-domain constructs, analyzing 175 features to build profiles for a writer's use of the constructs. These profiles provided detailed information about writing skills that may be essential for postsecondary success. In study 2, the researchers collected additional measures of student domain-general knowledge and intrapersonal skills and examples of coursework writing. While most students who participated were enrolled in first-year writing courses, students in study 2 did come from multiple disciplines, such as science, humanities, or psychology to help ensure a broad range of genres and writing samples.
Control condition
Given the exploratory nature of this project, there is no control condition.
Key measures
The researchers evaluated writing-specific and domain-general knowledge using subtests of ETS's HEIghten assessment and intrapersonal factors using existing motivation and interest scales. They collected administrative data on students' overall GPA and GPA from writing intensive courses, course completion, major course enrollment and completion, and retention in college for up to five semesters following primary data collection.
Data analytic strategy
The researchers used multiple analytic methods such as principal components analysis and regression to study the relationships among the AWE-identified linguistic features and among those features, critical thinking, intrapersonal factors, writing skills (as captured by AWE feature measures), and other postsecondary outcomes.
Key outcomes
The main findings of this exploratory study are as follows:
- Different aspects of the writing construct are captured in different contexts (assessment vs coursework writing) (McCaffrey et al., 2022). This informs our knowledge about how we think about evaluating student writing abilities.
- Coursework writing in different genres provides students with opportunities to practice different aspects of the writing construct. Limiting opportunities to write in different genres hinders development of their writing domain knowledge (Burstein, et al,2020).
- Writing domain knowledge predicts student general knowledge (critical thinking outcomes), academic outcomes (assessment scores, college grade point average), and correlate with success predictors (such as SAT/ACT scores) (Burstein, et al., 2019).
- Intrapersonal factors (such as attitudes) are associated with writing ability and outcomes, suggesting that writing curricula should consider the importance of intrapersonal factors (Ling et al., 2021).
- Survival analyses revealed relationships between vocabulary usage and student retention. Specifically, greater use of personally reflective language in coursework and assessment writing increased the likelihood of dropout. More sophisticated and varied use of vocabulary reduced the likelihood of dropout (Burstein et al. 2023, McCaffrey et al., 2021).
- Coursework writing was shown to be a potential source for assessing malleable factors using automated writing evaluation computer algorithms. The data might be sensitive to the course assignments. The project team developed a rubric for evaluating the quality of assignments and pilot tested the rubric using the limited assignment materials the project team collected. The assignment scores were generally low. The rubric could support future studies of coursework writing (Oddis et al., 2022).
People and institutions involved
IES program contact(s)
Project contributors
Products and publications
ERIC Citations: Find available citations in ERIC for this award here.
Publicly Available Data: Publicly available data can be found at the ETS Github site at this link.
Select Publications:
Book Chapter
Burstein, J. McCaffrey, D.F., Holtzman, S., & Beigman Klebanov, B. (2023). Making sense of college students’ writing achievement and retention with automated writing evaluation. in V. Yaneva, M. Von Davier (Eds). Advancing Natural Language Processing in Educational Assessment (pp. 217-234). Taylor & Francis.
Burstein, J., Riordan, B., & McCaffrey, D. (2020). Expanding automated writing evaluation. In, Yan, D., Rupp, A. A., & Foltz, P. W. (Eds.) Handbook of Automated Scoring: Theory into Practice (pp 329-346). CRC Press. 329-346
Journal Articles Hazelton, L., Nastal, J., Elliot, N., Burstein, J. & McCaffrey, D. (2021). Formative automated writing evaluation: A standpoint theory of action. Journal of Response to Writing, 7(1), 37-91.
Ling, G., Elliot, N., Burstein, J. C., McCaffrey, D. F., MacArthur, C. A., & Holtzman, S. (2021). Writing motivation: A validation study of self-judgment and performance. Assessing Writing, 48, 100509.
McCaffrey, D.F., Zhang, M., & Burstein, J (2022). Across performance contexts: Using automated writing evaluation to explore student writing. The Journal of Writing Analytics, 6, , 167-199. DOI: 10.37514/JWA-J.2022.6.1.07
Oddis, K., Burstein, J., Holtzman, S., & McCaffrey, D.F. (2022). A framework for analyzing features of writing curriculum in studies of student writing achievement. The Journal of Writing Analytics, 6, 95-144. DOI: 10.37514/JWA-J.2022.6.1.05
Proceedings
Burstein, J., McCaffrey, D., Elliot, N., Beigman Klebanov, B., Molloy, H., Houghton, P. & Mladineo, Z. (2020). Exploring writing achievement and genre in postsecondary writing . In Companion Proceedings in the 10th International Conference on Learning Analytics & Knowledge (LAK20), pp 53-55. Full text
Burstein, J., McCaffrey, D., Beigman Klebanov, B., & Ling, G. (2017). Exploring relationships between writing and broader outcomes with automated writing evaluation. In Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 101-108. Full Text
Burstein, J., McCaffrey, D., Beigman Klebanov, B., Ling, G., & Holtzman, S. (2019). Exploring writing analytics and postsecondary success indicators. In Companion Proceedings 9th International Conference on Learning Analytics & Knowledge (LAK19), pp. 213-214. Full Text
McCaffrey, D., Holtzman, S., Burstein, J., and Beigman Klebanov, B. (2021). What can we learn about college retention from student writing? To appear in Companion Proceedings in the 11th International Conference on Learning Analytics & Knowledge (LAK21).
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.