Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Intelligent Scaffolding for Peer Reviews of Writing
Center: NCER Year: 2012
Principal Investigator: Litman, Diane Awardee: University of Pittsburgh
Program: Education Technology      [Program Details]
Award Period: 3 years Award Amount: $1,498,939
Type: Development and Innovation Award Number: R305A120370
Description:

Purpose: The purpose of this project is to improve upon an existing software technology—Scaffolded Writing and Rewriting in the Disciplines (SWoRD)—that facilitates the writing and revision of essays and compositions, and handles the logistics of peer review (e.g., distribution of essays to reviewers, collection and distribution of anonymous reviewer comments to original authors). Researchers will attempt to improve upon the current software technology by adding new features that leverage advances in artificial intelligence techniques in the areas of Natural Language Processing and Machine Learning, such as automated detection of thesis statements and assessment of reviewer comments (e.g., do feedback comments contain helpful suggestions?). These techniques will then lead to scaffolds for both authors and reviewers, with the expectation of ultimately improving students' writing and critical thinking skills.

Project Activities: Activities will include iterative development of the software across two years, followed by a pilot study in the third year comparing the advanced version with the existing one. The iterative development will include the addition of new software features and user testing, resulting in revisions and corrections to the features, and allow for testing the usability and feasibility of the software. The researchers will examine the performance of the software features (e.g., accuracy of automated detection of thesis statements) and their effect on students (e.g., do students provide better peer-review feedback with the advanced features?; do their revisions benefit from the feedback?). Additionally, the researches will conduct focus groups with and surveys of users.

Products: The outcome of this project will be an advanced version of SWoRD, which will use artificial intelligence and Natural Language Processing techniques to assess student essays and reviewer comments; provide scaffolded assistance to authors and reviewers; and help authors organize reviewer feedback to facilitate writing revisions. Additionally, the researchers expect to publish their findings in scientific journals and conference proceedings.

Structured Abstract

Setting: Iterative development and testing will take place in science and English/social science courses in both secondary and post-secondary schools. The latter setting will give researchers easy access to large numbers of students taking lower-level physics and psychology courses, which will facilitate some of the iterative development. Iterative development and pilot testing will occur in high schools.

Population: Students in the Pittsburgh area from middle school to second-year university courses will participate.

Intervention: The intervention will add to the existing version of SWoRD, a technology tool that provides students a platform in which to write compositions and reports, and facilitates the logistics of conducting peer review and making revisions. For example, the tool distributes the essays to reviewers, records their anonymous feedback, and returns it to the appropriate authors, who in turn can use this information in revising their work. The project will include the addition of advanced technical features designed to assess certain critical features of essays (e.g., thesis statements) and reviewer feedback (e.g., location of where suggestions apply). Additionally, the improved intervention will include support in organizing for the authors the feedback from multiple reviewers.

Research Design and Methods: Methods will include iterative development and testing, including the use of surveys, focus groups, interviews, and quasi-experimental design studies. During Years 1 and 2, usability studies with both high school and (early) college students will be conducted in both the fall and spring semesters, with analysis of data and revisions to the software occurring in the winter and summer. Specifically, the team anticipates making minor revisions (e.g., fixing bugs in the software) during the winter whereas it will make major changes and upgrades (e.g., add new capabilities and features) over the summer.

In Year 3, the researchers will conduct a Pilot Study with high school students from at least four teachers' classrooms, resulting in at least 200 students, with half of the students using the basic version of SWoRD (which currently exists) and the other half using the advanced one (which will include the revisions and additions made from the development efforts carried out in this project). Both types of writing genre (argumentative essays and science reports) will be used with each version. Random assignment will be at the student level.

Control Condition: Students will use the currently existing version of SWoRD in the control condition.

Key Measures: Key measures of student writing and understanding of the writing process will include judgments by trained graduate students of several aspects, including: student essays and their revisions; and peer reviewer comments and feedback. Additionally, surveys will be conducted of students' knowledge of writing and composition processes, and their satisfaction with the system (including the advanced features). Measures of the developed technology (e.g., machine learning) will include measures of overall performance (e.g., accuracy, reliability, validity, and precision) in detecting thesis statements in students' essays and feedback features in reviewers' comments.

Data Analytic Strategy: Descriptive analysis of surveys, focus group, and interview data will be carried out and used to revise and refine SWoRD. The data of the pilot study will be analyzed primarily with a t-test, because the anticipated design is of a single factor (Version of SWoRD), with two levels: the basic version (existing version at the start of this project) versus the advanced version (which will include the new features to-be-developed in this project). Both writing genre types (argumentative essays and science reports) will be used, but the analysis will collapse across genre.

Products and Publications

Journal article, monograph, or newsletter

Loretto, A., DeMartino, S., and Godley, A. (2016). Secondary Students' Perceptions of Peer Review of Writing. Research in the Teaching of English, 51(2): 134–161.

Nguyen, H., Xiong, W., and Litman, D. (2017). Iterative Design and Classroom Evaluation of Automated Formative Feedback for Improving Peer Feedback Localization. International Journal of Artificial Intelligence in Education, 27(3): 582–622.

Schunn, C. D., Godley, A. J., and DiMartino, S. (2016). The Reliability and Validity of Peer Review of Writing in High School AP English Classes. Journal of Adolescent & Adult Literacy, 60(1): 13–23.

Schunn, C., Godley, A. and DeMartino, S. (2016). The Reliability and Validity of Peer Review of Writing in High School AP English Classes. Journal of Adolescent and Adult Literacy, 60(1): 13–23.

Zhang, F., Schunn, C.D., and Baikadi, A. (2017). Charting the Routes to Revision: An Interplay of Writing Goals, Peer Comments, and Self-Reflections from Peer Reviews. Instructional Science, 45: 679–707.

Proceeding

Baikadi, A., Schunn, C., and Ashley, K. (2016). Impact of Revision Planning on Peer-Reviewed Writing. In EDM 2016 Workshops and Tutorials co-located with the 9th International Conference on Educational Data Mining (pp. 1–5). Raleigh, NC: CEUR Workshop Proceedings.

Baikadi, A., Schunn, C., & Ashley, K. D. (2015, June). Understanding Revision Planning in Peer-Reviewed Writing. In EDM (pp. 544–547).

Falakmasir, M. H., Ashley, K. D., Schunn, C. D., & Litman, D. J. (2014, June). Identifying thesis and conclusion statements in student essays to scaffold peer review. In International Conference on Intelligent Tutoring Systems (pp. 254–259). Springer, Cham.

Hashemi, H. B., & Schunn, C. D. (2014, June). A tool for summarizing students' changes across drafts. In International Conference on Intelligent Tutoring Systems (pp. 679–682). Springer, Cham.

Litman, D. (2016, March). Natural language processing for enhancing teaching and learning. In Thirtieth AAAI Conference on Artificial Intelligence.

Nguyen, H., Xiong, W., & Litman, D. (2016, June). Instant feedback for increasing the presence of solutions in peer reviews. In Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Demonstrations (pp. 6–10).

Nguyen, H., Xiong, W., & Litman, D. (2014, June). Classroom evaluation of a scaffolding intervention for improving peer review localization. In International Conference on Intelligent Tutoring Systems (pp. 272–282). Springer, Cham.

Xiong, W., & Litman, D. J. (2013). Evaluating topic-word review analysis for understanding student peer review performance. In Proceedings 6th International Conference on Educational Data Mining (EDM 2013) (pp. 200–207). University of Pittsburgh.

Xiong, W., and Litman, D. (2014). Empirical Analysis of Exploiting Review Helpfulness for Extractive Summarization of Online Reviews. In Proceedings of COLING 2014, the 25th International Conference on Computational Linguistics (pp. 1985–1995). Dublin, Ireland: ACL Anthology.

Zhang, F., and Litman, D. (2015). Annotation and Classification of Argumentative Writing Revisions. In Proceedings of the Tenth Workshop on Innovative Use of NLP for Building Educational Applications (pp. 133–143). Denver, CO: Association for Computational Linguistics.

Zhang, F., & Litman, D. (2014, June). Sentence-level rewriting detection. In Proceedings of the Ninth Workshop on Innovative Use of NLP for Building Educational Applications (pp. 149–154).


Back