Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Accelerating Fluency Development in ...
Home arrow_forward_ios ... arrow_forward_ios Accelerating Fluency Development in ...
Information on IES-Funded Research
Grant Closed

Accelerating Fluency Development in an Automated Reading Tutor

NCER
Program: Education Research Grants
Program topic(s): Education Technology
Award amount: $2,581,691
Principal investigator: Jack Mostow
Awardee:
Carnegie Mellon University
Year: 2008
Project type:
Development and Innovation
Award number: R305A080628

Purpose

Fluency-defined as the ability to read connected text quickly, easily, accurately, and expressively-is one of the most important skills needed to develop successful readers. However, acquiring this skill takes practice and guidance from teachers and many schools lack the resources to give students the individualized practice and attention they need to excel. One possible method for filling this gap is through the use of tutors. While human tutors can be expensive and difficult to find, another possible solution is to create automated, computerized tutors to help address the needs of students while addressing schools' financial and logistical concerns. This project will develop a computerized tutor intended to address these needs.

Project Activities

This project will build upon an already existing automated tutor, the Reading Tutor, that "listens" to a child read aloud and provides auditory and graphical (e.g. pictures or animation of prosodic contours of stress and pitch patterns) feedback based on effective human interventions. The updated Reading Tutor will be designed to accelerate the development of reading fluency by continuously assessing the reader's progress and adjusting to the current abilities of the reader by selecting the appropriate difficulty of stories and providing practice at certain intervals. A series of small pilot studies will be carried out to test the promise of the intervention in improving student reading outcomes. Additionally, the Reading Tutor will capture detailed, longitudinal data on the users' development of reading skills. These data, along with data collected from pilot studies, will be used to redesign the tutor using an iterative process.

Structured Abstract

Setting

The research will be conducted in two elementary schools in the greater Pittsburgh, Pennsylvania area.

Sample

Participants will include approximately 200 second- and third-grade students who are enrolled in both regular and special-education classrooms in public elementary schools serving students from predominantly urban, low-income, and African-American demographics.
Intervention
The updated version produced under this award will incorporate various strategies including dynamically adaptive spacing of exposure to words over reading practice sessions, integrated practice on connected text with individual words in varying amounts of context (e.g. having more information included in a phrase that uses the target vocabulary word), and redesigning how stories are selected by the Tutor in order to achieve appropriate difficulty for individual learners. Additionally, the revised Reading Tutor will model, elicit, and assess expressive prosody in order to support comprehension processes.

Research design and methods

Over the three-year project period, the research team will iteratively revise and evaluate different components of the fluency activities presented by the tutor. The current version of the Reading Tutor displays stories on a computer screen, uses speech recognition to listen to children read aloud, and responds with spoken and graphical assistance. This graphical assistance may take the form of pictures or animation in which the font size increases or decreases, rises or lowers in a way to visually demonstrate different prosodic contours (e.g. stress and pitch). The visual displays are intended to help children mimic correct prosody and pronunciation. Because the tutor can listen to and match readers' speech production to the text being read, it can continuously assess students' reading progress. The tutor will capture detailed, longitudinal data on the readers' development of reading skills. Improving the sophistication with which the tutor uses this record of individual readers' learning trajectories will enable the tutor to vary its instruction. The research team will compare alternative forms of oral reading practice, such as practicing words presented in varying amounts of context (e.g. amount of information in a practice phrase surrounding the target word), to ascertain how well such practice on a word transfers to reading that word fluently in a story. Development will be guided by iterative design and randomized, within subject experimental trials, and comparing alternative tutorial choices.

Control condition

Participants in the control condition will use the current version of the Reading Tutor.

Key measures

Oral reading rate will be assessed using grade-appropriate fluency passages and oral reading prosody scored using the National Assessment of Educational Progress (NAEP) fluency rubric (or one of its variants).

Data analytic strategy

The research team will evaluate feasibility with descriptive analyses of tutor usage. In order to compare effects of different practice types, the researchers will fit individual data to an exponential regression or theory-based model of practice with the dependent variable of time to read a word in context. Independent variables will include student identity or pretest score to control for individual differences, word length to control for lexical effects, and the amount of each type of prior practice on the word. The resulting parameter estimates will indicate the relative value of different types of practice, such as varying the amount of context.

People and institutions involved

IES program contact(s)

Meredith Larson

Education Research Analyst
NCER

Products and publications

Products: The expected outcome of this project is a revised Reading Tutor that assesses student abilities and provides customized interaction including the selection of appropriately challenging texts, practice, and feedback. Additionally, published reports on issues related to the development of the Tutor will be produced.

Book chapter

Mostow, J., Beck, J.E., Cuneo, A., Gouvea, E., Heiner, C., and Juarez, O. (2010). Lessons From Project LISTEN's Session Browser. In C. Romero, S. Ventura, S.R. Viola, M. Pechenizkiy, and R.S.J.D. Baker (Eds.), Handbook of Educational Data Mining (pp. 389-416). New York: CRC Press, Taylor and Francis Group.

Journal article, monograph, or newsletter

Duong, M., Mostow, J., and Sitaram, S. (2011). Two Methods for Assessing Oral Reading Prosody. ACM Transactions on Speech and Language Processing (TSLP), 7(4): 11-22.

González-Brenes, J.P., and Mostow, J. (2011). Classifying Dialogue in High-Dimensional Space. ACM Transactions on Speech and Language Processing (Special Issue on Machine Learning for Adaptivity in Dialogue Systems), 7(3): 1-15.

Korsah, G.A., Mostow, J., Dias, M.B., Sweet, T.M., Belousov, S.M., Dias, M.F., and Gong, H. (2010). Improving Child Literacy in Africa: Experiments With an Automated Reading Tutor. Information Technologies and International Development, 6(2): 1-19.

Proceeding

González-Brenes, J., Duan, W., and Mostow, J. (2011). How to Classify Tutorial Dialogue? Comparing Feature Vectors vs. Sequences. In Proceedings of the 4th International Conference on Educational Data Mining (pp. 169-178). Eindhoven, Netherlands: Educational Data Mining.

González-Brenes, J.P., and Mostow, J. (2010). Predicting Task Completion From Rich but Scarce Data. In Proceedings of the 3rd International Conference on Educational Data Mining (pp. 291-292). Pittsburgh, PA: Educational Data Mining.

Mostow, J., Chang, K., and Nelson, J. (2011). Toward Exploiting EEG Input in a Reading Tutor. In Proceedings of the 15th International Conference on Artificial Intelligence in Education (pp. 230-237). Auckland, NZ: Artificial Intelligence in Education.

Mostow, J., González-Brenes, J., and Tan, B.H. (2011). Learning Classifiers From a Relational Database of Tutor Logs. In Proceedings of the 4th International Conference on Educational Data Mining (pp. 149-158). Eindhoven, Netherlands: Educational Data Mining.

Mostow, J., Xu, Y., and Munna, M. (2011). Desperately Seeking Subscripts: Towards Automated Model Parameterization. In Proceedings of the 4th International Conference on Educational Data Mining (pp. 283-287). Eindhoven, Netherlands: Educational Data Mining.

Xu, Y., and Mostow, J. (2011). Logistic Regression in a Dynamic Bayes Net Models Multiple Subskills Better. In Proceedings of the 4th International Conference on Educational Data Mining (pp. 337-338). Eindhoven, Netherlands: Educational Data Mining.

Xu, Y., and Mostow, J. (2011). Using Logistic Regression to Trace Multiple Subskills in a Dynamic Bayes Net. In Proceedings of the 4th International Conference on Educational Data Mining (pp. 241-245). Eindhoven, Netherlands: Educational Data Mining.

Related projects

Explicit Comprehension Instruction in an Automated Reading Tutor that Listens

R305B070458

Developing Vocabulary in an Automated Reading Tutor

R305A080157

Supplemental information

Co-Principal Investigators: Paula Schwanenflugel, University of Georgia, and Joseph Beck, Worcester Polytechnic Institute

To detect improvements in the new version of the Reading Tutor over the current one, a 2-way mixed-model ANOVA will be employed, with Test (pre- vs. post-test) as the within-subjects variable and Tutor Version (new vs. current Tutor) as the between-subjects variable.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

ReadingLanguageData and AssessmentsEducation Technology

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote