|Title:||ASSISTment Meets Science Learning (AMSL)|
|Principal Investigator:||Gobert, Janice||Awardee:||Worcester Polytechnic Institute|
|Program:||Science, Technology, Engineering, and Mathematics (STEM) Education [Program Details]|
|Award Period:||3 years||Award Amount:||$1,187,434|
|Type:||Development and Innovation||Award Number:||R305A090170|
Co-Principal Investigators: Heffernan III, Neil T.; Beck, Joseph; Koedinger, Kenneth R.
Purpose: The purpose of this study was to develop a computer-based intelligent tutoring system designed to support middle school students in the acquisition of flexible science inquiry and process skills in the 21st century workplace, students need to understand science more deeply and possess well-honed learning strategies that will allow them to apply their science knowledge in more flexible ways. By providing students with frequent, fine-grained performance assessments of science process skills, teaching and tutoring can be honed to students' individual needs.
Project Activities: In order to achieve the goals of assessing students' inquiry skills as envisioned by current frameworks (such as the NGSS framework), the researchers iteratively developed, tested, and refined a new, rigorous technology platform and a set of formative assessments for middle school earth science and life science. The platform, Inq-ITS (Inquiry Intelligent Tutoring System), enables teachers to assess their students' science inquiry skills rigorously, frequently, and in the context in which they are developing these skills. This work builds and extends prior work on assessing students' scientific inquiry skills for middle school physical science.
A major goal of this effort was to develop interactive formative assessments that could assess students' inquiry skills in real-time that could run seamlessly over the web so they also were scalableto large numbers of learners. In addition to the technological infrastructure needed, key development activities include creating formative units that rely heavily on the use of microworlds (i.e., computer-based manipulative models to assess students' science inquiry skills.
Setting: The setting for this study included urban and suburban public middle schools in Massachusetts.
Sample: Across the development and pilot study, 230 middle school students (and their teachers) in Massachusetts participated in the study. During the classroom implementation phase of the project 1,097 middle school students (and their teachers) in Massachusetts participated in the study. Participants were drawn from three schools and one after school program in central Massachusetts. The student population was ethnically diverse ranging from 88 to 40 percent White, 1.9 to 14.3 African American, and 2.2 to 5.1 percent multi-racial (2.2%–5.1%) backgrounds. About 4.5 to 34.6 percent were Hispanic. Combined, these students also represent a range in terms of socio-economic statuses, from 12 to 60 percent of students who are eligible for free- or reduced-price lunch programs.
Intervention: The researchers developed (1) new formative assessment modules, which relied heavily on the use of microworlds (i.e., simulations) to assess students on science skills needed to conduct inquiry, (2) corresponding pre- and post-test items for topic in the Massachusetts science frameworks for middle school earth and life science, and (3) a domain-general inquiry test used for getting baseline data on students' competencies.
The team developed the following assessment materials:
For earth science, the team developed the following:
The team also developed a standardized-style inquiry skills test to be used for assessing students' baseline skills.
Research Design and Methods: Building upon prior development of the ASSISTments System for Math, the research team and content experts built microworlds whose content and skills aligned with the Massachusetts Comprehensive Assessment system (MCAS)Science standards. The microworlds were piloted with students and their teachers. Think aloud protocols were collected from a random sample of students within classrooms. Students and teachers were interviewed and videotaped interacting with the microworlds. Computer log files were also collected to provide fine-grained data that was used to determine how students were using the models to conduct scientific inquiry. Analysis of these data informed revisions of the ASSISTment for Science microworlds, tasks, and scaffolds. The researchers conducted a series of experiments to examine the potential efficacy of the program for improving student science learning. In each of the experiments, students were randomly assigned to treatment (microworld modules with tutoring support and feedback) and control (microworld modules without tutoring support and feedback) conditions. The first experiment focused on whether inquiry tutoring led to improved inquiry skills in Earth and Life Science. The second experiment focused on whether the ASSISTment tutoring led to increased science content learning, as measured by MCAS scores. Finally, the third experiment focused on whether students' skills differed or changed as they moved on to a new content area, from Life Science to Earth Science or vice versa.
Control Condition: To assess the promise of the program, students assigned to the control condition received the earth and life science microworld modules without the ASSISTment tutoring support and feedback system.
Key Measures: To determine the usability and feasibility of modules, students and teachers were interviewed and videotaped interacting with the microworlds. Computer log files were also collected to provide fine-grained data that was used to determine how students were using the models to conduct scientific inquiry. To assess the potential efficacy of the intervention, pre- and post-tests of students' skills for each of the five inquiry strands—collecting data, interpreting data, predicting/hypothesizing, mathematizing data, and designing/conducting experiments—were developed by the researchers. In addition, students' scores on science items for the MCAS test were collected.
Data Analytic Strategy: The researchers analyzed the computer log files in order to fine tune and evaluate students' inquiry performance in each of the inquiry strands. Repeated measures multivariate analysis of variance was used to analyze student learning gains on the pre- and post-tests of the five inquiry strands, and MCAS items for science. Learning factors analysis was conducted to identify where the students switched content areas, as indicated by a sudden increase in haphazard inquiry behaviors, and learning decomposition analysis to estimate the relative efficacy of different types of practice on learning a skill.
The researchers conducted exploratory analyses of students' logfiles to track students' inquiry exploration paths using Markov chain modeling and K-means clustering. They analyzed classroom implementations of the formative assessment units for SimCell, Bug's Life, Ecosystems, Seasons, and Plate Tectonics using t-tests and multivariate analyses of variance (MANOVAs). They used t-tests to analyze classroom implementations of the domain-general inquiry test, and they tested for generalizability of assessment algorithms (from physical science to Ecosystems, and from physical science to Seasons) by doing text replay tagging of students' logfiles (for Ecosystems and for Seasons) and comparing these results to those for physical science. They also used student think-aloud protocols to analyze the pilot tests of SimCell(1.0), Bug's Life, EcoLife, Seasons, and Plate Tectonics.
Related IES Projects: The Development of an Intelligent Pedagogical Agent for Physical Science Inquiry Driven by Educational Data Mining (R305A120778)
ERIC Citations: Find available citations in ERIC for this award here.
Project Website: https://www.inqits.com/
Gobert, J., Sao Pedro, M., Baker, R.S., Toto, E., & Montalvo, O. (2012). Leveraging educational data mining for real time performance assessment of scientific inquiry skills within microworlds. Journal of Educational Data Mining, Article 15, Volume 4,153–185.
Gobert, J., Sao Pedro, M., Raziuddin, J., and Baker, R. S., (2013). From log files to assessment metrics for science inquiry using educational data mining. Journal of the Learning Sciences, 22(4),521–563.
Gobert, J.D., Kim, Y.J, Sao Pedro, M.A., Kennedy, M., & Betts, C.G. (2015). Using educational data mining to assess students' skills at designing and conducting experiments within a complex systems microworld. Thinking Skills and Creativity, 18, 81–90. doi:10.1016/j.tsc.2015.04.008
Pedro, M., Baker, R.D., Gobert, J.D., Montalvo, O., and Nakama, A. (2013). Leveraging Machine-Learned Detectors of Systematic Inquiry Behavior to Estimate and Predict Transfer of Inquiry Skill. User Modeling and User-Adapted Interaction, 23(1): 1–39. https://doi.org/10.1007/s11257-011-9101-0
Bachmann, M., Gobert, J.D., & Beck, J. (2010).Tracking Students' Inquiry Paths through Student Transition Analysis. Proceedings of the 3rd International Conference on Educational Data Mining (Pages 269–270).
Davenport, J., Quellmalz, E., Clarke-Midura, J., Dede, C., Gobert, J., Koedinger, K., McCall, M., & Timms, M. (2012). The Future of Assessment: Measuring Science Reasoning and Inquiry Skills Using Simulations and Immersive Environments. In van Aalst, J., Thompson, K., Jacobson, M. J., & Reimann, P. (Eds.), The Future of Learning: Proceedings of the 10th International Conference of the Learning Sciences (ICLS 2012) Volume 2, Short Papers, Symposia, and Abstracts (pp. 110–117). Sydney, NSW, AUSTRALIA: International Society of the Learning Sciences.
Gobert, J., Montalvo, O., Toto, E., Sao Pedro, M., & Baker, R. (2010). The Science Assistments Project: Scaffolding scientific inquiry skills. In Aleven, V., Kay, J. & Mostow, J. (Eds.) Intelligent Tutoring Systems Conference (6095), p. 445, Springer Berlin / Heidelberg.
Gobert, J., Pedro, M. Raziuddin, J., & the Science Assistments Team (2010).Studying the interaction between learner characteristics and inquiry skills in microworlds. In K. Gomez, L. Lyons, & J. Radinsky (Ed.), Learning in the Disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS 2010) – Volume 2(p. 46). Chicago, IL: International Society of the Learning Sciences.
Gobert, J., Raziuddin, J., & Koedinger, K. (2013). Auto-scoring discovery and confirmation bias during data interpretation in a science microworld. Artificial Intelligence in Education Lecture Notes in Computer Science, Volume 7926, pp 770–773.
Gobert, J., Sao Pedro, M., Montalvo, O., Toto, E., Bachmann, M. & Baker, R. (2011).The Science Assistments Project: Intelligent tutoring for scientific inquiry skills. In L. Carlson, C. Hoelscher, & T. (Eds.), Proceedings of the 33rd Annual Conference of the Cognitive Science Society. Austin, TX: Cognitive Science Society.
Montalvo, O., Baker, R.S.J.d., Sao Pedro, M.A., Nakama, A., & Gobert, J.D. (2010). Identifying Students' Inquiry Planning Using Machine Learning. Proceedings of the 3rd International Conference on Educational Data Mining, pp. 141–150.
Roll, I., Aleven, V., Koedinger, K., Berland, M., Martin, T., Benton, T., Petrick, C., Hershkovitz, A., Wixon, M, Baker, R., Gobert, J., Sao Pedro, M., Sherin, B., Blikstein, P, Worsley, M., & Pea, R. (2012). Building (timely) bridges between learning analytics, educational data mining, and core learning sciences perspectives. In The Future of Learning: Proceedings of the 10th International Conference of the Learning Sciences, pp. 131-141. Sydney, NSW, AUSTRALIA: International Society of the Learning Sciences.
Sao Pedro, M. A., Gobert, J. D., & Raziuddin, J. (2010). Comparing Pedagogical Approaches for the Acquisition and Long-Term Robustness of the Control of Variables Strategy. In K. Gomez, L. Lyons, & J. Radinsky (Ed.), Learning in the Disciplines: Proceedings of the 9th International Conference of the Learning Sciences (ICLS 2010) – Volume 1, Full Papers (pp. 1024–1031). Chicago, IL: International Society of the Learning Sciences.
Sao Pedro, M., Baker, R., & Gobert, J. (2012). Improving Construct Validity Yields Better Models of Systematic Inquiry, Even with Less Information. In Proceedings of the 20th Conference on User Modeling, Adaptation, and Personalization (UMAP 2012). Montreal, QC, Canada (pp. 249–260).
Sao Pedro, M., Baker, R., & Gobert, J. (2013). Incorporating Scaffolding and Tutor Context into Bayesian Knowledge Tracing to Predict Inquiry Skill Acquisition. In Proceedings of the 6th International Conference on Educational Data Mining. Memphis, TN, pp 185–192.
Sao Pedro, M., Baker, R., & Gobert, J. (2013). What Different Kinds of Stratification Can Reveal about the Generalizability of Data-Mined Skill Assessment Models. In Proceedings of the 3rd Conference on Learning Analytics and Knowledge. 190–194. Leuven, Belgium.
Sao Pedro, M., Gobert, J., & Baker, R. (2012). Assessing the Learning and Transfer of Data Collection Inquiry Skills Using Educational Data Mining on Students' Log Files. Paper presented at The Annual Meeting of the American Educational Research Association. Vancouver, BC, CA: Retrieved April 15, 2012, from the AERA Online Paper Repository.
Sao Pedro, M., Gobert, J., & Raziuddin, J. (2010). Long-term Benefits of Direct Instruction with Reification for Learning the Control of Variables Strategy. In Aleven, V., Kay, J. & Mostow, J. (Eds.) Intelligent Tutoring Systems Conference (6095), pp. 257-259. Springer Berlin/ Heidelberg.
Sao Pedro, M.A., Baker, R.S.J.d, Montalvo, O., Nakama, A. & Gobert, J.D. (2010).Using Text Replay Tagging to Produce Detectors of Systematic Experimentation Behavior Pattern. Proceedings of the 3rd International Conference on Educational Data Mining (Pages 181–190).
Sao Pedro, M.A., Gobert, J.D., Betts, C. (2014). Towards Scalable Assessment of Performance-Based Skills: Generalizing a Detector of Systematic Science Inquiry to a Simulation with a Complex Structure. In the Proceedings of the International Conference on Intelligent Tutoring Systems,[pp. 591–600]. Honolulu, HI, Springer: Berlin.