|Title:||The Assess-as-You-Go Writing Assistant: A Student Work Environment that Brings Together Formative and Summative Assessment|
|Principal Investigator:||Cope, William||Awardee:||University of Illinois, Urbana-Champaign|
|Program:||Education Technology [Program Details]|
|Award Period:||3 years||Award Amount:||$1,500,000|
|Type:||Development and Innovation||Award Number:||R305A090394|
Purpose: A key recent educational priority has been to use summative assessment both as a measure of learner progress and as an incentive for school improvement. It is widely agreed, however, that in order to support ongoing learning throughout the academic year, summative assessment should to be supplemented by formative assessment, or assessment for screening, monitoring and diagnosis. This research team proposes to create a new educational technology tool intended to improve student writing performance and writing assessments. The Assess-As-You-Go Writing Assistant will be an online writing environment which, via a combination of tagging, social networking, and natural language processing technologies, will give learners constant feedback on their writing in the form of on-demand, as-you-go formative assessment. The Writing Assistant will also track individual learner progress, the progress of cohorts of students through the system, and the progress of individuals in relation to cohorts—thus providing summative assessment data intended to meet teacher, school, parent, and community accountability requirements.
Project Activities: This study will be conducted over a 3 year period. During Year 1, the team will focus on software development activity, building on a foundation of available web-based, open source writing tools. Throughout this initial year, researchers will design and improve the Writing Assistant through iterative development, using laboratory trials, beta testing sites and field sites. In Year 2, researchers will pilot the Writing Assistant with 12 eighth grade teachers and their students in authentic settings. A summer intensive institute will train teachers, followed by fall supported in-school teacher classroom use. In Year 3, expanded field testing with 30 teachers and their students will assess feasibility of implementation, develop and validate the operational aspects of the writing assistant (multiple points of formative assessments by people, and machine assessment), and explore the potential effects of the intervention on classroom writing instruction, student outcomes, and teacher instruction.
Products: Products include the fully developed Assess-As-You-Go Writing Assistant online writing environment. Published reports of the findings will also be produced.
Setting: This project will take place in 8th grade classrooms in Illinois.
Population: Participants include 8th grade teachers and their students. In Year 2, 12 eighth grade teachers and their students will participate in a pilot study. In Year 3, 30 teachers and their students will participate in an expanded field test.
Intervention: The Assess-As-You-Go Writing Assistant will be a web-based learning environment where students can create written texts and receive ongoing feedback from the system itself through natural language processing technologies, as well as from teachers, peers, and experts using social networking and tagging technologies. In addition, the system will include supporting professional development materials and products necessary for implementation of the intervention.
Research Design and Methods: The first year will be primarily devoted to software development. This process will be grounded in an iterative cycle of research, development, and testing informed by on-going formative data collection (using laboratory trials, beta testing sites, and field sites). In the second year of the project, limited pilot testing of the intervention and supporting professional development activities will occur with 12 eighth grade teachers and their students. Information gained from this process will be used to refine the intervention and professional development. In the final year of the project, the team will carry out a feasibility evaluation aimed at collecting pilot data on feasibility of implementation in authentic educational delivery settings and the Writing Assistant's promise to generate intended outcomes, including increased access to useful feedback in the writing process, improved writing performance, and student and teacher satisfaction.
Control Condition: The control group will be randomly assigned to classrooms within each school in the Fall of 2011. The control students will not receive any intervention but a pre-test and post-test will be given to both the control and treatment groups.
Key Measures: Researchers will use standardized persuasive-essay writing prompt. Key outcomes of this project will be researcher-developed writing outcomes.
Data Analytic Strategy: Education experts will review student writing products and will code features indicative of specific education outcomes. This information will enable development of a machine learning algorithm that will underpin the Writing Assistant. A successful algorithm will be defined by high correlation between expert classification and other classifications by the Writing Assistant. Data collected during the development phase will be synthesized and reported back to the design team and will involve regularly re-iterated cycles of planning and refinement. In Year 2, observations and other data collected at the field sites will be used to develop a process for assessing fidelity of implementation and to collect initial assessments of feasibility and potential effectiveness. In Year 3, teacher plans and resulting implementation data will serve as the basis for feasibility assessment, testing and developing the machine algorithm, as well as further refinement of the software, professional development materials and process. Student outcome data will be analyzed using hierarchical linear modeling (HLM), where level 1 is the child (using child characteristics such as free lunch status, gender, and race as predictors at this level) and level 2 is the classroom (using treatment/control status along with any other measurable characteristics of the class and classroom teacher).
Project Website: http://newlearningonline.com/assess-as-you-go/
Cope, B., Kalantzis, M., and Magee, L. (2011). Towards a Semantic Web: Connecting Knowledge in Academic Research. Cambridge, UK: Woodhead Publishing.
Kalantzis, M., and Cope, B. (2012). Literacies. Cambridge UK: Cambridge University Press.
Cope, B., & Kalantzis, M. (2015). Assessment and pedagogy in the era of machine-mediated learning. Education as social construction: Contributions to theory, research, and practice, 350-374.
Cope, B., & Kalantzis, M. (2017). Conceptualizing e-learning. In e-Learning Ecologies (pp. 1-45). Routledge.
Cope, B., Kalantzis, M., & Abrams, S. S. (2017). Meaning-making and learning in the era of digital text. Remixing multiliteracies: Theory and practice from New London to new times, 35-49.
Kalantzis, M., and Cope, B. (2011). The Work of Writing in the Age of Its Digital Reproducibility. In S.S. Abrams, and J. Rowsell (Eds.), Rethinking Identity and Literacy Education in the 21st Century (pp. 40–87). New York: Teachers College Press.
Kalantzis, M., & Cope, B. (2015). Learning and new media. The Sage handbook of learning, 373-387.
Kalantzis, M., and Cope, B. (2016). New Media and Productive Diversity in Learning. Diversity in der Lehrerinnenbildung (pp. 310–325).
Smith, A., Cope, B., & Kalantzis, M. (2017). The quantified writer. Handbook of Writing, Literacies, and Education in Digital Cultures, 63.
Smith, A., & Kennett, K. (2017). Multimodal Meaning: Discursive Dimensions of e-Learning. In e-Learning Ecologies (pp. 88-117). Routledge.
Book, edition specified
Kalantzis, M., and Cope B. (2012). New Learning: Elements of a Science of Education. (2nd ed.). Cambridge UK: Cambridge University Press.
Journal article, monograph, or newsletter
Cope, B., and Kalantzis, M. (2013). New Media, New Learning and New Assessments. E–Learning and Digital Media, 10(4): 328-331.
Cope, B., & Kalantzis, M. (2015). Sources of evidence-of-learning: Learning and assessment in the era of big data. Open Review of Educational Research, 2(1), 194-217.
Cope, B., & Kalantzis, M. (2015). Interpreting Evidence-of-Learning: Educational research in the era of big data. Open Review of Educational Research, 2(1), 218-239.
Cope, B., & Kalantzis, M. (2016). Big data comes to school: Implications for learning, assessment, and research. AERA Open, 2(2), 2332858416641907.
Cope, B., & Kalantzis, M. (2019). Education 2.0: Artificial Intelligence and the End of the Test. Beijing International Review of Education, 1(2-3), 528-543.
Cope, B., Kalantzis, M., McCarthey, S., Vojak, C., and Kline, S. (2011). Technology-Mediated Writing Assessments: Principles and Processes. Computers and Composition, 28: 79–96.
Magnifico, A. M., Woodard, R., & McCarthey, S. (2019). Teachers as co-authors of student writing: How teachers' initiating texts influence response and revision in an online space. Computers and Composition, 52, 107-131.
Olmanson, J.D., Kennett, K.S., and Cope, B. (2015). The Techno-Pedagogical Pivot: Designing and Implementing a Digital Writing Tool. World Academy of Science, Engineering and Technology, International Journal of Social, Behavioral, Educational, Economic, Business and Industrial Engineering, 9(7): 2273–2276.
Olmanson, J., Kennett, K., Magnifico, A., McCarthey, S., Searsmith, D., Cope, B., & Kalantzis, M. (2016). Visualizing revision: Leveraging student-generated between-draft diagramming data in support of academic writing development. Technology, Knowledge and Learning, 21(1), 99-123.
Vojak, C., Kline, S., Cope, B., McCarthey, S., and Kalantzis, M. (2011). New Spaces and Old Places: An Analysis of Writing Assessment Software. Computers and Composition, 28(2): 97–111.
Montebello, M., Cope, B., Kalantzis, M., Tzirides, A. O., Haniya, S., Amina, T., ... & Chen, M. (2018, April). Critical thinking through a reflexive platform. In 2018 17th International Conference on Information Technology Based Higher Education and Training (ITHET) (pp. 1-6). IEEE.