|Title:||Using Video Clips of Classroom Instruction as Item Prompts to Measure Teacher Knowledge of Teaching Mathematics: Instrument Development and Validation|
|Principal Investigator:||Kersting, Nicole||Awardee:||University of Arizona|
|Program:||Effective Instruction [Program Details]|
|Award Period:||3 years||Award Amount:||$1,474,620|
Purpose: Over the past two decades researchers have developed teaching standards, detailing what teachers should know and be able to do. However, many of the assessments of teacher knowledge currently available assess low-level or marginally relevant knowledge, and not teachers' deep knowledge of subject matter and actual teaching skills. In addition, many assessments of teacher knowledge have been criticized for technical shortcomings relating to the instruments' reliability, inter-rater reliability, or validity, further underscoring the need for better assessments. To address existing needs, this study proposes a novel assessment approach to measure teachers' knowledge of teaching mathematics.
Project Activities: Following up on promising pilot data, this project will develop video-analysis assessments for three pre-algebra topic areas: (1) fractions, (2) ratio and proportion, and (3) equations. The project will also examine the reliability and validity of the novel assessments. The approach draws on findings from the expert-novice literature in cognitive psychology and education, and uses teacher lesson analysis ability as a proxy for their knowledge of teaching mathematics. Video-clips of classroom instruction are used as item prompts. Each video-analysis assessment (administered on-line) consists of a set of video clips and an analysis task. Teachers will analyze each clip, and record their responses in text fields.
Products: The products from this project will be an instrument assessing teachers' knowledge of teaching mathematics, and published papers.
Setting: The schools are in California.
Population: Participants are 200 elementary and middle school teachers.
Intervention: Following up on promising pilot data, this project will develop video-analysis assessments for three pre-algebra topic areas: (1) fractions, (2) ratio and proportion, and (3) equations. The project will also examine the reliability and validity of the novel assessments. The approach draws on findings from the expert-novice literature in cognitive psychology and education, and uses teacher lesson analysis ability as a proxy for their knowledge of teaching mathematics. Video-clips of classroom instruction are used as item prompts. Each video-analysis assessment (administered on-line) consists of a set of video clips and an analysis task. Teachers will analyze each clip, and record their responses in text fields.
Research Design and Methods: The research follows a mixed methods approach. The researchers will collect video-analysis assessment data and criterion measures of teacher knowledge from approximately 200 teachers, and records of teaching practice and student data from approximately 40 of those teachers. The project consists of two phases: Phase 1 is designed to function as a proof of concept. It consists of the development of a video-analysis assessment on fraction concepts and operations, including an estimate of the instrument's internal consistency and inter-rater reliability. Finally, Phase 1 includes a comprehensive validity study, which uses the following three sources: (1) a criterion measure to assess teachers' pedagogical content knowledge on fractions; (2) data from student performance assessment on fractions and student achievement data from standardized tests; and (3) a record of teaching practice on the topic of fractions, consisting of a lesson plan, a videotaped lesson, and a teacher post-interview. During Phase 2 -the scale-up phase of the project-two additional video-analysis assessments, one on ratio and proportion, and one on variables, expressions, and equations, will be developed. Again, the instruments' internal consistency and inter-rater reliability will be estimated. For these assessments, validity will be evaluated based on criterion measures of pedagogical content knowledge for the respective topic areas.
Key Measures: Validity will be evaluated through a criterion measure of pedagogical content knowledge, records of teaching practice, and student learning outcomes.
Data Analytic Strategy: Analyses of video-analysis data, criterion measures, and student data will be quantitative. Data will be analyzed using Classical Test Theory and Item Response Theory. For the validation study, video-analysis scores will be correlated with scores on the criterion measures, and will be used to predict teaching practice and student outcomes. Analysis of the teaching practice records will follow a qualitative approach through the development of a coding system.
Journal article, monograph, or newsletter
Kersting, N. (2008). Using Video Clips of Mathematics Classroom Instruction as Item Prompts to Measure Teachers' Knowledge of Teaching Mathematics. Educational and Psychological Measurement, 68(5): 845–886.
Kersting, N.B., Givvin, K., Sotelo, F., and Stigler, J. (2010). Teacher's Analysis of Classroom Video Predicts Student Learning of Mathematics: Further Explorations of a Novel Measure of Teacher Knowledge. Journal of Teacher Education, 61(1): 172–181.
Kersting, N.B., Givvin, K.B., Thompson, B., Santagata, R., and Stigler, J. (2012). Measuring Usable Knowledge: Teachers' Analyses of Mathematics Classroom Videos Predict Teaching Quality and Student Learning. American Educational Research Journal, 49(3): 568–590.
Kersting, N.B., Sherin, B., and Stigler, J.W. (2014). Automated Scoring of Teachers' Open-Ended Responses to Video Prompts: Bringing the Classroom-Video-Analysis (CVA) Assessment to Scale. Educational and Psychological Measurement, 74(6): 950–974.
** This project was submitted to and funded under Teacher Quality: Mathematics and Science Education in FY 2006.