Skip Navigation

Impacts of a Problem-Based Instruction Approach to Economics on High school StudentsImpacts of a Problem-Based Instruction Approach to Economics on High school Students

Regional need and study purpose

According to the National Council on Economic Education, 22 states require student testing in economics, 41 states require that districts implement standards in economics, and the National Assessment of Educational Progress assesses student knowledge of economics (NCEE 2007). Required for high school graduation in California and Arizona, economics has been the focus of attention because of the opportunity to improve instruction in an often poorly taught required course. In general, high school economics courses fail to teach students about their country's economic system, the workings of world trade, and the relationships between supply and demand and consumers and producers (National Council on Economic Education 1999). In addition, most teachers are unprepared for teaching economics because good instruction materials are unavailable, and professional development is scanty at best. Identifying a reliable and valid solution to this problem is of great value regionally and nationally.

Using a randomized controlled trial, this study assesses student-level impacts of a problem-based instruction approach to high school economics. Intended to increase class participation and content knowledge, the curriculum approach has been shown to especially benefit low-achieving students (Ravitz and Mergendoller 2005).This study targets high schools in both urban and rural areas and uses teachers who taught economics in both fall 2007 and spring 2008.

Previous research on problem-based economics curriculum indicates that the curriculum is effective with both low- and high-achieving students and that its practices are correlated with better student retention of core concepts (Ravitz and Mergendoller 2005; Moeller 2005). Evidence also suggests that a problem-based economics curriculum benefits various student subgroups (Mo and Choi 2003; Ravitz and Mergendoller 2005; Moeller 2005).

Through student and teacher background surveys, student and teacher checklists of practices used and their helpfulness, and pre-, post-, and final (delayed post) content tests, Ravitz and Mergendoller's (2005) quasi-experimental study relates teacher and student background characteristics to learning outcomes. That study, with 15 teachers and 1,162 students, finds the largest gains in learning among students who reported low prior achievement, though those reporting high prior achievement also outperformed expectations. This suggests an overall curvilinear relationship between prior achievement and learning in problem-based instruction. Problem-based practices consistent with this study's intervention were associated with long-term learning gains, while other more traditional or non-problem-based practices were associated only with short-term gains.

This study is designed to test the effect of problem-based instruction on student learning and problem-solving skills in economics. Student achievement outcomes are mediated by changes in teacher knowledge and pedagogical practice. Three research questions guided the study, one on teacher outcomes and two on student outcomes:

A problem-based approach to curriculum is frequently a component of high school reform models (Expeditionary Learning Outward Bound 1999; Honey and Henríquez 1996; Newmann and Wehlage 1995), but teachers and schools find incorporating problem-based teaching into daily classroom instruction difficult (Hendrie 2003). Teachers, social science department chairs, and school instruction leaders will be able to review the findings of the study as they evaluate their options in implementing a required component of the high school curriculum.

Intervention description

Teachers in the treatment group attended a five-day workshop in summer 2007 and received curriculum materials for problem-based economics and training in the materials. Workshop leaders—trained by the developer, the Buck Institute for Education of Novato, California—were experienced teachers who had used the problem-based economics curriculum extensively. Four hour-long phone-based coaching seminars and asynchronous email communications provided follow-up support, allowing teachers to share and refine instruction approaches and work with the developer on strategies for pacing and content delivery. Participating teachers agreed to teach core concepts in economics as identified by national economics standards, to provide information on how they covered these concepts, and to be faithful to their (treatment or control) condition.

Problem-based learning uses problem-solving rather than traditional classroom instruction to teach content knowledge and skills. Students learn by doing. The following description of the problem-based approach illustrates how it differs from the typical direct instruction approach of most economics classrooms:

Each [curriculum] unit contains seven interrelated phases: entry, problem framing, knowledge inventory, problem research and resources, problem twist, problem log, problem exit, and problem debriefing. Student groups generally move through the phases in the order indicated, but may return to a previous phase or linger for a while in a phase as they consider a particularly difficult part of the problem. The teacher takes a facilitative role, answering questions, moving groups along, monitoring positive and negative behavior, and watching for opportunities to direct students to specific resources or to provide clarifying explanations….Teachers still “teach,” but the timing and the extent of their instructional interventions differ from those used in traditional approaches. Problem-based learning teachers wait for teachable moments before intervening or providing needed content explanations, such as when students want to understand specific content or recognize that they must learn something (Mergendoller, Maxwell, and Bellisimo in press, p. 1).

Over the past 10 years staff at the Buck Institute for Education have developed and refined the problem-based economics curriculum in response to standards developed by the National Council for Economics Education. Partnering with the Centers for Economic Education, it disseminates the curriculum across the country with concentrations in states that require economics in their high school coursework.

Study design

The sample includes 78 teachers and around 6,400 students (in two cohorts of 3,200) from 66 high schools in California and Arizona. Through district curriculum leads and social study department chairs, Regional Educational Laboratory (REL) West contacted teachers to participate in the study. The sample includes nearly three times as many male as female teachers. Teachers, both treatment and control, had an average of about 14 years teaching, with 7 years teaching economics. Around 70 percent of the students are in grade 12, and 30 percent in grade 11. More than half of the sample students are of non-Hispanic descent, with an even distribution by gender. The intervention and data collection phases, in 2007 and 2008, are complete, and analysis is under way. The findings will be disseminated in late 2009.

The study has an experimental trial design in which teachers are randomly assigned to treatment or control groups. Teachers serve as the unit of randomization, and students, the primary unit of observation, are nested within teachers. The recruitment process demanded random assignment both within and between schools. When two or more teachers in a school agreed to participate, they were randomly assigned within the school. When only one teacher in a school agreed to participate—the typical case—the teacher was randomly assigned at the school level. Thus, using a teacher-level random assignment design, the study employs the school as a blocking factor when there are two or more teacher participants per school and a constructed stratum as a blocking factor when there is one teacher participant per school.

With this design, teachers in the same school can be assigned to either condition, so they are asked not to collaborate or share materials. Researchers designed the study anticipating between one and three economics teachers per school. Given the pedagogical changes required for full implementation, the study was conducted over one summer (2007) and two consecutive academic semesters (fall 2007 and spring 2008). Treatment teachers received the professional development intervention in the summer and additional support from the developer and master economics teachers in the following two semesters of using the new instruction approach. Data collection for teacher measures covered a full academic year, while students, in two cohorts, were followed for one academic semester, the full length of the course (table 1).

Treatment teachers received a five-day professional development program in summer 2007; control teachers received a delayed treatment the following summer (2008). The implementation was the classroom instruction program in economics for a one-semester course. In fall 2007 students in grades 11 and 12 received the problem-based economics curriculum or the typical curriculum—as did a second group of students enrolled in economics in spring 2008. Students in the second cohort had the advantage of having teachers who had practiced the instruction approach.

A strength of the design is that the impact estimates can be calculated on the treatment teachers who could implement the curriculum more than once—in fall 2007 and again in spring 2008. But the impact will include the fact that the second implementation will have been with second-semester seniors, whose imminent graduation may have distracted them during the posttest administration. This could attenuate some effects of the intervention. In addition, differences in program impacts for student subgroups are examined only with exploratory analyses.

Table 1.Study characteristics and data collection schedule

Group Students Teachers
Performance assessment Test of economic literacy Surveys Test of economic literacy Surveys
Fall 2007 student cohort December September / December September / December na na
Spring 2008 student cohort May January / May January / May na na
Teachers na na na August 2007 / May 2008 August 2007 / May 2008
na is not applicable.

With 78 teachers and a minimum of 50 students per semester in each two-class section (a total of around 6,400 students, or 3,200 per cohort), the sample size is sufficient for detecting program impacts in the range of 0.19–0.22 standard deviation units on academic outcomes for students. One way to interpret the magnitude of this effect is to compare it to the progress students make during an academic year. Hill et al. (2008) report that grade 10 students' scores on norm-referenced tests increase by 0.19 standard deviation in reading and 0.14 standard deviation in math over a calendar year. For teacher outcomes the sample size is sufficient for detecting larger impacts (0.57 standard deviation), but such impacts at the more proximal teacher level would be expected to produce smaller subsequent impacts at the more distal student level.

Key outcomes and measures

Box 1 lists the key outcome variables for the study. The primary outcome for both teachers and students is content knowledge gains in economics measured by the Test of Economic Literacy. This test, developed and refined by the National Council on Economic Education, is now in its third edition, and reliability is high (Cronbach alpha = .89). Student problem-solving skills are measured with open response performance assessments of applied economics concepts (performance task assessments), developed by the Center for Research on Evaluation, Standards, and Testing at the University of California, Los Angeles. Using survey data, teacher's practices and attitudinal measures were used to assess changes in engagement with the curriculum.

Box 1. Study outcomes and their measures
Teacher outcome measures
Teacher content knowledge in economics
  • Teacher Test of Economic Literacy

  • Teacher pedagogical practices and satisfaction with problem-based economics
  • Survey instruments

  • Student outcome measures
    Student content knowledge in economics
  • Student Test of Economic Literacy

  • Student problem-solving skills (student tests)
  • Monetary policy—federal funds (conceptual understanding)
  • Monetary policy—employment (conceptual understanding)
  • Fiscal policy (conceptual understanding)
  • Consumer demand (conceptual understanding)
  • Opportunity costs (conceptual understanding)
Data collection approach

The study includes strategies to examine teacher and student outcome and attitudinal measures. Each requires a specific data collection protocol to ensure the data are not compromised. The following data collection protocols were followed:

Analysis plan

The analyses for this study will compare outcomes for treatment students and teachers with their control counterparts after completing the economics course, using conditional multilevel regression models. Additional terms will be used to account for the nesting of individuals within higher units of aggregation (see Goldstein 1987; Raudenbush and Bryk 2002; Murray 1998). A random effect for teachers is included to account for the nesting of student observations within teachers. Potential fixed effects will include treatment group, state (California or Arizona), baseline (pretest) measures of outcome variables, and other student- and teacher-level covariates. Exploratory analyses are also planned to investigate differences in problem-based economics program impacts by gender, race/ethnicity, and English language learner and non–English language learner status—with expectations of finding more pronounced positive impacts on students who traditionally exhibit lower levels of academic achievement.

The procedures described by Schochet (2008) will be used to account for multiple hypothesis tests involving the numerous outcome variables assessed in the study. Within each of the three outcome domains—teacher content knowledge in economics, student content knowledge in economics, and student problem-solving skills—confirmatory impact analyses will apply multiple comparison procedures to adjust for errors that can arise from testing multiple hypotheses.

Dr. Neal Finkelstein

Contact information

Dr. Neal Finkelstein
Regional Educational Laboratory West
730 Harrison Street
San Francisco, CA 94107-1242
Voice: (415) 615-3171
Fax: (415) 565-3012
Email: nfinkel@wested.org

Region: West

References

Expeditionary Learning Outward Bound. (1999). Early indicators from schools implementing New American Schools designs. Cambridge, MA: Expeditionary Learning Outward Bound.

Goldstein, H. (1987). Multilevel models in educational and social research. London: Oxford University Press.

Hendrie, C. (2003, April 23). Small schools hard to start, report finds. Education Week, 22 (32). Retrieved April 17, 2009, from http://www.edweek.org/ew/articles/2003/04/23/32gates.h22.html.

Hill, C.J., Bloom, H.S., Black, A.R., and Lipsey, M.W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2 (3), 172–77.

Honey, M., and Henríquez, A. (1996, April). Union City interactive multimedia education trial: 1993–95 summary report (CCT Reports No. 3). Retrieved April 17, 2009, from http://www.edc.org/CCT/ccthome/tech_rept/CCTR3/.

Mergendoller, J., Maxwell, N., and Bellisimo, Y. (In press). The effectiveness of problem-based instruction: A comparative study of instructional methods and student characteristics. Interdisciplinary Journal of Problem Based Learning. Retrieved April 17, 2009, from http://www.bie.org/files/IJPBL%20PBE%20PaperFINAL-single%20spaced.pdf.

Mo K., and Choi Y. (2003). Comparing problem-based learning with traditional instruction: Focus on high school economics. Theory and Research in Citizenship Education, 35 (1), 89–113. English abstract available from http://www.bie.org/research/pbss/econ/summary.php?id=39.

Moeller, B. (2005). Understanding the implementation of problem-based learning in New York City high school economics classrooms. In J. Ravitz (Chair), Assessing Implementation and Impacts of PBL in Diverse K–12 Classrooms. Montreal, Canada: Buck Institute for Education. Retrieved April 17, 2009, from http://www.bie.org/AERA2005/Moeller_Paper.pdf.

Murray, D.M. (1998). Design and analysis of group randomized trials. New York: Oxford University Press.

National Council on Economic Education. (1999). Standards in economics: Survey of students and the public. New York: National Council on Economic Education. Retrieved April 17, 2009, from http://ncee.net/cel/results.php.

National Council on Economic Education. (2003). Survey of the states: Economic and personal finance education in our nation's schools in 2002. New York: National Council on Economic Education. Retrieved April 17, 2009, from http://www.ncee.net/about/survey2002/.

National Council on Economic Education. (2007). Survey of the states: Economic and personal finance education in our nation's schools in 2007. New York: National Council on Economic Education. Retrieved April 17, 2009, from http://www.councilforeconed.org/about/survey2007/NCEESurvey2007.pdf.

Newmann, F., and Wehlage, G. (1995). Successful school restructuring: A report to the public and educators by the Center on Organization and Restructuring of Schools. Madison, WI: Wisconsin Center for Education Research.

Raudenbush, S.W., and Bryk, A.S. (2002). Hierarchical linear models: Applications and data analysis methods. Thousand Oaks, CA: Sage Publications.

Ravitz, J., and Mergendoller, J. (2005). Evaluating implementation and impacts of problem based economics in U.S. high schools. In Ravitz, J. (Chair), Assessing Implementation and Impacts of PBL in diverse K–12 classrooms. Montreal, Canada: Buck Institute for Education. Retrieved April 17, 2009, from http://www.bie.org/AERA2005/Ravitz_Mergendoller.pdf.

Schochet, P.Z. (2008). Guidelines for multiple testing in impact evaluations of educational interventions (MPR Reference No. 6300-080). Princeton, NJ: Mathematical Policy Research, Inc.

Return to Index