Skip Navigation
Stay Up-to-Date:

Home > Blog > Supporting Districts’ Efforts to Evaluate Professional Development

Supporting Districts’ Efforts to Evaluate Professional Development

Candice Bocala

Candice Bocala
Senior Research Associate, REL Northeast & Islands

Sat Jun 01 2019

Schools and districts across the nation invest considerable resources supporting teachers in their work. Researchers estimate that most urban districts spend between $6,000 and $8,000 per teacher each year on professional development (PD), but suggest that these costs are often underestimated.1 Another study found that the average annual PD expenditure per teacher in the study districts was $18,000.2

Yet district officials often find it difficult to evaluate these PD efforts effectively and rigorously.3 Poorly designed evaluations prevent researchers and practitioners from drawing strong conclusions about the effectiveness of PD programs.4 District leaders need help determining what to document and measure at different phases of a PD initiative, from the planning stages to the delivery of new content to the collection of teacher and student outcomes.5 At the same time, under the Every Student Succeeds Act, districts must demonstrate how their professional learning strategies strengthen educator effectiveness.6 As a result, district leaders are placing new emphasis on program evaluation as they consider their future investments in PD.7

To support districts in our region with these efforts, the Professional Learning and Development Research Alliance at REL Northeast & Islands has developed two projects to specifically build the capacity of district leaders to evaluate their own teacher PD efforts, whether they decide to hire an external evaluator or conduct PD evaluations internally.

The first project was a three-part webinar training series designed to help participants plan for and execute high-quality evaluation of educator PD. My REL colleague Katrina Bledsoe and I facilitated the webinars, which explored such topics as using logic models to plan an evaluation, developing evaluation questions, selecting evaluation designs, collecting and analyzing data, ensuring data quality, and reporting and using evaluation results. We also discussed some of the limitations districts face in evaluating PD in their local context and approaches they can take that are both reasonable and rigorous.

Recordings of these webinars can be found at the following links:

After the webinar series, alliance members expressed a continuing need for practitioner-friendly resources to support PD evaluation. They requested a resource that can help practitioners apply some of the content in the webinar series, as well as introduce new content related to measuring the fidelity of implementation of a PD program. This request resulted in the second project – currently under development – which will be a practitioner-friendly brief that will provide an overview of PD evaluation and how to understand and assess fidelity of implementation. The brief will use a case study of a hypothetical PD program to illustrate the evaluation process. It will also explain common evaluation tools, such as logic models, and offer suggestions for effective data collection and analysis.

Two alliance members, Gladys Cruz of New York and Deborah Richards of Connecticut, have told us how important this resource will be to their work. Cruz is superintendent of the Questar III BOCES, or Board of Cooperative Educational Services, which serves Columbia, Greene, and Rensselear counties in upstate New York, and Richardson is director of student services at the Capitol Region Education Council (CREC) in Hartford, Connecticut. The BOCES and CREC are education service agencies that provide PD supports to schools and districts in their regions.

“In this time of limited resources, it is imperative that we are making the best decisions possible with our limited professional development dollars,” Richards explained. “New, cost-effective resources to assist us in this process will be a welcome addition to our evaluation toolkit and strategic planning process.”

Learn more about REL Northeast & Islands’ research alliances and partnerships.


1Sawchuk, S. (2010, November 10). Full cost of professional development. Education Week. Retrieved November 26, 2018, from https://www.edweek.org/ew/articles/2010/11/10/11pd_costs.h30.html.

2TNTP. (2015). The mirage: Confronting the hard truth about our quest for teacher development. Brooklyn, NY: Author. Retrieved from http://tntp.org/assets/documents/TNTP-Mirage_2015.pdf.

3Haslam, M. (2010). Teacher professional development evaluation guide. Oxford, OH: National Staff Development Council/Learning Forward. Retrieved November 26, 2018, from https://learningforward.org/docs/pdf/evaluationguide.pdf; and Killion, J. (2017). Assessing impact: Evaluating staff development (3nd ed.). Thousand Oaks, CA: Corwin Press.

4Yoon, K. S., Duncan, T., Lee, S. W.-Y., Scarloss, B., & Shapley, K. (2007). Reviewing the evidence on how teacher professional development affects student achievement (Issues & Answers Report, REL 2007–No. 033). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southwest. Retrieved from http://ies.ed.gov/ncee/edlabs

5Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA: Corwin Press.

6ESSA. (2015). Every Student Succeeds Act of 2015, Pub. L. No. 114-95 § 114 Stat. 1177 (2015-2016).

7Learning Forward & EducationCounsel. (2017). A new vision for professional learning: A toolkit to help states use ESSA to advance learning and improvement systems. Oxford, OH: Author. Retrieved November 26, 2018, from https://learningforward.org/docs/default-source/getinvolved/essa/essanewvisiontoolkit.