Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Improving Methods for Policy Impact Evaluation with Group Panel Data in Education Research
Center: NCER Year: 2020
Principal Investigator: Feller, Avi Awardee: University of California, Berkeley
Program: Statistical and Research Methodology in Education      [Program Details]
Award Period: 3 years (07/01/2020 – 06/30/2023) Award Amount: $896,026
Type: Methodological Innovation Award Number: R305D200010
Description:

Co-Principal Investigators: Miratrix, Luke; Rothstein, Jesse

Purpose: When a randomized control trial is infeasible in an education setting, researchers can use quasi-experimental research designs. A commonly used approach is to use repeated observations of aggregate data, known as group panel data, before and after a new policy or intervention is put in place. To estimate the effects, researchers typically rely on either a comparative interrupted time series (CITS) or, increasingly, the synthetic control method (SCM), but there is not a clear set of best practices for implementing these designs or for analyzing the data from them. The purpose of this grant is to develop such guidance and to develop a new estimation approach which combines CITS and SCM.

Project Activities: The research team will conduct simulation studies and multiple within-study comparisons using real data. They will then work with applied researchers to develop and clarify clear study guidelines and reporting standards for CITS, SCM, and the newly developed combination of them. The research team will also create user-friendly software for conducting analyses of data from these designs. They will disseminate their guidelines, reporting standards, and software through seminars, short courses, conference presentations, and peer-reviewed journal manuscripts.

Products and Publications

Arbour, D., Ben-Michael, E., Feller, A., Franks, A., & Raphael, S. (2021). Using Multitask Gaussian Processes to estimate the effect of a targeted effort to remove firearms. arXiv preprint arXiv:2110.07006.

Armstrong, T. B., Kline, P., & Sun, L. (2023). Adapting to Misspecification. arXiv preprint arXiv:2305.14265.

Ben-Michael, E., Arbour, D., Feller, A., Franks, A., & Raphael, S. (2023). Estimating the effects of a California gun control program with multitask Gaussian processes. The Annals of Applied Statistics, 17(2), 985–1016.

Ben-Michael, E., Feller, A., & Hartman, E. (2021). Multilevel calibration weighting for survey data. Political Analysis, 1–19.

Ben-Michael, E., Feller, A., & Rothstein, J. (2023). Varying impacts of letters of recommendation on college admissions (No. w30940). National Bureau of Economic Research.

Ben-Michael, E., Feller, A., & Rothstein, J. (2020). Varying impacts of letters of recommendation on college admissions: Approximate balancing weights for subgroup effects in observational studies. arXiv preprint arXiv:2008.04394.

Ben-Michael, E., Feller, A., & Rothstein, J. (2022). Synthetic controls with staggered adoption. Journal of the Royal Statistical Society Series B: Statistical Methodology, 84(2), 351–381.

Ben-Michael, E., Feller, A., & Rothstein, J. (2021). The augmented synthetic control method. Journal of the American Statistical Association, 116(536), 1789–1803.

Ben-Michael, E., Feller, A., & Stuart, E. A. (2021). A trial emulation approach for policy evaluations with group-level longitudinal data. Epidemiology (Cambridge, Mass.), 32(4), 533.

Bruns-Smith, D., Dukes, O., Feller, A., & Ogburn, E. L. (2023). Augmented balancing weights as linear regression. arXiv preprint arXiv:2304.14545.

Feller, A., & Stuart, E. A. (2021). Challenges with evaluating education policy using panel data during and after the COVID-19 pandemic. Journal of Research on Educational Effectiveness, 14(3), 668–675.

Ham, D. W., & Miratrix, L. (2022). Benefits and costs of matching prior to a difference in differences analysis when parallel trends does not hold. arXiv preprint arXiv:2205.08644.

Kim, E. J., & Miratrix, L.W. (2023). The Causal Impact of Charter Schools on Private Tutoring Prevalence. (EdWorkingPaper: 23–756). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/qs5q-ga02

Lu, B., Ben-Michael, E., Feller, A., & Miratrix, L. (2023). Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation. Journal of Educational and Behavioral Statistics, 10769986231155427.

Lu, B., Ben-Michael, E., Feller, A., & Miratrix, L. (2021). Is it who you are or where you are? Accounting for compositional differences in cross-site treatment variation. arXiv preprint arXiv:2103.14765.

Miratrix, L. W. (2022). Using simulation to analyze interrupted time series designs. Evaluation Review, 46(6), 750–778.

Rothstein, J., & Whitmore Schanzenbach, D., (2022). Does Money Still Matter? Attainment and Earnings Effects of Post-1990 School Finance Reforms. Journal of Labor Economics, 40(suppl 1).

Singh, R., & Sun, L. (2019). Double robustness for complier parameters and a semiparametric test for complier characteristics. arXiv preprint arXiv:1909.05244.

Soriano, D., Ben-Michael, E., Bickel, P. J., Feller, A., & Pimentel, S. D. (2023). Interpretable sensitivity analysis for balancing weights. Journal of the Royal Statistical Society Series A: Statistics in Society.

Soriano, D., Ben-Michael, E., Bickel, P. J., Feller, A., & Pimentel, S. D. (2021). Interpretable sensitivity analysis for balancing weights. arXiv preprint arXiv:2102.13218.

Sun, L., & Shapiro, J. M. (2022). A linear panel model with heterogeneous coefficients and variation in exposure. Journal of Economic Perspectives, 36(4), 193–204.


Back