Project Activities
People and institutions involved
IES program contact(s)
Products and publications
Journal article, monograph, or newsletter
Carnegie, N.B., Harada, M., and Hill, J.L. (2016). Assessing Sensitivity to Unmeasured Confounding Using a Simulated Potential Confounder. Journal of Research on Educational Effectiveness, 9(3), 395-420.
Dorie, V., Harada, M., Carnegie, N.B., aand Hill, J. (2016). A Flexible, Interpretable Framework for Assessing Sensitivity to Unmeasured Confounding. Statistics in Medicine, 35(20), 3453-3470.
Dorie, V., Hill, J., Shalit, U., Scott, M., and Cervone, D. (2017). Automated Versus Do-It-Yourself Methods for Causal Inference: Lessons Learned From a Data Analysis Competition. arXiv preprint arXiv:1707.02641.
Hill, J., & Hoggatt, K. J. (2018). The Tenability of Counterhypotheses: A comment on Bross' discussion of statistical criticism. Observational Studies, 4(2), 34-41.
Hill, J., and Su, Y.-S. (2013). Assessing Lack of Common Support in Causal Inference Using Bayesian Nonparametrics: Implications for Evaluating the Effect of Breastfeeding on Children's Cognitive Outcomes. Annals of Applied Statistics, 7(3): 1386-1420.
Hill, J., Linero, A., & Murray, J. (2020). Bayesian additive regression trees: A review and look forward. Annual Review of Statistics and Its Application, 7, 251-278.
Kern, H.L., Stuart, E.A., Hill, J., and Green, D.P. (2016). Assessing Methods for Generalizing Experimental Impact Estimates to Target Populations. Journal of Research on Educational Effectiveness, 9(1), 103-127.
Middleton, J. A., Scott, M. A., Diakow, R., & Hill, J. L. (2016). Bias amplification and bias unmasking. Political Analysis, 24(3), 307-323.
Scott, M. A., Diakow, R., Hill, J. L., & Middleton, J. A. (2018). Potential for bias inflation with grouped data: A comparison of estimators and a sensitivity analysis strategy. Observational Studies, 4(1), 111-149.
Supplemental information
Co-Principal Investigator: Scott, Marc
Sensitivity analysis is an umbrella term that encapsulates any of a variety of different methods that assess the degree to which inferences might be altered by changes in structural or parametric assumptions. These strategies allow for an assessment of uncertainty surrounding the assumptions used and the resulting inferences. Even when analyses rely on assumptions that are either not testable or difficult to assess, there is still information available. This information can be found both in the data and in the utilization of substantive knowledge of the area. This information can be used to help gauge how reasonable these assumptions are and to estimate how inferences might change if these assumptions were violated in specific ways.
In addition, the project developed practical guidelines for using sensitivity analyses in applied settings by testing the efficacy of competing methods in empirical settings with known answers and identifying better benchmarks for the plausibility of sensitivity analysis parameters. User-friendly software to implement these strategies was developed to make them accessible to education researchers. Also, the project developed procedures for representing results graphically to facilitate interpretation and will build these procedures into the software.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.