Adolescents with disabilities face a number of obstacles on their path to adulthood. These youth are more likely to drop out of high school and less likely to make successful transitions to employment and postsecondary education. NCSER recently joined with the Department of Education's National Institute on Disability and Rehabilitation Research (NIDRR) and the Office of Special Education Programs (OSEP) to convene a panel of prominent practitioners and scholars to learn about their experience in working and conducting research with this vulnerable group.
The participants possessed a broad range of knowledge across types of disabilities, aspects of adolescent development (e.g., motivation), work in school, community and employment settings, and research methodologies appropriate for studying adolescents in transition to adulthood. The panel focused on understanding the issues at a systems level, looking at the essential components of a school (or district) model program that would promote positive outcomes for all youth.
Many promising ideas were generated that will inform future research efforts.
In November 2012, NCSER released a report to help researchers translate effect size statistics into more interpretable forms that are helpful to practitioners, policymakers, and researchers. Specifically aimed at researchers who conduct and report education intervention studies, Translating the Statistical Representation of the Effects of Education Interventions into More Readily Interpretable Forms provides researchers with suggestions on ways to include measures of practical effects in the analysis of their research findings.
When researchers describe the findings of their studies on the effects of educational interventions, they report on whether there were statistically significant differences between the intervention group and the control group on the outcomes of interest (for example, academic achievement). Statistical significance, however, is not generally understood by those who have not been trained in conducting research. In addition, as the report's lead author Mark Lipsey of Vanderbilt University explains, "Practical and statistical significance are not the same thing," and statistical significance provides "little insight into the practical magnitude of the effect. For example, even if the results indicate a statistically significant effect of an intervention, is this effect meaningful in terms of real student learning or achievement, or is there only a trivial impact?"
The report explains ways in which researchers can help show the practical significance of their findings to a wide audience by providing appropriate effect size statistics and describing results in terms of benchmarks or units that are more easily understood and directly relevant to the intended outcomes of an educational intervention. Even researchers who do regularly include measures of practical significance in their scientific papers may learn new ways of expressing their ideas.
Lipsey and colleagues hope this report encourages investigators of educational interventions to routinely make these translations when they write their results for dissemination. "We hope the suggestions provided in this report will encourage researchers to take that extra step," said Lipsey. "The more interpretable our research results are, the more informative they are for researchers and non-researchers alike."