IES Blog

Institute of Education Sciences

Improving the WWC Standards and Procedures

By Chris Weiss and Jon Jacobson

For the What Works Clearinghouse (WWC), standards and procedures are at the foundation of the WWC’s work to provide scientific evidence for what works in education. They guide how studies are selected for review, what elements of an effectiveness study are examined, and how systematic reviews are conducted. The WWC’s standards and procedures are designed to be rigorous and reflective of best practices in research and statistics, while also being aspirational to help point the field of education effectiveness research toward an ever-higher quality of study design and analysis.

To keep pace with new advances in methodological research and provide necessary clarifications for both education researchers and decision makers, the WWC regularly updates its procedures and standards and shares them with the field. We recently released Version 4.0 of the Procedures and Standards Handbooks, which describes the five steps of the WWC’s systematic review process.

For this newest version, we have divided information into two separate documents (see graphic below).  The Procedures Handbook describes how the WWC decides which studies to review and how it reports on study findings. The Standards Handbook describes how the WWC rates the evidence from studies.

The new Standards Handbook includes several improvements, including updated and overhauled standards for cluster-level assignment of students; a new approach for reviewing studies that have some missing baseline or outcome data; and revised standards for regression discontinuity designs. The new Procedures Handbook includes a revised discussion of how the WWC defines a study.  All of the changes are summarized on the WWC website (PDF).

Making the Revisions

These updates were developed in a careful, collaborative manner that included experts in the field, external peer review, and input from the public.

Staff from the Institute of Education Sciences oversaw the process with the WWC’s Statistical, Technical, and Analysis Team (STAT), a panel of highly experienced researchers who revise and develop the WWC standards. In addition, the WWC sought and received input from experts on specific research topics, including regression discontinuity designs, cluster-level assignment, missing data, and complier average causal effects. Based on this information, drafts of the standards and procedures handbooks were developed.

External peer reviewers then provided input that led to additional revisions and, in the summer, the WWC posted drafts and gathered feedback from the public. The WWC’s response to some of the comments is available on its website (PDF).   

Version 4.0 of the Handbooks was released on October 26. This update focused on a few key areas of the standards, and updated and clarified some procedures. However, the WWC strives for continuous improvement and as the field of education research continues to evolve and improve, we expect that there will be new techniques and new tools incorporated into future versions the Handbooks.

Your thoughts, ideas, and suggestions are welcome and can be submitted through the WWC help desk.

Expanding Student Success Rates to Reflect Today’s College Students

By Gigi Jones

Since the 1990s, the Integrated Postsecondary Education Data System (IPEDS) has collected and published graduation rates for colleges and universities around the country. These rates were based on traditional college students—first-time, full-time degree- or certificate-seeking undergraduate students (FTFT) who, generally, enrolled right after high school.

While these data are insightful, some have argued the FTFT graduation rate only provides a part of the picture because it doesn’t consider non-traditional students, including those who are part-time students and transfers. This is an important point because, over the past decade, the number of non-traditional students has outpaced the increase in traditional students, mostly driven by growth in those who have transferred schools.  

The new IPEDS Outcome Measures survey was designed to help answer these changes. Starting with the 2015-16 collection cycle, entering students at more than 4,000 degree-granting institutions must be reported in one of four buckets, also called cohorts (see Figure below).

The FTFT cohort is similar to what has been collected since the 1990s, but the Outcome Measures adds three new student groups to the equation: 

  • First-time, part-time students (FTPT), who attend less than a full-time credit workload each term (typically less than 12-credits) and who have no prior postsecondary attendance; 
  • Non-first-time students, also known as transfer-in students, who are enrolled at a full-time level (NFTFT); and
  • Non-first-time students, also known as transfer-in students, who are enrolled at a part-time level (NFTPT).

For these four cohorts, postsecondary institutions report the awards conferred at two points of time after the students entered the institution: 6 years and 8 years. If students did not receive an award, then institutions must report their enrollment status one of three ways: 1) Still enrolled at the same institution; 2) Transferred out of the institution; or 3) Enrollment status is unknown.

These changes help respond to those who feel that the FTFT graduation rates do not reflect the larger student population, in particular public 2-year colleges that serve a larger, non-traditional college student population. Since 2008, steps have been taken to construct and refine the data collection of non-traditional college students through a committee of higher education experts (PDF) and public Technical Review Panels (see summaries for panels 37, 40 and 45).

The 2016-17 preliminary Outcome Measures data were released on October 12, as part of a larger report on IPEDS winter data collection. The data for individual schools can be found on our College Navigator site.  The final data for 2015-16 will be released in early 2018. Sign up for the IES News Flash to be notified when the new data are released or follow IPEDS on Twitter. You can also visit the IPEDS Outcome Measures website for more information. 

While this is an important step in the process, we are continuing to improve the data collection process. Starting with the 2017-18 Outcome Measures collection, the survey includes more groups (i.e., Pell Grant v. Non-Pell Grant recipients), a third award status point (4-years after entry), and the identification of the type of award (i.e., certificates, Associate’s, and Bachelor’s). Watch for the release of these data in fall 2018. 

EDITOR'S NOTE: This post was updated on October 12 to reflect the release of Outcome Measures data.

Why Can’t You Just Use Google Instead of ERIC?

By Erin Pollard, ERIC Program Officer

The Education Resources Information Center (ERIC) provides the public with free, online access to a scholarly database of education research. We are frequently asked why the government sponsors such a tool when people can use Google or a subscription-based scholarly database.  

Commercial search engines and scholarly databases are important, but would not function as efficiently without ERIC’s metadata to power their search engines. Because of the costs associated with indexing, commercial and scholarly search engines would likely prioritize the work from major publishers, and may not index the work from small publishers on a regular basis.

But ERIC has built national and global relationships with key publishers, research centers, government entities, universities, education associations, and other organizations to disseminate their materials. We are currently under agreement with 1,020 different publishers, many of whom are small and only publish a single journal or report series.

For more than 50 years, ERIC has been acquiring grey literature (e.g., reports from the Institute of Education Sciences (IES) and other government reports, white papers, and conference papers) and making it centrally available and free-of-charge to the public. Therefore, an ERIC user is just as likely to find a relevant conference paper from a smaller publisher as they are to find a journal article from a major publisher. (See infographic (PDF) above to learn more about who uses ERIC)

ERIC also ensures that all records indexed meet a set of quality guidelines before indexing, and provides tools, such as a peer-review flag, that can help users evaluate the quality of the material. Underlying all of ERIC’s records are a set of metadata that helps guide users to the resources they are seeking. The metadata also includes descriptors from ERIC’s Thesaurus, a widely recognized, controlled vocabulary of subject-specific tags in the education field. Descriptors are added to each record and used by search engines to pinpoint results.

Lastly, and most importantly, ERIC provides access to more than 380,000 full-text resources, including journal articles and grey literature and makes these articles available for perpetuity. ERIC has been around for more than 50 years and has collected materials in hard copy, microfiche, and PDF. These materials are publicly available even after organizations or journals cease operations or redesign their website in a way that makes materials no longer available. In any given month, over 25% of ERIC’s new records are peer reviewed and provide free full text. Additionally, about 4% of journals provide peer-reviewed full text after an embargo. This includes work from IES grantees that normally appears in journals behind a paywall, but ERIC can make available through the IES Public Access Policy.

ERIC’s comprehensive collection, metadata, and access to full text articles make it an important resource for researchers, students, educators, policy makers and the general public. 

Want to learn more about ERIC? Watch this short video introduction or check out our multimedia page for access to other videos, infographics, and webinars.  

Recognizing Our Outstanding IES Fellows

Each year, the Institute of Education Sciences recognizes some of its fellows for their academic accomplishments and contributions to education research. This year, IES has selected Rachel Baker as the 2016 Outstanding Fellow from its Predoctoral Interdisciplinary Research Training Programs in the Education Sciences.

Dr. Baker (pictured right) received her doctorate in Higher Education Policy and the Economics of Education from the Stanford Graduate School of Education. She is currently an Assistant Professor of Educational Policy at the University of California, Irvine, where she studies inequalities in postsecondary access and success using behavioral economic models of decision making and quasi-experimental and experimental methods. Dr. Baker will receive her award and present her research at the annual IES Principal Investigators meeting in Washington, D.C. in January 2018.

For the first time this year, IES is also recognizing two finalists for the outstanding fellow award, Dr. Elizabeth Tighe and Dr. Karrie E. Godwin.

Dr. Tighe received her doctorate in Cognitive Psychology from Florida State University. She is currently an Assistant Professor of Educational Psychology at Georgia State University where she focuses on advancing our understanding of the literacy skills and instructional needs of struggling adult readers who attend Adult Basic and Secondary Education programs.

Dr. Godwin received her doctorate in Developmental Psychology from Carnegie Mellon University. She is currently an Assistant Professor of Educational Psychology in the School of Lifespan Development and Educational Sciences at Kent State University. Her research examines how cognitive and environmental factors shape children’s development and learning in the laboratory and in the classroom.

We asked all three awardees how participating in an IES predoctoral training program helped their development as researchers.  For more information about the IES predoctoral training program, visit our website.

Rachel Baker, Fellow in Stanford University Predoctoral Training Program in Quantitative Education Policy Analysis

With full acknowledgement that it is impossible to know for certain how my development as a researcher has been shaped by participating in the IES pre-doctoral fellowship (where’s the counterfactual?), I can point to three factors that I think have been critical:  (1) the community of Stanford’s IES pre-doctoral fellows and  associated faculty, (2) the tightly structured curriculum and frequent opportunities to engage with high quality research, and (3) the freedom, within this structure, to engage with my own research questions.

From my first day at the Center for Education Policy Analysis (CEPA) at Stanford’s Graduate School of Education, I was exposed to a pervasive culture of intellectual rigor and the pursuit of high-quality research. But this culture of exactitude was paired with diversity of thought, a true investment in the ideas and work of students, and a sense of collegiality and collaboration. Every day, I felt both challenged and supported by the Stanford IES group. This professional network has been essential to my growth.

CEPA’s core curriculum prepared me well to conduct high-quality, policy-focused research. In particular, the classes on designing and implementing quasi-experimental studies has influenced my work tremendously – from the large, obvious ways, such as how I conceptualize and design research, to the small details of implementation that can make or break a study. The series of required classes was complemented by frequent, less formal engagement with the practice of research in the form of weekly seminars in which students presented work in progress and an unparalleled seminar series with speakers from other institutions.

But within this tight community and academic structure, a real benefit of the IES fellowship was my ability to engage with the research questions that I was most interested in. The financial security of the fellowship meant that I could work on projects, directed by faculty or of my own design, that I thought were timely, important, and interesting. In my five years, I worked closely with four CEPA faculty members, each of whom influenced the way I ask and answer questions in essential and unique ways.

I am grateful to the IES pre-doctoral fellowship, and especially the Stanford IES group, for the five years of opportunities, resources, and professional community.

Elizabeth Tighe, Fellow in Florida State University Program to Increase Research Capacity in Educational Science

The IES predoctoral training fellowship provided multiple avenues for me to work with interdisciplinary research teams and take courses that developed my quantitative skills within the realm of education research. Three benefits in particular were integral in shaping my development as a researcher: rigorous quantitative training; generous financial support for designing and implementing my own studies as well as providing networking opportunities at conferences; and the interdisciplinary and collaborative nature of the coursework and research.

For my graduate studies, I chose Florida State University for the innovative and interdisciplinary research conducted through the Florida Center for Reading Research. By assisting on various projects, I honed my quantitative skills and learned new theoretical perspectives from multiple disciplines. For example, I gained experience with eye-tracking equipment, assessment and measurement of reading-related constructs, and evaluation of classroom curricular materials. The flexibility of the fellowship allowed me to develop my own program of research, which focused on struggling adult readers enrolled in adult literacy programs. In addition, I utilized my quantitative skills on large-scale, existing datasets of students enrolled in K-12 education. These experiences provided ample opportunities to bridge my interests in adult literacy and quantitative methodology and to publish and present at different outlets.    

The financial support for conferences afforded me extensive networking opportunities with colleagues. This helped me to establish grant-writing collaborations, provide statistical consulting for projects, participate in cross-university symposia, and form professional friendships from which I continue to reap benefits today. I attended practitioner-oriented conferences in adult education and a training workshop for using a large-scale dataset on adults’ literacy, numeracy, and digital problem-solving skills. In conjunction with a colleague at the University of Iowa, my current lab received a grant to use this large-scale dataset to examine the literacy skills of United States inmates. 

My accrued training, experiences, and interdisciplinary collaborations are directly applicable to my current role as an Assistant Professor of Psychology and Assistant Director of the Adult Literacy Research Center (ALRC) at Georgia State University. I work alongside an interdisciplinary team of scholars within the Language and Literacy Initiative, which affords me the opportunity to work with professors and to mentor graduate students in multiple departments. My lab is currently is working with the ALRC and Applied Linguistics Department to investigate the comprehension monitoring skills of struggling adult readers using eye-tracking equipment. As a result of my IES training, it was imperative for me to find a position that supported my multidisciplinary research interests.

Karrie E. Godwin, Assistant Professor, Kent State University, Fellow in Carnegie Mellon Program in Interdisciplinary Education Research (PIER)

My participation as an IES predoctoral fellow in the PIER program at Carnegie Mellon University was seminal in guiding my development as a researcher. For me, there are three aspects of the program which were especially formative.

First, I was fortunate to be surrounded by thoughtful, caring, and passionate mentors who embody the goals of the PIER program. I am especially grateful to Anna Fisher, David Klahr, and Sharon Carver, who provided incredible guidance, advanced my critical thinking skills, taught me the importance of good experimental design and research hygiene, and how to effectively disseminate my work—not only to other scientists, but also to practitioners and stakeholders.

Second, PIER is committed to training scholars in rigorous research methodology that bridges theory and practice. My training has enabled me to conduct disciplined research in which I examine children’s cognitive development and the underlying mechanisms of change which inform educational practices by identifying causal and malleable factors that can be leveraged to promote better learning outcomes for children.

Lastly, PIER exposed me to a diverse set of scholars affording unique and dynamic opportunities for interdisciplinary research collaborations that would have been highly improbable in more traditional and siloed environments. For example, in addition to working with other psychologists, I formed productive collaborations with colleagues in robotics, human computer interaction, and statistics. This was a unique and powerful component of the program. Additionally, this feature of the program encouraged and developed my communication skills to groups and communities outside of my specific domain where traditional jargon is typically ineffective. This skill has been incredibly important in helping to communicate my research to other scientists from different disciplines but also to practitioners and the media.

In my new role as an assistant professor at Kent State University, I am drawing upon my experience in PIER and using the skills I gained during my fellowship to build my program of research. I am continuing to investigate children’s cognitive development in order to create more optimal learning environments and instructional materials that aim to enhance children’s learning outcomes. In addition, my new position allows me to help train students to become producers of high-quality research and to help future educational practitioners be thoughtful consumers of research. I am certain the skills I gained as an IES fellow at Carnegie Mellon will enable me to fulfill my commitment to improve children's learning outcomes through disciplined research.

Compiled by Katina Stapleton, National Center for Education Research

 

 

Updating Our Recommendations to Prevent Dropping Out

By Dana Tofig, Communications Director, IES

Almost a decade ago, the What Works Clearinghouse (WWC) released Dropout Prevention, one of its first practice guides, which offered six recommendations for keeping students in school, based on the findings from high-quality research and best practices in the field. Since its release in August 2008, the practice guide has been downloaded thousands of times by practitioners across the country. In fact, the guide was downloaded more than 1,500 times in 2016, alone.

However, over the past decade, the research and knowledge base has grown in the area of dropout prevention, which is why the WWC decided to update this guide to reflect the latest evidence and information about keeping students in school and on track toward graduation.

This updated guide, Preventing Dropout in Secondary Schools, builds on the 2008 guide in two significant ways.

First, it reflects improvements in practices related to monitoring at-risk students, including advances in using early warning indicators to identify students at risk for dropping out. Secondly, it covers an additional nine years of research that were not a part of the previous guide. In fact, 15 of the 25 studies used to support the recommendations in this updated guide were published after the first guide was published. In addition, studies from the previous guide were reviewed again against current, more rigorous WWC evidence standards.

Preventing Dropout in Secondary Schools offers four evidence-based recommendations that can be used by schools and districts:

  • Monitor the progress of all students, and proactively intervene when students show early signs of attendance, behavior, or academic problems;
  • Provide intensive, individualized support to students who have fallen off track and face significant challenges to success;
  • Engage students by offering curricula and programs that connect schoolwork with college and career success and that improve students’ capacity to manage challenges in and out of school; and
  • For schools with many at-risk students, create small, personalized communities to facilitate monitoring and support.

Each of these recommendations includes specific strategies for implementation and examples of how this work is being done around the country (see one such example in the image to the right).

Like all of our practice guides, the recommendations were developed with a panel of educators, academics, and experts who brought a wealth of knowledge and experience to the process. Two of the panelists on this updated guide were also involved in the development of the first dropout prevention guide—Russell Rumberger, from University of California, Santa Barbara, and Mark Dynarksi, of Pemberton Research LLC. The other panelists for the new guide were Howard (Sandy) Addis, of the National Dropout Prevention Center and Network; Elaine Allensworth, from the University of Chicago; Robert Balfanz, from The Johns Hopkins University; and Debra Duardo, superintendent of the Los Angeles County Office of Education.

Please download this free guide today and let us know what you think. We would especially love to hear from people who are using the recommended strategies in schools. You can reach us through the WWC Help Desk or email us at Contact.IES@ed.gov