IES Blog

Institute of Education Sciences

Leading experts provide evidence-based recommendations on using technology to support postsecondary student learning

By Michael Frye and Sarah Costelloe. Both are part of Abt Associates team working on the What Works Clearinghouse.

Technology is part of almost every aspect of college life. Colleges use technology to improve student retention, offer active and engaging learning, and help students become more successful learners. The What Works Clearinghouse’s latest practice guide, Using Technology to Support Postsecondary Student Learning, offers several evidence-based recommendations to help higher education instructors, instructional designers, and administrators use technology to improve student learning outcomes.

IES practice guides incorporate research, practitioner experience, and expert opinions from a panel of nationally recognized experts. The panel that developed Using Technology to Support Postsecondary Student Learning included five experts with many years of experience leading the adoption, use, and research of technology in postsecondary classrooms.  Together, guided by Abt Associates’ review of the rigorous research on the topic, the Using Technology to Support Postsecondary Student Learning offers five evidence-based recommendations:

Practice Recommendations: Use communication and collaboration tools to increase interaction among students and between students and instructors, Minimal evidence. 2. Use varied, personalized, and readily available digital resources to design and deliver instructional content, moderate evidence. 3. Incorporate technology that models and fosters self-regulated learning strategies. Moderate evidence. 4. Use technology to provide timely and targeted feedback on student performance, moderate evidence. 5. Use simulation technologies that help students engage in complex problem-solving, minimal evidence.

 

Each recommendation is assigned an evidence level of minimal, moderate, or strong. The level of evidence reflects how well the research demonstrates the effectiveness of the recommended practices. For an explanation of how levels of evidence are determined, see the Practice Guide Level of Evidence Video.   The evidence-based recommendations also include research-based strategies and examples for implementation in postsecondary settings. Together, the recommendations highlight five interconnected themes that the practice guide’s authors suggest readers consider:

  • Focus on how technology is used, not on the technology itself.

“The basic act of teaching has actually changed very little by the introduction of technology into the classroom,” said panelist MJ Bishop, “and that’s because simply introducing a new technology changes nothing unless we first understand the need it is intended to fill and how to capitalize on its unique capabilities to address that need.” Because technology evolves rapidly, understanding specific technologies is less important than understanding how technology can be used effectively in college settings. “By understanding how a learning outcome can be enhanced and supported by technologies,” said panelist Jennifer Sparrow, “the focus stays on the learner and their learning.”

  • Technology should be aligned to specific learning goals.

Every recommendation in this guide is based on one idea: finding ways to use technology to engage students and enhance their learning experiences. Technology can engage students more deeply in learning content, activate their learning processes, and provide the social connections that are key to succeeding in college and beyond. To do this effectively, any use of technology suggested in this guide must be aligned with learning goals or objectives. “Technology is not just a tool,” said Panel Chair Nada Dabbagh. “Rather, technology has specific affordances that must be recognized to use it effectively for designing learning interactions. Aligning technology affordances with learning outcomes and instructional goals is paramount to successful learning designs.”

  • Pay attention to potential issues of accessibility.

The Internet is ubiquitous, but many households—particularly low-income households and those of recent immigrants and in rural communities—may not be able to afford or otherwise access digital communications. Course materials that rely heavily on Internet access may put these students at a disadvantage. “Colleges and universities making greater use of online education need to know who their students are and what access they have to technology,” said panelist Anthony Picciano. “This practice guide makes abundantly clear that colleges and universities should be careful not to be creating digital divides.”

Instructional designers must also ensure that learning materials on course websites and course/learning management systems can accommodate students who are visually and/or hearing impaired. “Technology can greatly enhance access to education both in terms of reaching a wide student population and overcoming location barriers and in terms of accommodating students with special needs,” said Dabbagh. “Any learning design should take into consideration the capabilities and limitations of technology in supporting a diverse and inclusive audience.”

  • Technology deployments may require significant investment and coordination.

Implementing any new intervention takes training and support from administrators and teaching and learning centers. That is especially true in an environment where resources are scarce. “In reviewing the studies for this practice guide,” said Picciano, “it became abundantly clear that the deployment of technology in our colleges and universities has evolved into a major administrative undertaking. Careful planning that is comprehensive, collaborative, and continuous is needed.”

“Hardware and software infrastructure, professional development, academic and student support services, and ongoing financial investment are testing the wherewithal of even the most seasoned administrators,” said Picciano. “Yet the dynamic and changing nature of technology demands that new strategies be constantly evaluated and modifications made as needed.”

These decisions are never easy. “Decisions need to be made,” said Sparrow, “about investment cost versus opportunity cost. Additionally, when a large investment in a technology has been made, it should not be without investment in faculty development, training, and support resources to ensure that faculty, staff, and students can take full advantage of it.”

  • Rigorous research is limited and more is needed.

Despite technology’s ubiquity in college settings, rigorous research on the effects of technological interventions on student outcomes is rather limited. “It’s problematic,” said Bishop, “that research in the instructional design/educational technology field has been so focused on things, such as technologies, theories, and processes, rather than on the problems we’re trying to solve with those things, such as developing critical thinking, enhancing knowledge transfer, and addressing individual differences. It turns out to be very difficult to cross-reference the instructional design/educational technology literature with the questions the broader field of educational research is trying to answer.”

More rigorous research is needed on new technologies and how best to support instructors and administrators in using them. “For experienced researchers as well as newcomers,” said Picciano, “technology in postsecondary teaching and learning is a fertile ground for further inquiry and investigation.”

Readers of this practice guide are encouraged to adapt the advice provided to the varied contexts in which they work. The five themes discussed above serve as a lens to help readers approach the guide and decide whether and how to implement some or all of the recommendations.

Download Using Technology to Support Postsecondary Student Learning from the What Works Clearinghouse website at https://ies.ed.gov/ncee/wwc/PracticeGuide/25.

 

Sharing strategies to increase research-based educational practices

By Cora Goldston, REL Midwest

Highlighted Resources

How can states, districts, and schools identify effective practices to address challenges and achieve their goals? Education research can point the way, but sometimes finding and accessing relevant research can be a frustrating and time-consuming process. And even when practitioners can find research, it can be difficult to determine a study’s rigor and the strength of research evidence supporting interventions.

Equipping practitioners to use research evidence

Through the Midwest Alliance to Improve Knowledge Utilization (MAIKU), the Regional Educational Laboratory (REL) Midwest is partnering with practitioners to help states, districts, and schools use research to inform practice. The goal is to make it easier for educators to find research relevant to their priorities, assess the level of evidence that supports potential practices, and implement those practices that are based on strong evidence.

REL Midwest and MAIKU are supporting the use of research in education practice in several ways. For example, REL Midwest provided coaching sessions for the Ohio Department of Education (ODE) on understanding the Every Student Succeeds Act (ESSA) tiers of evidence. In addition, REL Midwest created a crosswalk that shows how the ESSA evidence tiers align with ratings from research clearinghouses, such as the What Works Clearinghouse. In turn, ODE is using this information to help Ohio districts that are applying for Striving Readers grants. To receive the grants, districts must demonstrate that they plan to use research-based practices to improve student literacy. As a result of REL Midwest’s support, ODE has strengthened its capacity to help districts determine the level of evidence supporting certain practices and, thus, to submit stronger grant applications.

REL Midwest is providing similar support across the region. In Michigan, we are conducting coaching sessions for the state Department of Education to help agency leadership choose priorities from the state’s Top 10 in 10 plan, identify research-based practices that support those priorities, and collaborate to implement new state-level practices. In Wisconsin, REL Midwest hosted a training series for the Department of Public Instruction to increase the agency’s capacity to collect, analyze, and use data to adjust state-level policies and practices. And in Illinois, REL Midwest is holding a training series for the State Board of Education on research methods, data collection, and data analysis and how to use the findings to inform agency practices.

June webinar on increasing evidence use

MAIKU is also working with researchers to support evidence use in education practice. On June 19, 2018, REL Midwest and MAIKU hosted a webinar to discuss how researchers can share evidence with practitioners in useful and accessible ways.

The webinar featured a presentation by Alan J. Daly, Ph.D., of the University of California at San Diego, and Kara Finnigan, Ph.D., of the University of Rochester. Dr. Daly and Dr. Finnigan discussed how information-sharing networks are structured among school and district staff and the challenges for practitioners in accessing and using research-based practices.   

Building on this context, Dr. Daly and Dr. Finnigan shared insights about the most effective ways to maximize the reach of research. One of their key findings is that the pattern of people’s social ties makes a difference for sharing and using research-based practices. Finnigan and Daly noted that the set of relationships we have can increase access to research evidence if the right ties are present but can constrain access to resources when those ties are not present. The quality of relationships also matters; high levels of trust are essential for more in-depth exchanges of information. The takeaway: fostering both the quantity and quality of social relationships is important for sharing research evidence.  

During the webinar, Jaime Singer, senior technical assistance consultant at REL Midwest, also shared actionable strategies that researchers can use to support evidence use in practice, including training and coaching sessions, checklists, blog posts, and clearinghouses of effective practices.

The webinar included a panel discussion about REL Midwest’s ESSA evidence tiers coaching sessions and crosswalk for ODE. REL Midwest researcher Lyzz Davis, Ph.D., provided a researcher perspective on developing resources to meet ODE’s needs. Heather Boughton, Ph.D., and Melissa Weber-Mayrer, Ph.D., at ODE provided practitioner perspectives on how REL Midwest’s work has strengthened the agency’s capacity to help districts find and use evidence-based interventions.

Looking for evidence outside of the scope of the WWC?

by Chris Weiss and Erin Pollard, What Works Clearinghouse

The What Works Clearinghouse (WWC) strives to be a central and trusted source of research evidence for what works in education. But did you know that the WWC is one of several repositories of evidence produced by the federal government? Our mission at the WWC is to review the existing research on different programs, products, practices, and policies in education to provide educators with the information they need to make evidence-based decisions. However, there are several other government repositories that review evidence on interventions that impact children and schools, reviews that may be of use and interest to WWC users.

 

Different Clearinghouses for Different Needs.

The mission of the different clearinghouses and the reasons for different reviews stems from the unique mission of each agency and the specific focus of the clearinghouse. The Department of Education focuses primarily on prekindergarten through postsecondary education; however, many public health and crime prevention programs are implemented through schools. So, for example, you would find information about a school-based bullying prevention program on the National Institute of Justice’s Crime Solutions website. The WWC would not review the evidence of this program’s effectiveness because its aim is to reduce bullying and victimization, rather than education-focused outcomes.

 

Some interventions are reviewed by multiple clearinghouses.

Users are often surprised that an intervention might be reviewed by multiple clearinghouses. For example, the WWC reviewed the evidence and created an intervention report on Career Academies, a school-within-school program where students take both career-related and academic courses, as well as acquire work experience. But reviews of the program are included in other clearinghouses. The Department of Labor’s CLEAR reviewed the study because of the intervention’s increase of student’s earnings. Similarly, the National Institute of Justice’s Crime Solutions has reviewed the intervention because it showed an effect on increasing earnings of young men – an economic factor linked to lowered risk of criminal activity. Each clearinghouse looked at different outcomes from the same study to highlight the domains they find most relevant to achieving their goal.

 

Each repository is different. The WWC may be your best bet – or others may fit your needs better.

We encourage users to look at the other clearinghouses to find information on outcomes that are outside of our scope. These sites have a lot of great information to offer. Here is a list of the other repositories for finding evidence:

  • Clearinghouse for Labor Evaluation and Research (CLEAR) – Department of Labor. CLEAR's mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies.
  • Compendium of Evidence-Based Interventions and Best Practices for HIV Prevention - Centers for Disease Control and Prevention. Evidence-Based Interventions and Best Practices in the Compendium are identified by CDC’s Prevention Research Synthesis Project through a series of ongoing systematic reviews. Each eligible intervention is evaluated against explicit a priori criteria and has shown sufficient evidence that the intervention works. Interventions may fall into one or more chapters including: Risk Reduction that includes PrEP-related outcomes and outcomes such as injection drug use, condom use, HIV/STD/Hepatitis infection; Linkage to, Retention in, and Re-engagement in HIV Care that includes outcomes such as entering and staying in HIV care; Medication Adherence that includes outcomes such as adhering to HIV medication and HIV viral load; and the most recently added Structural Interventions that includes outcomes such as HIV testing, social determinants of health, and stigma. Information sheets are available for all identified evidence-based interventions and best practices on the PRS Compendium Website.
  • CrimeSolutions - National Institute of Justice, Department of Justice. The clearinghouse, accessible via the CrimeSolutions.gov website, present programs and practices that have undergone rigorous evaluations and meta-analyses. The site assesses the strength of the evidence about whether these programs achieve criminal justice, juvenile justice, and crime victim services outcomes in order to inform practitioners and policy makers about what works, what doesn't, and what's promising.
  • Evidence Exchange - Corporation for National and Community Service. A digital repository of sponsored research, evaluation reports, and data. These resources focus on national service, volunteering, and civic engagement.
  • Home Visiting Evidence of Effectiveness (HomVEE) – Administration for Children and Families, Department of Health and Human Services. HomVEE provides an assessment of the evidence of effectiveness for home visiting models that target families with pregnant women and children from birth to kindergarten entry (that is, up through age 5).
  • Teen Pregnancy Prevention (TPP) Evidence Review – Department of Health and Human Services. A transparent systematic review of the teen pregnancy prevention literature to identify programs with evidence of effectiveness in reducing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors.
  • The Community Guide - Community Preventive Services Task Force (CPSTF). A collection of evidence-based findings to help you select interventions to improve health and prevent disease in your state, community, community organization, business, healthcare organization, or school. The CPSTF issues findings based on systematic reviews of effectiveness and economic evidence that are conducted with a methodology developed by the CPSTF.
  • youth.gov – Interagency. The youth.gov Program Directory features evidence-based programs whose purpose is to prevent and/or reduce delinquency or other problem behaviors in young people.

The WWC Evidence Standards: A Valuable and Accessible Resource for Teaching Validity Assessment of Causal Inferences to Identify What Works

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

 

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008).  During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

 

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

 

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

 

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

 

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

 

References

Boruch, R. F. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks, CA: Sage Publications.

Valentine, J.C., & Cooper, H. (2008), A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130-149.

Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, D.C.: Brookings Institution Press.

Making the WWC Open to Everyone by Moving WWC Certification Online

In December 2016, the What Works Clearinghouse made a version of its online training publicly available through the WWC Website. This enabled everyone to be able to access the Version 3.0 Group Design Standards reviewer training to learn about the standards and methods that the WWC uses. While this was a great step to increase access to WWC resources, users still had to go through the 1 ½ day, in-person training to become a WWC certified reviewer.

To continue our efforts to promote access and transparency and make our resources available to everyone, the WWC has now moved all of its group design training to be online. Now everyone will have access to the same training and certification tests. This certification is available free of charge and is open to all users. It is our hope that this effort will increase the number of certified reviewers and help increase general awareness about the WWC.

Why did the WWC make these resources publicly available? As part of IES’s effort to increase access to high quality education research, we wanted to make it easier for researchers to use our standards. This meant opening up training opportunities and offering training online was a way to achieve this goal while using limited taxpayer resources most efficiently.

The online training consists of 9 modules. These videos feature an experienced WWC instructor and use the same materials that we used in our in-person courses, but adapted to Version 4.0 of the Group Design Standards. After completing the modules, users will have the opportunity to download a certificate of completion, take the online certification test, or go through the full certification exam.

Becoming a fully certified reviewer will require users to take a multiple choice online certification test and then use the new Online SRG application to conduct a full review using the same tools that the WWC team uses. The WWC team will then grade your exam to make sure you fully understand how to apply the Standards before certifying you to review for the Clearinghouse.

Not interested in becoming a certified reviewer? Online training still has several benefits. Educators can embed our videos in their course websites and use our training materials in their curricula. Researchers can use our Online SRG tool with their publications to determine a preliminary rating and understand what factors could cause their study to get the highest rating. They could also use the tool to use when conducting a systematic evidence review.

Have ideas for new resources we could make available? Email your ideas and suggestions to Contact.WWC@ed.gov!

by Erin Pollard, WWC Project Officer