Skip to main content

Breadcrumb

Home arrow_forward_ios Events arrow_forward_ios 2007 IES Research Training Institut ...
Home arrow_forward_ios ... arrow_forward_ios 2007 IES Research Training Institut ...
Events
Workshop/Training Past

2007 IES Research Training Institute: Cluster Randomized Trials

NCER
In-person
Wyatt Center

1930 South Dr
Nashville, TN 37212
United States

Jun 17, 2007 - Jun 29, 2007
Add to Calendar

About this event

To increase the national capacity to develop and conduct rigorous evaluations of the effectiveness of education interventions by training researchers to conduct cluster (group) randomized trials in education settings.

Who should attend this event?

The National Center for Education Research encourages women, minorities, and individuals with disabilities to apply. The National Center for Education Research encourages applications from postdoctoral fellows, junior researchers, and senior researchers who would benefit from the knowledge and skills addressed in the Training Institute. 

If selected for participation, individuals who require reasonable accommodations for disabilities to participate in the Training Institute, should contact Dr. Caroline Ebanks, 202-219-1410 between 7:30 a.m. and 4:00 p.m. Eastern Time, or via e-mail at Caroline.Ebanks@ed.gov at least 60 days before the course begins.

Session overviews

Session 1 Jun 17, 2007 add remove
Jun 17, 2007
8:00 PM - 9:30 PM EDT
In-Person

Changing the Nature of Education Research

Session 2 Jun 18, 2007 add remove
Jun 18, 2007
8:00 AM - 10:00 AM EDT
In-Person

Specifying the conceptual and operational models; formulating precise questions

Agenda

8:00 AM

This session covers: 

  1. Developing the rationale for the importance of the intervention, including deciding if an intervention is ready for an RCT (randomized controlled trial); 
  2. Determining and justifying the type of study (development, efficacy, or scale-up), including a review of what is already known in the area, relevant pilot data and preliminary studies; 
  3. Specifying the "theory of change" underlying the intervention, including a conceptual model specifying key cause-effect constructs and their linkages and an operational model of the processes and activities that affect the outcomes; and 
  4. Framing the question precisely so that a trial can be designed to provide an answer that will be useful.

Session materials

Course Session #1 (June 18, 2007): Specifying the conceptual and operational models and formulating precise questions

Download

Session panelists

Mark Lipsey

Director of the Center for Evaluation Research and Methodology and a Senior Research Associate at the Vanderbilt Institute for Public Policy Studies
Vanderbilt University
Session 3 Jun 18, 2007 add remove
Jun 18, 2007
10:30 AM - 12:30 PM EDT
In-Person

Describing and quantifying outcomes

Agenda

10:30 AM

This session covers considerations for identifying relevant variables and selecting tests and measures in education trials, including 

  1. reliability, validity, sensitivity, and relevance of measures; 
  2. specifying proximal (mediating) and distal outcomes/variables; 
  3. alignment and overalignment of measures with the intervention; 
  4. continuity across ages/grades for follow-up measures; 
  5. developmental appropriateness of measures; 
  6. feasibility of use; 
  7. respondent burden; 
  8. efficiency-minimizing overlap among measures; 
  9. attention to possible unexpected as well as expected outcomes;
  10.  issues associated with correlated measures, creating composite measures; and 
  11. measurement issues associated with special populations.

Session materials

Course Session #1 (June 18, 2007): Specifying the conceptual and operational models and formulating precise questions

Download

Session panelists

Mark Lipsey

Director of the Center for Evaluation Research and Methodology and a Senior Research Associate at the Vanderbilt Institute for Public Policy Studies
Vanderbilt University
Session 4 Jun 18, 2007 add remove
Jun 18, 2007
1:30 PM - 3:15 PM EDT
In-Person

Assessment of treatment implementation/assessment of control condition

Agenda

1:30 PM

This session covers strategies used to assess instruction, process, and treatment fidelity, including systematic observation, logs and diaries, questionnaires, and interviews. It includes discussion of 

  1. the concepts of implementation versus fidelity;  
  2. The importance of clear specification of the intervention as basis for assessing fidelity/implementation; and
  3. Measuring the relevant experience in the control group.

Session materials

Dr. David Cordray

Download

Session panelists

David Cordray

Professor of Public Policy and Professor of Psychology
Vanderbilt University
Session 5 Jun 18, 2007 add remove
Jun 18, 2007
3:45 PM - 5:45 PM EDT
In-Person

Introducing the Group Activity Assignment

Agenda

3:45 PM

The Group Activity Assignment is designed to provide attendees with an opportunity to gain experience in applying the concepts and strategies that are presented in the various technical sessions to a specific RCT. With the assistance of the instructors, the activity will mimic the process of developing a feasible and technically sound RCT. Each group will formulate an intervention topic, decide on the type of RCT that is most appropriate, articulate the theory of action, and specify intervention and control contrasts. As the sessions progress, the group will have an opportunity to apply the measurement, sampling, design and analysis issues that have been discussed in the technical sessions. Continuous feedback and resources (e.g., extant data, normative conventions, and decision frameworks) will be made available to assist in the development of the overall RCT design. Toward the end of the Training Institute, the final design from each group will be presented to the full group.

Session panelists

David Cordray

Professor of Public Policy and Professor of Psychology
Vanderbilt University
Session 6 Jun 19, 2007 add remove
Jun 19, 2007
8:00 AM - 10:00 AM EDT
In-Person

Basic experimental design with special considerations for education studies

Agenda

8:00 AM

Sessions 6 to 8 will cover the logic of randomized experiments and their advantages for making causal inferences and a review of the basics of experimental design and analysis of variance focusing on the two most widely used designs: the hierarchical design and the (generalized) randomized blocks design. Issues that arise because of the hierarchical structure of populations in education (students nested in classes nested in schools) are discussed. Additional topics include: 

  1. Issues of which units to randomize (classrooms, schools, or individual students); 
  2. How to do the randomization; 
  3. Multiple levels of randomization (students to teachers, then teachers to conditions) and what the implications are for design and inference; 
  4. Randomization within schools/classrooms (contamination concerns) versus randomization between schools/classrooms (power issues); 
  5. Nature and role of blocking and covariates (including covariate and blocking considerations at student, classroom, and school levels); 
  6. Crossovers and attrition after randomization; 
  7. Handling multiple cohorts of treatment and control groups; and 
  8. How hypothesized aptitude x treatment interactions are included in design.

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 7 Jun 19, 2007 add remove
Jun 19, 2007
10:30 AM - 12:30 PM EDT
In-Person

Basic experimental design with special considerations for education studies

Agenda

10:30 AM

Continuation of Session 6.

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 8 Jun 19, 2007 add remove
Jun 19, 2007
1:30 PM - 3:15 PM EDT
In-Person

Basic experimental design with special considerations for education studies

Agenda

1:30 PM

Continuation of Session 7

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 9 Jun 20, 2007 add remove
Jun 20, 2007
8:00 AM - 10:00 AM EDT
In-Person

Statistical analysis overview I

Agenda

8:00 AM

This session will provide an introduction to hierarchical linear model analysis and how it is related to analysis of variance methods and regression approaches to analysis of the same designs. The session will include an

  1. Overview of multilevel models as a means to analyze data with dependencies due to repeated measures; 
  2.  introduce the intra-class correlation (ICC) to describe the degree of dependency or correlation in repeated measures within each level of these models; and 
  3. describe how different types of multilevel models address these dependencies by modeling the ICC. There will be an 
  4. overview of two-level models such as individual growth curve models and models that account for nesting of children in classrooms; and 
  5. three-level models such as individual growth curve models that describe change over time of children nested in classrooms.

Session panelists

Margaret Burchinal

Research Professor in Psychology
University of North Carolina, Chapel Hill
Session 10 Jun 20, 2007 add remove
Jun 20, 2007
10:30 AM - 12:30 PM EDT
In-Person

Statistical analysis overview II

Agenda

10:30 AM

Continuation of morning session (including concepts of ICC).

Session panelists

Margaret Burchinal

Research Professor in Psychology
University of North Carolina, Chapel Hill
Session 11 Jun 20, 2007 add remove
Jun 20, 2007
1:30 PM - 3:15 PM EDT
In-Person

Modeling growth in trials

Agenda

1:30 PM

This session will provide an introduction to growth models and longitudinal analyses, and applications of hierarchical models to the analysis of trials with longitudinal components. The session will include examples of how to use the HLM and SAS Proc Mixed programs to analyze two-level models in which children are nested in classrooms and three-level models in which there are repeated assessments of children nested in classrooms.

Session panelists

Margaret Burchinal

Research Professor in Psychology
University of North Carolina, Chapel Hill
Session 12 Jun 20, 2007 add remove
Jun 20, 2007
7:30 PM - 8:30 PM EDT
In-Person

IES Grant Opportunities

Session 13 Jun 21, 2007 add remove
Jun 21, 2007
8:00 AM - 10:00 AM EDT
In-Person

Sample size and statistical power I

Agenda

8:00 AM

Sessions 13 and 14 will cover:

  1. Computing statistical power for cluster randomized trials;  
  2. The role of between-unit variance components and intraclass correlation; 
  3. Planning sample sizes with adequate power; 
  4. The effect of blocking (matching) and covariates on power; 
  5. And how choice of analysis influences power. The sessions will also include: 
  6. Discussion of effect size; 
  7. How to determine and justify the minimum effect size to detect; 
  8. Designing around power considerations associated with the minimum detectable effect size; and 
  9. Cost considerations.

Session materials

Dr. Howard Bloom

Download

Session panelists

Howard Bloom

Chief Social Scientist
MDRC
Session 14 Jun 21, 2007 add remove
Jun 21, 2007
10:30 AM - 12:30 PM EDT
In-Person

Sample size and statistical power II

Agenda

10:30 AM

Continuation of Session 13.

Session materials

Dr. Howard Bloom

Download

Session panelists

Howard Bloom

Chief Social Scientist
MDRC
Session 15 Jun 21, 2007 add remove
Jun 21, 2007
10:30 AM - 12:30 PM EDT

Sampling and external validity

Agenda

10:30 AM

This session includes discussion of: 

  1. The nature of sampling in experiments; 
  2. That the concept of blocks in experiments is the same as clusters in sampling; 
  3. The logic of generalization from blocks (clusters) as fixed effects; 
  4. The logic of generalization from blocks as random effects; 
  5. Considerations and procedures for sampling clusters (districts, schools, classrooms, instructional small groups) from a population and units within clusters (classrooms within schools, students within classrooms); 
  6. Sample representativeness and sample diversity; 
  7. Oversampling; 
  8. Testing interactions between sample characteristics and treatment response to explore generalizability; and 
  9. Sampling issues in multi-site studies.

Session materials

Dr. Howard Bloom

Download

Session panelists

Howard Bloom

Chief Social Scientist
MDRC
Session 16 Jun 22, 2007 add remove
Jun 22, 2007
8:00 AM - 10:00 AM EDT
In-Person

Alternatives to randomized trials I

Agenda

8:00 AM

Sessions 15 and 16 will provide an overview of the design alternatives that have the highest internal validity under favorable circumstances and may be considered when a randomized design is not feasible (regression discontinuity, nonrandomized comparison groups with statistical controls, and time series). The discussion of these designs will focus on their general character and logic, the circumstances in which they are applicable, and their relative advantages and disadvantages.

Session materials

Lipsey 2

Download

Session panelists

Mark Lipsey

Director of the Center for Evaluation Research and Methodology and a Senior Research Associate at the Vanderbilt Institute for Public Policy Studies
Vanderbilt University
Session 17 Jun 22, 2007 add remove
Jun 22, 2007
10:30 AM - 12:30 PM EDT
In-Person

Alternatives to randomized trials II

Agenda

10:30 AM

Continuation of Session 15

Session materials

Lipsey 2

Download

Session panelists

Mark Lipsey

Director of the Center for Evaluation Research and Methodology and a Senior Research Associate at the Vanderbilt Institute for Public Policy Studies
Vanderbilt University
Session 18 Jun 22, 2007 add remove
Jun 22, 2007
4:30 PM - 6:00 PM EDT
In-Person

Networking Session

Session 19 Jun 25, 2007 add remove
Jun 25, 2007
8:00 AM - 10:00 AM EDT
In-Person

Recruitment of sites and participants

Agenda

8:00 AM

This session covers strategies for

  1. Recruiting and retaining schools into trials; 
  2. Encouraging school personnel and parents to participate (e.g., incentives, benefits); 
  3. Tracking participants; 
  4. Encouraging participation in posttest assessments; and 
  5. Dealing with instability in sample (e.g., students and teachers transferring; schools merging, dividing, or closing). In addition, the session will include 
  6. Discussion of ethical concerns of schools and parents regarding participating in randomized trials.

Session materials

Fred Doolittle

Download

Session panelists

Fred Doolittle

Director of the Policy Research and Evaluation Department
MDRC
Session 20 Jun 25, 2007 add remove
Jun 25, 2007
10:30 AM - 12:30 PM EDT
In-Person

Data collection in the field

Agenda

10:30 AM

This session covers 

  1. Practical aspects of data gathering designed to minimize error, bias, and loss of cases: 
  2. Training research staff; 
  3. Pilot-testing data gathering procedures; 
  4. Providing for ongoing quality assurance monitoring; 
  5. Use of state-of-the-art data gathering methods (e.g., computer assisted phone interviewing or CAPI) designed to reduce errors and bias by enhancing the completeness and consistency of responses and automatically recording participants' data; and 
  6. Strategies for conducting small-scale validation studies to assess data quality.

Session materials

Dr. Ina Wallace

Download

Session panelists

Ina Wallace

Senior Research Psychologist
RTI International
Session 21 Jun 25, 2007 add remove
Jun 25, 2007
1:30 PM - 2:45 PM EDT
In-Person

Recruiting participants and collecting data from the trenches

Agenda

1:30 PM

This session is an informal discussion with Vanderbilt researchers regarding recruiting and maintaining participants and collecting data.

Session 22 Jun 26, 2007 add remove
Jun 26, 2007
8:00 AM - 10:00 AM EDT
In-Person

Analyzing intervention effects I

Agenda

8:00 AM

Sessions 22 to 24 constitute a hands-on introduction to the analysis of randomized experiments in educational research. Session 22 is an introduction to the use of HLM to conduct analyses of experiments using multilevel and mixed model analysis of variance approach.

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 23 Jun 26, 2007 add remove
Jun 26, 2007
10:30 AM - 12:30 PM EDT
In-Person

Analyzing intervention effects II

Agenda

10:30 AM

Continuation of morning session.

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 24 Jun 26, 2007 add remove
Jun 26, 2007
1:30 PM - 3:15 PM EDT
In-Person

Analyzing intervention effects III

Agenda

1:30 PM

    Continuation of morning session.

Session materials

Dr. Larry Hedges

Download

Session panelists

Larry Hedges

Board of Trustees Professor of Statistics; Faculty Fellow, Institute for Policy Research
Northwestern University
Session 25 Jun 27, 2007 add remove
Jun 27, 2007
8:00 AM - 10:00 AM EDT
In-Person

Handling missing data in the analysis of trials

Agenda

8:00 AM

Sessions 25 and 26 will focus on a discussion of analysis with missing data, and an efficiency (planned missing data) design for measurement called the 3-form design. Session 25 will cover 

  1. Missing data theory, 
  2. Analysis with multiple imputation (with Joe Schafer's NORM program and SAS Proc MIX), and 
  3. Analysis with Full Information Maximum Likelihood (FIML) procedures (in the context of structural equation modeling). This session will also address material related to participant attrition.

Session materials

John Graham

Download

Session panelists

John Graham

Professor, Department of Biobehavioral Health
Pennsylvania State University
Session 26 Jun 27, 0007 add remove
Jun 27, 0007
10:30 AM - 12:30 PM EDT
In-Person

Missing data design

Agenda

10:30 AM

Session 26 will include a discussion of 3-form design, and related measurement designs. The 3-form design, a kind of matrix sampling, allows researchers to leverage limited resources to collect data for 33% more survey questions than can be answered by any one respondent. This session will cover implementation strategies, provide examples, and provide strategies for estimating the benefit of the design compared to other possible measurement designs. Specific advantages for group randomized trials will be discussed.

Session materials

John Graham

Download

Session panelists

John Graham

Professor, Department of Biobehavioral Health
Pennsylvania State University
Session 27 Jun 27, 2007 add remove
Jun 27, 2007
1:30 PM - 2:45 PM EDT

Reporting guidelines

Agenda

1:30 PM

This short session covers reporting guidelines for field trials (e.g., CONSORT) including

  1. Describing the source (e.g., initial selection, refusal at assignment, attrition) and magnitude of participant loss; 
  2. Practices for describing the attributes of participants in the achieved samples; 
  3. Adequate description of treatment, and 
  4. Implementation data, etc.

Session panelists

David Cordray

Professor of Public Policy and Professor of Psychology
Vanderbilt University
Session 28 Jun 28, 2007 add remove
Jun 28, 2007
8:00 AM - 10:00 AM EDT
In-Person

Reports of Student Work Groups: Designing Randomized Control Trials

Session 29 Jun 28, 2007 add remove
Jun 28, 2007
10:30 AM - 12:30 PM EDT
In-Person

Reports of Student Work Groups: Designing Randomized Control Trials

Session 30 Jun 28, 2007 add remove
Jun 28, 2007
1:30 PM - 3:30 PM EDT
In-Person

Reports of Student Work Groups: Designing Randomized Control Trials

Session 31 Jun 28, 2007 add remove
Jun 28, 2007
3:45 PM - 4:45 PM EDT
In-Person

Reports of Student Work Groups: Designing Randomized Control Trials

Session 32 Jun 29, 2007 add remove
Jun 29, 2007
8:00 AM - 10:00 AM EDT
In-Person

Final Review and Evaluation

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

You may also like

Zoomed in IES logo
Contract

What Works Clearinghouse Supporting and Analyzing ...

Contract number: 91990025F0014
Read More
Zoomed in IES logo
Training Material

Reduced calendar years: How are they implemented, ...

Read More
Zoomed in IES logo
Fact Sheet/Infographic/FAQ

Consistent Implementation of a Multi-Tiered System...

Read More
icon-dot-govicon-https icon-quote