Skip Navigation
A gavel National Board for Education Sciences Members | Priorities | Reports | Agendas | Minutes | Resolutions| Briefing Materials
National Board for Education Sciences
March 23, 2011 Minutes of Meeting

Location:
Institute of Education Sciences Board Room
80 F Street NW
Washington, DC

Participants:
National Board for Education Sciences (NBES) Board Members Present
Jonathan Baron, Chair
Bridget Terry Long, Vice Chair
Deborah Loewenberg Ball
Anthony Bryk
Adam Gamoran
Philip Handy
Sally E. Shaywitz
Robert Anacletus Underwood

NBES Board Members Absent:
Kris Gutierrez
Margaret McLeod

Ex-Officio Members Present:
John Q. Easton, Director, Institute of Education Sciences (IES)
Jack Buckley, Commissioner, National Center for Education Statistics (NCES)
Rebecca Maynard, Commissioner, National Center for Education Evaluation and Regional Assistance (NCEE)
Lynn Okagaki, Commissioner, National Center for Education Research (NCER); Acting Commissioner, National Center for Special Education Research (NCSER)
Jim Griffin, Delegate, National Institute of Child Health and Human Development (NICHD)
Joan Ferrini-Mundy, Delegate, National Science Foundation (NSF)

NBES Staff:
Monica Herk, Executive Director, NBES

Designated Federal Official:
Mary Grace Lucier

IES Staff:
Sue Betka
Matt Devine
Stephanie Schmidt
Ellie McCutcheon
Allen Ruby
Audrey Pendleton
Elizabeth Warner
Gil Garcia
Anne Ricciuti

Invited Presenters:
Anthony Wilder Miller, Deputy Secretary, Department of Education
Ruth Curran Neild, Associate Commissioner, NCEE
James H. Shelton, III, Office of Innovation and Improvement (OII), Department of Education
Jefferson Pestronk, OII, Department of Education

Department of Education Staff:
Paul Wood, Office of Communications and Outreach (OCO)
Marcia Sprague, Office of the General Counsel (OGC)

Members of the Public:
Kim Bromann, Coalition for Evidence-Based Policy
Sarah D. Sparks, Ed Week
Karen Studwell, American Psychological Association (APA)
Carla Jacobs, Lewis-Burke Associates
Teresa Duncan, ICF International
La Tosha Plavnik, Consortium of Social Science Associations (COSSA)
Gerald Sroufe, American Educational Research Association (AERA)
Sarah Spreitzer, Lewis-Burke Associates
Dan T. Mullaney, Pearson Education
Debrale Graywolf, Evance Research, Inc. (Present via telephone)

Call to Order, Approval of Minutes, Swearing in of New Members, Chair Remarks
Mr. Jon Baron, Chair, NBES

Mr. Baron called the meeting to order at 8:35 a.m. and asked for a roll-call of Board members and ex-officio members. Ms. Lucier noted that a quorum was present. Other meeting participants and attendees also introduced themselves. Mr. Baron announced that Deputy Secretary Miller was present to swear in new members Tony Bryk and Robert Underwood. Kris Gutiérrez was not present.

After swearing in the new members, Deputy Secretary Miller spoke on the importance of research and evaluation in advancing education. The Administration is committed to evidence-based policymaking. Understanding what works in education is an important component of decisionmaking. A core part of the strategy is infusing a dedication to research and evaluation throughout the agency. IES's work in advancing that agenda is critical, as is IES's influence on implementation. He and Secretary Duncan are dedicated to having the best research possible and making the most of that research. Mr. Baron said that the Department, Congress, the agencies, and Office of Management and Budget (OMB) recognize that the Department has played a lead role in using rigorous evidence to improve policy, and that tools developed by IES have been models for tools in other agencies. He thanked the Deputy Secretary for the Department's support.

Mr. Baron called for a motion on the minutes of the previous meeting. Dr. Gamoran moved approval. Dr. Long seconded and the motion carried unanimously.

Mr. Baron welcomed the two new members and congratulated Dr. Buckley, who was confirmed as the Commissioner of the National Center for Education Statistics. He introduced Dr. Herk, the new Executive Director of the National Board for Education Sciences, who has a background in public policy, research, and intelligence. Dr. Herk thanked the Board and said her career has had a common theme of bringing evidence and evidence-based approaches to policymaking, which she will continue to do with IES. She urged the Board members to feel free to contact her for support.

Mr. Baron asked the two new members to introduce themselves. Dr. Bryk spoke briefly on the creation of a Center on School Improvement at the University of Chicago (UC). While working at UC, he launched what is now called the Urban Education Institute and with colleagues founded the Consortium on Chicago School Research, which was unusual in its time in that it brought together high-level academic talent focused on education in the city and informing reform. Among other things, the Consortium built a large relational database on student achievement.

Dr. Underwood spoke briefly on his legislative background representing Guam in the U.S. House of Representatives. He emphasized that his goals as a Board member are trying to fit small and unique populations into the national research agenda as well as applying educational research to help those populations.

Once the attendees had introduced themselves, Mr. Baron made opening comments and summarized the agenda topics. He has served on the Board since 2004 and has seen IES make remarkable progress toward building a body of scientifically valid knowledge on which educational practices do and do not work. Through the National Center for Education Evaluation (NCEE) and the National Center for Education Research (NCER), IES has funded 20 large field experiments evaluating more than 75 interventions across education policy including school choice, charter schools, different reading and math curricula, educational software, and professional development programs. These studies have provided convincing evidence on the effects of different interventions on outcomes, such as high school completion and student achievement. IES leadership, staff, and researchers have created a pipeline to the truth about what programs and strategies work.

Overview of Recent Developments at the Institute of Education Sciences (IES), including the IES Centers
Dr. John Q. Easton, Director, IES

Dr. Easton reported that IES is actively working with the Department of Education's White House Liaison and staff to have reappointments made to bring the Board to full membership. He welcomed Mr. Baron in his new role as chairman and Dr. Buckley, the new Commissioner for the National Center for Education Statistics (NCES). Dr. Buckley has a background in education, applied statistics, and education policy and served as deputy director at the Center for 2 years. Stuart Kerachsky, the former acting NCES Commissioner and Board member, is now the Director of the Policy and Program Study Service in the Department of Education, which strengthens IES's relationship with the Department.

Preliminary Discussion of Proposed Agenda for the Regional Educational Laboratories (RELs)
Dr. Ruth Curran Neild, Associate Commissioner for Knowledge Utilization, NCEE

Dr. Easton introduced Dr. Neild, who works on the regional Labs, the What Works Clearinghouse, Education Resources Information Center (ERIC), and the IES library system. Dr. Neild referred the members to the statement of work (SOW) and a one-page summary for the RELs request for proposals (RFP). A draft SOW has been released, and the RFP is scheduled to be released later in the week.

What is being proposed for the next generation of RELs is consistent with the current RELs: emphasis on rigorous evaluations of impact; research using state and district databases; and translation of research in a way that is accessible, useful, and engaging for practitioners. There has been a slight shift in emphasis. Although the current generation of RELs has investigated a broad set of topics, the new generation will focus on three to five key topic areas over the 5 years of their contracts. Another new emphasis is to build capacity in the states and districts where the RELs work through a sustained engagement between RELs and education practitioners using research alliances, which are groups of individuals who share a particular educational concern and who work together on that concern over time. Rigorous studies will still be embedded in each REL's SOW, as appropriate for advancing knowledge in the topic areas. In addition, work by RELs with practitioners on state databases is intended to form a package of work advancing a particular area of knowledge.

Dr. Easton hoped the Labs and alliances will work closely with the agencies, building capacity to evaluate programs as part of their rollout and implementation. It is integral to the process to develop strong evaluations as districts or states look at implementing new policies. Mr. Baron commented that the SOW reflects the vision put forward in the approved priorities, especially around collaborations between researchers and practitioners. The opportunistic approach to evaluation reflects the field's need to play a larger role in determining what is evaluated. He solicited comments from the Board.

Mr. Handy commented that at $67 million for 10 RELs per year, there may not be enough money for rigorous analysis. Dr. Neild said RCTs (randomized controlled trials) can cost anywhere from $200,000 to $20 million or more. However, to lower costs the studies can use state administrative data, other existing data, and naturally occurring events.

Dr. Ball made three points.

  • The goal of improving the way research knowledge gets used is based on an old, maybe outdated, idea about the relationships between research and practice. Improving practice involves a very different set of relationships between work in the academic disciplines and what the practitioners have capacity for.
  • Much of the research that finds no effect of the intervention involves "interventions" that are under-specified.
  • She wondered why the research topics are left open for the RELs to choose rather than allocating funding for specific topics, which target the most crucial education problems based on a set of priorities. To better focus on top priorities, she suggested that the Board create a set of specific priorities (as opposed to broad domains like "teacher quality") based on what has already been learned from education research.

Dr. Bryk said there is a subtle difference between pushing research into practice versus research on practice improvement. The difference involves privileging method over focusing on practical problems and asking how evidence can be used to contribute to improvement. An issue like teacher quality can be reframed as something specific like improving the initial socialization of teachers into teaching, making them more effective and increasing the likelihood of their retention. This framing reveals variations in outcomes that can shape a research and development (R&D) agenda with a specific problem at the center and targeted measures people can actually work on. Mr. Baron asked for his opinion on the research alliance concept. Dr. Bryk said it would depend on how the alliances operate and how they connect to a field of practice.

Dr. Gamoran said focusing on three to five problems will address the main weakness of the previous Labs: namely, the lack of coherence across and within the Labs' research. The alliances will also help establish more permanence for the Labs' work. He suggested two potential models for the alliances: the Consortium on Chicago School Research and the Strategic Education Research Partnership. He said that when the questions are narrowly defined to ask whether certain intervention steps lead to certain specified outcomes, the RCT is a good methodology.

Dr. Maynard said the capabilities requirements for the RELs are specified in a way that should discourage the weak research and bad ideas around implementation. The new structure will require the RELs to have staff with sufficiently broad and deep expertise to address questions of priority to the regions in valuable and useful ways.

Dr. Long said the RELs are moving in the right direction. Leaving the nature of the alliances open will help to match the right method to the right problem. However, more specificity is needed on what the alliances will be like. Without unduly restricting the topics, the Board can try to point toward certain priorities. It takes time for researchers to build relationships with schools, which can affect early evaluation results. Oversight and compliance are important to ensuring that the RELs create something of value, but compliance should not be overly burdensome. She wondered whether customers, schools, and districts will be consulted on whether or not value was created.

Mr. Baron said having concrete examples within the RFP of what is desired would help proposers. The RELs look not only at causal inference. They also work on use of evidence and process improvements for interventions, strategies, and field experiments. Mr. Baron said that there are a few studies that have come out of the Labs that are exemplary, and he cited a well-executed study by one of the Labs on a kindergarten language development curriculum.

Dr. Underwood agreed that there can be a lack of clarity on what the agenda is, and maybe more guidance would be helpful. He wondered about the role of higher education in the research alliances and what is meant by building capacity and analytic capacity in school districts and states. He also wondered what enhancing the Labs' capacities means to the local districts.

Dr. Bryk encouraged thinking about the strategic alliances in terms of a networked improvement community. He wondered what the local capacity that is built will look like. The participants have to be made clear on why they are doing this, what they seek to accomplish, and what constitutes evidence of continuous improvement in a constructive direction.

Dr. Ball agreed that the changes in the instructions for the new RFP are encouraging. The language in the new RFP about the desired work should differentiate itself from the language in the previous RFP about "knowledge acquisition." Practitioners are still using the conventional notion of "research into practice." Rebuilding the RELs is a way to educate the field about the new thinking. She also said that what counts as an intervention has been defined too narrowly, not taking into account the effect of the intervention on the practitioners (and the implications for take-up and implementation). The RELs are well-situated to think about this. Dr. Gamoran agreed that the interventions must be more broadly conceptualized. There must be discussion on how to study interventions that are more likely to have causal effects.

Dr. Long raised the following question: if, for example, 5 of the 10 RELs do not submit any proposals that reach IES's desired standards of quality, is it possible for IES not to award contracts in those regions? She said that could send a clear signal that IES is not going to fund Labs that do not meet the standard. Dr. Gamoran said IES can make no award if no proposals meet the standard.

Mr. Baron said the Laboratories are an important part of IES's mission and funding. Because the appropriation bill has not been passed, there is no authority to extend the contracts. The Board discussed a resolution to Congress on that matter.

National Center for Education Evaluation and Regional Assistance Report and Discussion
Dr. Rebecca Maynard, Commissioner, NCEE

Dr. Maynard highlighted other issues at NCEE. RELs have received disproportionate attention over the past few months. However, there is also a robust evaluation portfolio. Going forward, there will be increased attention on issues related to teacher preparation and effectiveness, including four new projects that will be launched over the next 6 months: (1) a study of promising teacher programs looking to address issues of preservice; (2) a project on the impact of teacher and leader evaluation systems; (3) design work for a new type of evaluation in English language acquisition; and (4) design work for an impact study on professional development.

The What Works Clearinghouse is undertaking a web redesign to address the increasing diversity of the content and better serve the interests of the increasingly diverse user base. Among the new features will be a Find What Works button, which makes the database easier to sort through by effectiveness. NCEE is working to create more opportunities for training in the Clearinghouse review procedures, including through online training. NCEE also is piloting the options for sharing coded studies and trying to find ways to efficiently increase what is in the system and make it more usable.

Due to budgetary uncertainty, staff and the RELs are prioritizing their current activities to minimize the risk of unfinished work. NCEE staff is also doing strategic planning on the Library.

Mr. Baron called for discussion. In response to a question from Mr. Handy, Dr. Easton said there is no marketing officer in IES. Many of the changes in the What Works Clearinghouse, in particular, are user-driven. Dr. Maynard said there is a lot of feedback through the What Works Clearinghouse help desk and through user surveys. There also is promotional work with teachers and meetings with education leaders, state school officers, AERA, and constituent groups on the Clearinghouse. RELs are major disseminators of products of the What Works Clearinghouse, and they provide feedback from users to the Clearinghouse. Mr. Handy suggested metrics to see if improvements are being made in the Clearinghouse and to process input and research. Dr. Maynard said many of the changes to the What Works Clearinghouse website are in response to the reported frustration of users who have sought to answer particular questions through the database. Mr. Baron suggested that the next meeting's agenda include the question, "How do you measure use of the What Works Clearinghouse and improve its dissemination capability?" He pointed out that the number of hits alone does not measure the usefulness of a site.

Dr. Shaywitz asked how the goals of the Clearinghouse and its target consumers and audiences are determined. She also asked how the target audiences are made aware of the Clearinghouse, and how the information is set up to be useful to different groups, including parent groups and teachers. She emphasized the importance of dissemination and awareness and suggested that the group discuss the overall goals and targets of the Clearinghouse at the next meeting.

Mr. Baron raised the issue of the Clearinghouse identifying the more important outcomes, longer term outcomes, and final outcomes of interventions. For example, a large effect size on a less important outcome may be of lower policy importance than a small effect on an important outcome or a case where effects are sustained over time. Dr. Maynard indicated that the system will be built out to include ways to help users distinguish sustained effects on important outcomes while continuing to provide detail on other outcomes. Dr. Shaywitz said some of the short-term outcomes are basic building blocks in the process of learning, so the issue of which outcomes are more or less important is more complex than just which are intermediate and which are final outcomes.

National Center for Education Research Report and Discussion
Dr. Lynn Okagaki, Commissioner, NCER

Dr. Okagaki gave an overview on the two research centers in IES. The purpose of the Special Education Research Center is to support research; identify effective interventions for improving outcomes in children and youth with disabilities; and expand basic scientific understanding of infants, children, and youth with disabilities. There are long-term research programs that cover basic developmental outcomes, language development, reading development, behavioral outcomes, math and science education, and transition outcomes. In addition, NCSER supports research training grant programs to build research capacity. NCSER is responsible for two longitudinal national studies: the National Longitudinal Transition Study and the Pre-Elementary Longitudinal Study. The projects cover low-incidence disability as well as learning disabilities. This year's goal is to increase the number of funded applications by 15 to 20 percent.

This year, NCSER will support a week-long summer research training institute in single case experimental design, a methodological approach which is especially useful in low-incidence populations. It will be held at the University of Wisconsin and will include 30 to 40 participants.

There are two new research programs: one on families with children with disabilities and another on technology in special education. In addition, NCSER held a small business innovation research competition in special education. The Special Education Center has four researchers working on synthesis of research on reading for children who have or are at risk for disabilities. This should be released in September.

For the Education Research Center, Dr. Okagaki said that 2 years ago, NCER had its first competition for evaluation of state and local education programs and policies. One of the evaluations that was launched from this first cohort of projects was the evaluation of Tennessee's voluntary pre-K program. Data from the first cohort of the evaluation indicate that there were substantial improvements for children in the pre-K program. The researchers are working closely with the state Department of Education and presented at the Society for Research on Educational Effectiveness.

The Reading for Understanding Initiative is a large program on reading comprehension. It has five core teams; each team covers several grade levels and works in multiple domains, including literature, science, math, history, and social studies. To make the interventions usable by regular teachers in ordinary schools, the teams partner closely with practitioners. The teams all work together, sharing results. The teams develop their multi-grade interventions based on whatever theoretical approach they proposed. Using results from the pilot studies, they will refine the interventions. The teams have been making impressive progress. Some teams will begin their efficacy studies in the second and third years of the interventions. Dr. Easton said the teams are working together in a continuous improvement environment. At the same time, IES is looking for evaluation strategies to look across the subprojects and get a sense of contextual and process factors related to successful outcomes.

Dr. Okagaki indicated that NCER is starting a new research synthesis. A substantial body of work has been accumulated in early childhood research, and the Center is working to share the information with the Department. Dr. Shaywitz suggested that findings being presented to the Department be reported on at the next meeting.

Dr. Bryk asked if there is a working theory of practice improvement that integrates the work. Dr. Okagaki said one of the major accomplishments has been the development of the assessment framework for reading, in which people are getting to consensus on what "reading for understanding" means. That framework is evolving. Also, all the core teams within NCER will do efficacy evaluations.

National Center for Education Statistics Report and Discussion
Dr. Jack Buckley, Commissioner, NCES

Dr. Buckley gave an update on challenges and priorities at NCES. The top priorities are the integrity of the data, relevance, timeliness, rigor, and innovation. NCES data must be relevant to policymakers' questions and available when the decisions are being made. NCES must continue to be on the cutting edge of sampling theory, psychometrics, measurement, instrument design, survey methodology, and analysis.

There have been five rounds of State Longitudinal Data Systems (SLDS) grants funding the building of systems in the states, but NCES has only recently started establishing common education data standards for interoperability and analyzability of the data. NCES has completed a beta version of the standards.

Another challenge is the proliferation of administrative data in education, which NCES has not embraced in the past, such as the teacher compensation study. These data can be used to improve the quality of survey and assessment data.

The third challenge is the momentum behind the Common Core State Standards. Race to the Top has funded two consortia that are attempting to turn the Common Core State Content Standards into assessment frameworks. NCES is responsible for administering the National Assessment on Educational Progress (NAEP), but there are questions about the future of NAEP. NCES is working closely with the National Assessment Governing Board and the consortia on what to do. It is not yet clear whether there will be a universally adopted standard or various consortia of states with different standards. NCES is prepared to work with either situation and to provide technical assistance to the Department's policy leaders.

NCES supports the Education Dashboard and hosts its website. NCES has been working on computer-administered assessment. To standardize the testing conditions as much as possible, NCES developed Sojourn, which reboots the computers on a standardized operating system. NCES is working with the Census Bureau and the Bureau of Labor Statistics to develop means of measuring educational outcomes in non-degree post-secondary education. There will be a field test in the Survey of Income and Program Participation.

One last study tries to link NAEP and the Trends in International Mathematics and Science Study (TIMSS) to provide international benchmarking at the state level or lower. There is a validation study with eight states. The short-term goal is to allow states to compare themselves to countries. In the long term, it will be possible to gauge the progress of states compared to international competitors. Dr. Easton pointed out that the study allows reporting of NAEP results on the TIMSS scale and vice versa.

Dr. Gamoran suggested that NCES also consider consistency as a priority. He said he'd like to have a discussion on the possibility of linking NAEP to state longitudinal data systems.

Mr. Baron recessed the meeting from 11:09 a.m. to 11:23 a.m.

Vote on the Board Recommendation on REL Funding

Mr. Baron called for a motion on the proposed Board recommendation urging Congress to continue funding the RELs at current levels and to extend existing Laboratory contracts for one year. Dr. Underwood moved approval and Dr. Long seconded. Mr. Baron read the recommendation into the record.1 Dr. Bryk suggested reframing the recommendation to show why it is programmatically important to continue the funding. Mr. Baron explained that the resolution will be attached to a letter explaining the context and the rationale. The motion carried unanimously.2

How Can IES Research/Evaluation Increase the Likelihood of Identifying Interventions that Produce Important Positive Effects? How Can the Policy/Research Community Make Better Use of Well-Designed Studies Showing No Effects or Adverse Effects?
Dr. Rebecca Maynard, Commissioner, NCEE

Dr. Maynard said NCEE assesses what is in place to make sure it is doing what is expected and assesses new ideas and proposals to facilitate smart decisionmaking by policymakers and practitioners. One way to get more positive research findings is to put more energy into development and design work. The development/testing/feedback process is a way to find better strategies for testing. Some tests take place in a scientific setting, but because not all environments can be controlled, there have to be real-world studies as well. It is important to pay attention to design features, such as sample size. Many studies in the field are statistically underpowered. It is important to think about continuous learning, incremental change, testing as you go, and continuous improvement. Both big ideas and incremental improvement are needed to move forward. It is important to be more attentive to testing new strategies before implementing them and inserting causal inference research into those areas. There should be more vigilance about seizing opportunities and greater receptivity to using the findings. Although the culture in the research community celebrates positive outcomes, negative and null findings should be taken as seriously as positive findings.

Roundtable Discussion

Mr. Baron said one distinguishing characteristic of IES is that it does not incentivize putting a positive spin on results. He asked Dr. Long, who has developed and tested an intervention before, to contribute her thoughts to the discussion. Dr. Long said the problem is that IES funds a lot of research with small or no effects. There must be some victories to justify continued research. She said there are three pots of money: congressional mandates, competitive IES grants, and IES studies. Improving investments in all three pots will depend on what signals are sent. Signals and information sent to policymakers should help them craft mandates that are evidence-based and continually evaluated. Signals to researchers and grantees should have an explicit criterion that reviewers look at the likelihood of success, and perhaps there should also be incentives to ensure that the right types of projects are funded. The Board can send signals regarding how to avoid large investments in trials that produce minuscule results.

Mr. Baron invited Dr. Gamoran to offer his thoughts. Dr. Gamoran said IES has had a transformative influence on the field of education research, but some early ideas were naïve. It is time to take stock of IES's approach to identifying, developing, and verifying intervention sets that promote outcomes for children. Three improvements could bring progress: changing the ways proposals are requested, changing the way proposals are evaluated, and changing what the researchers do.

RFPs should require a greater emphasis on theory. Interventions have a larger chance of success if they draw on discipline-based theories. Going forward, research should be informed by social theories, psychological theories, and theories from other disciplines. There should be stronger evidence from efficacy trials that the intervention works before investing in scale-up. Finally, there should be more thought on the organizational system within which interventions are scaled up.

Mr. Baron invited Dr. Ball to discuss approaches likely to yield positive effects from her experience in the knowledge uptake process and improving teacher quality. Dr. Ball said teacher quality is an especially important subject. Teachers make a difference in student outcomes; however, studies are producing null results, which misinforms policymakers. Teaching is a complex practice; therefore, research should look at all of the characteristics of teachers and how those characteristics relate to performance. The professional development community believes it knows what the best practices are, but research shows that not to be the case. The problem is that no one has designed better theories about the design of interventions in teacher training. There have to be better design hypotheses tied to instruction in a real-world context. The interventions must be specific and clearly specified.

The primary problem is that it is unknown what produces outcomes in students. This information is necessary in order to know how to prepare teachers. She suggested demanding that proposals be related to practice and highly specified around known student outcomes. There is a need for more reliable and valid measures of variables, such as student achievement. Unless interventions get inside the classroom interaction between teachers and students, they will probably not produce effects. IES can push the field to develop interventions that get better effects.

Mr. Baron asked Dr. Bryk to speak on designing and improving research for better designed, fully developed interventions with more promise to produce impacts. Dr. Bryk pointed out that research for policy and research for practice are two different things. Over the history of the field, interventions have had variable effects. It is not a question of what works but of what works for whom, under what set of conditions. The culture and practice (of improvement research) should change so that people ask four basic questions: What specifically am I trying to accomplish? How do I understand the problem and the system within which it is embedded? What hypotheses undergird proposed changes? How will I know if these changes are actually an improvement? Introducing these rudiments of disciplined inquiry into practice is at the core of quality improvement.

While there are programs that do a good job of preparing teachers, where they are and what effects they have are largely unknown. Because RCT studies typically involve relatively small voluntary populations, the studies do not demonstrate how the interventions will work for different people in different circumstances. Any small trial may have relatively weak evidence associated with it, but if we had a fleet of such studies these small pieces of information can be organized to accelerate the learning process. Currently, the What Works Clearinghouse leaves this evidence on the editing floor.

Mr. Baron gave examples of IES studies in the past that had inadequate preliminary evidence of efficacy. The National Reading Panel's meta-analysis was based on small preliminary studies with short follow-up and only intermediate outcomes. That evidence was then used to justify Reading First and many other interventions. Other examples included the violence prevention evaluation; IES's evaluation of reading and math software, Cognitive Tutor; and NCER's social and character development evaluation. He suggested that IES staff emphasize the existence of adequate efficacy evidence before a larger study is undertaken.

Dr. Shaywitz said people will continue to use programs without proven efficacy unless there are studies. It is important to measure the effectiveness of what is being used in practice and to make practitioners aware of the results. There is a difference between "science-based research" and actual effectiveness and impact. Mr. Handy said it is not clear that legislators are paying attention to the research. Dr. Long said influencing policymakers, guiding research, and making decisions about what IES will do are the three key points. Dr. Bryk said ideas, rather than evidence, usually drive education policy, even when extant evidence refutes the ideas. Dr. Ball said policy is often based on misconceptions. IES may have to become involved in myth-busting to improve decisionmaking.

Dr. Herk said IES can't answer every question in education research; the Board should look at what are the important questions to focus on.

Dr. Shaywitz clarified that online courses, charter schools, and the like are not interventions or treatments but formats.

Mr. Baron moved the discussion to findings in the NCEE evaluations that are commonly misunderstood. Dr. Shaywitz said a lot of knowledge on what does and doesn't work is not being used. She suggested forming a Board committee working with NCEE on myth-busting and disseminating negative results. Dissemination of negative results is important.

Dr. Okagaki commented that looking back over past evaluations, many of them are evaluations of education "formats" rather than evaluations of interventions. Dr. Underwood agreed that myth-busting is a function of dissemination and part of IES's responsibility, but it will not end the debates. Ultimately, votes, not evidence, determine policy. Practitioners do not have much faith in research either. Dr. Long spoke to the need for the Clearinghouse to not only be addressing what works but also what doesn't work and providing alternative solutions. She suggested that the Clearinghouse be more proactive in disseminating findings to drown out "noise" in the field.

Mr. Baron asked whether there was support for a Board committee on the issues of "noise," myth-busting, and communicating findings. Dr. Bryk suggested an offline conversation to address the issue and to bring ideas to the next meeting.

The Board went off the record at 1:07 p.m. for lunch and ethics training.

Advanced Research Projects Agency-Education (ARPA-ED), a New White House Initiative to Catalyze the Development and Deployment of New Tools and Technologies that Could Significantly Improve Student Learning
Mr. James H. Shelton, III, Assistant Deputy Secretary for Innovation and Improvement

Mr. Shelton said there are three ways to achieve innovation: the field scan, which identifies and vets what is being done in the field to figure out what can be taken to scale; applied research into interventions to benefit students; and a direct development function, like the role Defense Advanced Research Projects Agency (DARPA) plays for the military to accelerate progress. The purpose of ARPA-ED is to perform that third function. Successful ARPAs are focused on projects and goals. The researchers are temporary, and there is a great deal of flexibility to leverage resources and to develop the team needed to solve the problems. DARPA has more failures than successes, and ARPAs must be tolerant of failure. Although IES and the National Science Foundation (NSF) do research and development work, there is still a gap with regard to directed development. Thus, a program like DARPA, with a focus on education, can lead to real breakthroughs.

Mr. Pestronk added that ARPA-ED is in the President's 2012 budget (funding is slated at $90 million), which comes from the Fund for the Improvement of Education (FIE) and the Wireless Innovation Fund. Mr. Shelton noted that a significant opportunity for partnerships exists with other agencies. One such program, called Engage, focuses on finding a platform to produce rigorous evidence concerning the performance of different curricular strands for young children in STEM (Science, Technology, Engineering, and Mathematics) fields using gaming platforms.

Roundtable Discussion

Mr. Baron asked if ARPA-ED is in the proposed budget. Mr. Shelton said ARPA-ED is in the proposed budget as part of a large emphasis on research and development for education; including a proposed increase in funding for IES. If there is an ESEA reauthorization, ARPA-ED will be included in it. Mr. Baron commented that there is an opportunity for input at the early stages. Dr. Bryk asked about priorities for directed development. Mr. Shelton said the protocol is not yet finalized. The ultimate decision resides with the Secretary of Education and the Director of ARPA-ED. The priorities will probably come from expert input regarding the largest opportunities for breakthroughs and what is already being done that can be capitalized upon. Mr. Shelton said the emphasis will be on educational technology, but there are a number of other opportunities as well. It may be better to proceed by delivering something tangible before moving into softer areas.

Dr. Gamoran asked how to compress the research and development timeframe in an area that requires research on humans. He then asked about the traits of unsuccessful ARPAs. Mr. Shelton said Homeland Security's ARPA has not been successful because there was no direct line of oversight to the Secretary and so the proper resources were not allocated to it, which is a common problem. For some ARPAs, there is an inability to invest in innovative ideas that appear to have promise. It is important to have the structure and flexibility to attract the right people. Mr. Pestronk said Biomedical Advanced Research and Development Authority (BARDA) did not succeed because the mission was too far beyond current science. However, the field has made progress in synthetic vaccine generation as a result of BARDA's work. A project can fail and still produce substantial scientific inroads into various questions. Addressing the question of compressing the research and development timeframe, Mr. Shelton said he would forward some writings to the Board. An ARPA operates differently from other research organizations. First, they attract people at the top of the field who have a fundamentally different approach that can transform the field. Second, ARPAs do not explore questions; they produce solutions, which is different from exploratory research. Third, ARPAs have flexibility to bring in people from different backgrounds to build teams. Team management is a core competence for program managers. Top program managers attract partners who are the top-performing practitioners in their fields.

Mr. Baron asked if there is a way to determine which seemingly promising projects will lead to breakthroughs. Mr. Shelton said there should be a progression of levels of evidence to see whether progress is being made toward breakthroughs. ARPAs do not work toward incremental improvement, so projects with marginal results in the early stages are shut down, and there are progressively more rigorous evaluations moving forward. He looks forward to partnering with IES on these evaluations and on developing new ways to think about leveraging evidence to figure out methodologies that work. ARPA-ED and IES can work together early on to get a clear idea of what rigor looks like at every stage of development, to work on evaluations supporting design, and to think about how to do evidentiary work in the ARPA context. Mr. Baron said IES's partnership on evaluation with the Office of Innovation and Improvement (OII) can serve as a model.

Mr. Handy commented that the Department of Defense (DoD) has a school system that is being revamped. Mr. Shelton said one of the President's commitments to support military families was to put Department of Defense Education Activity (DoDEA) on the cutting edge of what works with technology in education. DoD is trying to leverage the best of what it knows for use in the solicitations to track partners to help advance the school system.

Dr. Maynard wondered at what point NCEE should become involved. NCEE has not been active in the developmental stage, but it has partnered with the Investing in Innovation (i3) program and can have a productive relationship going forward. Dr. Gamoran expressed concern that the problems ARPA-ED is tackling may be too large to address in a reasonable timeframe. For example, there are a number of variables to look at in the gaming effort. Mr. Shelton said picking the right problem is very important.

Dr. Long raised a few elements of complexity that make education work different. There is no captive audience to test the technology; participation is voluntary. Because interventions can work for some students and not others, the studies must be powered to capture different kinds of students and contexts.

Mr. Baron said in the DARPA setting, an achieved goal is followed by DoD becoming a major user of the end product and wondered what the path to moving the improvement into widespread use will be in the ARPA-ED context. Mr. Shelton agreed that the market in the education field is fragmented and does not always choose approaches based on their effectiveness. He is already trying to address some of the challenges that exist outside of ARPA-ED, and within ARPA-ED, he is working on a commercialization function. Change in the education landscape will drive growth and change in the marketplace. The Investment in Innovation fund and other funding pools can cause a shift in marketing dollars toward product quality and evaluation, as could performance-based measures.

Mr. Baron recessed the meeting from 2:36 p.m. to 2:53 p.m.

Increasing the Effectiveness of Federal Education Programs through Development/Use of Rigorous Evidence about "What Works"/NBES Discussion of How IES, the Department, and Congress Can Best Accomplish this Objective in the Context of Reauthorization of the Elementary and Secondary Education Act (ESEA)

Mr. Baron noted that part of the Board's role is to help advance rigorous research and the use of research as part of federal education programs. IES is the research arm of the Department of Education, but the Department has many other federal education programs, such as Title I. There is often no requirement that these programs generate valid evidence about which program models, strategies, and practices work or don't work. To address that, Mr. Baron, after speaking with each of the Board members, drafted a resolution, NBES Recommendations for Education Legislation, which all of the members have before them.

The draft recommendation to Congress addresses the issue that many studies of these large funding streams indicate that the funding streams fall short with regard to improving educational outcomes. In several cases, these programs show weak or no positive outcomes. The broad recommendation is that Congress should include authorizing language based on two concepts: (1) funding incentives for grant applicants to use program models or strategies supported by evidence of effectiveness; and (2) sufficient funding to evaluate the impact of selected interventions through studies overseen by IES that allow for strong causal conclusions, including RCTs where appropriate. Mr. Baron said the idea is for programs to build knowledge in collaboration with IES so that the programs can evolve through use of evidence toward greater effectiveness over time.

Mr. Handy questioned the premise that the Board has to pass resolutions in order to advocate to Congress. Mr. Baron said that recently, with regard to the Regional Educational Laboratory issue, the Department's Office of General Counsel indicated that the Board had to pass an official recommendation before following up with Congress. Dr. Gamoran said two such Board resolutions have recently been successful: one related to the criteria for evaluating the i3 program and the other related to evaluation of the American Recovery and Reinvestment Act of 2009 (ARRA) investment in education.

Mr. Baron said there have been many pieces of legislation calling for "evidence-based research," but when analyses have been done regarding whether these laws actually shifted what received funding, basically there was no effect. In i3, the provisions to shape how funds were allocated had a meaningful effect. The programs that were funded had strong evidence, or at least moderate and potentially strong evidence. Especially for the larger grants, there was clearly strong attention to evidence, probably by IES-selected reviewers.

There have been situations in which Congress or OMB has put IES in charge of assessments and evaluations. Dr. Easton said the Department of Education has moved in the past year to make IES the home for evaluations.

Dr. Maynard pointed out that funding only initiatives with demonstrated effectiveness may dampen the enthusiasm of organizations to be innovative. It is important to encourage evaluation of innovative, undocumented strategies and to conduct such evaluations in a safe environment, meaning that innovators are not punished if their innovation is not found to be effective. Mr. Baron said much of the money in Investing in Innovation is going to programs backed by only preliminary evidence. In i3, the innovation piece might be an opportunity for straightforward evidence-building. Dr. Maynard said throwing away everything that either has not been evaluated or is not totally new would be a lost opportunity rather than an incentive to innovation. The alternative would be to incentivize smart people with practice experience to exercise their judgment in developing interventions, but with a commitment on their part to build the knowledge that is missing and to be reflective about evaluation results so they can further improve the intervention. Dr. Bryk agreed, saying that it is an oxymoron in i3 that only organizations with effective innovations can apply to the Innovation Fund. That means no new organizations can ever be funded. This precludes local entities from becoming more evidence-based. Dr. Long said there is a case for spending money on things that have a chance of succeeding because they are evidence-based. We do not want to exclude possible innovations, but there should be some theory of change or compelling case for why someone thinks an intervention works, so that i3 does not just randomly "throw money" at proposals.

Dr. Maynard said that, even for innovative programs, there still has to be a strong supporting theory that can be tested empirically. Mr. Baron wondered whether the language could be revised to allow innovative approaches as long as the grantee agrees to being evaluated as a condition of the award. Dr. Maynard said states and districts will always be making changes to how they educate students. Incentives can be built in to encourage schools and districts to participate in evaluations—for example, providing an incentive to phase in a new strategy or policy for a random subset of the students, teachers, schools or districts, creating a sample for evaluation. Dr. Bryk suggested that the language be changed so that interventions which are not currently on the What Works Clearinghouse list of effective interventions have rigorous funding incentives to evaluate anything they do. There was discussion on the terms "new" and "locally developed."

Dr. Ball said there is a middle ground between somebody simply saying "I have a good idea," and highly rigorous evidence. The middle ground is a well-reasoned and conceptually connected argument. In the effort to press for higher standards of rigor, much of the field has been left behind. She asked whether it is possible to develop a way of thinking about how to support innovations that have not yet reached the highest standards of evidence. She was interested in identifying efforts that are promising enough to be the basis of future development.

Dr. Long said the general language of the recommendation is appropriate, but asked whether the Board should take additional steps beyond just sending the resolution to Congress. Mr. Baron said that was open to Board discussion.

Dr. Underwood said that the Department's Blueprint for Reform indicates that the Secretary will submit a biannual plan for ESEA evaluation of performance measures and will establish an independent panel on the plan. He asked how the Board's resolution fits in with the evaluation plans described in the Blueprint. Mr. Baron said that IES's evaluations can be complementary studies on which "pieces" do and do not work and those studies could inform the Department's larger evaluation of the overall funding streams.

Dr. Bryk suggested providing incentives to get districts to think analytically about interventions and become active agents developing rigorous evidence. Because the field is changing in that direction, incentives can push the districts in a more disciplined direction. Mr. Baron said there was Board agreement on appropriate incentives to work toward building evidence and to use the evidence that exists. The concern was the effect on innovation. The language of the resolution was changed to describe sufficient funding to evaluate previously untested but highly promising interventions through studies overseen by IES.

Mr. Handy noted that the resolution was too narrow in the larger context of the ESEA reauthorization and that the Board should make a broader statement. Mr. Baron said this particular recommendation was on a narrow subject. Dr. Easton and Mr. Handy felt that a reasoned, comprehensive resolution would have more impact than numerous small resolutions. Mr. Baron said his thinking was that the resolution would be an opening piece to allow weighing in at an early stage, leaving the door open for something more comprehensive later on. Dr. Gamoran said between this meeting and the next it will be clearer whether or not ESEA will be reauthorized.

Mr. Baron asked for a sense of the Board's thoughts on the revised NBES Recommendation for Education Legislation: "That Congress include the following reforms in the authorizing language of Education Department grant programs, wherever feasible and cost-effective, to advance the use of evidence in decisionmaking: 1) Funding incentives for grant applicants to use interventions supported by evidence of effectiveness, as judged by IES standards such as those used in the Department's Investing in Innovation program; and 2) Sufficient funding to evaluate previously untested but highly promising interventions through studies overseen by IES that allow for strong causal conclusions, including randomized controlled trials, where appropriate." Dr. Underwood said getting this piece out would not preclude putting out a broader memo in June. At this point, only congressional staff will read the recommendation.

Mr. Baron asked for a sense of the Board's thoughts on whether to vote on the resolution or defer consideration until the next meeting. The Board was in favor of moving forward. Dr. Bryk said the discussion had raised a third provision: incentives to the districts to, either individually or in networks, engage in systematic evaluations of the new initiatives. There was general agreement to include the third provision. Mr. Baron said provision 3) would be written "Funding incentives to state and local educational agencies to engage in systematic evaluation and improvement of local initiatives consistent with evidence standards established by IES." Dr. Shaywitz moved approval. Dr. Bryk seconded and the motion carried.

Closing Remarks, Including Next Steps

Mr. Baron opened the floor for discussions of the next meeting's agenda, starting with discussion on activities at NCER and NCES. Dr. Gamoran said for NCES, the future work list includes longitudinal studies, the priorities, issues of dissemination, issues of how NCES studies are coordinated with the state longitudinal databases, and how NCES surveys launch into service taking advantage of new technologies. In the case of NCER, the discussion might focus on a thorough review of the grant-making process, including stability of the panels over time, the expertise of professional officers, and their ability to guide researchers. Dr. Gamoran said these should be hour-long, in-depth discussions on a single set of programs addressing hard issues the centers are grappling with. It is not necessary to have preliminary answers before the meeting.

Dr. Buckley said having preliminary answers in advance can help the discussion, especially for newer members. Mr. Baron said there may be some agreed-upon topics beforehand. Dr. Ball said it would be more helpful to have a written report before the meeting rather than a lengthy presentation and more discussion at the meeting. Mr. Baron proposed having an email exchange to agree on topics the Board is interested in to help determine the topic for the next meeting's roundtable. Dr. Bryk said that if there is an hour on the agenda for the topic, the presentation should not exceed 20 minutes and should end with well-thought-out questions. Mr. Baron said there should be preparation for the roundtable discussion with the topics determined in advance.

Dr. Bryk said he'd like to have a discussion on where IES is right now and how it should move forward: the difference between asking "what works?" and "what works, for whom, under what circumstances?" and how that might modify the identity of IES being projected forward. Mr. Baron suggested a top level discussion on that topic of how the pieces fit together and what IES's focus should be. Dr. Bryk suggested looking closely at the characteristics of students, teachers, and contexts in the interventions to understand variability and how we might reduce the negative tail of this variability. This is different from evaluating the average effect.

Another suggested agenda topic was marketing and how to enable the field to distinguish studies with good evidence from the "noise." This could be addressed as how to build understanding in the field among policymakers, practitioners, and others on how to distinguish scientifically valid findings from other claims of effectiveness and how to judge scientific findings.

Mr. Baron said the next meeting may be in late June and asked that the members check their schedules in order to select a date. He thanked staff for setting up and facilitating the meeting and adjourned the meeting at 4:31 p.m.

PDF File View, download, and print the full meeting minutes as a PDF file (286 KB)
The National Board for Education Sciences is a Federal advisory committee chartered by Congress, operating under the Federal Advisory Committee Act (FACA); 5 U.S.C., App. 2). The Board provides advice to the Director on the policies of the Institute of Education Sciences. The findings and recommendations of the Board do not represent the views of the Agency, and this document does not represent information approved or disseminated by the Department of Education.

1 The resolution as voted on is: "That Congress continue funding for the Regional Educational Laboratories at current levels as part of any congressional spending agreement for FY 2011, and authorize the Institute of Education Sciences to extend the existing laboratory contracts for one additional year beyond their scheduled completion date."

2 Dr. Gamoran was not in the room and did not participate in the vote on this resolution.