Institute of Education Sciences (IES) Board Room
80 F Street, NW
Washington, DC 20001
National Board for Education Sciences (NBES) Members Present
David J. Chard, Ph.D. (Chair)
Susanna Loeb, Ph.D. (Vice Chair, by phone)
Adam Gamoran, Ph.D.
Kris D. Gutierrez, Ph.D. (by phone)
Michael Feuer, Ph.D.
Darryl J. Ford, Ph.D.
Bridget Terry Long, Ph.D. (by phone)
Deborah Phillips, Ph.D.
Judith Singer, Ph.D.
Robert Teranishi, Ph.D. (by phone)
Hirokazu Yoshikawa, Ph.D. (by phone)
NBES Members Absent
Anthony S. Bryk, Ed.D.
Larry V. Hedges, Ph.D.
Robert A. Underwood, Ed.D.
Ex Officio Members Present
Thomas Brock, Ph.D., Commissioner, National Center for Education Research (NCER)
Peggy Carr, Ph.D., Acting Commissioner, National Center for Education Statistics (NCES)
Joan McLaughlin, Ph.D., Commissioner, National Center for Special Education Research (NCSER)
Brett Miller, Ph.D., Health Scientist Administrator, Child Development & Behavior Branch, Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), National Institutes of Health (NIH)
Teri Morisi, Bureau of Labor Statistics (for Erica Groshen, Ph.D.)
Ruth Curran Neild, Ph.D., Deputy Director, Policy and Research, Delegated Duties of the IES Director
Elizabeth VanderPutten, Deputy Director, Research in Informal and Formal Settings, National Science Foundation (NSF, for Joan Ferrini-Mundy, Ph.D.)
Michael Casserly, Executive Director, Council of the Great City Schools (CGCS)
Brian Harris-Kojetin, Ph.D., Deputy Director, Committee on National Statistics, National Academy of Sciences
Nancy Potok, Ph.D., Deputy Director, U.S. Census Bureau
Kenann McKenzie-Thompson, Ph.D., NBES Executive Director
Ellie Pelaez, Management & Program Analyst, Office of the Director, IES, Designated Federal Official
Elizabeth Albro, Ph.D., Associate Commissioner, Teaching and Learning, NCER
Jacquelyn Buckley, Ph.D., Office of the Commissioner, NCSER
Chris Chapman, Associate Commissioner, Sample Surveys Division, NCES
Dana Kelly, Chief, Assessments Division, International Assessment Branch, NCES
Daniel McGrath, Chief, Reporting and Dissemination Branch, NCES
Anne Ricciuti, Ph.D., Deputy Director for Science
Ross Santy, Associate Commissioner, Administrative Data Division, NCES
Thomas D. Snyder, Program Director, Annual Reports and Information, NCES
Andrew White, Senior Research Statistician, Statistical Standards and Data Confidentiality, NCES
Call to Order
David Chard, Ph.D., NBES Chair
Dr. Chard called the meeting to order at 9:00 a.m., and Ellie Pelaez called the roll. Dr. Chard pointed out that because the meeting had been rescheduled from October 2, 2015 (to avoid the impact of a possible federal government shutdown), some Board members and presenters were unable to attend.
Dr. Chard announced that Ruth Curran Neild, Ph.D., was named the deputy director of policy and research for IES and has been delegated the duties of the IES director. Dr. Neild thanked the Board for its counsel to IES. She explained that all of the IES staff are moving to a new office in southwest Washington, DC. The move marks the first time that all the IES offices will be housed in one building, which should help foster more collaboration and connections across IES.
Dr. Neild said that IES supports and conducts a great deal of creative, thoughtful, well-implemented work. Its statistics and research are top-notch, and it invests heavily in development and innovation of useful products and approaches for education. It supports the improvement of methods and measures. The What Works Clearinghouse has become part of the education landscape, and dissemination efforts are shaped by recent research on how policymakers and practitioners use data. In short, IES provides a critical infrastructure for education statistics and research in the United States. However, IES has been challenged to provide a clear narrative about its investments and why they are critical to education improvement. To that end, IES is revamping its website to better explain the types of work IES does and emphasize its extensive data collection efforts. In addition, IES recently hired a new communications director. Dr. Neild called on the Board to think about how IES should position itself over the next 18 months, in anticipation of a change in administration.
Dr. Chard briefly summarized the agenda. He thanked the IES commissioners for sending advance materials for review to facilitate conversation during this meeting.
IES Commissioners' Reports
National Center for Education Research
Thomas Brock, Ph.D., NCER Commissioner
Dr. Brock noted that NCER has begun its fiscal year (FY) 2016 grant competition. The budget for FY 2016 is not yet known, but NCER anticipates funding at the same level as FY 2015. Dr. Brock reminded the Board that NCER had more funding for awards in FY 2015 because many older grants expired. Most of FY 2016's funding will go to continuing rather than new awards.
Dr. Brock described two new competitions for FY 2016: the Pathways to Education Science Training Program, which aims to create a more diverse pool of individuals entering the education research profession, and Research Networks Focused on Critical Problems of Policy and Practice. Two initial research network efforts will evaluate the transition from pre-K to elementary school and college completion.
Dr. Brock reiterated decisions announced earlier this year to accommodate the limited funding: reducing maximum funding for Education Research Grant awards, capping the number of awards in some competitions, and not soliciting applications for Goal 2, Development and Innovation, in the Education Research Grants program.
ED's Office of Elementary and Secondary Education and Office of Planning, Evaluation, and Policy Development have been working with NCER on a special competition for rigorous evaluation of state programs and policies, especially in states that received waivers under No Child Left Behind to pursue novel reform efforts. The competition targeted (1) college- and career-ready standards and assessments, (2) major school turnaround initiatives, and (3) teacher and principal accountability systems. To accelerate the competition (to allow programs to begin evaluations at the start of the 2016 school year), NCER released a request for applications (RFA) in March with a June deadline for applications. Seventeen applications went to panel review, and three were awarded by September.
Adam Gamoran, Ph.D., asked whether the 2-year budget allows NCER to take any new steps. Dr. Brock responded that the longer budget enhances the ability to plan and reduces the risk of a federal shutdown, but the budget details (including the exact amount NCER would receive in FY 2016 or FY 2017) are not yet known. Dr. Neild added that the 2-year budget will be part of the overall federal spending allocation, and it is not clear what IES will receive.
Also in response to Dr. Gamoran, Dr. Brock said the new research networks are similar to the research and development (R&D) centers funded by NCER but are larger in scope and are designed to promote collaboration among researchers in different institutions, rather than a single grant to one institution. The networks will have a lead who will be responsible for coordinating the work of the different teams and promoting dissemination of findings.
Dr. Gamoran said researchers appreciate staggered deadlines. Dr. Brock agreed that varying deadlines among competitions benefits researchers, and he noted that the past year featured slightly different deadlines for research and training grants (a difference of 2 weeks) and for the special competition for evaluation of state education programs. Dr. Brock noted that there are efficiencies to having a single deadline in terms of managing the application review process—an important consideration when grant funds are limited. He also said that it is difficult to make funding decisions early in the year when it is not known how many high-quality applications may come in later in the year. This problem was less of a concern when IES's grant budget was larger and could more easily accommodate staggered deadlines.
Darryl J. Ford, Ph.D., asked for more details about the accelerated competition for the state education evaluation. Dr. Brock explained that one of ED's goals was to provide funding in time for evaluations to start with the 2015–16 academic year. An RFA was released in March 2015, applications were due in June, and funding decisions were made by the end of July. Dr. Brock said three awards were given, each for $5 million over 5 years, and each addressing one of the three priority targets described in the RFA. He said that collaborating with ED offices was helpful with outreach to potential applicants and generated a lot of good will. The accelerated timeline forced NCER and the Standards and Review Office (SRO) to rethink its processes and put in place some innovations, such as an application review meeting conducted by videoconference, that may benefit future competitions.
In response to Dr. Chard, Dr. Brock said the number of applications for FY 2016 is roughly on par with previous years, with the notable exception of the decrease in Education Research Grant applications because Goal 2 projects were not competed. Dr. Brock mentioned that the Pathways to Education Science competition generated some applications from institutions that had not previously taken part in IES competitions.
Once NCER has completed the Pathways to Education Science competition, Dr. Brock will inform the Board about the institutions that are selected.
Responding to Michael Feuer, Ph.D., Dr. Brock said the Pathways program will award up to $1.2 million over 4 years. The program targets younger students, so much of the funding goes toward exposing them to the field through, for example, summer research workshops and apprenticeships. Academic advising is also a feature of the programs. Dr. Feuer said it is important to track funding success rates, as NIH and NSF do. He added that IES and others are boosting the number of people in the education research field but not the amount of money available for research. Dr. Brock said that, historically, the funding rate for NCER is about 8–10%, although it was higher in FY 2015 (about 15%) because more money was available for new awards.
National Center for Special Education Research
Joan McLaughlin, Ph.D., NCSER Commissioner
Given the anticipated funding for FY 2016, Dr. McLaughlin said NCSER will continue to fund two competitions—one for special education research grants and one for training. Training awards will focus on early-career investigators, postdoctoral research, and methods training in single case design. The number of applications for research grants for FY 2016 increased over FY 2015 but has not yet rebounded to the number received before the spending cuts.
Dr. McLaughlin said that NCER and NCSER developed another off-cycle competition to support low-cost, short-duration evaluations of education interventions. The RFAs (one for education interventions, one for special education interventions) were released in September. The goal is to encourage partnerships between research institutions and state and local education agencies. Each center will make up to four awards of no more than $250,000 each. Dr. McLaughlin said attendance at a recent webinar indicated the competition speaks to a need in the field.
Evaluation of NCER and NCSER Research Pipeline
Dr. McLaughlin said NCER and NCSER embarked on an analysis of the research pipeline—that is, how IES grantees are pursuing research across the five IES goals (exploration, development, efficacy, effectiveness, and measurement). The effort sought to reveal how IES is building the knowledge base and whether it should encourage more work on certain types or areas of research.
Elizabeth Albro, Ph.D., said that from 2004 to 2015, just under half (43%) of awards supported development, and 28% went to efficacy research. Comparable portions went to measurement and exploration (13% and 15%, respectively), while only 1% went to effectiveness research (or scaling up). The analysis focused on three questions:
Is the Pipeline Working?
Dr. Albro said there is movement across the research goals, but not as much as she and others thought there would be. About one third of projects are related to other research projects, which is good given the short time that IES has been funding research. Of all NCER and NCSER development projects, for example, 17% have received efficacy awards; 20% of efficacy studies represent tests of IES-developed interventions. A portion of projects were linked in complex ways; for example, an efficacy study might lead to ideas for exploratory work or a development project might spur a new measure.
Why Are There No Efficacy Awards That Lead to Effectiveness Awards?
Jacquelyn Buckley, Ph.D., explained that IES is not seeing movement from efficacy to effectiveness studies because IES is still young. Only efficacy research funded very early on in IES history would be eligible for effectiveness awards, in part because the RFAs for effectiveness awards used to require evidence from at least two efficacy trials. Now, IES requires only one efficacy trial, which may help, said Dr. Buckley.
In addition, effectiveness trials require independent evaluation, which poses a challenge for some grantees. (IES is discussing how to work with grantees to meet the challenge.) Some areas of education research, such as special education transition, are still in early stages and are not yet ready for effectiveness studies.
Dr. Buckley pointed out that other arms of ED support effectiveness evaluation at scale. For example, the National Center for Education Evaluation and Regional Assistance (NCEE) is funding evaluation of interventions to teach academic language in elementary schools, and many of the interventions were developed with NCER or NCSER funding. Four Investing in Innovation validation awards are building on findings from NCER efficacy awards.
How Do the R&D Centers Fit Into the IES Pipeline?
Dr. Buckley said the R&D centers are critical, because they provide incentives for new investments in cutting-edge research. For example, NCER's Knowledge Utilization Centers look at how practitioners use research. The Virtual Learning Laboratory will use online instruction delivery platforms that are in wide use to test a wide range of education technology products. The R&D centers build on research supported by IES' primary research competitions. Two examples are the Center for the Study of Adult Literacy and the Center on Secondary Education for Students with Autism Spectrum Disorders.
Dr. Buckley concluded that grantees are building on existing work and research is moving through the pipeline. She welcomed suggestions for additional analyses or other approaches to the data. She also asked Board members for their opinions on whether IES should provide additional support to grantees to move research through the pipeline.
Judith Singer, Ph.D., pointed out that looking only at IES grants limits the ability to understand the ecosystem of education research. She questioned the use of continued funding as the measure of success because it encourages grantees to "keep going back to the trough." The impact of IES resources would be better measured by published research, interventions in schools, and technology transfer, for example. Dr. Chard pointed out that the goal of the analysis was to determine whether IES' goal structure functions as intended, not whether IES funding is having an impact. Dr. Singer responded that there should still be other outcomes besides additional grants. Dr. Albro said IES is looking at both narrow and broad outcomes.
Dr. Singer said the very small number of grants for scaling up interventions raises questions about the goal structure. It seems to reveal a mismatch between the requirements and what the average developer wants to do. Dr. Singer said IES should not remove the requirement for independent evaluation, but she was concerned that grantees are not applying for resources to prove their interventions work. She encouraged IES to think more about incentives.
Dr. Albro said she has anecdotal evidence indicating that IES grantees receive other sources of funding (both public and private), but it is difficult to get complete data for some 1,100 projects across the wide range of possible funders. Elizabeth VanderPutten said NSF faces the same challenge.
Dr. Albro pointed out that a small percentage of funded projects address the goal of scaling up, but those grants represent the largest dollar amounts. Effectiveness trials are very costly, she noted.
Dr. Gamoran said the Common Guidelines for Education Research and Development were a landmark for IES and NSF. The document is an example of IES taking a stand on what constitutes quality in education research. He suggested the quotation cited in the background document be included in the Board's annual report:
A "pipeline" of evidence ... begins with basic and exploratory research, moves to design and development of interventions or strategies, and, for interventions or strategies with initial promise, results in examination of the effectiveness for improving learning or another related education outcome. However, as we describe later in this document, the reality of scientific investigation is more complicated, less orderly, and less linear than such a "pipeline" suggests.
Dr. Gamoran said there is a tension between the perception that only the same successful researchers get additional funding and the desire of IES and others to move research through a pipeline. He said there is no right answer, but he suggested IES staff take the matter into consideration. Dr. Gamoran recommended bringing in an external consultant to review initial studies (e.g., a random sample or all grants awarded under one goal for a specified period) and then determine whether the next stage of research was funded and, if so, by whom.
Susanna Loeb, Ph.D., said a broader assessment would identify not just other funding sources, but also show how research moves forward, especially for academic researchers who may not have the organizational capacity to pursue research further. Dr. Albro pointed out that the analysis looked at links between projects, but those links did not always involve the same principal investigator (PI) or research team.
Bridget Terry Long, Ph.D., hoped the research portfolio would include projects that involve risk and innovation. Peer review panel members find it easier to give high scores to proposals for research they already understand, but some calculated risks are appropriate, even with limited resources, she said.
Brett Miller, Ph.D., said, from a funder perspective, there should be some sort of timeframe around the expected flow of research to see what is moving and what barriers exist. He said NICHD also struggles with understanding linkages around funding.
Dr. Buckley said the analysis looked for clear evidence of direct relation among projects. In doing so, Dr. Albro said, the analysis sometimes identified "cousins" rather than "parents and children." That is, while some projects have clear direct relationships, such as when an intervention developed through an IES development project is subsequently tested through an IES efficacy project, others have more indirect relationships, such as when an investigator receives funding to explore data available from four IES-funded efficacy studies to identify links between characteristics of those interventions and implementation fidelity.
Dr. Singer recommended surveying grantees or asking applicants to list other sources of funding. She also suggested interviewing some applicants who did not receive funding to find out what happened with their research, which could shed light on the review process.
Ms. VanderPutten said NSF requires applicants to list prior NSF support, which is a start toward tracking the flow of funding. The list of awardees could be shared with ED. Dr. Miller said NIH has tried to survey grantees, but the process is cumbersome and time-consuming. However, if staff (e.g., interns) are available, there are some time-intensive ways to search online databases to identify NSF or NIH funding. Also NIH asks applicants about current support, which is useful. Dr. Miller said NICHD staff interviewed some unsuccessful applicants, which was time-consuming; the results may be of interest to IES staff.
Dr. Chard supported the notion of a broader evaluation by an external contractor. Dr. Feuer said such an evaluation would ideally include IES, NSF, and NIH. Coordination among the three agencies could dispel fears that only the agency that capitalizes on the end-results gets the public credit for its investment. Dr. Gamoran suggested including the Department of Housing and Urban Development. He thought that NSF is more likely to fund the basic research that informs development and efficacy work supported by IES. However, the approach would show how work funded by IES yields positive results.
Dr. Brock suggested that IES would welcome further discussion with the Board about whether the research centers should continue to fund Goal 4 (effectiveness) research, given limited resources and the possibility that scale-up evaluations can be funded elsewhere.
Dr. Neild liked the idea of a cross-agency review and agreed to explore it further. While investigators are more comfortable with efficacy studies than other types of research, Dr. Neild said, IES staff is discussing whether it should be more intentional about trying to move interventions with positive results in small-scale studies to larger-scale trials, even if not by the same PI. She said NCEE does large scale-up evaluations because its contractors have the infrastructure to support them.
The Board will discuss with IES leadership the possibility of an external evaluation of education research grants funded by IES, NSF, NIH, and perhaps other federal agencies.
Dr. Feuer said IES should be proud of the results of this analysis, because it is taking the lead despite being new to the funding arena.
Standards and Review Office
Anne Ricciuti, Ph.D., Deputy Director for Science
Dr. Ricciuti described the responsibilities of the Standards and Review Office (SRO) and where it is situated within IES. Dr. Ricciuti pointed out that the SRO is not an official office in IES but rather the name used to refer to peer review activities and staff under the Deputy Director for Science. All peer review activities and staff are situated outside of the four research centers so that peer review of reports and research grant competitions is as objective as possible.
Dr. Ricciuti said the SRO reviews reports from all four centers using a process modeled on that of journals. Reports undergo one of two types of review: external or internal. In general, reports that present new data analyses in a form other than basic cross-tabulations are sent to external experts for peer review. Research reviews and syntheses also receive external review. Reports with limited data analyses and indicator reports general receive internal review. SRO staff works with the centers to ensure reports meet SRO standards before they are released. Every year, about 80 reports are in some stage of the review process.
Most of the research grant competitions come from NCER and NCSER, although NCES holds the Statewide Longitudinal Data competition. The grant review process is modeled on that of NIH and NSF. In FY 2015, the SRO processed 1,099 applications; 992 applications were reviewed by 425 reviewers on 23 review panels.
Dr. Singer wondered whether journals would be better suited to review reports, because they already have a well-developed system for reviewing data and would better disseminate research findings. She noted that journal publication factors into tenure review for academic researchers. Dr. Ricciuti responded that some reports have to be released by IES, such as Congressionally mandated evaluations. She said IES established a rigorous peer review process to elevate the standing of its reports.
Dr. Neild added that the cost of internal peer review is relatively small, particularly in comparison to the costs of large-scale data collections and studies, and substantially improves the quality of reports. The SRO has demonstrated that it is scrupulously fair, nonpartisan, and very careful in its review. Dr. Neild also noted that NCEE contractors are encouraged to submit articles to journals for publication. She hoped academia would recognize the extent of the IES peer review process and weigh IES reports more heavily in tenure and promotion decisions.
Dr. Gamoran supported the role of the SRO in quality control and peer review of reports. All reports, including those of contractors, should be reviewed by staff separate from the research centers and should carry the IES "stamp of approval." The basic descriptive information published by NCES should be disseminated through reports that are reviewed by an impartial party like the SRO. Dr. Gamoran said reports are more likely to be produced by contractors than academics in tenure-track positions; he agreed that more reports should be submitted to journals so they become part of the scientific literature. However, said Dr. Gamoran, there is room for debate over how universities should evaluate peer reviewed products written for government agencies. Organizations that seek to promote the use of research evidence in policymaking should recognize that policymakers are more likely to turn to IES reports than journals, Dr. Gamoran noted.
Dr. Feuer agreed that the SRO should continue to play a role in publishing independently reviewed reports that carry deserved weight, especially because of the growing number of "advice givers" in the field (e.g., think tanks). Dr. Ricciuti said SRO staff looks not only at the technical quality of reports but also at the objectivity of the interpretation and presentation of findings.
Returning to the research grant process, Dr. Ricciuti described the typical timeline for review. She pointed out that screening the applications involves multiple offices. The SRO staff must assess whether the application meets the technical requirements of the competition. From start to finish, the process takes 8–9 months, comparable to that of the NIH.
Dr. Ricciuti described how the number and timing of NCER and NCSER competitions and submission dates changed because of dramatic funding cuts in FY 2013. The SRO adjusted to accommodate the changes. The timeline continues to be about 8.5 months from the time the RFA is published to the release of regret letters to applicants who did not receive awards. In FY 2015, despite an increase in applications, the SRO managed to allow more time for processing and reviewer assessment and earlier panel reviews. It was also able to release review summaries and scores through the new online review system before funding decisions were made (as NIH does).
Dr. Ricciuti said two FY 2015 off-cycle competitions, both with accelerated timelines, contributed to an extremely demanding year for the SRO. For one, the time from receipt of applications to release of regret letters was 6 weeks; for the other, 13 weeks.
In FY 2016, the SRO aims to shorten the application processing time by 3 weeks, give reviewers more time to evaluate applications, and hold review panels earlier in the year. It also plans to release scores online within a few days of the meeting and summary reviews about 1 month after the panel meets. Dr. Ricciuti said FY 2016 also brings two new off-cycle competitions, both of which differ from other types of IES competitions. Both will require substantial strategic planning, and both have accelerated timelines.
Dr. Ricciuti described some of the strategies used to accelerate the review process. For example, for the evaluation of state and local education programs and policies, the SRO was able to recruit reviewers before the applications were submitted because the focus was specific and limited and the number of applications anticipated was small. In addition, SRO used videoconferencing technology for the panel review.
Dr. Ricciuti welcomed suggestions to shorten the timeline for grant review while maintaining the quality of the process, avoiding overburdening reviewers, and providing adequate time for reviewer evaluation. Because she often hears complaints that the process takes too long, she asked what the target timeline should be.
Dr. Feuer said the credibility of an organization is largely a function of the rigor of its review process. Once that credibility is compromised, the rest of the organizational investment is compromised. He recommended the SRO weigh any suggestions to shorten the review process very carefully to assess the risks and benefits. Identifying and addressing conflicts of interest is a very important part of the process, he added. Dr. Ricciuti said the SRO has an extensive, proactive process for assessing conflicts of interest.
Dr. Gamoran said the timeline seems appropriate given the volume of applications and the quality of the feedback to applicants. He said the resubmission process has improved greatly, to the credit of Dr. Ricciuti and the SRO, but continued vigilance is warranted. Dr. Ricciuti summarized the results of resubmissions (see the table). For both NCER and NCSER, resubmissions are funded at higher rates than first submissions. Dr. Ricciuti noted that the new IES website will include such data.
|Number of Applications Funded and Funding Rates by Submission Status for NCER and NCSER Main Education Research Competitions; FY2009 through FY2015|
|NCER 305A||NCSER 324A|
|Overall Funding Rate||Resubmission Funding Rate||First Submission Funding Rate||Overall Funding Rate||Resubmission Funding Rate||First Submission Funding Rate|