Skip Navigation
A gavel National Board for Education Sciences Members | Director's Priorities | Reports | Agendas | Minutes | Resolutions| Briefing Materials

III. Scientifically Valid Research, Unbiased Evaluations and Accurate Education Statistics

A. Background

ESRA requires the Director of the Institute "to establish necessary procedures for technical and scientific peer review of the activities of the Institute" to assure that ''scientifically based research standards'' are applied to the work of the Institute, including the funding of grant applications and the products that are published by the Institute. To that end, the Standards and Review Office was created and staffed within the Institute's Office of the Deputy Director for Science. The Standards and Review Office developed, implemented, and refined the peer review procedures beginning shortly after the enactment of ESRA. The review procedures apply to all reports supported by the Institute, as well as all grant proposals submitted for funding.

B. Peer Review of Products

ESRA requires that "all research, statistics, and evaluation reports conducted by, or supported through, the Institute shall be subjected to rigorous peer review before being published or otherwise made available to the public." In addition, the Act requires that Institute products be "objective, secular, neutral, and non-ideological and are free of partisan political influence and racial, cultural, gender, or regional bias." Locating the Standards and Review Office within the Office of the Deputy Director for Science, independent of any of the Institute's four operating Centers, allows Institute staff responsible for approval and scientific peer review of products (action editors) to maintain an independent, objective point of view. Action editors are senior staff members with several years of experience conducting research, publishing and carrying out scientific reviews. Center staff responsible for the oversight of research projects, research contractors, and report authoring are necessarily exposed to influences and relationships that could easily result in the appearance of conflict of interest in a review process. This carefully engineered organizational independence allows the Institute's products to undergo objective, arms-length peer review, setting a standard for rigor, and placing the Institute on par with the nation's other premier federal science agencies.

Centers submit a report to the Standards and Review Office after they have conducted their own Center-level review, and the relevant Commissioner has approved the report for submission. The relevant Center Commissioner recommends to the Deputy Director for Science whether a report should receive external or internal scientific review, or should be exempt from review. Reports that present new analyses of data are sent to external scientists for peer review. All reports with limited descriptive data analyses are reviewed internally by Standards and Review staff.

For an external scientific review, the action editors identify potential peer reviewers, locating researchers who have published in top research journals in the relevant content area, have used similar methodological approaches, and have substantial experience conducting studies similar to the one presented in the report. The Deputy Director for Science reviews names, publication lists, and examples of published work of potential reviewers, and approves the recruitment of specific reviewers. Action editors recruit at least two approved external reviewers for each report. The external reviewers are asked to focus on the significance of the work for the field of education (the research itself, and the product at-hand); the technical quality of the research design, data collection, and data analyses; the appropriateness of the conclusions; and the clarity of the presentation. They are also asked to assure that any language in the report that advances causal claims is supported by the research methods and analyses described in the report.

Standards and Review Office action editors conduct their own review of each report simultaneously with the external reviewers, and then write a disposition memorandum synthesizing the action editor's review and those of the external reviewers. In their reviews, action editors focus on issues of technical quality, and are also responsible for ensuring that the reports are neutral and objective, and do not contain policy implication statements or recommendations, or statements of advocacy for particular positions, programs, or policies. The disposition memorandum indicates whether or not the report has been approved for publication, or is in need of revision.

For an internal scientific review, the disposition memorandum is based on the action editor's review of the report. For both internal and external reviews, the disposition memorandum is reviewed and approved by the Deputy Director for Science, and then sent to the Center responsible for the report. Standards and Review action editors, in consultation with the Deputy Director for Science, are responsible for reviewing and approving revisions made to the report and for recommending final approval of the report for publication by the Institute.

The Institute understands the importance of a timely review process. In 2005, it devoted an average of 24.9 working days up to the disposition memo (representing a 12% decrease from the previous year); 29.6 days in the Standards and Review Office (a 23% decrease from 2004); and 60.7 total working days until final approval (35% decrease from 2004). The Standards and Review Office is committed to continuing to address a timely review process.

Implementation of the new peer review process has resulted in substantial changes to some Institute products. As with the review process used by top research journals, reports sometimes require more than one round of revisions prior to receiving approval for publication from the Standards and Review Office. Sometimes these revisions involve issues of technical quality, and sometimes the revisions are focused on ensuring that the work is presented in a clear, objective, neutral, and unbiased manner. As an example, all program evaluation reports approved by the new review process will clearly lay out the purpose of the report, the study context, research design, data collection procedures, outcome measures, and data analyses; report the findings in a neutral manner; and leave implications for policy and practice to the reader.

The scientific review process for reviewing products represents a departure from the way in which Department research, statistics, and evaluation products were handled prior to passage of ESRA. While products were subject to various kinds of review prior to publication or public release, there was no systematic, independent scientific peer review of research, statistics, and evaluation products, and there were no requirements that such products be neutral and objective. The new process is intended to mirror the scientific peer review process used by top research journals for the acceptance of research articles for publication.

The Institute's peer review process for reports was initiated in early 2004. In 2004, the Standards and Review Office reviewed 44 reports, and in 2005, 96 reports.

C. Peer Review of Grant Applications

Under ESRA, activities of the Institute that are carried out through grants, contracts, or cooperative agreements, at a minimum, shall be awarded on a competitive basis and, when practical, through a process of peer review. Further, the Director is required to establish a peer review procedure (involving highly qualified individuals with an in-depth knowledge of the subject to be investigated) for reviewing and evaluating all applications for grants and cooperative agreements that exceed $100,000.

The Standards and Review Office is responsible for implementing the scientific peer review of grant applications. As with the peer review of reports, a key provision of the grant application peer review system is intended to put distance between the program officers and administrators within the Institute who administer grant programs, work with grantees, and disseminate the results of research, and those who are responsible for the peer review of applications for funding under those grant programs.

In FY2002, the Institute established a new system for the scientific review of grant applications that is similar to the process of grant application peer review at the National Institutes of Health. Standards and Review Office staff identify and recruit highly qualified reviewers primarily on the basis of the quality of the research they have conducted and published in scientific peer-reviewed journals and the degree to which they are in-depth experts in the relevant research methods and subject matter.

These reviewers are assigned to various panels designated to review applications for similar research topics. Reviewers are asked to consider the significance of the work for the field of education, the quality of the research plan, the quality of the personnel, and the adequacy of the available resources for the project. Two to three primary reviewers provide independent narrative reviews and initial rating scores for each of the four criteria above, as well as an initial overall quality rating for their assigned application. Based on the initial overall scores provided by the primary reviewers, Standards and Review staff prepare a preliminary rank order of the applications assigned to each review panel, and based on this rank ordering, approximately the top 25 applications are identified for discussion by the full panels at the review panel meetings.

In FY2003, the Institute created an entirely electronic application submission and review process. This electronic system allows applicants to submit letters of intent and grant applications on-line. In addition, the electronic system also allows reviewers to access applications assigned to their panel, submit their reviews of applications, view preliminary scores and reviews submitted by other reviewers (after submitting their own reviews), and revise their own narrative comments during the panel review meeting. The electronic system allows Institute staff to closely monitor the progress of the review process, and to quickly calculate preliminary scientific merit scores that are used to triage the top ranked applications for consideration by the full panels at the panel review meetings.

Between FY2002 and FY2005, single session review panels were constituted as needed for each round of review, and each review panel considered one grant competition (topic). In FY2006, the Institute created 5 standing review panels to which panel members may be appointed for multiple, consecutive review sessions, to complement the use of single session review panels as appropriate. The standing panels are composed of principal panel members who serve staggered, 3-year terms, rotating panel members who serve for one review session, and ad hoc members who serve for one review session to review a subset of applications using specialized expertise.

The following table summarizes the growth of the review process over the last five years.

Year Number of Applications Number of Competitions Number of Reviewers Number of Panels
FY2002 226 4 73 4
FY2003 479 7 121 7
FY2004 600 8 145 8
FY2005 696 11 162 11
FY2006* 703 185 5 standing
5 single session

* In 2005, the National Center for Special Education Research (NCSER) was established, and the Standards and Review Office became responsible for the peer review of grant applications from this Center starting in FY2006.

The new peer review system for grants has been in place for sufficient time to consider its yield. Between FY2001 and FY2004, to examine whether the Institute was fulfilling its goal to fund rigorous research, a number of distinguished researchers were asked to rate the overall quality of newly funded research projects. The percent of newly funded research projects rated as being of high quality from FY2001 to FY2004 was 36%, 50%, 70%, and 70%, respectively.

The review processes and expectations for higher quality proposals have improved the quality of the responses from the field. In each of the fiscal years 2002 through 2004, the Institute funded about 8% of the grant applications received. In FY2005, the Institute funded about 12%.

In addition, the Institute is also keeping track of the percentage of new research proposals funded by the Institute's National Center for Education Research (NCER) that receive an average score of excellent or higher from a panel of independent reviewers. From FY2003 to FY2005 the percentage of proposals that received this score were 88%, 97% and 100%, respectively.

D. Board Assessment and Approval of Peer Review Processes

While the process for scientifically reviewing grant proposals was first implemented in 2002 and the Institute's peer review process for reports was initiated in early 2004, the Board did not formally approve these procedures until its September 2005 meeting. This time lag allowed the Board to review extensive documentation of these processes and to assess not only the quality of their design but also the fidelity of their implementation prior to granting formal approval. Board members received grant reviews and multiple iterations of report reviews. Members also attended panel meetings and thus were able to assess the quality of the deliberations.

Many of the Board members are researchers who have received funding from the National Science Foundation (NSF) and/or the National Institutes of Health (NIH) and are familiar with the processes of these federal research agencies. They were impressed with what the Institute had put in place and were able to validate that these processes would assure quality, objectivity, validity, and integrity in scientific publications.

In addition, ex-officio members Arden Bement (NSF) and Duane Alexander (NICHD) publicly acknowledged that the approved procedures were of the highest merit and comparable to those of their agencies. While the processes are exemplary, the Board and Director realize that the key to strong systems is the selection of strong panel members. Members made several recommendations related to panel composition and asked that the Director institute quality assurance procedures to insure that these well-developed processes continue to be implemented with fidelity. The Director has committed to ongoing monitoring and reporting of the quality and compositions of the panels.