Skip Navigation
A gavel National Board for Education Sciences Members | Director's Priorities | Reports | Agendas | Minutes | Resolutions| Briefing Materials
National Board for Education Sciences
November 16, 2015 Minutes of Meeting

Location
Institute of Education Sciences (IES) Board Room
80 F Street, NW
Washington, DC 20001

Participants
National Board for Education Sciences (NBES) Members Present
David J. Chard, Ph.D. (Chair)
Susanna Loeb, Ph.D. (Vice Chair, by phone)
Adam Gamoran, Ph.D.
Kris D. Gutierrez, Ph.D. (by phone)
Michael Feuer, Ph.D.
Darryl J. Ford, Ph.D.
Bridget Terry Long, Ph.D. (by phone)
Deborah Phillips, Ph.D.
Judith Singer, Ph.D.
Robert Teranishi, Ph.D. (by phone)
Hirokazu Yoshikawa, Ph.D. (by phone)

NBES Members Absent
Anthony S. Bryk, Ed.D.
Larry V. Hedges, Ph.D.
Robert A. Underwood, Ed.D.

Ex Officio Members Present
Thomas Brock, Ph.D., Commissioner, National Center for Education Research (NCER)
Peggy Carr, Ph.D., Acting Commissioner, National Center for Education Statistics (NCES)
Joan McLaughlin, Ph.D., Commissioner, National Center for Special Education Research (NCSER)
Brett Miller, Ph.D., Health Scientist Administrator, Child Development & Behavior Branch, Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), National Institutes of Health (NIH)
Teri Morisi, Bureau of Labor Statistics (for Erica Groshen, Ph.D.)
Ruth Curran Neild, Ph.D., Deputy Director, Policy and Research, Delegated Duties of the IES Director
Elizabeth VanderPutten, Deputy Director, Research in Informal and Formal Settings, National Science Foundation (NSF, for Joan Ferrini-Mundy, Ph.D.)

Invited Presenters
Michael Casserly, Executive Director, Council of the Great City Schools (CGCS)
Brian Harris-Kojetin, Ph.D., Deputy Director, Committee on National Statistics, National Academy of Sciences
Nancy Potok, Ph.D., Deputy Director, U.S. Census Bureau

NBES Staff
Kenann McKenzie-Thompson, Ph.D., NBES Executive Director
Ellie Pelaez, Management & Program Analyst, Office of the Director, IES, Designated Federal Official

IES Staff
Elizabeth Albro, Ph.D., Associate Commissioner, Teaching and Learning, NCER
Jacquelyn Buckley, Ph.D., Office of the Commissioner, NCSER
Chris Chapman, Associate Commissioner, Sample Surveys Division, NCES
Dana Kelly, Chief, Assessments Division, International Assessment Branch, NCES
Daniel McGrath, Chief, Reporting and Dissemination Branch, NCES
Anne Ricciuti, Ph.D., Deputy Director for Science
Ross Santy, Associate Commissioner, Administrative Data Division, NCES
Thomas D. Snyder, Program Director, Annual Reports and Information, NCES
Andrew White, Senior Research Statistician, Statistical Standards and Data Confidentiality, NCES

Call to Order
David Chard, Ph.D., NBES Chair

Dr. Chard called the meeting to order at 9:00 a.m., and Ellie Pelaez called the roll. Dr. Chard pointed out that because the meeting had been rescheduled from October 2, 2015 (to avoid the impact of a possible federal government shutdown), some Board members and presenters were unable to attend.

Opening Remarks

Dr. Chard announced that Ruth Curran Neild, Ph.D., was named the deputy director of policy and research for IES and has been delegated the duties of the IES director. Dr. Neild thanked the Board for its counsel to IES. She explained that all of the IES staff are moving to a new office in southwest Washington, DC. The move marks the first time that all the IES offices will be housed in one building, which should help foster more collaboration and connections across IES.

Dr. Neild said that IES supports and conducts a great deal of creative, thoughtful, well-implemented work. Its statistics and research are top-notch, and it invests heavily in development and innovation of useful products and approaches for education. It supports the improvement of methods and measures. The What Works Clearinghouse has become part of the education landscape, and dissemination efforts are shaped by recent research on how policymakers and practitioners use data. In short, IES provides a critical infrastructure for education statistics and research in the United States. However, IES has been challenged to provide a clear narrative about its investments and why they are critical to education improvement. To that end, IES is revamping its website to better explain the types of work IES does and emphasize its extensive data collection efforts. In addition, IES recently hired a new communications director. Dr. Neild called on the Board to think about how IES should position itself over the next 18 months, in anticipation of a change in administration.

Dr. Chard briefly summarized the agenda. He thanked the IES commissioners for sending advance materials for review to facilitate conversation during this meeting.

IES Commissioners' Reports

National Center for Education Research
Thomas Brock, Ph.D., NCER Commissioner

Dr. Brock noted that NCER has begun its fiscal year (FY) 2016 grant competition. The budget for FY 2016 is not yet known, but NCER anticipates funding at the same level as FY 2015. Dr. Brock reminded the Board that NCER had more funding for awards in FY 2015 because many older grants expired. Most of FY 2016's funding will go to continuing rather than new awards.

Dr. Brock described two new competitions for FY 2016: the Pathways to Education Science Training Program, which aims to create a more diverse pool of individuals entering the education research profession, and Research Networks Focused on Critical Problems of Policy and Practice. Two initial research network efforts will evaluate the transition from pre-K to elementary school and college completion.

Dr. Brock reiterated decisions announced earlier this year to accommodate the limited funding: reducing maximum funding for Education Research Grant awards, capping the number of awards in some competitions, and not soliciting applications for Goal 2, Development and Innovation, in the Education Research Grants program.

ED's Office of Elementary and Secondary Education and Office of Planning, Evaluation, and Policy Development have been working with NCER on a special competition for rigorous evaluation of state programs and policies, especially in states that received waivers under No Child Left Behind to pursue novel reform efforts. The competition targeted (1) college- and career-ready standards and assessments, (2) major school turnaround initiatives, and (3) teacher and principal accountability systems. To accelerate the competition (to allow programs to begin evaluations at the start of the 2016 school year), NCER released a request for applications (RFA) in March with a June deadline for applications. Seventeen applications went to panel review, and three were awarded by September.

Discussion

Adam Gamoran, Ph.D., asked whether the 2-year budget allows NCER to take any new steps. Dr. Brock responded that the longer budget enhances the ability to plan and reduces the risk of a federal shutdown, but the budget details (including the exact amount NCER would receive in FY 2016 or FY 2017) are not yet known. Dr. Neild added that the 2-year budget will be part of the overall federal spending allocation, and it is not clear what IES will receive.

Also in response to Dr. Gamoran, Dr. Brock said the new research networks are similar to the research and development (R&D) centers funded by NCER but are larger in scope and are designed to promote collaboration among researchers in different institutions, rather than a single grant to one institution. The networks will have a lead who will be responsible for coordinating the work of the different teams and promoting dissemination of findings.

Dr. Gamoran said researchers appreciate staggered deadlines. Dr. Brock agreed that varying deadlines among competitions benefits researchers, and he noted that the past year featured slightly different deadlines for research and training grants (a difference of 2 weeks) and for the special competition for evaluation of state education programs. Dr. Brock noted that there are efficiencies to having a single deadline in terms of managing the application review process—an important consideration when grant funds are limited. He also said that it is difficult to make funding decisions early in the year when it is not known how many high-quality applications may come in later in the year. This problem was less of a concern when IES's grant budget was larger and could more easily accommodate staggered deadlines.

Darryl J. Ford, Ph.D., asked for more details about the accelerated competition for the state education evaluation. Dr. Brock explained that one of ED's goals was to provide funding in time for evaluations to start with the 2015–16 academic year. An RFA was released in March 2015, applications were due in June, and funding decisions were made by the end of July. Dr. Brock said three awards were given, each for $5 million over 5 years, and each addressing one of the three priority targets described in the RFA. He said that collaborating with ED offices was helpful with outreach to potential applicants and generated a lot of good will. The accelerated timeline forced NCER and the Standards and Review Office (SRO) to rethink its processes and put in place some innovations, such as an application review meeting conducted by videoconference, that may benefit future competitions.

In response to Dr. Chard, Dr. Brock said the number of applications for FY 2016 is roughly on par with previous years, with the notable exception of the decrease in Education Research Grant applications because Goal 2 projects were not competed. Dr. Brock mentioned that the Pathways to Education Science competition generated some applications from institutions that had not previously taken part in IES competitions.

Follow-Up Item
Once NCER has completed the Pathways to Education Science competition, Dr. Brock will inform the Board about the institutions that are selected.

Responding to Michael Feuer, Ph.D., Dr. Brock said the Pathways program will award up to $1.2 million over 4 years. The program targets younger students, so much of the funding goes toward exposing them to the field through, for example, summer research workshops and apprenticeships. Academic advising is also a feature of the programs. Dr. Feuer said it is important to track funding success rates, as NIH and NSF do. He added that IES and others are boosting the number of people in the education research field but not the amount of money available for research. Dr. Brock said that, historically, the funding rate for NCER is about 8–10%, although it was higher in FY 2015 (about 15%) because more money was available for new awards.

National Center for Special Education Research
Joan McLaughlin, Ph.D., NCSER Commissioner

Given the anticipated funding for FY 2016, Dr. McLaughlin said NCSER will continue to fund two competitions—one for special education research grants and one for training. Training awards will focus on early-career investigators, postdoctoral research, and methods training in single case design. The number of applications for research grants for FY 2016 increased over FY 2015 but has not yet rebounded to the number received before the spending cuts.

Dr. McLaughlin said that NCER and NCSER developed another off-cycle competition to support low-cost, short-duration evaluations of education interventions. The RFAs (one for education interventions, one for special education interventions) were released in September. The goal is to encourage partnerships between research institutions and state and local education agencies. Each center will make up to four awards of no more than $250,000 each. Dr. McLaughlin said attendance at a recent webinar indicated the competition speaks to a need in the field.

Evaluation of NCER and NCSER Research Pipeline

Dr. McLaughlin said NCER and NCSER embarked on an analysis of the research pipeline—that is, how IES grantees are pursuing research across the five IES goals (exploration, development, efficacy, effectiveness, and measurement). The effort sought to reveal how IES is building the knowledge base and whether it should encourage more work on certain types or areas of research.

Elizabeth Albro, Ph.D., said that from 2004 to 2015, just under half (43%) of awards supported development, and 28% went to efficacy research. Comparable portions went to measurement and exploration (13% and 15%, respectively), while only 1% went to effectiveness research (or scaling up). The analysis focused on three questions:

  • Is the pipeline working as intended? Are projects with promising findings moving to the next phase of research? If not, why?
  • How well does the IES pipeline connect with other sources of funding (such as NSF and NICHD)?
  • Should IES be more intentional about helping research projects with promising findings to receive IES funding at the next stage of investigation? If so, how?

Is the Pipeline Working?

Dr. Albro said there is movement across the research goals, but not as much as she and others thought there would be. About one third of projects are related to other research projects, which is good given the short time that IES has been funding research. Of all NCER and NCSER development projects, for example, 17% have received efficacy awards; 20% of efficacy studies represent tests of IES-developed interventions. A portion of projects were linked in complex ways; for example, an efficacy study might lead to ideas for exploratory work or a development project might spur a new measure.

Why Are There No Efficacy Awards That Lead to Effectiveness Awards?

Jacquelyn Buckley, Ph.D., explained that IES is not seeing movement from efficacy to effectiveness studies because IES is still young. Only efficacy research funded very early on in IES history would be eligible for effectiveness awards, in part because the RFAs for effectiveness awards used to require evidence from at least two efficacy trials. Now, IES requires only one efficacy trial, which may help, said Dr. Buckley.

In addition, effectiveness trials require independent evaluation, which poses a challenge for some grantees. (IES is discussing how to work with grantees to meet the challenge.) Some areas of education research, such as special education transition, are still in early stages and are not yet ready for effectiveness studies.

Dr. Buckley pointed out that other arms of ED support effectiveness evaluation at scale. For example, the National Center for Education Evaluation and Regional Assistance (NCEE) is funding evaluation of interventions to teach academic language in elementary schools, and many of the interventions were developed with NCER or NCSER funding. Four Investing in Innovation validation awards are building on findings from NCER efficacy awards.

How Do the R&D Centers Fit Into the IES Pipeline?

Dr. Buckley said the R&D centers are critical, because they provide incentives for new investments in cutting-edge research. For example, NCER's Knowledge Utilization Centers look at how practitioners use research. The Virtual Learning Laboratory will use online instruction delivery platforms that are in wide use to test a wide range of education technology products. The R&D centers build on research supported by IES' primary research competitions. Two examples are the Center for the Study of Adult Literacy and the Center on Secondary Education for Students with Autism Spectrum Disorders.

Dr. Buckley concluded that grantees are building on existing work and research is moving through the pipeline. She welcomed suggestions for additional analyses or other approaches to the data. She also asked Board members for their opinions on whether IES should provide additional support to grantees to move research through the pipeline.

Discussion

Judith Singer, Ph.D., pointed out that looking only at IES grants limits the ability to understand the ecosystem of education research. She questioned the use of continued funding as the measure of success because it encourages grantees to "keep going back to the trough." The impact of IES resources would be better measured by published research, interventions in schools, and technology transfer, for example. Dr. Chard pointed out that the goal of the analysis was to determine whether IES' goal structure functions as intended, not whether IES funding is having an impact. Dr. Singer responded that there should still be other outcomes besides additional grants. Dr. Albro said IES is looking at both narrow and broad outcomes.

Dr. Singer said the very small number of grants for scaling up interventions raises questions about the goal structure. It seems to reveal a mismatch between the requirements and what the average developer wants to do. Dr. Singer said IES should not remove the requirement for independent evaluation, but she was concerned that grantees are not applying for resources to prove their interventions work. She encouraged IES to think more about incentives.

Dr. Albro said she has anecdotal evidence indicating that IES grantees receive other sources of funding (both public and private), but it is difficult to get complete data for some 1,100 projects across the wide range of possible funders. Elizabeth VanderPutten said NSF faces the same challenge.

Dr. Albro pointed out that a small percentage of funded projects address the goal of scaling up, but those grants represent the largest dollar amounts. Effectiveness trials are very costly, she noted.

Dr. Gamoran said the Common Guidelines for Education Research and Development were a landmark for IES and NSF. The document is an example of IES taking a stand on what constitutes quality in education research. He suggested the quotation cited in the background document be included in the Board's annual report:

A "pipeline" of evidence ... begins with basic and exploratory research, moves to design and development of interventions or strategies, and, for interventions or strategies with initial promise, results in examination of the effectiveness for improving learning or another related education outcome. However, as we describe later in this document, the reality of scientific investigation is more complicated, less orderly, and less linear than such a "pipeline" suggests.

Dr. Gamoran said there is a tension between the perception that only the same successful researchers get additional funding and the desire of IES and others to move research through a pipeline. He said there is no right answer, but he suggested IES staff take the matter into consideration. Dr. Gamoran recommended bringing in an external consultant to review initial studies (e.g., a random sample or all grants awarded under one goal for a specified period) and then determine whether the next stage of research was funded and, if so, by whom.

Susanna Loeb, Ph.D., said a broader assessment would identify not just other funding sources, but also show how research moves forward, especially for academic researchers who may not have the organizational capacity to pursue research further. Dr. Albro pointed out that the analysis looked at links between projects, but those links did not always involve the same principal investigator (PI) or research team.

Bridget Terry Long, Ph.D., hoped the research portfolio would include projects that involve risk and innovation. Peer review panel members find it easier to give high scores to proposals for research they already understand, but some calculated risks are appropriate, even with limited resources, she said.

Brett Miller, Ph.D., said, from a funder perspective, there should be some sort of timeframe around the expected flow of research to see what is moving and what barriers exist. He said NICHD also struggles with understanding linkages around funding.

Dr. Buckley said the analysis looked for clear evidence of direct relation among projects. In doing so, Dr. Albro said, the analysis sometimes identified "cousins" rather than "parents and children." That is, while some projects have clear direct relationships, such as when an intervention developed through an IES development project is subsequently tested through an IES efficacy project, others have more indirect relationships, such as when an investigator receives funding to explore data available from four IES-funded efficacy studies to identify links between characteristics of those interventions and implementation fidelity.

Dr. Singer recommended surveying grantees or asking applicants to list other sources of funding. She also suggested interviewing some applicants who did not receive funding to find out what happened with their research, which could shed light on the review process.

Ms. VanderPutten said NSF requires applicants to list prior NSF support, which is a start toward tracking the flow of funding. The list of awardees could be shared with ED. Dr. Miller said NIH has tried to survey grantees, but the process is cumbersome and time-consuming. However, if staff (e.g., interns) are available, there are some time-intensive ways to search online databases to identify NSF or NIH funding. Also NIH asks applicants about current support, which is useful. Dr. Miller said NICHD staff interviewed some unsuccessful applicants, which was time-consuming; the results may be of interest to IES staff.

Dr. Chard supported the notion of a broader evaluation by an external contractor. Dr. Feuer said such an evaluation would ideally include IES, NSF, and NIH. Coordination among the three agencies could dispel fears that only the agency that capitalizes on the end-results gets the public credit for its investment. Dr. Gamoran suggested including the Department of Housing and Urban Development. He thought that NSF is more likely to fund the basic research that informs development and efficacy work supported by IES. However, the approach would show how work funded by IES yields positive results.

Dr. Brock suggested that IES would welcome further discussion with the Board about whether the research centers should continue to fund Goal 4 (effectiveness) research, given limited resources and the possibility that scale-up evaluations can be funded elsewhere.

Dr. Neild liked the idea of a cross-agency review and agreed to explore it further. While investigators are more comfortable with efficacy studies than other types of research, Dr. Neild said, IES staff is discussing whether it should be more intentional about trying to move interventions with positive results in small-scale studies to larger-scale trials, even if not by the same PI. She said NCEE does large scale-up evaluations because its contractors have the infrastructure to support them.

Follow-Up Item
The Board will discuss with IES leadership the possibility of an external evaluation of education research grants funded by IES, NSF, NIH, and perhaps other federal agencies.

Dr. Feuer said IES should be proud of the results of this analysis, because it is taking the lead despite being new to the funding arena.

Standards and Review Office
Anne Ricciuti, Ph.D., Deputy Director for Science

Dr. Ricciuti described the responsibilities of the Standards and Review Office (SRO) and where it is situated within IES. Dr. Ricciuti pointed out that the SRO is not an official office in IES but rather the name used to refer to peer review activities and staff under the Deputy Director for Science. All peer review activities and staff are situated outside of the four research centers so that peer review of reports and research grant competitions is as objective as possible.

Dr. Ricciuti said the SRO reviews reports from all four centers using a process modeled on that of journals. Reports undergo one of two types of review: external or internal. In general, reports that present new data analyses in a form other than basic cross-tabulations are sent to external experts for peer review. Research reviews and syntheses also receive external review. Reports with limited data analyses and indicator reports general receive internal review. SRO staff works with the centers to ensure reports meet SRO standards before they are released. Every year, about 80 reports are in some stage of the review process.

Most of the research grant competitions come from NCER and NCSER, although NCES holds the Statewide Longitudinal Data competition. The grant review process is modeled on that of NIH and NSF. In FY 2015, the SRO processed 1,099 applications; 992 applications were reviewed by 425 reviewers on 23 review panels.

Dr. Singer wondered whether journals would be better suited to review reports, because they already have a well-developed system for reviewing data and would better disseminate research findings. She noted that journal publication factors into tenure review for academic researchers. Dr. Ricciuti responded that some reports have to be released by IES, such as Congressionally mandated evaluations. She said IES established a rigorous peer review process to elevate the standing of its reports.

Dr. Neild added that the cost of internal peer review is relatively small, particularly in comparison to the costs of large-scale data collections and studies, and substantially improves the quality of reports. The SRO has demonstrated that it is scrupulously fair, nonpartisan, and very careful in its review. Dr. Neild also noted that NCEE contractors are encouraged to submit articles to journals for publication. She hoped academia would recognize the extent of the IES peer review process and weigh IES reports more heavily in tenure and promotion decisions.

Dr. Gamoran supported the role of the SRO in quality control and peer review of reports. All reports, including those of contractors, should be reviewed by staff separate from the research centers and should carry the IES "stamp of approval." The basic descriptive information published by NCES should be disseminated through reports that are reviewed by an impartial party like the SRO. Dr. Gamoran said reports are more likely to be produced by contractors than academics in tenure-track positions; he agreed that more reports should be submitted to journals so they become part of the scientific literature. However, said Dr. Gamoran, there is room for debate over how universities should evaluate peer reviewed products written for government agencies. Organizations that seek to promote the use of research evidence in policymaking should recognize that policymakers are more likely to turn to IES reports than journals, Dr. Gamoran noted.

Dr. Feuer agreed that the SRO should continue to play a role in publishing independently reviewed reports that carry deserved weight, especially because of the growing number of "advice givers" in the field (e.g., think tanks). Dr. Ricciuti said SRO staff looks not only at the technical quality of reports but also at the objectivity of the interpretation and presentation of findings.

Returning to the research grant process, Dr. Ricciuti described the typical timeline for review. She pointed out that screening the applications involves multiple offices. The SRO staff must assess whether the application meets the technical requirements of the competition. From start to finish, the process takes 8–9 months, comparable to that of the NIH.

Dr. Ricciuti described how the number and timing of NCER and NCSER competitions and submission dates changed because of dramatic funding cuts in FY 2013. The SRO adjusted to accommodate the changes. The timeline continues to be about 8.5 months from the time the RFA is published to the release of regret letters to applicants who did not receive awards. In FY 2015, despite an increase in applications, the SRO managed to allow more time for processing and reviewer assessment and earlier panel reviews. It was also able to release review summaries and scores through the new online review system before funding decisions were made (as NIH does).

Dr. Ricciuti said two FY 2015 off-cycle competitions, both with accelerated timelines, contributed to an extremely demanding year for the SRO. For one, the time from receipt of applications to release of regret letters was 6 weeks; for the other, 13 weeks.

In FY 2016, the SRO aims to shorten the application processing time by 3 weeks, give reviewers more time to evaluate applications, and hold review panels earlier in the year. It also plans to release scores online within a few days of the meeting and summary reviews about 1 month after the panel meets. Dr. Ricciuti said FY 2016 also brings two new off-cycle competitions, both of which differ from other types of IES competitions. Both will require substantial strategic planning, and both have accelerated timelines.

Dr. Ricciuti described some of the strategies used to accelerate the review process. For example, for the evaluation of state and local education programs and policies, the SRO was able to recruit reviewers before the applications were submitted because the focus was specific and limited and the number of applications anticipated was small. In addition, SRO used videoconferencing technology for the panel review.

Dr. Ricciuti welcomed suggestions to shorten the timeline for grant review while maintaining the quality of the process, avoiding overburdening reviewers, and providing adequate time for reviewer evaluation. Because she often hears complaints that the process takes too long, she asked what the target timeline should be.

Discussion

Dr. Feuer said the credibility of an organization is largely a function of the rigor of its review process. Once that credibility is compromised, the rest of the organizational investment is compromised. He recommended the SRO weigh any suggestions to shorten the review process very carefully to assess the risks and benefits. Identifying and addressing conflicts of interest is a very important part of the process, he added. Dr. Ricciuti said the SRO has an extensive, proactive process for assessing conflicts of interest.

Dr. Gamoran said the timeline seems appropriate given the volume of applications and the quality of the feedback to applicants. He said the resubmission process has improved greatly, to the credit of Dr. Ricciuti and the SRO, but continued vigilance is warranted. Dr. Ricciuti summarized the results of resubmissions (see the table). For both NCER and NCSER, resubmissions are funded at higher rates than first submissions. Dr. Ricciuti noted that the new IES website will include such data.

Number of Applications Funded and Funding Rates by Submission Status for NCER and NCSER Main Education Research Competitions; FY2009 through FY2015
  NCER 305A NCSER 324A
  Overall Funding Rate Resubmission Funding Rate First Submission Funding Rate Overall Funding Rate Resubmission Funding Rate First Submission Funding Rate
Fiscal Year N % N % N % N % N % N %
2009 71 13 31 21 40 10 33 11 12 17 21 10
2010 77 9 29 14 48 8 33 10 22 25 11 5
2011 74 9 34 15 40 7 39 12 32 25 7 3
2012 59 8 31 14 28 6 45 12 27 18 18 8
2013 57 9 46 21 11 3 14 4 11 9 3 2
2014 52 8 41 18 11 3 N/A N/A N/A N/A N/A N/A
2015 81 15 62 28 19 6 30 12 25 33 5 3

1 Number of Applications Funded and Funding Rates by Submission Status for NCER and NCSER Main Education Research Competitions; FY2009 through FY2015

Dr. Singer wondered if the overall process would benefit from having multiple submission deadlines throughout the year. She noted that the panel review of applications serves as a triage process, because most applications are not funded on the first submission. A more formal triage process would not speed up the process but could allow for decisionmaking in a more timely fashion.

Dr. Singer pointed out that most organizations do a poor job of screening out reviewers with a conflict of interest that stems from having had their own applications for similar research rejected. Such conflicts are not declared and may be hard to find. Dr. Singer also said IES has a treasure trove of data in its reviewer scores, and she suggested Dr. Ricciuti think about how to delve into that. For example, if some reviewers' scores highly correlate with those of others, it may not be necessary to have both reviewers. Sometimes, dissonance is better than consonance.

Regarding whether the same reviewers should evaluate a resubmission, Dr. Singer said fresh eyes should review it. The current triage system effectively guarantees continuity. She said there is a role for both approaches (continuity vs. fresh eyes).

Dr. McLaughlin noted that because of the limited funding, NCER and NCSER must continue to have one submission deadline so that they can ensure sufficient funds for awards.

Dr. Chard encouraged the SRO to keep working with NSF and NICHD and to continue putting more information online. The data show that the resubmission process has been working since 2009, despite some perceptions to the contrary.

Lunch

The Board adjourned for lunch at noon, during which members reviewed the Board charter in a closed session. The public meeting resumed at 1:03 p.m.

Overview of the National Center for Education Statistics
Peggy Carr, Ph.D., Acting NCES Commissioner

Dr. Carr gave a brief history of the NCES, which traces its origins to Congressional language from 1867 directing the collection of data on the status of education in the states and territories. The commissioner meets monthly with the heads of 12 other federal statistical agencies. The NCES staff numbers 112, and Dr. Carr is seeking to increase that number, because NCES carries a very heavy load.

According to legislation, NCES' mandate is as follows:

  • Produce and disseminate relevant and timely information
  • Conduct credible and accurate statistical activities
  • Conduct objective statistical activities to collect data that are impartial, clear, and complete
  • Protect the trust of information providers by ensuring the confidentiality and exclusive statistical use of their responses

The mission of NCES is to collect, analyze, report, and disseminate education information and statistics in a manner that:

  • meets the highest methodological standards;
  • is timely, relevant, and useful to practitioners, researchers, policymakers, and the public;
  • is objective, secular, neutral, and nonideological; and
  • is free of partisan political influence and racial, cultural, gender, or regional bias.

In addition, NCES data collection is guided by four core values: quality, usefulness, predictability, and timeliness. For example, a 2004 push to improve timeliness and predictability led NCES to decrease the timeline dramatically for reporting results of the National Assessment of Educational Progress (NAEP). Every NCES commissioner has made quality a high priority. Dr. Carr said she is also focusing on usefulness of data.

Outlining the organization of NCES, Dr. Carr noted that the Statistical Standards and Data Confidentiality unit was created to enhance NCES' credibility. She pointed out that 85% of the staff has advanced degrees. Data collection efforts cover the lifespan (see Figure 2). Despite the wide range, Dr. Carr said NCES does not collect all data as often as she would like.

Seleced NCES data collections

2 Selected NCES Data Collections

Products created by NCES range from First Looks—short, initial reports on data—to detailed statistical analyses and R&D reports. The center also offers numerous services, such as providing data management expertise to other ED staff, leading data work in the field, providing leadership for education assessment, and offering technical data collection and systems expertise. It organizes conferences on best practices and facilitates training, which boost relationships with individuals at the school and district level.

The NCES budget is divided into two separate lines. For FY 2015, the statistics budget (including international assessments) is about $100 million, and the national assessments budget is about $129 million.

Dr. Carr described the numerous types of contracts and agreements NCES relies on to do its work. The most challenging and costly component of the work is the technology that NCES brings to schools to conduct assessments. Dr. Carr said NCES is affected by the pervasive decline in survey response rates; in some cases, response rates are so low that results cannot be reported. Other challenges include complex internal processes (such as clearance and confidentiality issues), the quality of external data provided to NCES, programmatic vs. statistical use of data, funding uncertainty, and understaffing.

Discussion

Hirokazu Yoshikawa, Ph.D., asked why the NCES national assessment budget increased substantially from 2008 to 2009. Dr. Carr responded that when the NAEP was tied to No Child Left Behind, NCES received an additional $68 million. Also, in 2009, the NCES budget was increased to fund a full schedule of assessments. In response to Dr. Feuer, Dr. Carr clarified that the money from the statistics budget line can be used to support national assessments, but not the other way around. She added that NCES accounts for nearly half of the IES budget, and NAEP is a big part of the NCES budget.

NCES Assessment Division

Dr. Carr said that combining the NAEP and international assessments into one division was a good move, because the assessments share the same designs, vision, and goals. The NCES assessments monitor programs in the United States and internationally to provide indicators and to measure trends over time. Dr. Carr said NCES informs the world about what students know and what they can do.

Within the Assessment Division, NAEP accounts for about 60% of the work, which includes developing test items, collecting data, and managing NAEP state coordinators and NAEP ambassadors. The international branch has grown tremendously in the past 10 years, starting with the Trends in International Mathematics and Science Study (TIMSS) and more recently with strong interest in the Program for International Student Assessment (PISA). Both the NAEP and international assessments are led by governing bodies. A separate branch of NCES provides administrative support to coordinate contracts and budgets, for example.

Dr. Carr said NAEP is mandated by Congress, and states are required to participate in certain assessments every 2 years. Results of NAEP assessments are reported within 6–12 months of data collection via the First Look reports. The assessments are supported by NAEP state coordinators, who are not only liaisons with schools but also play an integral role in data collection and quality assurance. The first long-term trend assessments in reading and mathematics began in 1968.

Dr. Carr described the NAEP functions as a three-legged stool that relies on the National Assessment Governing Board (NAGB) for policymaking, external contractors for services ranging from item development to scoring, and NCES for operations support. At the peak of the effort, 7,000 people collect NAEP data over the course of 6 weeks.

International Assessments

Daniel McGrath, chief of the Reporting and Dissemination Branch, said international assessments focus on children beginning in fourth grade and continue to adulthood. Assessments from elementary through high school include TIMSS, PISA, the Progress in International Reading Literacy Study (PIRLS), and the Teaching and Learning International Survey (TALIS). The Program for the International Assessment of Adult Competencies (PIAAC) assesses literacy.

The International Association for the Evaluation of Educational Achievement conducts TIMSS and PIRLS. It has a general assembly in which NCES participates to develop policy and strategy. The Organization for Economic Cooperation and Development (OECD) sponsors PISA, PIAAC, and TALIS. Mr. McGrath and his colleagues represent NCES on various OECD boards that guide policy.

The International Assessment branch contributes to study design and development and leads data collection and operations. It also advocates for the development of educational measures, with NAEP as the gold standard and model for most international studies. Finally, the branch provides analysis, reporting, and dissemination of results.

Study design and development includes matrix sampling used by NAEP and all the international assessments. All the IES centers use the evidence-centered design created by the branch. The development process is extensive and can take up to 4 years. A number of task forces provide input on study design.

Developmental research to advance educational assessment is conducted through five contracts. Other efforts to advance assessment include the Survey Assessment Innovation Lab, a virtual laboratory to assess student skills and knowledge. Through PISA, the branch is looking at student behaviors during computer-based assessment to determine, for example, whether there are optimal strategies and what contributes to low performance.

In support of dissemination, the branch has created many tools and reports targeted to various audiences. They are available on the NCES website.

The changing landscape of education poses challenges. For example, the rapid, uneven transition to digital devices in classrooms and homes means habits are changing. There is an opportunity to collect a lot of data through digital devices. The variability in transitions to new standards, curricula, and assessments also poses a challenge.

Discussion

In response to Dr. Gamoran, Dr. Carr explained that NAEP and TIMSS measure the same age ranges. She believed the two studies could be integrated without abandoning either. Mr. McGrath added that PISA and TIMSS provide different information.

Dr. Singer pointed out that the National Academy of Education, under the direction of Dr. Feuer, is evaluating NCES efforts, including international assessments. One initiative will address the number of assessments and opportunities to combine them. Dr. Gamoran said the initiative reflects a refreshing openness to researchers on the part of NCES, and he hoped NCES would continue to support such initiatives. Dr. Feuer added that there may be some collaboration with the National Academy of Sciences and opportunities to bring in experts in demography and other social sciences. The most powerful rhetoric to support education reform comes from headlines comparing U.S. students with those of other countries, he said.

Mr. McGrath noted that TIMSS provides an international angle and is not very expensive; NCES has already saved money on administrative support by leaning on NAEP. Dr. Carr said the burden of testing is more significant than the cost. However, she concurred that TIMSS acts as an indicator and provides needed information.

Sample Surveys Division

Chris Chapman, associate commissioner, said the division designs and conducts sample surveys around individuals, schools, and school systems, then disseminates the findings. The surveys seek to gather data not found in administrative records, such as individual attitudes and motivations. The staff is divided according to the type of data collection—longitudinal surveys and cross-sectional surveys.

Longitudinal surveys have a range of cohorts, from newborns to college-age students. In early education, data come from parents and providers; older students provide data directly. The longest-duration studies collect data for up to 10 years.

Regular cross-sectional surveys include the National Household Education Surveys, the National Teacher and Principal Survey (NTPS), the Private School Survey, the School Crime Supplement, and the School Survey of Crime and Safety. As-needed studies include short, topical NTPS modules; Fast Response Surveys and Postsecondary Quick Information Surveys; and Career and Technical Education Statistics. In addition, the branch develops cross-sectional surveys for ED's Education School Climate Survey and the Recent Graduate Employment and Earnings Survey.

The division spends a lot of time disseminating results. Online, the Datalab tool allows users to sort through data while maintaining correct statistical weights and variances. With eDAT, users can download raw data for analysis.

Discussion

Dr. Gamoran appreciated the openness of the branch and NCES in general. He asked what the survey branch needed most; Mr. Chapman said the branch needs more staff.

Administrative Data Division

Ross Santy, associate commissioner, said the mission of the division is to oversee the collection of "universe" data from state and local education agencies and postsecondary institutions efficiently and with minimum burden. Much of the division's efforts involve working with others in ED and users of the data. The division's three branches correlate with the data sources. The State Longitudinal Data Systems Branch supports state capacity for data system management. The Postsecondary Branch and the Elementary and Secondary Branch operate data collection and reporting in schools.

Data collected include numbers of students (meeting specific criteria), administrators, and faculty; levels and amounts of resources; and counts by status. These data are important to establish a strong, reliable data chain.

The Postsecondary Branch runs the Integrated Postsecondary Education Data System, which collects data from all institutions that participate in federal student financial assistance programs. Data are released through First Look reports, methodology reports, data files, and technical review panel reports. The branch also maintains online tools; it is currently reconfiguring its website to better support mobile access.

The Elementary and Secondary Branch runs the Common Core of Data (CCD) and EDFacts and supports civil rights data collection. The branch works with other programs to determine what they need and what data they can provide. The branch seeks to improve CCD data management and is working with EDFacts Datamarts for ED programs.

The State Longitudinal Data Systems Branch focuses on the Statewide Longitudinal Data Systems, which recently added 16 new teams. In addition to annual papers, updates, and online tools, the branch helps states improve data collection.

The Administrative Data Division also supports developing work. For example, it is ramping up its mapping and geography efforts to incorporate more geographic census data. A new mapping tool, MapED allows users to explore information at national and local levels. Data integration projects include the Wage Index (which incorporates teacher salary and geographic data). The ED Data Inventory is an example of the division's development contributions to data use and transparency.

The division seeks to develop and maintain a national network, which involves a lot of relationship-building and maintenance. Mr. Santy concluded by pointing out that NCES does not directly control any of the systems or organizations that manage the data needed to create high-quality education statistics.

Annual Reports and Information Unit

Thomas D. Snyder, program director, said the unit oversees numerous reports, including the annual Condition of Education report mandated by Congress. The Digest of Education Statistics is widely used, especially by graduate students. Examples of topic-specific reports are the Indicators of School Crime and Safety, which was compiled with data from the Department of Justice, and Trends in High School Dropout and Completion Rates in the United States. The Projections of Education Statistics report is often incorporated into other reports and used by businesses, educators, and policymakers. This year, the unit revitalized its series Status and Trends in the Education of Racial and Ethnic Groups. It also oversees production of other special reports using new NCES data.

The unit responds to data requests from policymakers, ED program staff, and others. Mr. Snyder said NCES often gets media coverage because it is able to respond to media requests for information quickly. The unit also coordinates the publication planning and review system and provides formula grant allocation services. It supports various popular ED websites, including Fast Facts. The unit works with the Federal Interagency Forum on Child and Family Statistics and the OECD.

The unit coordinates social media and outreach for NCES. With the growing use of mobile devices, NCES increasingly relies on mobile-friendly videos, its blog, and Twitter to provide updates and announcements and to reach a broader audience.

Statistical Standards and Data Confidentiality Unit

Andrew White, senior research statistician, sitting in for Chief Statistician and Program Director Marilyn Seastrom, said the unit's mission is to support NCES in the collection and analysis of education statistics in a manner that meets the highest methodological standards, while ensuring that the data are timely, objective, secular, neutral, nonideological, relevant, and useful. Activities include promoting statistical standards and protecting data confidentiality. The unit offers technical support to NCES and other statistics organizations and coordinates quality review.

The unit provides training through the NCES Distance Learning Dataset Training System, which helps users identify and use relevant NCES data. Training also explains how to find and use online data tools. To support training, the unit produces the NCES Statistical Standards, Tabular Guidelines, and Handbook of Survey Methods. It also supports the ED Data Inventory and the National Forum on Education Statistics.

NCES Stakeholders
Nancy Potok, Ph.D., Deputy Director, U.S. Census Bureau

Dr. Potok said the Census Bureau and NCES are not only partners and colleagues, but NCES is also a client of the Census Bureau, in that it relies on the Bureau's data collection efforts for some initiatives. Both organizations are part of the federal statistics system, and both participate in the Interagency Council on Statistical Policy and its many working groups.

The Census Bureau is among the largest statistical agencies, with an annual budget over $1 billion and 17,000 employees. Much discussion among the statistical agencies revolves around independence and the relationship of statistics to policy, the need for objective data, the stress of changing administrations, and the influence of department leaders on statistical agencies. The agencies tend to band together to uphold the standards and independence of federal statistics and to move methodology forward.

The Census Bureau is involved in at least a dozen NCES surveys. The two agencies are working together to improve data collection, advance methodology, make data collection more efficient for respondents (especially for school districts taking multiple surveys), and improve the quality and timeliness of data. In response to Dr. Chard, Dr. Potok explained that the Census Bureau has interagency agreements with NCES to support data collection implementation and processing, survey design, timing, and other areas. In some cases, the Census Bureau acts as a contractor for NCES.

Brian Harris-Kojetin, Ph.D., Deputy Director, Committee on National Statistics, National Academy of Sciences

Dr. Harris-Kojetin formerly served in the Office of Management and Budget (OMB) and worked closely with the 13 statistical agencies. In that position, he reviewed numerous ED and NCES evaluations and surveys, and he offered his perspectives on the basis of his interactions with NCES staff and programs as well as interagency interactions. Dr. Harris-Kojetin said the federal statistical system can be fragmented but works well.

All the statistical agencies face challenges from declining cooperation, flat budgets, increasing costs, and growing concern about the burden on respondents. Dr. Harris-Kojetin said NCES is a key innovator in the field. Thanks to NCES, the OMB disseminated a memo on increasing the use of administrative records for statistical purposes, establishing a broader, government-wide foundation for the NCES approach.

Dr. Harris-Kojetin raised the issue of a potential loss of stature and autonomy for NCES since it became part of IES. He said IES is a model for other agencies, but he was not sure IES has been good for NCES. The statistical community is concerned about the effect of IES on the mission of NCES, he noted.

Michael Casserly, Executive Director, Council of the Great City Schools

Mr. Casserly said CGCS is a coalition of large, urban, public school systems and is an avid user of NCES data. He expressed appreciation for Dr. Carr's leadership and said he believes she is one of the most valuable public servants in federal government. The CGCS is an NCES stakeholder at the ground level, mostly around NAEP data. It works closely with NCES and NAGB on the Trial Urban District Assessment (TUDA).

Mr. Casserly said the CGCS suggested to NAGB 15 years ago that it create TUDA to demonstrate that urban public education is committed to high standards. The assessment enabled comparison of urban school districts across state lines, which other assessments did not allow. It also provided a mechanism for evaluating reforms.

Data from NCES enabled the CGCS to answer questions about academic attainment and other issues in urban public schools. Mr. Casserly said he is grateful for NAEP's contribution and enormously pleased with NCES' continuing collaboration with the CGCS. Other NCES products, such as data on staffing, budget, finance, and salaries, provide unique databases allowing comparisons across urban school districts. Mr. Casserly said NCES is a reliable partner, particularly for NAEP.

Mr. Casserly acknowledged that it takes a long time to analyze data and get it right, but the lack of timeliness can be a barrier to utility. He suggested establishing an ongoing collaboration between NCES and large urban school districts on issues such as data quality and usefulness. In the 1970s, the CGCS had a standing advisory group of urban school researchers and testing directors who met with NCES staff to discuss details and utility of the data. It was a mechanism for NCES and stakeholders to share information.

Discussion

Asked for an example of combining data activities, Dr. Potok said the Census Bureau is doing a lot to link records. The Bureau is required to use data available from existing records whenever possible instead of collecting the same data again. The American Community Survey (ACS) is a rich data set that NCES might use to look at education programs in impoverished geographic areas. More can be done to link data, said Dr. Potok.

Dr. Potok noted that bipartisan legislation introduced in the U.S. Senate would create a clearinghouse for federal records, which the Census Bureau supports. Education records are the only dataset the Census Bureau cannot access because privacy protections override the Bureau's mandate. A mechanism that combines ED data, Census data, and ACS data could be very useful, said Dr. Potok.

The Census Bureau has research data centers across the country that allow researchers to delve into records linked by the Bureau staff. Most of the data centers are located at universities, while some are at agency offices. The Bureau recently proposed expanding the data centers to include all the federal statistical research agencies. By doing so, data could be linked in a secure environment, with privacy and confidentiality protected, opening up new doors for research, said Dr. Potok. Dr. Feuer said education researchers are very sensitive to privacy and confidentiality issues.

Dr. Feuer asked Mr. Casserly for advice on how to protect TUDA data against the inevitable desire to use those data as a scorecard. Mr. Casserly said the CGCS knew that asking for comparable data and public reporting would pose that risk. However, the ability to collect data to improve education remains critical. Mr. Casserly said that NCES gets better every year at presenting the data in a manner that does not invite the scorecard mentality or feed the urge to draw simplistic conclusions. Mr. Casserly believes TUDA is the way to address the enormous pressure to improve academic attainment, close achievement gaps, and respond to public concerns.

Dr. Gamoran asked whether a data clearinghouse is needed if the Census Bureau is already working on expanding its data centers. Dr. Potok said the Census Bureau has a legislative mandate to use administrative records when possible but no corresponding legislation that requires other agencies to provide that data to the Bureau. Gaining access to records can take years of negotiation, said Dr. Potok. A clear Congressional mandate could set an expectation for sharing records.

Dr. Potok continued that when data are collected, they become the Census Bureau's property, which triggers stringent privacy protections and makes it very hard to get the data back to those who provided them or to researchers. Any mechanism that addresses privacy, security, and informed consent in a way that the public finds satisfactory would be welcome, said Dr. Potok. A clearinghouse with standards, expectations, and protections that allows research access would be a good step.

Dr. Harris-Kojetin said the research community is very enthusiastic about the proposed legislation, which raises the profile of statistical data. He hoped there was an opportunity to leverage existing structures and current protection methods.

Asked to expand on his contention that IES may not be good for NCES, Dr. Harris-Kojetin explained that the Education Sciences Reform Act (ESRA), which is up for reauthorization, transferred some of the autonomy and independence of NCES to IES. Dr. Harris-Kojetin said that IES has helped build a strong scientific research arm for ED, and the consolidation of various research components in ED has been helpful. However, NCES has lost some of its autonomy and clout. A recent law that gave fixed-term confirmation to the leadership of the Census Bureau took away the Senate confirmation requirement for the NCES commissioner. The proposed legislation to renew ESRA would demote the NCES commissioner to an appointee of the IES director, which Dr. Harris-Kojetin thinks is a major threat. The federal statistical agencies should have independent heads, he said.

Dr. Gamoran said the Board used to advise Congress on how to treat IES but the members were split on the issue of appointment of leadership. Half did not want to give up the rare position that afforded someone in education such high status, while the other half argued that IES would function better if the centers were all on equal footing. Dr. Gamoran suggested that at a future meeting, the Board should discuss whether the NCES commissioner should be a Presidential appointee.

Dr. Ford asked if there were other concerns beyond the loss of stature. Dr. Harris-Kojetin said NCES could become buried further down in the bureaucracy, as is true for other federal statistical agencies. Dr. Neild disagreed, saying that it is entirely different when a statistical agency is placed within a non-independent policy office, as opposed to being part of a larger scientific agency that has statutory independence and publication authority. By virtue of authorizing legislation, all of IES, including NCES, is granted this independence. She said that a compelling case had not been made for actual harm to the work of NCES.

Because the Census Bureau is responsible for sensitive economic data, the potential for political manipulation is always a concern, said Dr. Potok. It is important to have a person of stature at the helm who can remain independent from politics. Without sufficient stature, an agency's products can be manipulated by others to suit a specific agenda. When the agency head is confirmed by the Senate, that individual has more "muscle" and can take findings directly to the Secretary. Dr. Harris-Kojetin emphasized that the concern about independence is real. He said ED could reinforce its position by ensuring that there is no mechanism for manipulation or control of NCES data.

Mr. Casserly urged the Board to give more attention to stakeholders at the local level who would like to use more NCES data. Local school administrators, researchers, and assessment directors, for example, even in big cities, are unable to take full advantage of NCES data. There are steps NCES should consider to be more relevant to people in the field, which would also increase the number of people who advocate for NCES when it seeks higher appropriations.

Dr. Potok strongly agreed with Mr. Casserly and pointed out that responsiveness will continue to decline unless data collection agencies can give respondents back something of value. The more NCES and other agencies can put data out in formats that people can use to help them do their jobs and inform their decisions, the more the agencies can justify asking for the data. Linking administrative records would lead to more timely information at narrower geographic levels that could be fed back to the primary data users, not just researchers, Dr. Potok added.

Mr. Casserly said TUDA provides very detailed data that allow users to look into all kinds of questions of immediate interest to those who participated. With other efforts, NCES should aim to provide more data on useful topics, such as financing and staffing. However, Mr. Casserly noted that the more granular the data, the more suspicious users are of the accuracy of the data, and school districts will not use them. When NCES gets it right, it does well, said Mr. Casserly, but when studies are problematic, the utility of the data decreases.

Deborah Phillips, Ph.D., pointed out that low response rates contribute to all sorts of data collection problems, such as utility and timeliness. They may also reflect survey fatigue or personnel turnover. Mr. Casserly agreed that responsiveness is a serious concern, but when school districts want real, useful data, the burden is not an issue. For example, the CGCS collects substantial academic and operational data annually, yet school districts do not complain, because they appreciate the frequency, granularity, timeliness, and nature of the data.

Dr. Harris-Kojetin said household survey responses are declining in every rich country. Dr. Potok said a "bad" response rate for Census Bureau is about 70%, but she also agreed that response rates are declining. The level of effort required to get responses is becoming more intense and expensive. Dr. Potok said education survey fatigue could be addressed with a more holistic view of the survey landscape for schools. That is, researchers should look at the schools in the sample and consider how often they are asked to respond and also the timing of surveys. For schools that must take part in multiple surveys, researchers could at least acknowledge the burden and work with the schools to provide meaningful data back to them. She conceded that doing so is difficult when various contractors conduct the surveys.

Dr. Feuer said some political resistance increases the perception of the burden of government-sponsored surveys and leads to dismissal of their utility. He suggested that the CGCS may get better responses because it is not a government agency. Mr. Casserly pointed out that his organization cannot mandate compliance. Stakeholders tell the CGCS what data they need to inform practice. Mr. Casserly again encouraged the Board to create a mechanism that would give local schools and communities more opportunities to share information with NCES and a better sense of what NCES is doing.

Dr. Phillips agreed that steps can be taken to lessen the imposition, such as engaging participants so they feel more ownership, providing timely results, and avoiding overlap between state tests and other surveys, for example. Dr. Harris-Kojetin said some of the problems are structural. Specifically, PISA was inconvenient and offered no payoff to schools, so it frequently did not meet thresholds for reporting. However, in countries where PISA serves as the equivalent of the NAEP, there are no problems.

Dr. Yoshikawa said a National Academy of Sciences panel on integrating immigrants into U.S. society recommended collecting data on the children of immigrants by parents' birthplace through the ACS. He asked what efforts are underway to gather more data on youth. He also asked whether NCES provides input into OECD initiatives other than PISA. Dr. Carr responded that NCES is active in many OECD initiatives. Dana Kelly, chief of the International Assessment Branch and chair of an OECD committee on early childhood education, said NCES works with various entities to avoid duplicating efforts and to ensure that survey items are appropriate for U.S. populations.

Mr. Chapman said NCES does collect data on parents' birthplaces, but the samples are not big enough to provide meaningful data by country or region of origin, because it is a nationally representative sample. He said NCES could work with the Census Bureau on the ACS, but it would need a legislative mandate to add questions. Mr. Chapman thought that, potentially, some data could be collected from administrative records. Mr. Santy added that school systems that get Title III funding (to address students with limited English proficiency) might have data on parents' birthplace. Other programs may have academic achievement results for English language learners, but the two datasets do not overlap. Mr. Sandy said NCES has not yet found a way to get detailed data about the children of immigrants that is not burdensome.

Dr. Phillips asked where data on children with special needs falls into the survey landscape. Dr. Carr replied that NAEP has the largest sample of such children but cannot break down the data too far. Administrative data collections capture some students with special needs, and so does the middle school survey. There is a representative set of indicators for students with special needs, Dr. Carr added. Dr. McLaughlin pointed out that a future iteration of the middle school survey will have a section on students with special needs. Mr. Casserly noted that broader data are available from ED's Office of Special Education Programs and Office for Civil Rights. Mr. Santy said a lot of data could be gathered from various existing sources with more resources and time. He suggested a restricted-use license could be granted to a researcher who would like to work with the data.

Dr. Chard and Dr. Carr thanked the stakeholders for bringing new ideas to the table.

Closing Remarks & Adjournment
David Chard, Ph.D., NBES Chair

Invited to give her impressions of the day, Dr. Neild said she felt a key takeaway is the need for IES to work with NIH and NSF on how the organizations' funding streams influence one another. She said NCES is incredibly productive and creative, despite its very small staff, as is the SRO. Although government is not known for being nimble and creative, said Dr. Neild, IES is both, doing a tremendous amount to support the research infrastructure. She reiterated her request that the Board consider how IES should position itself to preserve its importance and the high quality of its work in the next administration.

Dr. Chard thanked all the Board members for their participation and gave special thanks to Dr. Gamoran and Anthony S. Bryk, Ed.D., who have completed their final terms. He said both have always brought good questions and insight to the table. Dr. Chard, Larry V. Hedges, Ph.D., and Dr. Yoshikawa have completed their first terms on the Board and do not yet know if they will be reappointed. If Dr. Chard is not reappointed, Vice Chair Dr. Loeb will lead the next meeting, and elections for a new chair and vice chair will be held.

Dr. Chard said the Board has been looking for ways to better integrate the Board into IES' work constructively. The Board charter has provisions for standing committees, which could include committees that align with each of the centers. If such committees were developed, the commissioners would work closely with the Board to identify which Board members should serve on which committees. Such an arrangement could mean that the committees and the commissioners hold their own discussions and present jointly at Board meetings. An ad hoc committee could be formed to address special issues of interest to IES, allowing Board members to dig deeper into IES projects and offer a more productive role for Board members.

Dr. Chard said the next NBES meeting is scheduled for February but no date has been determined. He adjourned the meeting at 4:22 p.m.

Report prepared for NBES by Dana Trevas, Shea & Trevas, Inc.

The National Board for Education Sciences is a Federal advisory committee chartered by Congress, operating under the Federal Advisory Committee Act (FACA); 5 U.S.C., App. 2). The Board provides advice to the Director on the policies of the Institute of Education Sciences. The findings and recommendations of the Board do not represent the views of the Agency, and this document does not represent information approved or disseminated by the Department of Education.