Skip Navigation
A gavel National Board for Education Sciences Members | Director's Priorities | Reports | Agendas | Minutes | Resolutions| Briefing Materials
National Board for Education Sciences
February 24 2012 Minutes of Meeting

Location
Institute of Education Sciences (IES) Board Room
80 F Street NW
Washington, DC 20001

Participants
National Board for Education Sciences (NBES) Members Present
Bridget Terry Long, Ph.D., Chair
Kris D. Gutierrez, Ph.D., Vice Chair
Deborah Loewenberg Ball, Ph.D. (via telephone)
Robert Granger, Ed.D.
Margaret "Peggy" R. McLeod, Ed.D.
Robert A. Underwood, Ed.D.

NBES Members Absent
Anthony S. Bryk, Ed.D.

Ex-Officio Members Present
John Q. Easton, Ph.D., Director, IES, U.S. Department of Education (ED)
Elizabeth Albro, Ph.D., Acting Commissioner, National Center for Education Research (NCER), IES
Alison Aughinbaugh, Ph.D., Research Economist, Office of Employment and Unemployment Statistics, Division of National Longitudinal Surveys, Bureau of Labor Statistics
Sean P. "Jack" Buckley, Ph.D., Commissioner, National Center for Education Statistics (NCES), IES
Rebecca Maynard, Ph.D., Commissioner, National Center for Education Evaluation and Regional Assistance (NCEE), IES
Peggy McCardle, Ph.D., M.P.H., Branch Chief, Child Development & Behavior Branch, Center for Research on Mothers and Children, Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD), National Institutes of Health (NIH)
Joan Ferrini-Mundy, Ph.D., Assistant Director, Directorate for Education and Human Resources, National Science Foundation (NSF)
Deborah Speece, Ph.D., Commissioner, National Center for Special Education Research (NCSER), IES
Robert Kominski, Ph.D., Assistant Chief, Social, Economic and Housing Statistics Division, U.S. Census Bureau

NBES Staff
Monica Herk, Ph.D., Executive Director, Designated Federal Official (DFO)

IES Staff
Sue Betka
Lisa Bridges, Ph.D.
Wai Chow
Allison Orechwa, Ph.D.
Anne Ricciuti, Ph.D.

Invited Presenters
Paul Carttar, M.B.A., Director, Social Innovation Fund (SIF), Corporation for National and Community Service
Naomi Goldstein, Ph.D., Director, Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services (HHS)
Lloyd Horwich, Office of Legislation and Congressional Affairs, ED
Diane Massell, Ph.D., Research Associate, School of Education, University of Michigan
Ruth Neild, Ph.D., Associate Commissioner, NCEE, IES

Members of the Public
Karen Akins, ED
Donald Duggins, Crowe Horwath LLP
Jean Gossman, Education Daily
Sarah Hutcheson, Society for Research in Child Development
Kim Hymes, Council for Exceptional Children (CEC)
Megan Foster, Society for Research in Child Development
Carla Jacobs, Lewis-Burke Associates
Jady Johnson, Reading Recovery Council of North America
Jim Kohlmoos, National Association of State Boards of Education
Augustus Mays, WestEd
Margaret Olmos, ED
Sheritta Cooper Porter, AFYA, Inc.
Tim Silva, Mathematica
Sarah Sparks, Education Week
Sarah Spreitzer, Lewis-Burke Associates
Gerald Sroufe, American Education Research Association (AERA)
William D. Welkowitz, Villanova Law School
Phoebe Winter, Pacific Metrics

Meeting Summary

Call to Order, Approval of Agenda, Chair's Remarks
Bridget Terry Long, Ph.D., NBES Chair

Dr. Long called the meeting to order at 8:33 a.m., and Dr. Herk, NBES DFO, called the roll. Dr. Herk acknowledged Mary Grace Lucier, the previous NBES DFO, who retired after 28 years. Ms. Lucier served many boards, including the predecessor to NBES, and she will be missed, said Dr. Herk. NBES members unanimously approved the agenda for this meeting. NBES members also unanimously approved the summary of the October 14, 2011 NBES meeting with no changes.

Dr. Long thanked the Board members for electing her as the chair. She also thanked Dr. Herk, Dr. Easton, and Dr. Gutierrez (the recently elected vice chair) for partnering with her to create an agenda of substance.

Dr. Long described the purpose of NBES as outlined in federal statute and the requirement that the Board submit an annual report on its accomplishments. The meeting's agenda and the annual report that is currently being developed reflect three themes the Board has discussed over the past year and will continue to discuss during the coming year:

  • Impact, including how IES disseminates and scales up promising practices
  • IES's funding of research
  • Advocacy for the support and use of research

Update: Recent Developments at IES
John Q. Easton, Ph.D., IES Director

Dr. Easton presented the portion of the President's budget request for Fiscal Year (FY) 2013 that applies to IES. The total, $621.1 million, represents a nearly 5 percent increase over FY 2012. Dr. Easton said he was pleasantly surprised to see proposed increases for the Research, Development, and Dissemination line and for the Statistics line. Funding for the Regional Education Laboratories (RELs) would remain flat, as it has for many years. It was proposed that the Assessment funding be reduced by $6 million for FY 2013. Most of the Assessment funding supports the National Assessment of Educational Progress (NAEP), but some goes to support the National Assessment Governing Board, which would see a $1-million reduction in its funding. Special Education Research funding also remained flat. The budget for the Statewide Longitudinal Data Systems (SLDS) would increase. Funding for the Special Education Studies and Evaluation would remain flat. Dr. Easton pointed out that while the proposed budget represents an overall increase for IES, there are both increases and decreases when the budget is viewed across funding line items.

Discussion

Dr. Easton explained that many people are consulted in the development of the President's budget. The rationale for particular funding levels is described to some extent in the President's budget, but the thinking behind the funding is not always clear, he said. Dr. Buckley noted that the increase in funding for Statistics includes $6 million to NCES to support state participation in the Program for International Student Assessment. NCES would offer the funding to states willing to participate in a cost-sharing arrangement.

Dr. Easton said he and his staff must make budget decisions for IES despite the uncertainty of funding in the current economic environment. He emphasized that the President's budget is only a request, and it is unlikely that the federal FY 2013 budget will be approved soon. In terms of the motivation behind some of the proposed cuts, Dr. Easton said the budget request suggests that some programs could be conducted at lower costs, but the language is not specific.

Commissioner Updates
National Center for Education Evaluation and Regional Assistance (NCEE)
Rebecca Maynard, Ph.D., NCEE Commissioner

NCEE awarded 10 REL contracts that launched in January 2012. An opening conference in January included the director and nine team members from each REL. Key task leaders from each REL, not just senior management, attended. The conference focused on expectations and challenges. Dr. Maynard emphasized that NCEE has developed an expert REL management team that understands and supports the principles of strong organizations. As a result of the previous round of REL contracts, 26 randomized, controlled trials have been conducted, of which the final two are nearing completion. The research results are available on NCEE's website.

The directors of the National Library of Education and the Education Resources Information Center (ERIC) retired in December, and efforts are underway to fill those positions. NCEE is piloting a process for updating practice guidelines for the What Works Clearinghouse (WWC) and is reviewing proposals for a new procurement that would update the English Language Learner Practice Guide. It has also implemented a new strategy to speed up the WWC Quick Reviews. To make full use of all the studies reviewed for the WWC, NCEE is preparing to publish single-study reviews.

Dr. Maynard said NCEE is embarking on a modest cross-agency project to develop some standards and practical tools for evaluation. NCEE continues to work with other agencies, including the U.S. Social Security Administration, the U.S. Department of Health and Human Services (HHS), the U.S. Department of Labor, the U.S. Department of Justice (DOJ), and the U.S. Department of Homeland Security on evidence standards, reviews, and dissemination. A group of NCEE staff is working with the NSF on common evidence standards. Finally, NCEE just released two more Technical Methods reports on the use of state test data and will soon release two more on non-experimental methods.

Discussion

Dr. Maynard clarified that one aspect of the interagency efforts is to seek some consensus among the federal agencies funding programming and research concerning what constitutes a reasonable study design or reasonable standards for federally funded studies. Another aspect of the interagency efforts is looking for ways to make it easier to access and exchange existing evidence across different federal agency platforms, such as the What Works Clearinghouse or DOJ's platform. The final aspect of interagency efforts is around cost-effectiveness and the need to do more with less.

National Center for Education Research (NCER)
Elizabeth Albro, Ph.D., NCER Acting Commissioner

Dr. Albro said the first wave of a new round of NCER research awards will be announced shortly and NCER is working on the Requests for Applications (RFAs) for 2013. NCER research grantees will be meeting just before the Society for Research in Educational Effectiveness (SREE) meeting in March; slides from presentations made at the SREE meeting will be posted on the SREE website (https://www.sree.org/).

Beyond the research portfolio, NCER also takes part in the IES Small Business Innovation Research (SBIR) program. Of 35 projects funded since 2002, 12 have developed commercially viable products for schools, such as technology for assessing children with disabilities. Another 12 projects are working toward commercialization of a product, while the remaining projects did not succeed. Dr. Albro said a success rate of two thirds in a program supporting innovation demonstrates a nice distribution. Two high-profile SBIR-funded projects are Filament Games, which won the grand prize for the 2011 National STEM Video Game Challenge, and Insight Learning Technology, which has been highlighted by national media outlets. Dr. Albro added that several products initially developed with SBIR funding are undergoing efficacy studies or further development using NCER research grant funding.

Dr. Albro said the Center for the Analysis of Longitudinal Data in Educational Research (CALDER) recently held its fifth conference. She highlighted research from the conference on the effects of a school district's decision to require algebra in middle school. The research takes advantage of the natural experiment set up by the district-wide policy change and is available on the CALDER website (http://www.caldercenter.org/events/5th-annual-calder-conference.cfm).

Discussion

Dr. McCardle commented that the National Institutes of Health (NIH) SBIR program awards grants, while IES's program establishes contracts. Dr. Albro explained that the SBIR programs are administered by the Small Business Administration, and the businesses retain the intellectual property rights to the products developed. The contract mechanism gives the federal program office a greater say in the project, particularly in terms of research support, because many innovative technology developers have limited research expertise.

National Center for Special Education Research (NCSER)
Deborah Speece, Ph.D., NCSER Commissioner

Dr. Speece said one of her goals after becoming commissioner of NCSER was to identify important research issues in special education. In November, NCSER convened a technical workgroup of scholars to talk about pressing issues for children and youth with disabilities. Additional input came from a group of NCSER grantees and others attending the Pacific Coast Research Conference, in February 2012, who met to discuss, among other things, how IES can help them with their research. Also, the Council for Exceptional Children (CEC) has organized a meeting to bring teachers together with NCSER staff and staff from ED's Office of Special Education Programs (OSEP) to discuss the needs of practicing professionals that may inform NCSER and OSEP activities. Dr. Speece said another goal of hers was to increase the number of grants NCSER funds and she anticipated good news on that topic.

Dr. Speece said she also hoped to increase NCSER's communication with the fields of education and special education and across the government. She has initiated conversations and collaboration with OSEP and ED's National Institute on Disability and Rehabilitation Research (NIDDR). NCSER is discussing with Charlie Lakin, the new director of NIDDR, how to address achievement and outcomes among adolescents with disabilities. Discussion is also underway with NSF and NIH about capacity-building.

NCSER will have several sessions at CEC's annual conference in April. These sessions will include a grant writing workshop, presentations that showcase some of the research sponsored by NCSER, and a "What's New at NCSER" presentation by Dr. Speece. Dr. Speece has also been raising awareness about NCSER by presenting at numerous conferences and by meeting with stakeholders when they come to Washington, DC.

Discussion

Dr. Speece said that one of the themes that arose from the technical workgroup in November was the need to address the intractable problems of children and adolescents with learning disabilities—that is, building a science of intensive instruction. Another theme was the adolescence issue. The final theme was the importance of focusing on the context of interventions for children—the school context and the district context. Dr. Speece acknowledged that evaluating the effectiveness of teachers is an enormous issue and special education teachers are concerned about how they will be evaluated when the children they teach are also in general education classes. She hoped that NCSER would fund research on measures for special education teachers—such as malleable factors that connect teacher behaviors and student outcomes.

National Center for Education Statistics (NCES)
Jack Buckley, Ph.D., NCES Commissioner

NCES has issued several high-profile reports recently, including the 2011 Nation's Report Card and the NAEP results for reading and mathematics in grades four and eight. NCES also published findings about school crime and victimization from the School Crime Supplement to the National Crime Victimization Survey; data for that survey were collected by NCES, the Bureau of Justice Statistics, and the U.S. Census Bureau.

The quality of state-level longitudinal data is a high priority, said Dr. Buckley, and the SLDS grant has increased the federal role in improving state and local data quality. NCES convened the National Forum on Education Statistics (immediately before the 25th annual Management Information Systems Conference) to bring states and districts together to discuss systems and data quality issues. For the first time, the RELs joined the Forum as associate members. NCES hopes that including RELs will help to build a bridge between data collection and use.

NCES is also publishing numerous best practices as guides for states and districts on data use. A set-aside from the SLDS grant program funds technical assistance and NCES and NCEE are working together to focus more of that technical assistance on data use.

Education is among the few sectors that lacks a common standard for data collection, Dr. Buckley pointed out. So efforts are underway to create a voluntary set of Common Education Data Standards (CEDS) to allow states to exchange information both for research and for logistic purposes (e.g., transferring student records). The first step is defining common data elements; for something like gender, the standards would establish a single definition and codes that all systems could use. Other elements are more complicated and require more negotiation among stakeholders, such as early childhood certification status or child care center accreditation, said Dr. Buckley.

Dr. Buckley emphasized that the CEDS are not a mechanism for data collection and the federal government is not creating a centralized education database. Rather, the effort brought together stakeholders (e.g., state and local education agencies, individual institutions, HHS, and DOL) to define the elements that should be included in databases across systems to facilitate data exchange. NCES sought broad representation from the field and held three rounds of public comment to ensure it received feedback. NCES is also developing tools to demonstrate the purpose and utility of the standards. Version 2 of the CEDS was released on January 31, 2012.

NCES also worked with other organizations that have developed standards (the Schools Interoperability Framework and the Postsecondary Electronic Standards Council) and education associations, such as the Council of Chief State School Officers and the State Higher Education Executive Organization. The Gates Foundation and the Dell Foundation have funded communication and implementation efforts. Ultimately, the goal of the standards is to create a voluntary common vocabulary, tools, and a model that allows stakeholders to use data systems across states.

Discussion

Dr. Buckley clarified that it is not enough to build a single set of standards based only on the common elements on which all states can agree. Version 2 of CEDS includes standards that are useful and used by enough stakeholders to have merit, but not every element or standard may be relevant to every system. At present, the CEDS have over 600 discrete elements, and some systems have more—and more nuanced—elements that will be incorporated into the next version of the CEDS. Dr. Buckley also noted that federal, state, and local stakeholders representing early childhood education were involved in the CEDS effort. The website, http://ceds.ed.gov/, provides more information and tools.

Dr. McCardle agreed to provide Dr. Buckley with contact information for a subset of independent schools focused on children with learning differences that are developing a new data system. The National Association of Independent Schools has had a database for over a decade. Dr. Buckley said he has reached out to independent schools, but they have not provided much input on the CEDS. As an example of what the CEDS contain, Dr. Buckley described the data element for capturing Hispanic or Latino ethnicity, which system developers or programmers could use to implement the standards. He also agreed to provide a copy of his presentation to the NBES members, which includes a link to the CEDS website.

NBES 2012 Annual Report: Review of Initial Draft
Introduction and Framing
Bridget Terry Long, Ph.D., NBES Chair

Dr. Long explained that past annual reports were primarily descriptions of the Board makeup and IES's activities that were compiled by IES staff. At the October 2011 meeting, the Board agreed to develop a more substantive report. The final report will be published in June and will reflect the deliberations and decisions of the Board during its June 2011, October 2011, and February 2012 meetings. Dr. Herk and Dr. Long reviewed minutes, agendas, and presentations to create a draft that is organized around the Board's statutory responsibilities as outlined in the legislation that created NBES. The draft was sent to Board members for review before the meeting and members were asked to comment on specific areas.

Presentation of Draft
Monica Herk, Ph.D., NBES Executive Director

Dr. Herk summarized the content of the sections of the draft report and their sources. Sections I through III (Background on the Board, National Education Center Updates, and Description of IES-Funded Research) were drawn primarily from meeting summaries. The Description of Advocacy for Education Research (Section VII) was drafted on the basis of letters, sent to the ED secretary by the Board, regarding Elementary and Secondary Education Act (ESEA) waivers and evaluation. Dr. Herk said she relied on her own judgment to determine when and how much of the Board discussion to include in the annual report and she welcomed feedback. Dr. Long added that she and Dr. Herk tried to identify and highlight the recurring themes of Board discussions. Dr. Granger hoped that, where possible and for future reference, the annual report would capture the rationale for including or excluding areas of discussion.

Dr. Herk outlined the remaining sections of the draft report that were assigned to specific Board members for review:

  1. Research Topics Identified by the Board
    1. Implementation Research and Quality Improvement Science
    2. Teacher Quality, Preparation, and Effectiveness: "Instructional Quality"
  2. Dissemination of IES Research
  3. Partnerships Between Researchers and Practitioners

Dr. Long noted that the areas that Board members were specifically asked to review represent the key issues addressed by the Board. She added that this meeting represents the last opportunity for face-to-face, group discussion of the report's contents.

Member Comments and Discussion
Section IV-A: Research Topics Identified by the Board: Implementation Research and Quality Improvement Science

Dr. Ball felt that this section of the draft report emphasized including implementation in research to a greater extent than the Board had actually expressed. In fact, she felt that the Board's discussion had focused more on developing a better understanding of what implementation even is. Dr. Ball said that, as written, Section IV-A of the draft signals that the Board is interested in better understanding the factors that affect implementation, but it still seems there is a need to better understand how to conceptualize and theorize about implementation. She stated that she understood the Board in its discussions to call not only for research that improves our understanding of factors that affect implementation but also research that improves our ability to theorize and better study how interventions are taken up and their effects.

Dr. Granger described some opportunities outside of IES to address the questions raised by Dr. Ball about implementation. He requested that the bulleted statement, "Education intervention should be defined more broadly to include systems for take-up and implementation of the intervention," be clarified.

Dr. Long emphasized that the report is not a criticism of IES but rather an effort to signal to the field the areas in which the Board feels more investment is needed. Dr. McCardle suggesting clarifying that IES is doing a lot to address the identified research needs and that the Board believes IES should continue to build on those efforts. Dr. McLeod said the Dissemination of IES Research section (Section V) similarly could be improved by summarizing the ongoing efforts of IES to address questions raised. Dr. Long said such information can be drawn from the commissioners' updates to the Board and included as needed in the annual report. Dr. Gutierrez said the entire report would benefit from the addition of more context to frame the issues identified.

Dr. Granger praised the draft for shedding light on the ongoing tension, confusion, and lack of consensus within the field of education research concerning the various approaches to quality improvement. The annual report is an opportunity to communicate to the field about different and complementary approaches.

Dr. Long noted that discussions from the current meeting would be incorporated into the Annual Report and that the draft would be shared with the commissioners and IES staff for comment, later in the process.

Section IV-B: Research Topics Identified by the Board: Teacher Quality, Preparation, and Effectiveness: "Instructional Quality"

Dr. Gutierrez provided several editorial suggestions that would provide more context for the reader, such as listing the four new NCEE studies referenced in the report draft. Dr. Ball suggested some refinements, noting that improving instructional quality requires more knowledge about (1) the interactions between particular aspects of instruction and student's learning, (2) the resources and environment required by teachers to improve the quality of instruction, and (3) the interventions or supports that result in high-quality teaching. She offered to provide some suggested wording to capture the concepts for the report.

Dr. Underwood commented that the draft report lacks a discussion of research on teacher evaluation systems, which is a major current concern of teachers and others, and suggested that it might be something that should be added. Dr. Ball clarified Dr. Underwood's concern as the need for research on evaluation systems and how they relate to instructional quality; Dr. Underwood agreed with her restatement of his point.

Dr. Granger pointed out that a common theme cuts across both the Implementation Research and Quality Improvement Science and Teacher Quality, Preparation, and Effectiveness: "Instructional Quality" subsections of Section IV: namely, the importance of taking a broader view to better understand what works under what circumstances. That is, just as education policies broadly affect the success of an intervention in a given environment, so too is a broad understanding of instructional quality important for understanding individual teacher performance.

Section V: Dissemination of IES Research

Dr. Gutierrez suggested elaborating on the statement that the Board "commends IES for communicating negative research findings" to clarify that there is something to be learned from studies showing that an educational practice has not had a positive impact on student achievement. In addition, the report should clarify that the Board commends IES for including, in the REL contracts, a focus on translating research findings into practice. Dr. Gutierrez also suggested that the draft add more context to the discussion of target audiences and gaps in dissemination because some of the statements are unclear.

Dr. Granger suggested that the term "dissemination," which seems like one-way distribution, be replaced where possible in the report by the term "communication," which suggests a two-way street and seems to be closer to the goals expressed by IES and its commissioners. For example, by changing the language in the draft to ask how IES evaluates its "communication activities" (as opposed to its "dissemination activities"), it broadens the question from "Who is reading IES materials?" to also include "How much is IES hearing from the field and how much is the communication a two-way street?"

Section VI: Partnerships Between Researchers and Practitioners

Dr. McLeod pointed out that many of the questions and issues raised in the Dissemination of IES Research section (Section V) have been or are currently being addressed by IES. In contrast, the questions raised in the Partnerships Between Researchers and Practitioners section (Section VI) will be addressed this coming year or later. So, it may be appropriate to frame this section differently than Section V. Dr. McLeod added that some of the questions raised in Section VI hinge on external factors beyond the control of IES (e.g., the capacity of state education agencies (SEAs) to conduct research in the face of severe budget cuts); she suggested framing the questions in such a way that IES can respond. She also hoped the report would include more information about the RELs.

Dr. Underwood suggested the report better describe the role of universities as potential partners. Dr. Long said there was discussion about the fact that academic researchers are encouraged by their institutions to publish findings in academically oriented journals that have little impact on real-world practice. Dr. Granger said universities generally do not fund much educational research (but rather rely on external grant funding); he agreed that the annual report should highlight the concern that universities steer researchers toward issues that may not have practical applications.

Dr. McCardle pointed out that NICHD, NSF, and IES solicit research topics from researchers and also practitioners in the field, so two-way communication does take place. Dr. Granger added that IES is providing incentives for researchers to form more alliances and to prevent the phenomenon of university researchers using IES funding to pursue studies that have no impact on educational practice.

Dr. Gutierrez suggested revising wording in the report to highlight collaborations between researchers and practitioners to develop research agendas of mutual interest rather than the current framing, which pits the two against each other. Examples of collaborations should be included. Dr. Gutierrez also suggested maintaining, but toning down, the description of the gulf between research and practice.

Next Steps

Dr. Long said changes submitted by Board members will be incorporated into the report and that Dr. Herk might contact individual members to follow up on their feedback. Dr. Herk stressed that comments should be sent to her and Dr. Long, not to the entire Board. Dr. Long said she would try to develop a timeline for the production of the report, which must be sent to a printer by June 1, 2012. IES staff, commissioners, and other ex-officio Board members will have an opportunity to review the draft before it is finalized. Dr. Granger suggested that Board members have an opportunity to vote on endorsing the final draft before it is submitted; Karen Akins of ED agreed to discuss the matter with the Office of the General Counsel to determine an approval process that falls within the guidelines for federal advisory committees.

The Importance of Disseminating Research Results: How Can We Better Reach Practitioners and Policy-Makers?

New REL Contracts
Ruth Neild, NCEE Associate Commissioner

Dr. Neild described the structure of the REL program. For the cycle that begins in 2012 and continues through 2017, RELs will be encouraged to focus on a small number of topic areas and to go "deep" rather than "broad" in their analyses. In addition, most of the work of the RELs will be conducted through research alliances, which are partnerships of researchers and practitioners that are sustained over time.

Dissemination remains a key component of the REL program. Effective dissemination requires an understanding of how SEAs and local education agencies (LEAs) use research findings and how RELs can help. Simply put, RELs must not only supply research findings but also understand and encourage the demand for data—that is, how to ensure that the research meets the needs and expectations of SEAs and LEAs.

On the supply side, REL research must address questions important to practice. The use of research alliances allows for an ongoing conversation between researchers and practitioners through which important issues are identified, refined into the form of a research question, and tied to an action or decision that can be taken. Results should be made available in a timely manner; NCEE has struggled with timeliness and is seeking to institute a more nimble review process that retains the integrity of the original review process. Research by the RELs should be well-informed by the local context and the findings should be clearly presented and appropriate for the target audience. The whole IES team is thinking about these issues and emphasizing, to RELs, the need to consider communication strategies and develop meaningful products.

On the demand side, primarily through technical assistance, the RELs will utilize research alliances as well as face-to-face conversations with SEAs and LEAs to augment practitioners' capacity to access, ask questions of, and interpret data and research. The expectation is that this capacity will increase the frequency of using data as a regular part of educational practice. SEAs and LEAs need to know what data are available and what kinds of questions can (or cannot) be answered by their data. RELs will be working closely with SEAs and LEAs to ensure they can review research findings and their own data with confidence. RELs should also boost the ability of SEAs and LEAs to pose questions about their own data.

Much of what the RELs are doing has not changed since the previous cycle, said Dr. Neild; rather, NCEE is encouraging the RELs to do things in a different way. For example, RELs should develop a clear research agenda with specific and practical goals. Research alliances should focus on a clear action or goal that is related to improving educational practice and, ultimately, student academic outcomes. NCEE is encouraging RELs to focus on three to five topics and to evaluate them more deeply, rather than exploring a broad range of topics but only at a high level.

To increase the timeliness of findings and retain interest in research, RELs are being encouraged to present their work in shorter, more accessible products. RELs should work in partnership with practitioners to simplify some topics—e.g., an explanation of basic descriptive data—and disseminate early findings to spur interest. By putting out smaller chunks of information over time, said Dr. Neild, practitioners have more opportunity to engage with researchers and discuss practical concerns and applications.

NCEE also suggests that RELs develop "suites" of related products and consider how products can be targeted to different practitioner audiences. More attention should be paid to usability and readability. RELs should seek a variety of media and tools to broaden dissemination. Dr. Neild offered the following draft categorization of the various "product lines" that will result from research:

  • What's Happening?—descriptive studies of baselines and trendlines and the implementation of policies, programs, and practices
  • What's Known?—literature reviews, including systematic reviews
  • Making Connections—correlational studies
  • Making an Impact—studies of effectiveness
  • Applied Research Methods—methods-related lessons and studies
  • Briefly Stated—summaries of research crafted for different audiences

A key question to be explored in this new contract cycle is how practitioner-researcher alliances can operate at scale, especially when alliances include members that are separated geographically by hundreds of miles.

Organizing and Using Evidence for School Improvement: SEAs in the 21st Century
Diane Massell, Ph.D., Senior Research Associate, School of Education, University of Michigan

Dr. Massell noted that SEAs have long been criticized for being fragmented and reactive organizations, but there has been very little research on how they actually function. At the same time, SEAs have been given more resources and more responsibility to guide districts and schools and provide research-based information to help them improve. To fill this gap in our understanding of SEAs, Dr. Massell and her colleagues, Dr. Margaret Goertz and Dr. Carol Barnes, are studying how they seek out and use evidence to improve their low-performing elementary schools. The investigators used surveys, interviews, and document reviews of SEAs in three states to study how state education officials access three types of evidence used in decision-making: research, data, and advice from practitioners. The investigators are also exploring the impact of organizational structures and informal social networks on information exchange.

Of the three types of evidence, Dr. Massell said that research-based knowledge was by far the largest source of evidence that SEA officials sought out. Dr. Massell's survey of SEA officials indicates that they tend to seek information from internal agency sources more than from external sources. Nevertheless, the investigators' case studies and interviews suggest that external sources play a critical role in translating and packaging research on school improvement for SEAs. Most of the external sources used come from other government bodies (e.g., RELs, IES, ED, LEAs) and also from professional organizations.

The three states differed in their selection and use of government sources of research. For example, two of the three SEAs studied frequently turned to No Child Left Behind Comprehensive Assistance Centers, a network of technical assistance providers that seek to help states access and use high-quality research. One focus of these centers is to help states develop systems of improvement supports for low-performing schools. Dr. Massell hypothesized, based on her interviews, that the remaining state, which did not use its No Child Left Behind Comprehensive Assistance Center as much, had a relatively more established school improvement system; therefore, it did not reach out as often to these centers for advice on this topic. However, this state did turn more often to several different RELs. She said the differences across states in the sources of research evidence they used were likely related to the presence (or absence) of established relationships with the evidence sources and the background and experience of SEA staff.

Dr. Massell pointed out that about three quarters of all the external sources of research evidence identified by SEA officials were mentioned by only one individual. About 13 to 19 percent of internal staff were also mentioned by one person; if you include those named 2-3 times, the proportion is approximately 41 percent. Despite this pattern of singular connections, each SEA had key staff members who were turned to quite often by their colleagues for research advice. Dr. Massell displayed a socio-gram showing the people and organizations that SEA officials in one state would turn to in order to obtain research evidence. Individuals working in assessment and accountability and school improvement directors were often the central knowledge brokers for school improvement research. These findings contradict perceptions that SEAs are fragmented and siloed; instead, staff members identify and utilize sources of information across SEA organizational divisions.

Social capital within a network—that is, members' perceptions of the quality, strength, influence, and efficacy of information they obtain from the network—can facilitate more robust information-sharing, said Dr. Massell. Survey measures of network strength (frequency of communication and influence) showed that relationships internal to the SEA were considerably stronger than external ties. Although the networks that SEA officials used to obtain research evidence were the largest in terms of the number of sources, the networks that SEA officials used to obtain practitioner advice were stronger in terms of more frequent and more influential exchange. Dr. Massell pointed out that it is not surprising that people turn to their colleagues and peers for information and value their knowledge. Finally, SEA officials trusted the quality of various types of information they were receiving, but they were more mixed in their opinions regarding whether the information would be effective in solving the educational problems they faced.

Dr. Massell described the qualities of useful research from the perspective of the user:

  • "It looks like me." The context appears to be relevant.
  • "It shows me what to do." The findings describe specific actions, give clear examples, and outline the steps involved.
  • "It's cheap and addresses my problem." The findings include the cost of implementation and explain how it tackles the issue. (Fiscal considerations are powerful factors when acting on knowledge, said Dr. Massell.)

Finally, Dr. Massell summarized some key findings that IES may consider:

  • SEAs have key knowledge brokers in certain topics; can they be identified and engaged in research?
  • Communities of practice are an important component in evidence-based problem solving.
  • Clarifying the context of research helps the user connect by making the research more relevant, useful, and legitimate to the user.
  • IES may wish to learn more about the different networks within SEAs that state officials use to search out information about education research.

Commentary
Peggy McLeod, Ed.D., NBES Member

Dr. McLeod said the presentations by Dr. Neild and Dr. Massell inspired her to believe that the RELs could restore what should be the natural relationship between practice and research, in place of what too often has been an unfortunate disconnect between the two. RELs have the potential not only to disseminate information to SEAs, LEAs, and schools, but also to bring input back from those institutions to IES, other federal agencies, and the No Child Left Behind Comprehensive Assistance Centers. The RELs in this role could even serve as a model of how to establish strong relationships among federal agencies, researchers, and users of research.

In the absence of practitioners' perspectives, much research can be irrelevant and meaningless. In addition, practitioners sometimes take up educational interventions of dubious value because they have no research or guidance on which to base decisions about what works. Dr. McLeod hoped IES would support the RELs' potential for restoring the connection between research and practice and for serving as a new model for federal investments.

NBES Discussion

Dr. Granger noted that the William T. Grant Foundation funded Dr. Massell's research and other similar work, specifically focusing on the users of research. He noted that the WWC is pressing for coherent, clear products, but many practitioners don't use it because—as Dr. Massell suggested—it doesn't describe how a particular intervention fits their local settings, the real costs of adoption and implementation are not described, or they rely more on their social networks for information. More research on knowledge utilization may help identify the key information brokers and suggest how to reach them. Dr. Granger asked whether IES sees knowledge utilization as a potential research area and whether NCEE evaluation will take into account some of the knowledge utilization findings.

Dr. Neild said NCEE has developed a team of experts to support the RELs that includes people with experience in research alliances and data, among other topics, and their experience feeds into evaluation efforts. She is encouraged that some RELs built some formative evaluation into their proposals. Dr. Easton said IES has not discussed knowledge utilization as a new research topic but discussions have focused on how to foster knowledge utilization throughout the agency. The topic currently rests within the NCEE and IES has taken steps to work more closely with NCES and the SLDS on the issue.

Dr. Maynard noted that it is important not only to know where users get their information but also how producers of information communicate with existing knowledge brokers. She said RELs do incorporate evaluation and NCEE is thinking about how to better integrate evaluation into the RELs' work. Dr. Albro noted that NCER has RFAs that focus on knowledge acquisition and utilization. NCER's Reading for Understanding initiative is one effort that involves teams of researchers who are trying to speed up the process of knowledge dissemination. Dr. Albro added that Doing What Works and practice guidelines inform one another, so there is feedback about research and products. Dr. Ferrini-Mundy suggested that IES consider the NSF's Science of Science and Innovation Policy portfolio as a source of funding for research on knowledge utilization.

Dr. Ferrini-Mundy said the NSF director would like to see more commercial applications of funded research. The NSF Innovation Corps uses private sector funding to broker connections between researchers and people with business knowledge and experience; such a program can help propel research forward quickly. There has been some talk among the education staff at NSF of an E-Corps, similar to the I-Corps but focused on getting education research into practice and policy.

Dr. Massell said her research seeks to learn more about the role of key knowledge brokers. Dr. Granger said more efforts are needed to bring researchers together to better understand knowledge utilization and talk about persistent problems of connecting research with practice. He said the William T. Grant Foundation is working with IES to create a learning community among the existing research alliances.

Lunch

During the lunch break, NBES members participated in ethics training delivered by Marcia Sprague of the Ethics Division of ED's Office of the General Counsel.

Scaling Up Promising Models: What Can the Field of Education Learn From the Experiences of Other Federal Agencies?
Naomi Goldstein, Ph.D., Director, Office of Planning, Research and Evaluation, Administration for Children and Families, HHS

Dr. Goldstein summarized her experience with two HHS evidence review efforts—one on the effectiveness of home visiting and one on preventing teen pregnancy. "What works?" is not a simple question, Dr. Goldstein emphasized. It encompasses many other questions, such as what is the experience of the comparison group, in what context does an intervention work, and what are the "active ingredients" essential to the approach or intervention? In addition, what constitutes success—is it impact of a certain magnitude, or being sustained over a minimum time, or across certain domains?

HHS was required by law to implement evidence-based programs to prevent teen pregnancy and to promote early childhood and maternal health and well-being through home visits. Evaluating the evidence was complicated by the programmatic and research context of the two fields. With teen pregnancy, outcomes are relatively narrow and easily defined (e.g., initiation of sexual activity, use of contraception, pregnancies, births, and sexually transmitted infections). In home visiting, the law defined eight domains, each with multiple outcomes (e.g., child health, school readiness, maternal health, family self-sufficiency, and links to social services). The research context for the two topics also differs, said Dr. Goldstein. There are fewer teen pregnancy prevention studies and few models are broadly replicated or include long-term follow-up. In contrast, home visiting has a comparatively large body of research and some models are widely used and have been extensively studied.

Dr. Goldstein described four broad, distinct approaches that funding agencies can use to identify and build on the evidence base for an area of research in the context of a tiered initiative (i.e., a funding approach that gives the most money to the interventions with the strongest supporting evidence). In a competitive grant process, such as that used for Investing in Innovation Initiative (i3), the funder states the criteria for evidence in the RFA and the applicants make the case. An advantage of this approach is that the competition encourages applicants to take evidence seriously to meet the criteria. However, reviewers' expertise in assessing the quality of evidence may vary, and their assessment is limited to evidence supplied in the application (i.e., they may not be able to review the evidence directly).

Another approach is to rely on expert panels to assess the evidence base for a given intervention. The approach has the advantage of introducing expert judgment into the process, but could raise concerns about consistency and transparency. It may be prudent to conduct a systematic review of the evidence in order to provide comprehensive, consistent information for the expert panel to review.

Some funding agencies encourage grantees to identify and implement evidence-based models. Dr. Goldstein said, in such situations, the same organizations are responsible for determining evidence and implementing the programs, so this approach may effectively spread a culture of evidence. On the other hand, this approach may allow many definitions and interpretations of evidence.

Finally, the funding agency can undertake a systematic review of the evidence. HHS opted for this path, building on the WWC approach. The agency established criteria for acceptable types of evidence and a minimum standard for outcomes. The criteria for the two programs are similar in many respects, but differ in some respects based on differences in the legislation and the programmatic and research context. For example, in both programs, evidence from good quality randomized control trials and quasi-experiments is acceptable. In teen pregnancy prevention, where there are relatively few outcome domains and little replication of findings, HHS decided that favorable impacts in one domain are sufficient. In home visiting, where there are many outcome domains and some replication of findings, HHS decided that a model must have favorable impacts either across more than one outcome domain, or replicated across multiple studies. HHS did not require that programs show sustained impact over time, nor did it set standards on the magnitude of impact that would be considered sufficient. In publishing its findings, HHS described the factors that informed its assessments in detail so that others can review the models and the evidence for themselves.

As programs become implemented at scale, said Dr. Goldstein, it becomes more important to focus on the "what" in the question "What works?" For example, what is an acceptable intentional adaptation of the intervention? What is an acceptable level of fidelity? What is an acceptable application of the model to new populations or settings? Who determines what is acceptable? The model developers can bring specialized expertise to oversight and technical assistance, but the federal funder also has some responsibility.

Dr. Goldstein also said funders have to decide what kind of national or local ongoing evaluation is appropriate as programs scale up. While some believe that an evidence-based program does not require further impact evaluation, Dr. Goldstein said, ongoing evaluations may be warranted to evaluate the impact of adaptations, to strengthen the evidence base in light of new methodological standards, or to update evidence as the social context changes.

Paul Carttar, M.B.A., Director, SIF, Corporation for National and Community Service

Mr. Carttar said the SIF focuses more on the process of scaling up and evaluation than it does on the programs themselves. Of the interventions demonstrated to address problems affecting low-income communities, few of those interventions have succeeded in tackling those problems on a large scale. The SIF was developed to mobilize public and private resources to identify effective interventions that benefit low-income communities and grow them to reach more people. In addition to providing grant funding, the SIF is expected to contribute knowledge and practices that influence how other federal agencies make grants and to enhance the effectiveness of nonprofit organizations and donors through knowledge dissemination.

The i3 funds programs differentially on the basis of the quality of evidence the programs can demonstrate. In contrast, the SIF identifies programs that consistently meet minimal standards for preliminary evidence and (1) works with those programs to refine the intervention and demonstrate effectiveness while (2) helping those programs grow and build capacity.

Also, unlike the i3, the SIF makes grants to intermediaries (grantees) that make grants to service-providing nonprofit organizations (subgrantees). The SIF has distributed $95 million so far to 16 intermediaries, which have funded 150 service-providing organizations. Notably, the SIF does not select or directly manage the programs that ultimately receive funding. The SIF helps the grantees work with the subgrantees to plan evaluation, scaling, and capacity-building and then to begin implementing programs at scale.

The biggest challenge the SIF faces is achieving the goals of scaling up programs and expanding the evidence base through the intermediaries. After much discussion, the SIF determined that the ultimate goal of scaling up is to achieve social impact, and building human capacity and physical infrastructure are means to that end.

The SIF also determined that preliminary evidence is a sufficient minimum standard for selecting programs. Mr. Carttar said the SIF believes that its approach must be grounded in real-world issues. Those who would require a higher level of evidence may not fully appreciate the efforts of nonprofits trying to solve real-world social problems. The SIF is not intended to be a laboratory experiment, said Mr. Carttar; the funded programs are applying the evidence they have to serve people in need while building a larger database of what works.

The next challenge lies in setting clear overall program parameters about the nature of scaling up and the expectations for the programs, especially given that the SIF exerts no control over the service-providing organizations. Other challenges include selecting grantees capable of executing the program and helping grantees select high-quality subgrantees that not only have evidence to support their approaches but also are committed to evidence-based practice and management. Efforts are made to ensure that the subgrantees have clear plans for scaling up program and the SIF provides some input into, and management, of that process. Also, the programs must have a clear learning agenda, because the opportunity to learn about what works is even greater than the opportunity to help people, said Mr. Carttar. Supporting implementation and monitoring progress also pose challenges.

Mr. Carttar offered the following advice about scaling up promising models:

  • Set clear program expectations and parameters.
  • Be realistic about everything, such as the environment in which the intervention takes place and the goals and capacity of the service-providing organization.
  • Take advantage of the opportunity to generate and capture knowledge.

Commentary
Robert Granger, Ed.D., NBES Member

Dr. Granger applauded the efforts described by both speakers, saying the initiatives represent an extraordinary moment in U.S. policymaking in which funding has been dedicated to scaling up evidence-based programs. Not only do such initiatives provide needed social services, but they also offer opportunities for research and development. Failing to take advantage of the opportunities to learn would be a big mistake, said Dr. Granger.

Dr. Granger raised the concern that all of the evidence-based funding initiatives come from different sources, and those funding sources are not stable. He hoped that, at the federal level, efforts are underway to learn from the scale-up initiatives regarding how to effectively produce change at scale and then build those results into ongoing funding streams, such as Title I or the Child Care and Development Block Grant.

While none of the funders have yet determined which approach to assessing the evidence base is most effective, with reasonable impact evaluation, it's likely that we will learn more about the importance of the evidence and how to use that information in funding decisions. The tiered funding approach seeks to invest in efforts that have a strong evidence base, while also encouraging others to develop a stronger evidence base, said Dr. Granger. From these programs, we will learn how well that approach works.

Dr. Granger pointed out that even those programs that have been successfully disseminated are not fully "at scale." But the efforts underway by HHS, the SIF, i3, and others will gather more information about contextual factors that influence the effectiveness of various interventions—in other words, what works where, when, and with whom. For example, starting a new teen pregnancy prevention program in a community that already has several related programs may not make a net difference, even though the program may be effective in other circumstances. Moreover, the mobility and transience of the community may affect how successful the programs are and data collection may reveal what kind of model might work best for a given community.

As we gather data, said Dr. Granger, it behooves us to think about gathering information that will be useful for others who are developing and implementing models, not just for those who were funded. He noted that the initiatives described seem to be committed to learning, and he hoped that the initiatives were sharing information among themselves.

Discussion

Mr. Carttar said the SIF is conducting a self-evaluation over a 5-year period. The first step was to define success, which is measured in five topic areas:

  • Outcomes for the people served
  • Expansion of the evidence base
  • Advancing understanding about scaling and what works
  • Role of the intermediaries
  • Influence on other funders

The SIF hopes to learn not just what worked but what underlying factors distinguished what worked from what did not. The first year of evaluation cost about $500,000, or 1 percent of the SIF budget. In addition, each of the SIF subgrantees is subject to rigorous evaluation. Mr. Carttar also noted that there are opportunities for funding agencies to share information.

Dr. Goldstein agreed with Dr. Granger that the initiatives represent an important opportunity to learn about contextual factors that can only be gleaned from large-scale efforts. However, she cautioned not to expect too much; HHS plans to evaluate home visiting programs at 85 sites around the country, but that effort includes four distinct models applied in various settings among various populations, so the scale is not quite as large or illuminating as one might think. Dr. Granger concurred, but said that findings will accumulate over time and meta-analysis can help sort out what is effective. He said the goal for now is to identify and fund promising practices, gather information, and improve our approach over time.

IES-Funded Research: Reviewing Current Activities and Considering Avenues for Improvement
Introduction
Bridget Terry Long, Ph.D., NBES Chair

Dr. Long framed the topic by pointing out that three components work together to influence the research that IES funds: the proposals received in response to RFAs, reviewer selection and training, and the review process itself. The following presentations provide an overview of the current portfolio of research grants being funded by IES through NCER and NCSER.

NCER's Research Portfolio
Elizabeth Albro, Ph.D., NCER Acting Commissioner

Dr. Albro presented a table listing all of NCER's investments, by category, since 2002. Categories included Education Research, Research and Development Centers, and Predoctoral Research Training, with Education Research receiving the majority of funding: $803.9 million or 56 percent of NCER's investments since 2002.

Dr. Albro said that the remainder of her talk would focus on the Education Research grants. The Education Research program funds research in 10 topic areas that range from basic academic areas (e.g., reading, math) to improving education systems, policies, organization, management, and leadership. Topic names have changed over the years, but the research opportunities have not really shifted. She provided a breakdown by topic of the number of grants funded, but noted that the number does not necessarily reflect the total number of grants funded that address the topic. Some topics (e.g., English-language learners, education technology) have not been separate program areas for very long and therefore have fewer funded grants, but research on those issues had been funded under other topics before NCER established the "new" topic category.

NCER has five research goals that have been included in all RFAs since 2004:

  • Exploration
  • Development and Innovation
  • Efficacy and Replication
  • Scale-Up Evaluation
  • Measurement

Since 2004, 46 percent of grants funded to date were submitted under the Development and Innovation goal, while 26 percent were submitted under the Efficacy and Replication goal. Thirteen percent of funded grants were submitted under the Exploration goal and another 13 percent were submitted under the Measurement goal. Only 2 percent of funded projects to date were submitted under the Scale-Up Evaluation goal.

Breaking down the percentage of funded grants by the grade level to which the research applies, 21 percent of the projects are looking at a combination of grade levels, either at transition points or across the spectrum of education. Beyond that, 12 percent are examining early childhood education, 31 percent are focusing on the elementary grades, 32 percent are looking at middle or high school grades (or both), and 4 percent are addressing postsecondary education.

Dr. Albro noted that while NCER funded 80 projects under its math and science topic, a total of 198 grants address math or science in some capacity. For example, two thirds of the cognition and learning research portfolio is exploring how to improve math and science learning. The teacher quality program, several state and local evaluations, and the education technology topics all include grants involving math and science. Similarly, while fewer than 20 grants fall under the topic of English-language learners, 57 grants address the topic in some way. NCER also has a strong investment in reading and writing, Dr. Albro said.

In terms of tracking the progress of funded research over time, Dr. Albro said she found it rewarding to know that 16 interventions initially supported through NCER Development awards are being, or have been, evaluated through Efficacy awards. Another 15 projects were funded under multiple goals (e.g., first under Exploration and later under Measurement).

Finally, Dr. Albro noted that over the past 5 years, NCER funded between 9 and 13 percent of the applications it received. The number of applications spiked in 2010 and 2011 and the proportion of those rated outstanding or excellent by review panels was about 9 percent.

NCSER's Research Portfolio
Deborah Speece, Ph.D., NCSER Commissioner

Dr. Speece described NCSER's mission, noting that it covers 18 statutory duties assigned by the Individuals with Disabilities Education Act (IDEA) of 2004. With that many topics to cover, NCSER sometimes has to choose between breadth and depth of research.

Dr. Speece summarized NCSER's funding by program type from 2006 to 2011; nearly all of NCSER's funding goes toward special education research. NCSER has four Research and Development Centers, two of which received their first grants in just the past 2 years. Dr. Speece said NCSER awarded SBIR grants for the first time in 2011.

NCSER's four Research and Development Centers tackle difficult problems that NCSER's leadership believes deserve more emphasis: math, assessment, response to intervention in early childhood, and serious behavioral disorders at the secondary level. Each receives about $10 million over 5 years. NCSER requested proposals for four more centers: two of which will focus on autism spectrum disorders, one on deaf and hard-of-hearing students, and one on families of students with emotional/behavioral disorders.

NCSER special education research funding falls into 11 topic areas, including the two most recently established topics: families and technology. Dr. Speece noted that, as Dr. Albro pointed out, researchers could receive funding for research in any of these topics previously (before they were formally designated by IES), but NCSER designating the areas as topics helps draw attention to them.

Data that Dr. Speece showed on NCSER investment by topic shows that the oldest topics have the largest proportion of grant investment and the newest have the least. The most "mature" portfolios are early intervention, and social and behavioral outcomes. Dr. Speece pointed out that NCSER funds research aimed at children from birth through high school and is the only IES Center that includes research on children ages 0–3 years old.

NCSER has the same five goals as NCER and the funding breakdown by goal is similar as well. The largest investment is in Development and Innovation. There is a trend of more funding for Efficacy and Replication grants (which includes rigorous testing of interventions originally funded under Development and Innovation). Like NCER, very little funding goes to Scale-Up Evaluation at this point.

An evaluation of investment by developmental level shows that early childhood, elementary school, and middle or high school (or both) each receive about one fourth to one third of total funding. Dr. Speece said she would like to see more research on adolescents.

Over half of NCSER's special education research dollars (54 percent) are focused on infants, toddlers, children, and youth with identified disabilities. Thirty-eight percent of the funding goes to studies that focus on children who either have a disability or are considered at risk for disability. Dr. Speece clarified that "at risk" must be based on a convincing argument that a proposed identification variable is related to risk for a specific disability and not a generalized statement from an applicant linking, for example, poverty with later disability. A few very early grants focused on at-risk children only.

Federal law identifies 13 special education categories, and Dr. Speece presented the NCSER investment for 11 of those. The largest investment (23 percent) goes to behavioral disorders. To some extent, the level of investment is related to the length of time that the topic has existed as a named topic for funding.

Finally, like NCER, NCSER funds from 9 to 13 percent of the applications it receives. Dr. Speece did not know why the number of applications dipped in 2008. The median percentage of grants funded is 10 percent.

Discussion

Dr. Albro said that when topic areas are added to IES's portfolio it reflects the priorities of the IES director. For example, the influence of former IES Director Grover Whitehurst is evident in the focus on basic academic topics (reading, writing, math, and science). Dr. Easton added that topics are sometimes carved out because grant applications are not addressing issues of concern to IES, such as educational policies, leadership, and systems. Dr. Speece said she believed some of the NCSER topic areas arose out of interactions with the field, responses to RFAs, and IES priorities.

Dr. Gutierrez wondered whether the fact that NCER funded only 9 percent of applications received in 2010 and 2011 indicates that IES may be overlooking relevant areas on which researchers are working. Dr. Albro and Dr. McCardle both said that applicants are referred to related funding opportunities from NCER, NICHD, or NSF when appropriate. Dr. Albro said she rarely receives feedback that applicants are interested in studying topics that are not covered by IES funding or related programs.

Dr. McCardle asked whether availability of funds affected the number of awards. Dr. Granger said that if NCER had the financial capacity to fund many more grants than it actually does, then efforts should be made to improve the quality of applications submitted. Dr. Easton pointed out that the percentage of applications funded was lower in 2010 and 2011 than in previous years, but the actual number of applications funded increased in those years.

Dr. Maynard noted that a high percentage of applicants receive funding for revised submissions; Dr. McCardle said it is not uncommon for younger, newer investigators to send in lower-quality applications because they have not talked with program officers about requirements or sought technical assistance in advance. Dr. Albro said that many agencies had additional federal stimulus funding in 2010 and 2011, and Dr. Maynard said universities were encouraging investigators to submit applications for everything during those years.

Dr. Long asked whether the experience of program officers for a given topic might influence the number of grants funded for that topic. Dr. Speece said the percentage of grants funded ranges from 9 to 12 percent across all the topics; no single topic has a consistently higher or lower percentage of funded applications.

Dr. Albro explained that NCER has been able to fund all the applications considered to be excellent or outstanding by the review panels. There is no specified cap on funding. Dr. Speece said NCSER has its own funding line. NCSER could have funded more grants if more applications were received. As the number of applications has increased, so has the number funded. She clarified that, to date, NCSER, like NCER, has never had excellent or outstanding applications that it could not afford to fund.

Members debated whether funding 9–12 percent of applications is appropriate. Dr. McCardle said NICHD funds about 13 percent of applications and a funding cut-off of 9 percent may be low but "not terrible." She did not believe other federal agencies were funding much higher percentages; the highest rates she recalls were 20–22 percent, but those figures were long ago when more money was available. Dr. Long said programs are clearly making an effort to educate potential applicants about writing strong proposals. The Board may wish to discuss whether it can make recommendations to encourage the submission of better applications.

Dr. Long questioned whether the very small percentage of grants going toward scaling up projects reflects the lack of effective programs or the novelty of scaling up programs in education. Dr. McCardle pointed out that previous efforts to fund scale-up were extremely expensive and applicants may feel limited by the amount of funding available. Dr. Albro said NCER receives very few applications for scale-up projects. However, she described a relatively new program in which a state or local district covers the cost of implementing an educational program of its choice and IES pays for the evaluation of the program. Ten of these partnerships, which provide an avenue for evaluating programs that are relevant to schools, have been funded over the past 3 years. Dr. McCardle said that approach may identify potential programs that should be scaled up.

Regarding the percentage of applications funded in 2010 and 2011, Dr. Albro cautioned that the data represent only a 5-year period. Dr. Long believed the message for the field is that it is in everyone's best interest for applicants to take advantage of technical assistance offered by program officers so that their initial submissions are strong.

Legislative Update: Status of IES Appropriations and Reauthorizations
Lloyd Horwich, Office of Legislation and Congressional Affairs, ED

Mr. Horwich said that the Obama administration supports a strong, bipartisan reauthorization of ESEA. However, ESEA reauthorization is long overdue. Therefore, President Obama and ED Secretary Duncan created an ESEA flexibility package that states can use to begin implementing reforms to help children. The deadline for requests, for the second round of flexibility waivers, is in late February 2012. Mr. Horwich believes 42 states either intend to apply for waivers or have already.

The Senate Health, Education, Labor, and Pensions (HELP) Committee passed a bipartisan reauthorization of ESEA in October 2011. The House is marking up a series of bills that would reauthorize ESEA in pieces, but the bills do not have bipartisan support. Both ED Secretary Arne Duncan and Senator Tom Harkin, who chairs the Senate HELP Committee, are concerned about the lack of bipartisan support for the House ESEA reauthorization bills.

The Senate ESEA reauthorization bill consolidates several smaller authorities for evaluation into a more comprehensive evaluation authority, which could give IES the ability to be more strategic in conducting evaluations and to take advantage of economies of scale. The House ESEA reauthorization bills do not focus on IES or research.

Mr. Horwich said that neither the House nor the Senate authorizing committees have begun to prepare for reauthorizing the Education Sciences Reform Act (ESRA). Also, ED has not developed a proposal regarding ESRA reauthorization.

Mr. Horwich said the 2013 ED budget request totals $69.8 billion in discretionary funding, an increase of $1.7 billion (2.5 percent) over 2012. In addition, President Obama has proposed a one-time investment of $14 billion to align education programs with workforce demands, to raise the profile of the teaching profession, and to increase college affordability and quality.

ED continues to make strong investments in foundational programs, such as Title I, IDEA, and Pell grants, as well as in reform efforts, such as i3 and the Promise Neighborhood program. ED Secretary Duncan sees IES as a high priority, said Mr. Horwich. He described the proposed IES budget for FY 2013, which represents an overall 4 percent increase over 2012:

  • Research, Development, and Dissemination would receive $12.5 million more, for a total of $202 million.
  • Statistics would increase by $6 million, for a total of $114.7 million.
  • The proposed budget for the RELs ($57.4 million), Research in Special Education ($49.9 million), and Special Education Studies and Evaluations ($11.4 million) would remain the same as the FY 2012 budget.
  • Assessment would receive $132.3 million.
  • Statewide Data Systems would increase from $38 million to $53 million.

The President's FY 2013 budget proposal would establish the Advanced Research Projects Agency for Education (ARPA-ED) and ED would request that funds appropriated for i3 be used to support ARPA-ED. Support for ARPA-ED is included in the Senate HELP Committee bill for ESEA reauthorization.

Discussion

Dr. Long said the House Subcommittee on Education held a hearing in November about ESRA reauthorization (the legislation that created and authorizes IES). Mr. Horwich said he believed that hearing reflected the subcommittee chair's interest in data, but ESRA reauthorization is not on any committee's agenda yet and ED has not submitted a reauthorization proposal.

Dr. Granger emphasized that under current legislation, Board members' terms are tied to the "slot" they are occupying and not to when they were actually confirmed to serve in that slot. The confirmation process can move very slowly, so that by the time someone is appointed they may effectively have only a very short (e.g., 1 year) term. At the same time, "log jams" of terms expiring simultaneously also occur. As a result, if no new members are confirmed, the Board will have only two members as of November 2012. An easy fix would be to reset the clock, so that Board members have 4-year terms that begin when they are appointed. Dr. Long added that the Board is already down to 7 members (out of a potential 15), which makes it very difficult for the Board to function. Mr. Horwich said ED will take that concern into account as it develops a proposal for reauthorization.

Dr. Underwood pointed out that there had been some congressional movement toward simplifying the appointment process. Mr. Horwich said some legislation to that effect had passed in the Senate. A streamlined appointment process would help address the Board membership problem.

Mr. Horwich noted that neither the Senate nor the House has scheduled a mark-up of the appropriations legislation for the coming fiscal year and sometimes the process occurs very late. He also noted that ARPA-ED would likely fall under the Office of Innovation and Investment, which is spearheading the proposal, and not under IES. Dr. Easton said one argument he has heard for separating ARPA-ED from IES is to allow for the creation of a new grant-making culture at the advanced research projects agency that is more focused on innovation than the mainstream research funding agency is.

Discussion: Role of the NBES

Dr. Long asked Board members to consider the following questions:

  • What role should the Board take in advocating for education research?
  • Should the Board be involved in advocating for the use of research in forming policy legislation?

By way of example, Dr. Long noted that in the summer of 2011, the Board wrote a letter to the Secretary recommending that states granted an ESEA flexibility waiver be encouraged to collaborate with ED to evaluate at least one of their programs and that language was incorporated into the final document. A follow-up letter from the Board clarified how states could work with IES to develop their research capacity.

Dr. Granger described how the Board came to the decision to mark up the ESRA legislation that was eligible for reauthorization in 2008. At that time, the suggested changes were marginal, as the Board supported the basic nature of ESRA and believed that IES was off to a good start. The Board's suggested mark-ups for ESRA from May 2008 were included in the Board members' binders for their consideration. Dr. Granger said the 2008 mark-up was sent to the Secretary and to Congress and there was some public comment and discussion about it.

Dr. Granger recalled there had been some debate at the time about appointment of Board members and IES commissioners. He noted that some commissioners are appointed by the President and confirmed by the Senate while other commissioners are appointed by the director of IES.

Dr. Long said the Board was probably not large enough to set up a subcommittee to undertake a mark-up of ESRA legislation (as the 2008 Board did), nor did it have the kind of expertise required to do so. However, the Board can explore other options to signal Board priorities or support. Dr. Long reminded the Board that she and the previous Board Chair, Jonathan Baron, met with congressional staff and drafted the letters sent to the Secretary regarding waivers and evaluation. Dr. McLeod said that with no reauthorization bill under discussion, the less formal approach may be the Board's only option.

Dr. Granger said that when he served as chair, he encouraged the Board to make formal resolutions on areas of interest. In that way, Board officers could speak with the authority of the Board. Dr. Underwood said advisory committees are often discouraged from advocacy, but if the Board has the opportunity to do so, it should take advantage of it. Dr. Long noted that she and Mr. Baron cleared their approach with ED before moving forward.

Dr. Long added that the Board's support and recommendations should point to specific examples of IES successes when possible. Dr. Granger proposed that at the June 2012 Board meeting, members discuss and update the 2008 mark-up of the ESRA. Then, regardless of when the reauthorization occurs, future Board members can refer to a recent discussion of Board priorities and recommendations on the matter. Dr. Granger also suggested asking Dr. Easton and the IES commissioners for input. Members discussed identifying some general principles about ESRA reauthorization—either in addition to, or instead of, marking up the legislation. Dr. Granger also recommended inviting comments from stakeholders, such as AERA, or presenting previous input from stakeholders, as the Board reviews the ESRA mark-up.

Dr. Underwood noted that Section 303 of ESRA defined "states" for the purpose of NAEP as the 50 states plus the District of Columbia and Puerto Rico. It leaves out the U.S. territories, which until ESRA was enacted had been included in NAEP. Dr. Underwood hoped to identify some remedy that would gather more data on all American students.

Closing Remarks, Next Steps, and Adjournment
John Q. Easton, Ph.D., IES Director, and Bridget Terry Long, Ph.D., NBES Chair

Dr. Long said the Board will work on finalizing the 2012 Annual Report. It may be possible to have a public teleconference before the next in-person meeting in June to discuss the final report. Dr. Long said she and others will give more thought to how to work in a review of the 2008 ESRA mark-up.

Dr. Easton thanked the Board for its work on the annual report, noting that this first effort to write the report in a new way will not be easy, but IES appreciates it. He particularly thanked Dr. Long and Dr. Gutierrez for their work organizing this meeting.

Dr. Long thanked the staff of AFYA, Inc., the meeting contractor that handled the logistics of this meeting; Ellie McCutcheon, IES Associate Research Specialist, for managing travel arrangements; Wilma Greene, IES Management/Program Analyst, for serving as liaison to the contractor; and Dr. Herk for all of their hard work. Dr. Long adjourned the meeting at 4:34 p.m.

PDF File View, download, and print the full meeting minutes as a PDF file (474 KB)
The National Board for Education Sciences is a Federal advisory committee chartered by Congress, operating under the Federal Advisory Committee Act (FACA); 5 U.S.C., App. 2). The Board provides advice to the Director on the policies of the Institute of Education Sciences. The findings and recommendations of the Board do not represent the views of the Agency, and this document does not represent information approved or disseminated by the Department of Education.