Skip Navigation
A gavel National Board for Education Sciences Members | Director's Priorities | Reports | Agendas | Minutes | Resolutions| Briefing Materials
National Board for Education Sciences
Tuesday November 8, 2016 Minutes of Meeting

Location
Suite 206
Heritage Reporting Corp.
1220 L Street, N.W.
Washington, D.C.

Participants
Ellie Pelaez: Moderator/ Designated Federal Officer
David Chard: President of Wheelock College Michael Feuer: Dean of Graduate School of Education and Development, George Washington University. President of National Academy of Education Kris D. Gutierrez: Professor of Literacy and Learning Sciences, University of Colorado, Boulder
Larry Hedges: Board of Trustees Professors, Northwestern University
Bridget Terry Long: Academic Dean and the Xander Professor of Education at Harvard Graduate School of Education
Jeannie Oakes: Presidential Professor Emeritus in Education Equality at UCLA
Judith D. Singer: James Bryant Conant Professor of Education and Senior Vice Provost for Faculty Development and Diversity at Harvard University
Robert Teranishi: Morgan and Helen Chu Endowed Chair in Asian American Studies, UCLA Graduate School of Education and Information Studies
Ruth Neild: Commissioner, Office of the Commissioner, NCER
Thomas Brock: Commissioner, National Center for Special Education Research
Peggy Carr: Acting Commissioner, NCES
Dr. Heit (phonetic)
Joan McLaughlin: Commissioner, National Center for Special Education Research
Joy Lesnick: Associate Commissioner, Knowledge Utilization Division, NCEERA
Rebecca Rust

Proceedings

(2:00 p.m.)

MS. PELAEZ: Good afternoon. This is Ellie Pelaez, the Designated Federal Officer for the National Board for Education Sciences. I just want to make a quick note. The WebEx password is lower case "nbes" for anyone who's having trouble accessing the WebEx. That's just a quick reminder.

I'd like to call the meeting to order. Thank you, everyone, for joining us today. I'd like to begin by taking roll.

Dr. David Chard? Please take your cells off mute, another reminder. Dr. Chard?
(No response.)
MS. PELAEZ: Dr. Feuer?
DR. FEUER: Here.
MS. PELAEZ: Dr. Gutierrez?
DR. GUTIERREZ: Here.
MS. PELAEZ: Dr. Hedges?
DR. HEDGES: Here.
MS. PELAEZ: Dr. Long?
(No response.)
MS. PELAEZ: Dr. Oakes?
(No response.)
MS. PELAEZ: Dr. Phillips?
(No response.)
MS. PELAEZ: Dr. Singer?
(No response.)
MS. PELAEZ: Dr. Teranishi?
(No response.)
MS. PELAEZ: Dr. Underwood?
(No response.)
MS. PELAEZ: Just one more reminder to take your cells off mute. I think we have more participants, but I'm not getting any response.
DR. OAKES: Jeannie Oakes is here.
MS. PELAEZ: Thank you. Dr. Singer?
DR. TERANISHI: This is —
MS. PELAEZ: Go ahead.
Mr. TERANISHI: Oh, this is Robert Teranishi.
MS. PELAEZ: Hi, thank you.
DR. LONG: And Bridget Long.
MS. PELAEZ: Hi, thank you. Dr. Chard? Dr. Neild?
DR. NEILD: Present.
MS. PELAEZ: Dr. Brock?
DR. BROCK: Here.
MS. PELAEZ: Dr. Lesnick?
DR. LESNICK: Here.
MS. PELAEZ: Dr. McLaughlin?
DR. McLAUGHLIN: Here.
MS. PELAEZ: Dr. Carr?
DR. CARR: Here.
MS. PELAEZ: Mr. Thompson?
(No response.)
MS. PELAEZ: Dr. Bianche?
(No response.)
MS. PELAEZ: Ms. Rust?
(No response.)
MS. PELAEZ: Dr. Heit?
DR. HEIT: Yes, here.
MS. RUST: Ms. Rust is here.
MS. PELAEZ: Thank you. Thank you.
Did I miss anyone? Is anybody else on the call? Any other members?
(No response.)
MS. PELAEZ: Okay. So I'd next like to welcome in particular a new member, Dr. Oakes, and our re-appointed members, Dr. Chard and Dr. Hedges. Congratulations. Welcome, and welcome back.
DR. OAKES: Thank you.
MS. PELAEZ: Thank you. So our first item of business is to approve the agenda. As a reminder, please state your name before you speak. Do I have a motion to approve the agenda as written?
DR. HEDGES: So moved. It's Larry Hedges.
MS. PELAEZ: Thank you. Do I have a second?
DR. GUTIERREZ: Second. Kris Gutierrez.
MS. PELAEZ: Thanks, Kris.
All those in favor of approving the agenda as written please say aye.
(Chorus of ayes.)
MS. PELAEZ: Those opposed please say nay.
(No response.)
MS. PELAEZ: The agenda is approved. Thank you.

Okay, so we're now going to move into the chair nomination and election phase of today's meeting. So I'm going to open up the floor or the line, if you will, for nominations. Please state your name and then your nomination.
DR. GUTIERREZ: This is Kris Gutierrez. I'd like to nominate Larry Hedges.
MS. PELAEZ: Okay. Do I have a second for this nomination?
MS. SINGER: This is Judy Singer. I second that nomination.
MS. PELAEZ: Do we have a second for the nomination? I'm sorry.
MS. SINGER: This is Judy Singer. I second that nomination.
MS. PELAEZ: Thank you, Judy. Thank you.
Any other nominations?
Hold on one second, everybody on the phone. We're trying to work out a technical problem. We're just going to mute it for one second.
(Pause.)
MS. PELAEZ: Some people were unable to speak, so I think we sent the right code to David, so we should be good. I'll just wait a second. Well, I'll call one more time if there's any more nominations one more time.
THE COURT REPORTER: Excuse me. This is the court reporter. We do not have an updated list of the names. We have maybe half of the names. So if there's a way for us to get an updated list.
MS. PELAEZ: Yes, we can send you the names.
THE COURTER REPORTER: Okay.
MR. CHARD: Ellie, this is David Chard. I just got on the speaking —
MS. PELAEZ: Great.
MR. CHARD: — number. Sorry.

MS. PELAEZ: Great. No problem. Thank you, David. So now that you're on and I think we've got all the members who were supposed to call in and talk, one more call for nominations for chair. I have Larry Hedges. Okay, I don't hear anymore nominations. So we're going to proceed with sending out the ballot. Give me a few seconds to get this ready.
DR. FEUER: Ellie, this is Michael Feuer. I'm in a car using my iPhone, and I would just ask if you and everyone could speak as loudly as you can because I'm not sure that I just heard the last thing you said.
MS. PELAEZ: Sure. Just give me a few seconds to get the ballot ready and then I will send it out by email.
DR. FEUER: Oh, okay.
MS. PELAEZ: So there might be a little bit of a lull here.
DR. FEUER: And is that going to be the only way to vote? Because I can't do that while I'm talking on my phone.
MS. PELAEZ: That's right. That's the only way. I have to send it by email. It has to be anonymous.
DR. FEUER: All right. Well, there you go.
MS. PELAEZ: Anonymous ballot. Yes, you could vote. I think you can still stay on the call and access your email and vote that way.
DR. FEUER: I don't think I can access my email while I'm on the phone. That's the problem.
DR. GUTIERREZ: Michael, this is Kris. I think I would suggest just voting then and then get back on the call. You've got to do it somehow.
DR. FEUER: Okay, I can do that.
FEMALE VOICE: Did you already send, Ellie, or are you still sending it?
MS. PELAEZ: Sorry, I'm still working on it.
FEMALE VOICE: Oh, okay.
MS. PELAEZ: I will give you a heads-up when it's on its way. Thank you.
THE COURT REPORTER: Hi. This is the court reporter again. Just to remind everyone to identify yourselves before you speak.
MS. PELAEZ: Okay, I just sent the email, so it's coming from me, but it will be a link to Survey Monkey, so you should all get it because we've emailed in the last few days, so I don't think it will go to your junk mail or spam folder.

DR. FEUER: Okay. So, Ellie, this is Michael Feuer again. I'm going to click off the call so I can look at my email, cast a ballot, and then I'll just dial back in. Is that going to be okay?
MS. PELAEZ: Okay. I think that's fine. Thank you.
DR. FEUER: Okay.
(Pause.)
MS. PELAEZ: Hi, this is Ellie. We have four votes, waiting for four more. Again, let me know if you don't get the email.
(Pause.)
MS. PELAEZ: Okay, we're at six. Two more.
(Pause.)
MS. PELAEZ: We're up to seven.
DR. FEUER: Hello?
MS. PELAEZ: Hello?
DR. FEUER: Yes, this is Michael Feuer. I have dialed back in.
MS. PELAEZ: Okay, thank you.
(Pause.)
MS. PELAEZ: Is there anyone who hasn't voted?
DR. HEDGES: Yes. Actually, I haven't voted. This is Larry Hedges. I'm still waiting for Survey Monkey to let me in.
MS. PELAEZ: Okay.
(Pause.)

MS. PELAEZ: Larry, are you having any luck with it? Should I re-send it?
DR. HEDGES: I still see the little circle going around and nothing's happening, so maybe I should — I tried it a second time. Maybe I should try it a third time.
MS. PELAEZ: Yes, just to —
DR. HEDGES: Okay, I think it just came up.
MS. PELAEZ: Okay.
DR. HEDGES: Oh, actually, maybe not.
(Pause.)
DR. HEDGES: Okay.
MS. PELAEZ: Larry, I think we got you. We're at eight.
DR. HEDGES: Good.
MS. PELAEZ: So the votes have been easily tabulated and Larry Hedges is our next Board Chair. So congratulations.
DR. HEDGES: Great. Thank you.
MS. PELAEZ: You're welcome to make a few brief remarks if you'd like at this point.

DR. HEDGES: Well, thank you, colleagues, for that vote of confidence. I'm not sure it's justified, but I'll do my best to make it so. Beyond that, I wasn't prepared for this, so I haven't prepared any remarks.

MS. PELAEZ: Okay, that's fine. Thank you. Thank you, Larry. So I guess we'll move now into the IES updates from the commissioners and Ruth.

DR. NEILD: Okay. Good afternoon. This is Ruth Neild speaking, and I guess I should say good morning for those who might be outside of — who might be on Pacific Time or further.

At IES, we do want to thank the National Board for Education Sciences for convening to elect a chair in what is almost certainly its first WebEx meeting, so thank you very much for this flexibility. We are really looking forward to benefitting from the counsel of Larry and the entire Board. The Board, as you know, has had a very major roll in helping to strengthen IES, most recently with the encouragement you have given us to communicate clearly and widely about data collections, assessments, research, evaluation, and dissemination that IES supports and how these impact education practice and policy.

For the next 50 minutes or so, maybe a little longer since we have some extra time, we will provide an overview of major undertakings and accomplishments at IES during the past year. I will speak for a few more minutes and then each commissioner or acting commissioner will provide an update of no more than 10 minutes. After those presentations, we may have time for a few questions from board members. It looks like we will since we are a little bit early on the agenda, so that's good.

As the agenda indicates, there are five minutes at the end of the meeting for the Chair to make closing remarks. So, Larry, you could be thinking about if you have any remarks in the next hour. The meeting will end no later than 3:30 p.m. Eastern Time.

So let me begin by talking about a key focus at IES in the last year and that is improved communications. This focus has had two dimensions. First, we greatly increased our resources for effective communications. In November 2015, we welcomed a seasoned communications director who has been a journalist and had also worked at a state education agency and a local education agency. With his leadership and a great deal of work by IES staff, we rolled out a new look for the IES website in April of this year.

So what you see on your screen right now is the home page of the IES website. The design is more contemporary and less cluttered, and it also enables us to efficiently highlight what we think are really important qualities of IES. In this case, looking at this screen, that IES is independent and non-partisan, and that the work it does is both relevant and practical. The new website is also mobile friendly, and that was an important improvement that we wanted to make because we see that our traffic increasingly is coming from mobile devices.

This page seeks to balance an overall vision about IES with information about each of the four centers. Each of these boxes that are circled in red here contain a live link to that center's home page. The home pages for the centers and the Standards and Review Office have a similar look and feel to that of the IES home page, so what you are looking at now is the home page of the National Center for Education Research where, again, we are highlighting what we believe are important features of IES-supported research; namely, that the research is rigorous and that it addresses important 21st century challenges in education. Over time, our intent is to redesign the pages of individual programs below the level of the center.

We also expanded our social media presence this year and began to provide short blog posts to help our audiences understand what is being learned from IES-funded work and how it is being used. We now have a main IES Twitter account. We never had that before. We had a number of different Twitter accounts for different programs and centers, but not one for IES as a whole. We do have that now, and this Twitter account provides updates about activities across IES, including videos from the IES YouTube page and blog posts from across the centers, and what you're seeing on the screen now is an example of one of our blogs.

One of the challenges we have faced at IES is showing the full range of IES support for an evidence-informed education system. It's a good challenge to have in many ways because there is a lot that is going on at IES, but we have struggled sometimes to explain how all those pieces fit together.

So a second dimension of our communications work has been to build a stronger, more cohesive narrative about IES as a whole. We are working on a series of short videos about IES and to date we have published two of them. The one you're looking at now — well, it's not the video, but it is a screen shot from the video — is an introduction to IES and it can be found on the IES home page. In this video, we describe six broad types of work in statistics, research, and evaluation that IES supports and we talk about why each of them is important.

To emphasize the overall identity of IES, we don't talk about the individual centers until the end of the video, and we're continuing to work on materials that convey these important messages and the overall narrative about IES.

So what I've described is the communications work that has occurred at the institute level. Each of the commissioners will talk more about how they and their staff have sought to explain activities and findings clearly and in an engaging manner and encourage their grantees and contractors to do the same.

In addition to communications the commissioners will address other key areas of work at IES this year, specifically, a range of support and tools for data access, research evaluation, and reporting carried out in formalized partnership between staff at state or local education agencies on the one hand and researchers from outside those agencies on the other. The commissioners will also describe some important new work at IES, including new grant opportunities, tools, and approaches to support the collection and use of data and research.

Much of the work that the commissioners will describe has been in planning and development for a number of years, so we think it was fortuitous that this work was nearing completion at the time that the Congress passed and the President signed the Reauthorization of the Elementary and Secondary Education Act, known as the Every Student Succeeds Act or ESSA.

A key set of provisions in this December 2015 law requires recipients of funding under ESSA to select activities, strategies, or interventions that have specific levels of evidence of their effectiveness. The law implicitly identifies a hierarchy of evidence of effectiveness, with the strongest evidence from experimental studies, followed by that from quasi-experiments, then from correlational studies, and, finally, from a logic model supported by other high-quality evidence.

There isn't any time today to delve into these provisions, so perhaps the Board may want to consider them in a future meeting, but I'm mentioning them today because, one, I would argue that this part of the law would not have been thinkable without the work of IES and its staff, grantees, and contractors over the past 15 years; and, two, the law has intensified the conversation among education agencies about how to incorporate research evidence into decisionmaking, and clearly that calls on the work of all of the centers of IES.

So, at this important moment, we at IES are thinking hard about other ways that IES — that the IES infrastructure of data and statistics collections, research funding, research summaries, standards, both research standards and statistical standards, and training, technical support, and dissemination mechanisms, can help states and districts make good use of scientific evidence, and we are looking forward to working with the Board and to getting your advice and your good counsel on ways that IES can use this infrastructure in this way.

So finally, I want to briefly mention some important work that IES is supporting to improve transparency and reproduceability of scientific results as well as to promote access to findings from IES-funded research. A project supported by IES through a grant to the Society for Research on Educational Effectiveness is the development of a registry for studies of efficacy and effectiveness. You are very likely aware of growing concerns in a number of scientific fields about the so-called file drawer problem where null results are not published and the so-called pea-hacking problem, otherwise known as going on a fishing expedition for statistically significant results rather than specifying a hypothesis and analysis plan ahead of time.

The result can be that the field gets a distorted picture of the effectiveness of an intervention. So study registries are thought to be one way to address these problems.

What you see now on your screen is the opening page of SREE's online registry which is expected to be live in March. The study registry that had been maintained by the What Works Clearinghouse has been shut down and their records will be transferred to the new SREE registry. The SREE registry will allow a lot more detail to be recorded about studies than the What Works Clearinghouse registry did. So we see this as an advance in the field and are happy to be supporting it.

I also want to mention IES has been implementing a public access policy to research in which grantees are required to deposit the full text of peer-reviewed articles in ERIC within one year of publication. We have created materials to help grantees understand and comply with this policy and ultimately, we're hopeful that the registry will include links to completed papers in ERIC.

So I hope that this introduction has increased your appetite to hear more about what IES has been working on in the past year, and so now we will turn over to the commissioners for some updates, and next we will have an update from Dr. Peggy Carr, who is the acting commissioner of the National Center for Education Statistics.

DR. CARR: Thank you, Ruth, and I want to congratulate Larry Hedges on his new appointment, and welcome others to the WebEx here today.

THE COURT REPORTER: Excuse me. Who is speaking? This is the court reporter, so I need to know who is speaking right now.

DR. CARR: This is Peggy Carr.
THE COURT REPORTER: Okay, thank you.

DR. CARR: I have a very short presentation and want to start off by a thing that has only three components. I want to update you on the budget and perhaps I said a little bit about this before, but this is an update, then a little bit about published reports, recently published reports and some upcoming ones that I want to bring to your attention, and what we will be doing in 2017 as we venture into our major data collection.

Here you see the budget layout for FY '16, and as you might recall, there was a $20 million annual increase for the National Assessment of Educational progress from 129 to 142. The statistics budget also received an increase of $8 million, up to 112, in 2016.

Here you see what we did with those funds or what we're anticipating on doing with those funds. With regard to the national assessment, or NAEP, we will be transitioning to a digitally-based assessment. I'll say a little bit more about that as it will be implemented in 2017. We are also going to be funding the full schedule of NAEP-based assessments, which include writing at 8th grade. It also includes the social studies at the 8th grade starting in 2018.

We're going to be adding to our cadre of trial urban districts assessments jurisdictions. We had 21, we're going to be moving up to 27 districts starting in 2017.

The $8 million increase for statistics is going to be used to fund two international studies. The International Early Childhood Education Study is really a new study, has not been fielded as of yet. In fact, Dana Kelly, our own Dana Kelly here at the National Center for Education Statistics is the chair of that committee and they're working on the design.

The International Computer and Information Literacy Assessment has been fielded before. It will be the second time in operation in 2018. Both of these, if I did not mention, will be in 2018. Only a handful of countries participated before, but this assessment will be administered in 8th grade and it looks very interesting, a practice very similar to the NAEP TEL assessment, the Technology, Engineering and Literacy Assessment.

I also want to point out that we're going to use some of the funds to support an administrative data collection for NAEP staff. We're very excited about this opportunity because it will reduce burden and hopefully increase efficiency and accuracy of data for NAEP staff.

For many years now we have been funding, and I don't have my data here for K–12, but we have been funding and producing studies for safety and crime but no official allotment, and so we're very pleased to have funds dedicated to the collection of these data moving forward.

Moving very quickly to just a list of the recently released reports, I won't go into any of the details here. Perhaps if there is time for questions later, I can certainly address any questions you might have. I would point out that our most recent one, Science, Science 2015 got a lot of attention. Also, I would call your attention to the remedial course taking study which also has been very interesting findings around who really needs remediation, so I invite you to pay attention to those in particular.

If we could go next — yes, we'll go to the graduation rates for high schools was just released. President Obama had a little something to say about these graduation rates recently when they were released to the public at a local high school, but the important data to note here is that we see an increase over the time period from 2010 to 2015, a now 4.2 percentage increase for graduation rates, and if you look at the subgroups, you'll see that they are also increasing and some of them at a faster rate.

In terms of our upcoming releases, in a few days we'll have data from the PIAAC assessment, that's the Program for International Assessment of Adult Competency, but for a prison population. This is incarcerated adults between 18 and older. The TEMS (phonetic) release is forthcoming November 29 on math and science, of course, and I mention the science in particular because the science assessment was captured during the same time period and contiguous, in a contiguous window to the NAEP science data that we just released a few weeks ago.

Finally, the PISA data will be released on December 6. It, too, will have science data in it, of course, along with math and reading.

Just a little bit about the data collections that are coming up. I won't go through these individually. I can always come back if someone has questions, but here we have some postsecondary longitudinal data collections going on and we're very happy to see that the NTPS or the old SASS is going to be back in the field.

And I promised I would say just a little bit about the NAEP 2017. Now, in with that, the first large-scale, digitally-based collection of NAEP data for reading and math on tablets will happen in 2017 for 4th and 8th grades. We will be piloting this effort for social studies during the same time period.

We're going to be bridging from the past into the future from our paper and pencil to digitally-based items for all states and for all tutors. We will not be reporting out on our paper and pencil, but we'll have an opportunity to evaluate the differences that may occur, and there will be a special study for our adaptive design for mathematics in 2017.

So I'll stop there and, again, if there's questions later, I can always go back to some of these items.

DR. BROCK: Okay. Good afternoon. This is Tom Brock. I'm the commissioner for the National Center for Education Research, and let me just say it's a pleasure to be part of this virtual meeting.

I'd like to update you on some of the major activities of the National Center for Education Research during the past year, and I'm going to focus on three major topics: first, our 2016 and 2017 grant competitions; secondly, the innovative new programs we recently started; and third, the efforts we're making to improve our dissemination and outreach.

Let me just say also at the outset that two of the research centers, NCER and NCSER, coordinate on many areas of our work, and you'll hear both from Joan McLaughlin and me about some of these joint initiatives.

So, first, I wanted to share with you the results of our 2016 competitions. IES reviewed over 500 applications and made a total of 81 new awards for our competitive research and training grants. As usual, the large majority of grants were in our Education Research Grants Program in which we support field-initiated studies across a wide range of education topics.

The second largest was in our new Research Networks Program, which I will tell you more about in a few minutes. We also made several awards for researcher/practitioner partnerships and researcher training, and we made one grant to a new research and development center focused on virtual learning.

Some of you may remember that heading into 2016 we anticipated a tight year for funding new grants, and so to avoid a situation in which we receive many more applications than we knew we could fund, we limited the scope of some of our competitions. Most important, in the Education Research Grants Program, we did not invite applications for development and innovation or goal two research projects. We also announced caps on the number of awards we would make in several of our competitions.

These steps proved helpful. With these limits in place and with a small increase in the IES budget from Congress, we were able to fund all of the outstanding and excellent applications in our education research grants competition. The funding rate across all of our competitions in 2016 was a little more than 15 percent, which is much better than we were able to do in the years following sequestration.

Last spring we announced our FY 2017 research and training grant competitions and largely owing to the budget agreement that Congress and the White House reached last December, we are more optimistic about our ability both to compete more programs and to make more awards in the coming year. As you can see from the slide, we are running six major competitions. We are once again inviting applications for development and innovation or goal two projects in education research, and we lifted the caps on the number of awards in most of our competitions.

Applications were due last August and the review process is now in full swing. I can tell you that the overall number of applications we received is up considerably from 2016, and next year when the Board meets again in the spring or early summer, I'll update you on the results of the current competition.

I'd like to turn now to some of the innovations in our research and training grant competitions.

As you may recall, NCER and NCSER undertook a number of efforts back in 2014 and 2015 to get input from researchers and practitioners on our programs in ways we could make the research we fund more impactful and useful. We issued a request for public comment. We held technical working group meetings with experts, and we engaged in conversations with you, the Board.

One piece of advice we heard consistently was that in an environment of limited resources, it's important to identify some research priorities where we think concentrated investments will make a difference. Another recommendation was to look for ways to encourage more collaboration among researchers to tackle problems jointly.

We heard that perhaps an unintended consequence of IES grant competitions may be that they lead to a lot of disconnected research projects and to research that's being carried out by individuals working in relative isolation. The thinking was that perhaps we could get more breakthroughs or at least make faster progress on some of these problems if we could find ways to encourage more joint problem solving.

So, in response, we developed a new program called Research Networks Focused on Critical Problems of Policy and Practice. Each network involves several research teams that are working on the same problem or issue. The research teams are independent, meaning they each have their own grant and their own ideas about how to tackle an issue, but each of the teams also agrees to work with other research teams on activities that will benefit their collective work, such as developing and using common measures.

A network lead is responsible for coordinating network activities. IES is also setting aside up to $1 million for collaborative activities that may be approved by IES and which may include things like jointly developed measurement tools or research synthesis at the end of the project. In 2016, we competed and funded two research networks. One is focused on early learning. Specifically, the network is examining the transition to early elementary school to understand why it is the positive gains that we often see among low-income children in pre-K seem to fade once they begin regular elementary school.

The second network is focused on college completion. This came out of a review of many of the grant awards we've already made in the postsecondary field and realizing that most of the work we have funded to date is focused on college access or on the transition to college. We know much less about how to get students through college and to a degree, so this network is focused on developing and evaluating interventions to support college completion.

As noted earlier, we made nine grant awards to begin building these networks in 2016, and we are running another networks competition in 2017.

A second new program that we launched is called Pathways to the Education Sciences. As we discussed with you at the June 2015 board meeting, there's broad concern in the field about the lack of economic, ethnic, and racial diversity amongst people entering and working in the education sciences field.

The IES-funded pre-doctoral training programs are expected to have a recruitment plan that takes diversity into account, but we heard from the PIs of many of these programs and other experts that we still need to do more to identify and develop young scholars further back in the pipeline.

The pathways program is designed to do just that. It focuses on juniors and seniors in college, recent college graduates or Master's degree students who may be interested in pursuing an education research as a career. It specifically targets students from low-income backgrounds and groups that are underrepresented in Ph.D. programs. It is designed to provide instruction in scientific methods used by education researchers, provide a paid research apprenticeship, and assistance with graduate school selection and application. Minority-serving institutions are involved by requirement either as a host institution or as a partner.

We funded four pathways grants in 2016 and are competing this program again in 2017.

A third new program that we launched is called Low Cost, Short Duration Evaluations of Education Programs, and both NCER and NCSER began this effort as a pilot in 2016. We see it as a tool to help states and school districts working in partnership with researchers to get credible and timely information that they can use to make decisions.

As Ruth noted, the Every Student Succeeds Act really elevates the need for states and school districts to have reliable evidence on program effectiveness at their fingertips.

So what do we mean exactly by low cost and short duration? Our answer is $250,000 for an evaluation that can be completed in two years. Now obviously we do not expect the studies funded at this level to do the same amount of data collection and analysis of our larger grants, but we do expect a rigorous design, either a randomized control trial or a regression discontinuity design. We think the key to making such studies relatively cheap and fast is to focus the studies on short-term interventions that last no more than a single school year and to rely on administrative records to measure student outcomes.

NCES grants, among others, have helped many states make improvements in their data systems, and so we're hoping that this program will help states and school districts use these systems to test whether or not their interventions are making a difference.

NCER made three awards to low-cost, short duration studies in 2016, and NCSER made one award, and both centers are running this competition again in 2017.

So I will conclude by mentioning our ongoing efforts to strengthen dissemination and outreach. We encourage our researchers to publish in the scientific literature, but we also encourage them to consider ways to share their learning with policymaker and practitioner audiences. Again, under ESSA, this could not be more important.

Both NCER and NCSER require applicants to our research grant programs to describe they will use to disseminate their research findings. In addition, in our researcher/practitioner partnership grants and our low-cost, short duration evaluation grants, we require at least one briefing for state and local partners at the conclusion of the study.

The two research centers are also taking steps to amplify major findings and lessons through our research blogs and the social media, such as Twitter. Recent blogs, for instance, have focused on sharing information about research, to improve reading instruction for children with Down Syndrome, and on some of the award-winning educational gains that have been developed through our Small Business Innovation Research Program.

Finally, we're working with the principal investigators of our research and development centers and other large grants to brief policymakers on the Hill and in the education department. Last summer, for instance, we brought the principal investigators of our Reading for Understanding Initiative to Washington to speak to leaders and staff in the department about their results. This briefing was simultaneously broadcast over the Internet for other Ed employees and the general public.

We're enthusiastic about the progress we're making on communications, but we also know there's a lot more to do. So we welcome your feedback and advice on this and all areas of our programming.

So, with that, let me turn it over to my colleague, Joan McLaughlin.

DR. McLAUGHLIN: Good afternoon or good morning. This is Joan McLaughlin from the National Center for Special Education Research. I'm happy to be talking with you.

I will spend my time today updating you briefly on the results of the National Center for Special Education Research, or NCSER's, 2016 grant competitions and then on the grant programs we're currently competing in fiscal year 2017. I'll then update you on some other activities we've been involved with, including two technical working group meetings in the Department of Education's plan for public access to research.

In 2016, NCSER released three requests for applications. The first was for our field-initiated Special Education Research Grant Program. This is our primary grant program that supports research in 11 broad topics and the five IES research goals. We funded 36 grants in this competition across all topics and goals.

Next, with our research training program grants competition, and this has three components. Our early career development and mentoring grants provide support for integrated research and career development plan for investigators early in their — the early stages of their academic careers and who have established an interest in special education or early intervention research, and in 2016 we funded four of these early career grants.

Our Postdoctoral Research Training Program grants are made to doctoral-granting institutions to further prepare researchers to conduct high-quality independent special education or early intervention research, and we funded two of these grants in 2016.

We also funded a methods training grant. This is to provide training on using single-case design and its aim is to help current education researchers maintain and enhance their research and data analysis skills related to single-case design.

Lastly, we also had the inaugural year for our low-cost, short duration evaluation of special education intervention in both NCSER and NCER. As Tom mentioned, these grants are intended to provide evidence of impact that state and district education agencies can use in making timely decisions regarding the scaling up or revision of special education intervention, and the partnership between researchers and state and locals is primary here. NCSER funded one low-cost grant award in this inaugural year.

By limiting the number of competitions we ran in 2016, we were able to fund all 44 applications rated outstanding or excellent by peer reviewers across all three of these competition.

Next slide.

So our 2017 competitions are well underway. As Tom talked about, the request for applications were released in the spring, and applications were due at the beginning of August. Given an even tighter budget outlook in 2017 than 2016 for new awards, and this is a little bit different from NCER, NCSER needed to further limit what we competed. For our main special education research grant competition, we limited the scope of what we competed. Specifically, we're targeting a critical need area as the focus of all applications, and that critical area is research on teachers and other instructional personnel responsible for educating students with or at risk for disabilities.

We're interested in a broad range of research within this targeted focus that helps us to better understand and improve the knowledge and skills that helps teachers and other instructional personnel improve student education outcome.

For our Research Training Grant Program in 2017, we're competing the early career development and mentoring grants only, and we are competing the second year of our low-cost, short duration evaluation of special education intervention grants. Again, with ESSA we see a great need for this in the field and also the requirements that OSEP has for states to target certain areas in their state plans. So stay tuned for what happens in 2017.

So the two research centers generally hold two to three technical working group meetings over the course of a year to help us plan future research activities. These meetings bring together about 15 to 20 expert researchers and practitioners whose input can help us identify ways that we might encourage more research in an area where we typically haven't seen much work, helps us to better understand critical issues in a topic area, and to address training or technical assistance needs for researchers, as well as identify directions that we should take to help us generally improve the research we support.

So, in this past year, we have held two technical working groups involving both research centers. The first was held in September focused on writing at the secondary level. Both research centers have funded writing since we started funding grants in the early 2000s, but the portfolios have been relatively small and they focused on how young children learn to write. There has been relatively little about how older children and young adults learn to be proficient writers or how to intervene to help them improve their writing skills.

This twig focused on the research needs in the area of middle and high school writing for students, including English learners and, of course, for my perspective, for those with or at risk for disability, and we targeted the four areas that you see on your screen: argumentative writing, support for struggling writers, engaging adolescents in writing, and assessment and feedback. It was a productive and engaging meeting and it gave us much to consider in our future grant activities. We have a summary of the meeting under review and this will be posted on our website.

The second technical working group held a few weeks ago we called "Building Evidence: What Comes After an Efficacy Study?". Our intent was to receive experts' guidance and recommendations for how best the IES research centers can help build evidence following the successful completion of an efficacy study. This includes conducting further efficacy trials and establishing the effectiveness of program policies and practices, as well as addressing the specific issues and challenges that go along with doing this work.

Some of the guiding questions that were posed I put up on the screen and they include: What should effectiveness studies accomplish? What are the challenges of initiating and completing an effectiveness study? What steps are needed to encourage more replication research? How do we advance our understanding of causal mechanisms and variation and impacts? And what role should IES training programs play?

We had an interesting discussion around these questions and we received some great input and, again, this information will be used as we plan future activities in the coming years. A summary of this technical working group meeting is currently being prepared and we will post that on the webpages for each of the research centers.

Tom has already spoken to you about activities that we focused on in the research centers related to communication and dissemination, and I want to mention a milestone on another effort that the research centers have been working on for three and a half years and may be responsible for some of the gray hairs in my head.

On October 21, the department approved the official plan and policy development guidance for ensuring public access to research findings. This plan addresses access to the results of Department of Ed funded research across all offices within the department and includes publications and data stemming from the research. With this approval, the department now joins a host of other federal departments and agencies that have completed this process in line with the White House directive back in 2015.

While it is newly approved for the department, the IES research centers began implementation awhile back and will continue to build on these efforts. Beginning in 2012, we required our grantees to submit their peer-reviewed scholarly publications to ERIC, as Ruth mentioned, and before then we encouraged them to do so.

In 2013, we began to roll out the requirements on the data access part so that the public could have access to final data from our funded research activities.

Applications for our efficacy and effectiveness grants must include a data management plan, and this describes the method of data sharing, the types of data to be shared, and documentation that will be created to promote responsible use of data.

We still have a great deal to work on, for example, getting our compliance rates up for submission of articles into ERIC and providing more technical assistance on the data access process. But now that we've worked on the plan and had it approved departmentwide, we can focus on this next phase.

I will end here and thank you and I look forward to engaging with you more in our next meeting.

DR. LESNICK: Thank you. Good afternoon or good morning. My name is Joy Lesnick and I have been the acting commissioner of the National Center for Education Evaluation and Regional Assistance since July of 2015. I was unable to attend the last board meeting a year ago, so I'm glad to be able to be here today to give you a brief update of our work in NCEE even if it's from afar and after more time than expected.

So the last year has been a very productive one in NCEE. We released 11 evaluation reports on a variety of topics and tried out different ways of sharing the information from those reports, including an interactive application for exploring the results of the race to the top evaluation that came out in the last few weeks and a few videos to summarize study findings too.

One of the evaluations that wrapped up this year, in September, was our third large-scale random assignment evaluation of teacher professional development. This most recent study and the two earlier ones are part of an ongoing evaluation agenda that we've developed to advance understanding of how to help teachers improve.

They were designed to build upon one another to form a coherent and useful body of knowledge. But these evaluations were separated by a number of years which makes it hard for readers to connect the dots. So, to address that challenge, last week NCEE released a follow-up brief that synthesizes the findings from the three professional development evaluations and offers lessons learned. We also wrote a blog post to explain more about the findings and what we've learned and used videos and study snapshots along the way to share the findings in accessible ways.

So what have we learned? Well, all three studies examined professional development programs in reading and math that emphasized to somewhat varying degrees the building of teachers' content knowledge and knowledge about content-specific pedagogy. These are skills that educators and researchers have argued are important for effective teaching, but the amount of rigorous evidence is limited.

The programs combined summer institutes with periodic teacher meetings and coaching during the school year. These programs were compared to the substantially less intensive professional development that teachers typically received in study districts.

So the studies all found that the professional development improved teachers' knowledge in some aspects of their practice, but improving teachers' knowledge and practice did not lead to positive impacts on student achievement and that most of the measured aspects of teachers' knowledge and practice were not correlated with student achievements.

On the plus side, the consistent pattern of findings suggest that the intensive summer institute coupled with meetings and coaching during the school year may be a promising format for changing teacher knowledge and behaviors in the classroom.

However, it also suggests that the translation from teacher knowledge and practice to actual student learning is complex. It appears we still have much work to do to fully understand what exactly it is that teachers need to know and be able to do well in order to best promote student learning, and once we have that down, we also need to think about whether the current formats we use for professional development are moving the essential aspects of knowledge and practice enough to translate into impacts on student achievement.

This is a critical lesson that we hope will inspire researchers, developers, and providers to think more carefully about the logic and design of the next generation of professional development interventions.

So shifting gears to our knowledge utilization projects, in the past year, we released 14 What Works Clearinghouse intervention reports which are systematic reviews. We have new reports in a variety of topic areas, including three new topic areas in postsecondary education, which are transition to college, supporting postsecondary success, and developmental education. The What Works Clearinghouse has also released two practice guides in the past year, both in literacy, one on foundational skills to support beginning reading and the other on teaching secondary students to write effectively.

And for a sneak peak, in three weeks we expect to release our twenty-third What Works Clearinghouse practice guide, and this will be our first in postsecondary education. This upcoming guide provides recommendations on supporting postsecondary students in developmental education.

Our Regional Educational Laboratory Program, or REL Program, has also been quite busy releasing 97 REL reports in the past year, even more than are listed on the right, which are new topic pages released last week on the IES website that compiled REL Program work by topic rather than just organizing them by REL. So these pages all have links to publications, videos of archived Webinars, infographics and works in progress for the topic areas with the largest portfolios of REL work across the entire REL Program.

I should note that the past year represents the peak of releasing REL reports in a five-year contract cycle and this contract cycle comes to an end in early January. So this number will not be as high the next time we talk, but we are pleased with the high-quality, relevant work done in partnership with our research alliance members on a variety of topics across the nation and outlying areas.

The overall focus on communication at IES has also been an NCEE focus and we've tackled that in a few different ways. One way has been through the tools that NCEE projects have developed to support the use of knowledge from high-quality research. We have some really innovative recent examples on the slide here. The first one is RCT Yes, which is a free downloadable software program that allows users to easily analyze results from impact studies and quasi-experimental designs as well as RCTs. Even though it's RCT Yes, it does not replace the researcher but instead helps support and simplify the analysis process.

Going clockwise we also have a few podcasts, including a series from REL Midwest on developing early learning systems in Minnesota and Wisconsin which was part of a year-long REL programwide learning series on early learning systems that included tools, technical assistance events, infographics, research reports and user-friendly summaries.

And, finally, the REL Program has an entire tools product line that includes self-study guides for implementing early literacy interventions, tools for developing surveys in your own context, tools for creating a research practice partnership research agenda, two guides for facilitating a professional learning community based on What Works Clearinghouse practice guides, and many, many others.

NCEE has also continued to invest in partnerships over the past year, including nearly 80 REL-supported research alliances across the country.

Another notable partnership in NCEE is between the What Works Clearinghouse and REL Investments through bridge events in which RELs share information from What Works Clearinghouse practice guides and other high-quality research through events and technical assistance.

A major achievement in NCEE this past year was an overhaul of the What Works Clearinghouse website. This work has been in process for nearly two years and the revised website was launched in September. This is a screen shot of the new What Works Clearinghouse home page, which directs users to explore the evidence by choosing a topic right on the front page, and I encourage you to try out the live demo version, but I'll show you a few screen shots here to take you through some of the key features and how it's changed dramatically.

This screen shot has already selected math as the topic area, and I'll give you a brief tour of some of the features and invite you to try it out on your own later.

So, when you select a topic, the next page provides a list of all the interventions that have intervention reports on that topic, and you can also choose more than one topic on the list. All results are listed and they are ordered by the interventions with studies with positive effects first.

A brand new feature is in the upper left with the yellow arrow that is the students like yours button where you can add variables of interest to you and your contexts.

Here's an example of adding student characteristics from your context. This will add an additional column to the results list, and I've also added grade 6, 7, and 8, but there are sliders here for the demographics that apply to your particular context.

The new results list has added a column for students like yours with one to three green ovals. So even if the research is not similar to the characteristics submitted, there will still be one green oval because there could still be relevant information for the decisionmaker even if the research took place in a different context. The three green ovals means research was conducted in situations very similar to yours.

There's lots and lots of clickable information on this page. Almost everything is clickable. You can also compare interventions, but I selected the second intervention in the list with math outcomes. This is Teach For America, interesting to get this as a math intervention, but it has math outcome, so that's how it continues to be organized, and I'll show you a little bit more here.

This is an intervention page. It gives an overview of the findings from studies of the intervention. At the top right, you can export or print the information and you can also do this for the underlying data from all studies and all interventions from the find evidence section of our main menu. But from this page you can drill down information about each of the studies that met standards that contribute to this intervention report, which I'll do here for math achievement.

Go ahead, next slide there. Yes. Where six studies met standards. In the live demo version, that would open up to show you all six of the studies and each of them would as well, and then you could continue to drill down and click more.

So there is a study page. This is one of those six studies that's underneath of that six studies meeting standards for the math achievement domain. This page provides all of the information from the study review, including the study rating and if there is at least one statistically significant positive finding in that study. The study pages also indicate that the review may not reflect the full body of research evidence for this intervention, so it's directing users back to the results of the systematic review of all of the evidence rather than just one study.

So my final slide is a view of the search screen to search for individual studies. This is another area, a brand new functionality on the What Works Clearinghouse website and users can search by the What Works Clearinghouse rating, by the research design, by the topic, or studies that have statistically significant positive findings by choosing a check box.

In the live version, again, the list is clickable. It will take you back to the study page, which will then direct you back to an intervention page, so you can see there's lots of different ways to get into the data and navigate among the different resources available on the website.

This has been a major undertaking and a major change to the underlying database, the website and user experience. Our goal has been to solidify and enhance the What Works Clearinghouse infrastructure to support the use of evidence now and into the future. So I encourage you to explore the live version and we welcome your feedback and suggestions for future improvements.

With that, I'll turn it back over to Ruth. I'm also happy to answer any questions you may have about this and other NCEE work during the remaining time. Thank you.

DR. NEILD: We have, I think, about 15 minutes for questions and comments. Larry, if that's okay with you since you are the Board Chair and you are really in charge of this meeting at this point.
If you do have a question or comment, please state your name.
DR. FEUER: I'm not sure if that's Ellie or Ruth speaking. This is Michael Feuer. If I might barge in with a couple of comments and a question so that I can also catch my flight home. If it's okay for me to go first, I would appreciate that.
DR. HEDGES: Go ahead, Michael.
DR. FEUER: Thank you. A couple of very quick comments. First of all, congratulations, Larry. I'm delighted and I'm looking very much forward to working with you, and congratulations to all of us, and I also want to say a special thank you to David for very fine leadership and I hope for continued collaboration.
I have a question that — actually, it's a question/comment for Peggy and a comment based on what Tom has presented, and so let me just be brief about it.

Peggy, I know that one of the issues that you face with respect to NAEP has been the status and the prospects for the long-term trends, and we don't have time probably this afternoon to get into all of the details, but I just wonder if there are aspects of NCES's constraints vis-a-vis the long-term trend and the decision that you think the Board might be able to help with one way or the other.

I have, as you know, mixed feelings about these issues and I'm sure that the decision about where the long-term trend is and when the next one is going to be administered went through a lot of very careful thought, and I just want to suggest that if you think the Board can be of some help to you, I would certainly be glad to participate in that kind of a conversation.

The comments I have based on Tom's report, it has to do with the statistics that you reported, Tom, on first-time award rates for new grants in the various programs. As I was driving I couldn't write them all down. I got a sense that things had improved at least slightly over previous years, which is, of course, terrific.

But my suggestion would be, and maybe this is actually something for Ruth and for others and the other commissioners to think about, my suggestion is to develop a more systematic and uniform way of actually reporting the data on grant applications, awards, hit rates, and to provide that kind of information in an easily accessible way, disaggregate it if possible by characteristics of the applicant and the applicant's institution and the type of grant. This is information that would be extremely helpful, I believe, not just to me but to other deans who are trying to cultivate and encourage their early career faculty in particular to apply for competitive federal grants on these sorts of topics.

Having that kind of data in a regular place, I think, and I know it is accessible, but it's sometimes a little bit awkward to get to it, and I would just encourage you to think of providing that more easily to people like us who could really use it.

So those are my comments and just to say in general that was a superb set of presentations and I wish I actually could have written most of it down because I found it very valuable and informative, and I thank you.

DR. CARR: Thank you for your question. This is Peggy Carr. Thank you for your questions, Michael. Very insightful. Perhaps you have seen the discussion by Tom Loveless and others being very concerned about the long-term trend and sort of the back and forth amongst the assessment community.

I should share that the National Assessment Governing Board is launching sort of a series of stakeholder outreach conversations on the topic of long-term trends. Ed Puddles (phonetic) has written a paper on the topic and I think it might be available in the NACSE (phonetic) briefing book when it's released in a few days. He, along with Jill Wilhardt (phonetic) and Andrew Holm, is pulling together people to comment on the paper and a series of outreach to the community.

So I see a wonderful opportunity. Just generally, any stakeholder I think should be available to participate in this dialogue and I would welcome you to be part of that conversation. I'll get you more information about where we are with these activities as they unfold.

DR. FEUER: Thank you so much, Peggy. Thanks very, very much.

DR. BROCK: Michael, this is Tom Brock from NCER. I just want to say indeed we do collect the kind of statistics that you're asking for about number of applications reviewed and number funded. We can go back in time. We can break down by specific competitions. There are various ways we can look at the data.

I think two things: One, you are a relatively new board member and we have not met with the Board for a year now, but in the past we actually did present exactly those data and those trends, and we would be happy to come back and do that again in the future. So perhaps working with Larry now we can think about when and which agenda to put that on in the future.

DR. FEUER: Thank you. Let me just add that I think that kind of data also relates to something that's clearly a priority of yours and of IES in general and that's the expansion of the opportunities for research to communities of researchers who may be underrepresented in our field, and that kind of disaggregation of the data I think would be very informative. So thank you for whatever you can provide perhaps between now and subsequent meetings and continue that conversation. Very much appreciate it.

DR. HEDGES: In the spirit of Michael's comments, I think this was a — this is Larry Hedges. I think this was a terrific briefing and I hope you'll make available the complete collection of slides to all of us in some form that we can hang onto in the near future.

DR. NEILD: This is Ruth Neild, and we will be happy to do that.
DR. HEDGES: Good. Are there other comments from other board members?
MR. CHARD: This is David Chard. I want to commend Joy and her team for the work done with the What Works Clearinghouse. I know we've all been in these conversations for a number of years, but I'm now hearing from people in the field who just are remarking on how much more accessible and useful the tool is, and so bravo.
DR. LESNICK: Great. Thank you so much, David. This is Joy Lesnick.
DR. OAKES: This is Jeannie Oakes. I, too, very, very much appreciated the briefings this morning and really think the progress is quite amazing. I have a suggestion on the communications front based on my own personal pride in what we did at AERA last year.

You know, we took about 32 researchers and put them through a communications training process to teach them how to talk about the key findings of their work in about seven minutes, like Ted Talk-style talks, and they also then so it didn't look like they were just sort of off-the-top-of-their-heads Ted Talks did fact sheets which gave a brief and layman-like summary of the research base from which their talks were drawn with a nice long bibliography.

Those are all on the AERA website now and downloadable, both the fact sheets and the talks. But what we noticed, and we had the experience and the great privilege of presenting some of these talks both at the White House for administration officials but also at a convening with a lot of thought leaders outside the field of education, and they found them really, really compelling and an additional benefit was that the researchers really — Kris Gutierrez was one of them, so she knows — really communicated how human and accessible they were, as well as having a great deal of technical expertise, and the use of that tool for not only conveying information but to help build relationships between people in policy and practice and researchers I think was a real benefit.

So I don't know whether you considered that kind of a format as a way of reporting the results of your funded studies, but I would certainly recommend just taking a look and see what you think.

DR. GUTIERREZ: This is a great suggestion. This is Kris Gutierrez. This is a great suggestion because there has been so much uptake of those videos, we never imagined. I just used it. It's interesting to see how many people in some of these sectors of education are utilizing those Ted Talks.

DR. McLAUGHLIN: This is Joan McLaughlin from the Special Ed Research Center. We have our grantees come in once a year. The principal investigators come in for a big meeting once a year, and both last year and this year we have tried to focus on communication and dissemination. This year we're having people prepare videos that we can upload and show and hopefully put on our website later. We also have — we're going to have some people talk about why research matters in a very short period of time to give them, you know, sort of the elevator or the snippet of the elevator talk of why the research matters, and again it's to get them thinking about communicating this to people who don't have much time but, you know, getting at the essence of it, and I think we're also working in these meetings to give people some more skills on communicating their data in a better way and talking to reporters and those kinds of things, which I think all are important, and besides the journal articles and getting the messages out.

DR. BROCK: And this is Tom Brock. I would just add that in our various training programs we also have made an increasing emphasis for fellows or trainees in these programs to gain this kind of communications experience that you're talking about. I actually have witnessed directly in the cluster randomized control trial workshops that Larry and colleagues run at Northwestern that's being done. It's actually quite inspiring to see. But we're trying to infuse this across the board, in our predoctoral training, postdoctoral training.

Let me just also say if anyone listening today has ideas for a new training program specifically focused on communication, that would also be something we would entertain through our annual grant, training grant competition. We certainly would be interested in proposals along those lines.

DR. NEILD: This is Ruth Neild. And I would just add that we have also been making a lot of efforts to increase the awareness and skill about communicating among IES staff, so we've had a number of trainings this year for senior staff on presentation skills. NCES has had day-long training on data visualization, so we're trying to make sure that our staff are also aware and skilled and able to support and help grantees and contractors.

DR. HEDGES: We're coming close to the end of our session, so perhaps we can have one more question or comment from folks on the Board before we begin to bring the meeting to a close.
DR. FEUER: This is Michael Feuer. I have nothing to add. I just want to say thank you and I'm going to sign off so I can catch my flight. Again, Larry, congratulations and thanks for your service.
DR. GUTIERREZ: This is Kris Gutierrez. Several of us are going off the board and I just want to say it's been a privilege working with you all.
DR. LONG: The same here. This is Bridget. And a wonderful many years, I know I will continue to see most of you, but again thank you every one for your partnership and hard work.
FEMALE VOICE: And we would really like from IES to thank those board members as well. Each of you has had really a very helpful contribution. We can think of many ways in which we have taken your specific counsel to heart, and I think some of the things that you've seen today have really been a direct result of the ways that you have pushed us in the past. So thank you very much for your service.

DR. HEDGES: Well, let me say — this is Larry Hedges — I am very humbled to be elected chair and particularly following David Chard's terrific service as chair on the board. He really helped us move forward in important ways.

I think that, as everybody is no doubt aware, today is Election Day and that implies that one way or another there will be a transition in the government and transitions are important times for agencies. They are times in which old relationships are broken by transitions among individuals in the government and they're times in which agencies need to — in which priorities can change and agencies need to reestablish, you know, their relationships and the understanding that others have for them.

And I think, although IES is widely appreciated and increasingly emulated in various parts of the government, it's important that we — that they, you know, continue to be understood. I think that we on the Board can play an important role during this transition period in helping to communicate the strengths and successes of IES, to help build a broader understanding of what they're doing and why it's important so that they can continue the good work that they're doing.

I think the presentation today showed how much IES has taken to heart the finding in the GAO report that although they had appropriately accomplished a lot in the area of rigor that they had some work to do perhaps in the area of relevance, and I think we saw a variety of responses to that. I think it's important for us to help get the word out that IES has made some strides in the area of making the work not only rigorous but also relevant and increasingly relevant to American education.

I think that also we see that IES is appropriately concerned with what has been called in other areas the replication crisis that's engulfed both medicine and other social sciences and, you know, has implications for education as well, and I think it's important for us on the Board, you know, to help them stay ahead of what could be a very difficult problem for them, and I would applaud the after efficacy twig and the registry as two very good steps in that direction.

I look forward to working with all of you and both on the Board and those of you in the agency to help you do your work as well as you have and even better if that's possible, and if there are no further remarks that we need before adjourning, it's just about time to adjourn.

DR. NEILD: Thank you.
DR. HEDGES: Do we need a motion to adjourn?
DR. GUTIERREZ: I make a motion to adjourn. This is Kris.
DR. LONG: Seconded. This is Bridget.
DR. HEDGES: Shall we vote? All those in favor.
(Chorus of ayes.)
DR. HEDGES: Opposed?
(No response.)
DR. HEDGES: Then I look forward to the next meeting and I'll declare this meeting adjourned. Thank you all.
ALL: Thank you.
(Whereupon, at 3:30 p.m., the meeting in the above-entitled matter concluded.)

REPORTER'S CERTIFICATE DOCKET NO.: None CASE TITLE: NBES Meeting HEARING DATE: November 8, 2016 LOCATION: Washington, D.C. I hereby certify that the proceedings and evidence are contained fully and accurately on the tapes and notes reported by me at the hearing in the above case before the United States Department of Education. Date: November 8, 2016 Evelyn Sobel Official Reporter Heritage Reporting Corporation Suite 206 1220 L Street, N.W. Washington, D.C. 20005-4018
The National Board for Education Sciences is a Federal advisory committee chartered by Congress, operating under the Federal Advisory Committee Act (FACA); 5 U.S.C., App. 2). The Board provides advice to the Director on the policies of the Institute of Education Sciences. The findings and recommendations of the Board do not represent the views of the Agency, and this document does not represent information approved or disseminated by the Department of Education.