Charting the Future of NCES: Our First NASEM Panel Discussion
On May 10, NASEM and IES staff convened the first public panel meeting to discuss the future of NCES. I thought I would share with you some of the themes we explored and hopefully tempt you to attend the next panel, slated for May 26. NASEM will solicit public comments at various times during their discovery and writing processes, so I am taking the opportunity to give you a heads up.
Mapping the future, unrestrained by the past
Despite the title of the project ("A Vision and Roadmap for Education Statistics in 2030 and Beyond"), I made clear that we are interested in a five-year time frame. There are so many changes going on regarding data collection, data merging, and privacy protection that creating a blueprint for change beyond the next five years could resemble more a work of fiction than a roadmap for needed changes.
During our discussion, I also stressed that this panel should be forward- rather than backward-looking. True, NCES, the second oldest designated principal statistical agency in the nation, has a long history and a "product mix" that has evolved over time. However, our history should not prescribe our future. For this reason, I laid out the following approach to structure NASEM's work:
Note that this approach is agnostic regarding existing approaches to data collection. Rather than saying we need to improve longitudinal surveys or IPEDS or any particular product, the challenge is to decide what information is needed and then identify the best way to gather that information. Some traditional but outdated data sources might be put on the chopping block; others might be targets for major investments; still others might need to be created.
But the approach is not agnostic when it comes to data use.
IES is Congressionally charged with disseminating research, evaluation, and statistical information to many different stakeholders, including "parents, educators, students, researchers, and the general public." I asked the NASEM panel to consider whether we are we matching our products with the audiences that need our insights and our information. And I asked NASEM to consider how we can extend our reach and impact through better data visualizations, shorter reports, more interactivity, or other modes of dissemination. NCES (and IES) needs to take the use of knowledge far more seriously than we have in the past.
Managing a rare staffing arrangement
What I asked for is expansive but hopefully not unrealistic. But we need to realize that for years, maybe decades, staff size has dictated how NCES functions. I am not a budget/personnel maximizing bureaucrat (I still remember Niskanen's model), but at some point we need to recognize that the NCES staff is not growing in proportion to its expanding list of responsibilities.
Consider the following staffing and direct funding data from 2019. In that year, NCES was funded at $328 million with a permanent staff of 105. On average, each NCES employee supervised over $3 million in government funds. The Bureau of Labor Statistics has a little less than twice NCES funding ($615 million) but close to 2000 staff. The Energy Information Administration (EIA) in 2019 was funded at $125 million but had 368 permanent employees. The exact numbers vary across the 13 officially designated principal statistical agencies, but these numbers give you some idea just how different NCES' funding and staffing levels are from other federal statistical agencies. This staffing arrangement—in which NCES has lots of work but few people to do it—translates into a simple fact: rather than doing statistical work, NCES staff mostly supervise contractors.
There is an old joke about the Soviet economy: this is an average year—worse than last; better than next. As the demand on the agency has outpaced staffing, our relative workload has met that definition of average.
I asked NASEM to develop realistic assumptions and strategies about personnel and matching work to present realistic estimates of future staffing levels. The challenge can be simply put: if we get additional personnel, where should they go? Which divisions within NCES represent the future? Which the past? Should existing personnel be moved across divisions—which essentially means ending some products to move NCES in different, more modern directions?
What stays? What goes?
I posed a specific question that encompasses so many of these issues: What is the future of the survey research division of NCES versus the administrative data division?
Surveys give us unique details and context about behavior and attitudes that administrative data cannot. But surveys are usually expensive and response rates are tumbling. We have already canceled one major survey—the middle schools survey—and more problems are certainly on the way.
We must consider what data points we can get from surveys that can be moved to administrative data collections. For example: We periodically conduct the National Teacher and Principal Survey (NTPS), which gives us rich data on things like teacher turnover, wages, and the makeup of the teacher and principal workforce. But as more and more state HR systems are linked into other state data systems, is there a more efficient way of getting these data?
This illustrates the approach I emphasized to NASEM: If we think about the data we need, rather than the historical systems we have built to get the data, then we can better think about how to be more efficient.
By way of a mea culpa, in preparing for the NASEM presentation I went over historical trends in staffing by each division within NCES. Focusing on the survey research and the administrative data divisions, here are recent FTE counts:
Both divisions lost personnel, but the administrative data division is down by about a third, while the survey division is down by around 10 percent. It is incumbent on IES and NCES to allocate personnel, the scarcest of all resources, in a way that reflects current and future priorities, rather than letting attrition determine staffing levels.
I know that there are many ways of writing essays, blogs, scholarly articles. But I find that engaging with data usually leads to points (and passions) that were not evident in my best laid writing plans. So, it was with this blog, where the data made clear the needs of different parts of NCES. And the data also make crystal clear the importance of the NASEM panel.
I once used my favorite legal term in an earlier blog: res ipsa loquitur (thanks to my lawyer daughter and her expensive education). The NCES data speak for themselves. The size of our staff necessitates a sophisticated managerial strategy, and I (as well as my predecessors) have been less than strategic about managing scarce resources against the always evolving changes in data and statistics.
In legal proceedings, there is usually a definable endpoint. The arguments lead to a ruling from a judge or jury. We don't have a definable endpoint at which we will say "Thank you, NASEM. Now we know what to do." Hopefully, the NASEM report will help point the way—but your input as we re-think NCES' role and structure will matter.