IES Blog

Institute of Education Sciences

Statistical concepts in brief: How and why does NCES use sample surveys?

By Lauren Musu-Gillette

EDITOR’S NOTE: This is the first in a series of blog posts about statistical concepts that NCES uses as a part of its work. 

The National Center for Education Statistics (NCES) collects survey statistics in two main ways—universe surveys and sample surveys.

Some NCES statistics, such as the number of students enrolled in public schools or postsecondary institutions, come from administrative data collections. These data represent a nearly exact count of a population because information is collected from all potential respondents (e.g., all public schools in the U.S.). These types of data collections are also known as universe surveys because they involve the collection of data covering all known units in a population. The Common Core of Data (CCD), the Private School Survey (PSS) and the Integrated Postsecondary Education Data System (IPEDS) are the key universe surveys collected by NCES.

While universe surveys provide a wealth of important data on education, data collections of this magnitude are not realistic for every potential variable or outcome of interest to education stakeholders. That is why, in some cases, we use sample surveys, which select smaller subgroups that are representative of a broader population of interest. Using sample surveys can reduce the time and expense that would be associated with collecting data from all members of a particular population of interest. 


Example of selecting a sample from a population of interest

The example above shows a simplified version of how a representative sample could be drawn from a population. The population shown here has 60 people, with 2/3 males and 1/3 females. The smaller sample of 6 individuals is drawn from this larger population, but remains representative with 2/3 males and 1/3 females included in the sample.


For instance, the National Postsecondary Student Aid Study (NPSAS), Baccalaureate and Beyond (B&B), and the Beginning Postsecondary Study (BPS) select institutions from the entire universe of institutions contained in the Integrated Postsecondary Education Data System (IPEDS) database. Then, some students within those schools are selected for inclusion in the study.

Schools and students are selected so that they are representative of the entire population of postsecondary institutions and students. Some types of institutions or schools can be sampled at higher rates than their representation in the population to ensure additional precision for survey estimates of that population. Through scientific design of the sample of institutions and appropriate weighting of the sample respondents, data from these surveys are nationally representative without requiring that all schools or all students be included in the data collection.

Many of the NCES surveys are sample surveys. For example, NCES longitudinal surveys include nationally representative data for cohorts of students in the elementary grades (Early Childhood Longitudinal Survey), the middle grades (Middle Grades Longitudinal Study), as well as at the high school (High School Longitudinal Study), and college levels (Beginning Postsecondary Students). The National Household Education Survey gathers information on parental involvement in education, early childhood programs, and other topics using household residences rather than schools as the population. The National Postsecondary Student Aid Survey gathers descriptive information on all college students and their participation in student aid programs. Additionally, characteristics of teachers and principals and the schools in which they teach are obtained through the Schools and Staffing Survey, and the National Teacher and Principal Survey.

By taking samples of the population of interest, NCES is able to study trends on a national level without needing to collect data from every student or every school. However, the structure and the size of the sample can affect the accuracy of the results for some population groups. This means that statistical testing is necessary to make inferences about differences between groups in the population. Stay tuned for future blogs about how this testing is done, and how NCES provides the data necessary for researchers or the public to do testing of their own.

Should ESSA Evidence Definitions and What Works Study Ratings be the Same? No, and Here's Why!

By Joy Lesnick, Acting Commissioner, NCEE

The Every Student Succeeds Act (ESSA), the new federal education law, requires education leaders to take research evidence into account when choosing interventions or approaches. ESSA  defines three “tiers” of evidence—strong, moderate, and promising—based on the type and quality of study that was done and its findings.  

Are the ESSA definitions the same as those of Institute of Education Sciences’ What Works Clearinghouse (WWC)?  Not exactly.  ESSA definitions and WWC standards are more like cousins than twins.

Like ESSA, the WWC has three ratings for individual studies – meets standards without reservations, meets standards with reservations, and does not meet standards. The WWC uses a second set of terms to summarize the results of all studies conducted on a particular intervention. The distinction between one study and many studies is important, as I will explain below.

You may be wondering: now that ESSA is the law of the land, should the WWC revise its standards and ratings to reflect the tiers and terminology described in ESSA?  Wouldn’t the benefit of making things nice and tidy between the two sets of definitions outweigh any drawbacks?

The short answer is no.

The most basic reason is that the WWC’s standards come from a decision-making process that is based in science and vetted through scholarly peer review, all protected by the independent, non-partisan status of the Institute of Education Sciences (IES). This fact is central to the credibility of the WWC’s work.  We like to think of the WWC standards as an anchor representing the best knowledge in the field for determining whether a study has been designed and executed well, and how much confidence we should have in its findings.

WWC Standards Reflect the Most Current Scientific Knowledge – and are Always Evolving

WWC standards were developed by a national panel of research experts. After nearly two years of meetings, these experts came to a consensus about what a research study must demonstrate to give us confidence that an intervention caused the observed changes in student outcomes.

Since the first WWC standards were developed over a decade ago, there have been many methodological and conceptual advances in education research. The good news is that the WWC is designed to keep up with these changes in science. As science has evolved, the WWC standards have evolved, too.

One example is the WWC’s standards for reviewing regression discontinuity (RD) design studies.  The first version of RD standards was developed by a panel of experts in 2012.  Since then, the science about RD studies has made so much progress that the WWC recently convened another panel of experts to update the RD standards. The new RD standards are now on the WWC website to solicit scholarly comment.  

When it Comes to Evidence, More is Better

The evidence tiers in ESSA set a minimum bar, based on one study, to encourage states, districts, and schools to incorporate evidence in their decision making. This is a very important step in the right direction.  But a one-study minimum bar is not as comprehensive as the WWC’s approach.

In science, the collective body of knowledge on a topic is always better than the result of a single study or observation. This is why the primary function of the WWC is to conduct systematic reviews of all of the studies on a program, policy, practice, or approach (the results of which are published in Intervention Reports like the one pictured here).

The results of individual studies are important clues toward learning what works. But multiple studies, in different contexts, with different groups of teachers and students, in different states, and with different real-world implementation challenges tell us much more about how well a program, policy, practice or approach works. And that, really, is what we’re trying to find out.

An Improved WWC Search Tool and Ongoing Support for States and Districts

One area where WWC will make changes is in how users find studies that have certain characteristics described in ESSA’s evidence tiers.  For the past 16 months, the WWC team has been hard at work behind the scenes to develop, code, and user-test a dramatically improved Find What Works tool.  We expect to release this tool, along with other changes to the WWC website, in fall 2016. (More on that in another post, but the picture below offers a sneak preview!)

These changes should further increase the utility of the WWC website, which already gets more than 300,000 hits each month and offers products that are downloaded hundreds of thousands of times each year.

We know that providing information on a website about evidence from rigorous research is just a first step.  States and districts may need additional, customized support to incorporate evidence into their decision-making processes in ways that are much deeper than a cursory check-box approach.

To meet that need, other IES programs are ready to help. For example, IES supports 10 Regional Educational Laboratories (RELs) that provide states and districts with technical support for using, interpreting, and applying research. At least two researchers at every REL are certified as WWC reviewers (meaning they have in-depth knowledge of the WWC standards and how the standards are applied), and every REL has existing relationships with states and districts across the nation and outlying regions. Because the RELs are charged with meeting the needs of their regions, every chief state school officer (or designee) sits on a REL Governing Board, which determines the annual priorities of the REL in that area.

As states prioritize their needs and identify ways to incorporate evidence in their decisions according to the new law, the WWC database of reviewed studies will provide the information they need, and the RELs will be ready to help them use that information in meaningful ways.

 

 

 

Building a Better RFA

The Institute of Education Sciences (IES) is committed to continuous improvement and that includes the process by which people apply for and access grants.

Since its authorization in 2002, IES’ research centers—the National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER)—have  been making efforts to improve the Requests for Applications (RFAs) we put out each year. In this spirit, we have conducted surveys of applicants the past few years and used that feedback to improve the current RFAs.

In Fiscal Years (FY) 2014 and 2015, all Principal Investigators (PIs) who submitted an application to the Education Research Grants Program RFA (CFDA # 84.305A) or the Partnerships and Collaborations Focused on Problems of Practice or Policy Program RFA (CFDA # 84.305H) were contacted via e-mail and asked to participate in the web-based survey. In FY 2015, applicants to the Special Education Research Grants Program (CFDA #324A) were included in the survey request.  The response rates were good for all surveys:

Grant Program FY 2014 FY 2015
Education Research Grants Program 62% 66%
Partnerships and Collaborations Program 59% 73%
Special Education Research Grants n/a 55%

 

Survey respondents generally provided positive feedback in both years. Most respondents indicated they felt the RFAs were clear and helpful, though there were some areas that generated some confusion and criticism.  For example, in FY 2014:

  • Applicants to the Education Research Grants program thought it was inconvenient to have to refer to two separate documents, the RFA and the Application Submission Guide, in order to complete their application.
  • Applicants to the Partnerships and Collaborations program reported some confusion about the distinction between partnership activities and research activities.

In response to the FY 2014 RFA survey results, the Institute made a number of changes. For the FY 2015 Education Research Grants and Special Education Research Grants, changes included combining the RFA and the Application Submission Guide into one document to provide all the necessary information in one place. According to responses from the FY 2015 RFA survey, this change was positively received. The majority of the respondents to the Education Research Grants and Special Education Research Grants surveys (n=398; 83%) reported that combining the RFA and Application Submission Guide was much better or somewhat better than having two separate documents. Overall, a majority of respondents (n = 161, 56%) felt the FY 2015 RFA was much better or somewhat better than in previous years, while another 43 percent felt that it was not better or worse.

For the Partnerships and Collaborations RFA, a number of changes were made to the FY 2015 RFA in response to the surveys. For example, the requirements for the research activities were disentangled from the requirements for the partnership in order to reduce redundancy within the application. Most respondents to the FY 2015 RFA survey (n = 53; 73%) felt this change made the RFA much better or somewhat better.

Respondents to the FY 2015 RFA survey also had some criticisms, and the Institute addressed those concerns in the FY 2016 RFAs. Specifically, in the Education Research Grants and Special Education Research Grants RFAs, more detail was added to the requirements for the dissemination plan and for the cost analysis plan.  For the Education Research Grants RFA, the language around research gaps was expanded to clarify that these are not priorities. Changes made in the Special Education Research Grant RFA in response to the feedback from the survey included streamlining application requirements related to student disability, age range or grade level, outcomes, and settings across its 11 research topics.  More details were added about the partnership tracking strategy (an area of confusion for many applicants) in the Partnerships and Collaborations FY 2016 RFA.

IES continues to strive toward improving RFAs and welcomes comments and suggestions for improvement. More information on the RFA results is available here: https://ies.ed.gov/ncer/projects/.

New FY 2017 RFAs are being posted on the IES Funding Opportunities page. If you have comments, please write to us at IESresearch@ed.gov.

By Christina Chhin (NCER), Rebecca McGill-Wilkinson (NCER), Phill Gagné (NCER) and Kristen Rhoads (NCSER)*

* Since this blog post was written, Dr. Rhoads has taken a position with the U.S. Department of Education's Office of Special Education and Rehabilitative Services. 

Improving Transitions: How NCSER-supported Work is Helping Prepare Students for Success

Talk of “transition” on Capitol Hill frequently focuses on political issues, such as the transition from one administration to the next. But on March 4, the conversation was about a very different type of transition—promoting positive outcomes for students with disabilities after high school.

For students with disabilities, post-high school goals are often similar to their non-disabled peers, but preparing them for success requires planning, support, and targeted interventions.

Over the past several years, the National Center for Special Education Research (NCSER) in the Institute of Education Sciences (IES) has funded research to innovate and develop as well as rigorously assess interventions that help students make successful transitions after high school.

A briefing on Capitol Hill was held this month to share recent research on transition for these students conducted by experts in the field. These experts have all received funding support from NCSER to help us better understand the transition challenges facing students with disabilities and to develop research-based programs and supports to increase the chances of success for students with disabilities.

"Young people with disabilities want the very same things as anyone else. A satisfying job, close relationships, a comfortable and safe place to live, a college degree, involvement in their community, friends they can count on, a chance to give something back, and an opportunity to be part of caring communities."

– Dr. Erik Carter, Vanderbilt University

Mary Wagner, of SRI International, began the briefing by talking about the National Longitudinal Transition Study-2 (known as the NLTS2), the more recent longitudinal study of the experiences of youth with disabilities as they transitioned from secondary school into postsecondary life over a 10-year period. Dr. Wagner presented findings that show that there has been progress in preparing students for, and engaging them in, postsecondary education. Additional academic courses, a paid job, and participation in transition planning and goal setting in high school were associated with increases in postsecondary education enrollment for these students after high school. However, the improvements have been uneven for some groups of students with disabilities and many challenges remain. For example, the rates of employment over time have not increased. 

David W. Test, of University of North Carolina-Charlotte, presented information about the innovative program “Communicating Interagency Relationships and Collaborative Linkages for Exceptional Students” or CIRCLES. This program involves three levels of interagency collaboration to promote positive outcomes for students with disabilities in secondary schools. The program connects students to more information and resources as well as provides mentoring support and partnerships. Ongoing research indicates the CIRCLES program is having a positive impact on student outcomes as compared to students receiving school services typically provided to support transition. In addition, participating students overwhelmingly agreed with the statement that they were “prepared for life after school” and their parents strongly agreed that they had “a better understanding of their child’s needs” and reported playing an active role in transition preparation.

The final two speakers discussed programs aimed at helping with transitions for students who face some of the greatest challenges.

Laurie E. Powers, of Portland State University, presented a research-based intervention program, “My Life,” for youth in foster care who also have disabilities. This program combines youth-directed coaching, workshops, and partnerships and mentoring to assist students in identifying goals and provide information and guidance they need to help them to experience success and to understand that they can achieve their goals. Many youth in foster care face extreme challenges in general: higher levels of unemployment, poverty, homelessness, abuse, and other mental health issues, and face incarceration rates of 10 times more than the general population. In addition, about 6 in 10 receive special education services and many also have developmental disabilities.

Research results have been positive. Students in the My Life program were found to be better prepared for postsecondary education and careers, and more were graduating from high school and fewer were homeless. After one year, postsecondary employment rates were up and rates of incarceration were down compared to the students who received services as usual.

Lastly, Erik Carter, of Vanderbilt University, presented research on improving workplace transitions for youth with intellectual disabilities (ID) in high school through a summer job support program. Although a disability does not predict aspirations, it does often predict post-high school experiences. Based on an analysis of data from the NLTS2, most youth with ID have a goal of employment, but only about 15 percent of all adults with ID are employed. A factor positively predicting outcomes for these students were the high expectations of those teaching them.

Project Summer embodies high expectations for these students and involves individual summer-focused transition planning, identification of community resources, and opportunities for youth to connect to community support and employment opportunities. Research indicates that the youth involved in Project Summer were much more likely to obtain employment or volunteer experiences in their community (66%) than their peers (19%) and all were paid above the minimum wage. This research also demonstrated that schools and communities have the capacity to support and promote the employment of youth with severe disabilities.

The briefing was sponsored by Senator Lamar Alexander, of Tennessee, Representative Suzanna Bonamici, of Oregon, and Representative Michael Honda, of California and was arranged by the Friends of IES, a group that advocates for education research. Certainly, there is much more work to be done to help students with disabilities successfully transition from high school and help them achieve their goals. But this month’s briefing demonstrated that progress is being made.

By Kimberley Sprague, Senior Research Scientist/Education Analyst, NCSER, and Dana Tofig, Communications Director, IES

 

New Release: Forum Guide to Elementary/Secondary Virtual Education Data

By The National Forum on Education Statistics Virtual Education Working Group

Rapid advancements and innovations in virtual education are providing education agencies, educators, and students with new opportunities for teaching and learning. That growth increases the need for accurate, high-quality data about virtual education that provides a full picture of successes and challenges. A new resource released earlier this month can help with this important work.

In recent years, virtual education has become an integral part of K-12 education and nearly every student is exposed to virtual learning in some context—whether as a single aspect of a traditional course or program, in an entirely virtual program, or in any combination of traditional and virtual learning.

Virtual education is often a core aspect of curricula and class instruction, and students and teachers are increasingly adept at integrating lectures, lessons, and group work delivered via computers, tablets, and other devices into day-to-day teaching and learning. Moreover, many students and teachers no longer distinguish between virtual and traditional learning—the technology and tools used in virtual education are familiar to them and are no more novel than a pencil.

Despite widespread interest in enhancing and expanding virtual teaching and learning, many state education agencies and school districts do not yet have the ability to collect accurate, high-quality virtual education data. Some organizations have not yet specified the data they want to collect, while others have not developed reliable processes for gathering and managing data. The prevalence of virtual education, the increasing diversity in virtual education opportunities, and the rapid pace of technological change require new ways of thinking about how to modify data elements and systems to effectively identify, collect, and use virtual education data to inform and improve education.

Local and state members of the National Forum on Education Statistics (the Forum) identified this problem and established a Virtual Education Working Group, tasked with developing a resource to assist education agencies as they: 1) consider the impact of virtual education on established data elements and methods of data collection, and 2) address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education. On February 4th, 2016, the Forum Guide to Elementary/Secondary Virtual Education Data was released.

In the document, the Forum Working Group members identify supports to virtual education data such as the organizational structure of virtual education, user experiences, challenges in collecting virtual education data, policy implications, as well as privacy and confidentiality protections. The document also includes common data elements for K12 virtual and blended data, such as

The working group also identifies elements that exist for traditional schools that are useful for virtual education. Finally, the Guide provides real-world examples and common practices implemented by state departments, local districts, and schools to modify their data systems and add elements that better reflect the needs unique to virtual education. 

As virtual education continues to expand in elementary/secondary school systems, education data collection and reporting systems need to evolve as well. It is important for all virtual education stakeholders – teachers, parents, education administrators, data systems administrators, and policymakers – to come together and creatively address the challenges of building a sound data infrastructure that considers the unique aspects of virtual education.

It is our hope that the Forum’s new guide can be a helpful tool in that process.

 

About the National Forum on Education Statistics

The work of the National Forum on Education Statistics is a key aspect of the National Cooperative Education Statistics System. The Cooperative System was established to produce and maintain, with the cooperation of the states, comparable and uniform education information and data that are useful for policymaking at the federal, state, and local levels. To assist in meeting this goal, the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, established the Forum to improve the collection, reporting, and use of elementary and secondary education statistics. The Forum addresses issues in education data policy, sponsors innovations in data collection and reporting, and provides technical assistance to improve state and local data systems.

Members of the Forum establish working groups to develop best practice guides in data-related areas of interest to federal, state, and local education agencies. They are assisted in this work by NCES, but the content comes from the collective experience of working group members who review all products iteratively throughout the development process. After the working group completes the content and reviews a document a final time, publications are subject to examination by members of the Forum standing committee that sponsors the project. Finally, Forum members (approximately 120 people) review and formally vote to approve all documents prior to publication. NCES provides final review and approval prior to online publication.

The information and opinions published in Forum products do not necessarily represent the policies or views of the U.S. Department of Education, IES, or NCES. For more information about the Forum, please visit http://www.nces.ed.gov/forum and/or contact Ghedam Bairu at Ghedam.bairu@ed.gov