IES Blog

Institute of Education Sciences

Introducing a New Resource Page for the IPEDS Outcome Measures (OM) Survey Component

The National Center for Education Statistics (NCES) has introduced a new resource page for the Integrated Postsecondary Education Data System (IPEDS) Outcome Measures (OM) survey component. This blog post provides an overview of the webpage and is the first in a series of blog posts that will showcase OM data.

Measuring Student Success in IPEDS: Graduation Rates (GR), Graduation Rates 200% (GR200), and Outcome Measures (OM) is a new resource page designed to help data reporters and users better understand the value of OM data and how the OM survey component works, particularly when compared with the Graduation Rates (GR) and Graduation Rates 200% (GR200) survey components.

The OM survey component was added to IPEDS in 2015–16 in an effort to capture postsecondary outcomes for more than so-called “traditional” college students. From 1997–98 to 2015–16, IPEDS graduation rate data were collected only for first-time, full-time (FTFT) degree/certificate-seeking (DGCS) undergraduates through the GR and GR200 survey components. Unlike these survey components, OM collects student outcomes for all entering DGCS undergraduates, including non-first-time students (i.e., transfer-in students) and part-time students.

Outcome measures are useful as characteristics of students vary by the level of institution. In 2009, some 4.7 million students began at 2-year postsecondary institutions, and 25 percent were full-time students who were attending college for the first time. During the same period, some 4.5 million students began at 4-year institutions, and 44 percent were first-time, full-time students.1

The new resource page answers several important questions about OM, GR, and GR200, including the following:

  • Which institutions complete each survey component?
  • Does the survey form vary by institutional type?
  • What student success measures are included?
  • Which students are included in the cohort?
  • What is the timeframe for establishing student cohorts?
  • Which subgroups (disaggregates) are included?
  • What is the timing of data collection and release?

In answering these questions, the resource page highlights that OM provides a more comprehensive view of student success than do GR and GR200. Furthermore, it suggests that OM, GR, and GR200 are not directly comparable, as the survey components differ in terms of which institutions complete them, which students are captured, and how each measures cohorts. Here are some of the key differences:

  • Institutions with FTFT cohorts complete the GR and GR200 components, whereas degree-granting institutions complete the OM component.
  • GR and GR200 include only FTFT DGCS undergraduates, whereas OM includes all DGCS undergraduates.
  • GR and GR200 cohorts are based on a fall term for academic reports and a full year (September 1–August 31) for program reporters, whereas OM cohorts are based on a full year (July 1–June 30) for all degree-granting institutions.

Finally, the resource page outlines how OM works, including how cohorts and subcohorts are established, which outcomes are collected at various status points, and when the public have access to submitted data. Exhibit 1 presents the current 2021–22 data collection timeline, including the cohort year, outcome status points, data collection period, and public release of OM data.


Exhibit 1. 2021­–22 Outcome Measures (OM) data collection timeline (2013–14 entering degree/certificate-seeking cohort)

Infographic showing the 2020—21 OM data collection timeline, including the cohort year, outcome status points, data collection period, and public release of OM data


Data reporters and users are encouraged to utilize the new OM survey component resource page to better understand the scope of OM, how it works, and how it differs from GR and GR200. Stay tuned for a follow-up blog post featuring data from OM that further highlights the survey component’s usefulness in measuring student success for all DGCS undergraduate students.

 

By Tara Lawley, NCES; Roman Ruiz, AIR; Aida Ali Akreyi, AIR; and McCall Pitcher, AIR


[1] U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS), Winter 2017–18, Outcome Measures component; and IPEDS Fall 2009, Institutional Characteristics component. See Digest of Education Statistics 2018, table 326.27.

NCES's Top Hits of 2021

As 2021—another unprecedented year—comes to a close and you reflect on your year, be sure to check out NCES’s annual list of top web hits. From reports and Condition of Education indicators to Fast Facts, APIs, blog posts, and tweets, NCES releases an array of content to help you stay informed about the latest findings and trends in education. Don’t forget to follow us on Twitter, Facebook, and LinkedIn to stay up-to-date in 2022!
 

Top five reports, by number of PDF downloads

1. Condition of Education 2020 (8,376)

2Digest of Education Statistics 2019 (4,427)

3. Status and Trends in the Education of Racial and Ethnic Groups 2018 (3,282)

4. Indicators of School Crime and Safety: 2019 (2,906)

5. Trends in High School Dropout and Completion Rates in the United States: 2019 (2,590)

 

Top five indicators from the Condition of Education, by number of web sessions

1. Students With Disabilities (100,074)

2. Racial/Ethnic Enrollment in Public Schools (64,556)

3. Characteristics of Public School Teachers (57,188)

4. Public High School Graduation Rates (54,504)

5. Education Expenditures by Country (50,20)

 

Top five Fast Facts, by number of web sessions

1. Back-to-School Statistics (162,126)

2. Tuition Costs of Colleges and Universities (128,236)

3. Dropout Rates (74,399)

4. Graduation Rates (73,855)

5. Degrees Conferred by Race and Sex (63,178)

 

Top five NCES/EDGE API requested categories of social and spatial context GIS data, by number of requests

1. K–12 Schools (including district offices) (4,822,590)

2. School Districts (1,616,374)

3. Social/Economic (882,984)

4. Locales (442,715)

5. Postsecondary (263,047)

 

Top five blog posts, by number of web sessions

1. Understanding School Lunch Eligibility in the Common Core of Data (8,242)

2. New Report Shows Increased Diversity in U.S. Schools, Disparities in Outcomes (3,463)

3. Free or Reduced Price Lunch: A Proxy for Poverty? (3,457)

4. Back to School by the Numbers: 2019–20 School Year (2,694)

5. Educational Attainment Differences by Students’ Socioeconomic Status (2,587)

 

Top five tweets, by number of impressions

1. CCD blog (22,557)


2. NAEP dashboard (21,551)


3. IPEDS data tools (21,323)


4. ACGR web table (19,638)


5. Kids’ Zone (19,390)

 

By Megan Barnett, AIR

The “Where” of Going to College: Residence, Migration, and Fall Enrollment

Newly released provisional data from the Integrated Postsecondary Education Data System’s (IPEDS) Fall Enrollment (EF) survey provides an updated look at whether beginning college students are attending school in their home state or heading elsewhere.  

In fall 2018, the number of first-time degree/certificate-seeking students enrolled at Title IV postsecondary institutions (beginning college students) varied widely across states, ranging from 3,700 in Alaska to 400,300 in California (figure 1). College enrollment is strongly correlated with the number of postsecondary institutions within each state, as more populous and geographically large states have more institutional capacity to enroll more students. Most states (32 out of 50) and the District of Columbia enrolled fewer than 50,000 beginning college students in fall 2018 and only six states (California, Texas, New York, Florida, Pennsylvania, and Ohio) enrolled more than 100,000 beginning college students.


Figure 1. Number of first-time degree/certificate-seeking undergraduate students enrolled at Title IV institutions, by state or jurisdiction: Fall 2018SOURCE: U.S. Department of Education, National Center for Education Statistics, IPEDS, Spring 2019, Fall Enrollment component (provisional data).


As a result of students migrating outside their home states to attend college, some postsecondary institutions enroll students who are not residents of the same state or jurisdiction in which it is located. Among beginning college students in fall 2018, the share of students who were residents of the same state varied widely, from 31 percent in New Hampshire to 93 percent in Texas and Alaska (figure 2). For a majority of states (27 out of 50), residents comprised at least 75 percent of total beginning college student enrollment. Only three states (Rhode Island, Vermont, and New Hampshire) and the District of Columbia enrolled more nonresidents than residents among their fall 2018 beginning college students.


Figure 2. Percent of first-time degree/certificate-seeking undergraduate students enrolled at Title IV institutions in the state or jurisdiction who are residents of the same state or jurisdiction: Fall 2018

SOURCE: U.S. Department of Education, National Center for Education Statistics, IPEDS, Spring 2019, Fall Enrollment component (provisional data).


States experience varying levels of out-migration (i.e., residents leaving the state to attend college) and in-migration (i.e., nonresidents coming into the state to attend college). For example, in fall 2018, California experienced the largest number of residents out-migrating to attend college in a different state (44,800) but gained 37,800 nonresidents in-migrating to attend college in the state, for an overall negative net migration of beginning college students (figure 3). In contrast, New York also experienced a large number of residents out-migrating for college (33,800) but gained 43,300 nonresidents, for an overall positive net migration of beginning college students.


Figure 3. Number of first-time degree/certificate-seeking undergraduate students at Title IV institutions who migrate into and out of the state or jurisdiction: Fall 2018

NOTE: The migration of students refers to students whose permanent address at the time of application to the institution is located in a different state or jurisdiction than the institution. Migration does not indicate a permanent change of address has occurred. Migration into the state or jurisdiction may include students who are nonresident aliens, who are from the other U.S. jurisdictions, or who reside outside the state or jurisdiction and are enrolled exclusively in online or distance education programs. Migration into the state or jurisdiction does not include individuals whose state or jurisdiction of residence is unknown.

SOURCE: U.S. Department of Education, National Center for Education Statistics, IPEDS, Spring 2019, Fall Enrollment component (provisional data).


Approximately three-quarters of states (37 out of 50) and the District of Columbia had a positive net migration of beginning college students in fall 2018 (figure 4). The remaining one-quarter of states (13 out of 50) had more residents out-migrate for college than nonresidents in-migrate for college, resulting in a negative net migration of beginning college students. Net migration varied widely by state, with New Jersey experiencing the largest negative net migration (28,500 students) and Utah experiencing the largest positive net migration (14,400 students).


Figure 4. Net migration of first-time degree/certificate-seeking undergraduate students at Title IV institutions, by state or jurisdiction: Fall 2018

NOTE: Net migration is the difference between the number of students entering the state or jurisdiction to attend school (into) and the number of students (residents) who leave the state or jurisdiction to attend school elsewhere (out of). A positive net migration indicates more students coming into the state or jurisdiction than leaving to attend school elsewhere.

SOURCE: U.S. Department of Education, National Center for Education Statistics, IPEDS, Spring 2019, Fall Enrollment component (provisional data).


The newly released IPEDS Fall Enrollment data provide tremendous insights into the geographic mobility of beginning college students. Additional analyses on residence and migration can be conducted using the full IPEDS data files. For example, the data can identify to which states and types of institutions beginning college students out-migrate and, conversely, from which states postsecondary institutions recruit their incoming classes.

 

By Roman Ruiz, AIR

From Data Collection to Data Release: What Happens?

In today’s world, much scientific data is collected automatically from sensors and processed by computers in real time to produce instant analytic results. People grow accustomed to instant data and expect to get things quickly.

At the National Center for Education Statistics (NCES), we are frequently asked why, in a world of instant data, it takes so long to produce and publish data from surveys. Although improvements in the timeliness of federal data releases have been made, there are fundamental differences in the nature of data compiled by automated systems and specific data requested from federal survey respondents. Federal statistical surveys are designed to capture policy-related and research data from a range of targeted respondents across the country, who may not always be willing participants.

This blog is designed to provide a brief overview of the survey data processing framework, but it’s important to understand that the survey design phase is, in itself, a highly complex and technical process. In contrast to a management information system, in which an organization has complete control over data production processes, federal education surveys are designed to represent the entire country and require coordination with other federal, state, and local agencies. After the necessary coordination activities have been concluded, and the response periods for surveys have ended, much work remains to be done before the survey data can be released.

Survey Response

One of the first sources of potential delays is that some jurisdictions or individuals are unable to fill in their surveys on time. Unlike opinion polls and online quizzes, which use anyone who feels like responding to the survey (convenience samples), NCES surveys use rigorously formulated samples meant to properly represent specific populations, such as states or the nation as a whole. In order to ensure proper representation within the sample, NCES follows up with nonresponding sampled individuals, education institutions, school districts, and states to ensure the maximum possible survey participation within the sample. Some large jurisdictions, such as the New York City school district, also have their own extensive survey operations to conclude before they can provide information to NCES. Before the New York City school district, which is larger than about two-thirds of all state education systems, can respond to NCES surveys, it must first gather information from all its schools. Receipt of data from New York City and other large districts is essential to compiling nationally representative data.

Editing and Quality Reviews

Waiting for final survey responses does not mean that survey processing comes to a halt. One of the most important roles NCES plays in survey operations is editing and conducting quality reviews of incoming data, which take place on an ongoing basis. In these quality reviews, a variety of strategies are used to make cost-effective and time-sensitive edits to the incoming data. For example, in the Integrated Postsecondary Education Data System (IPEDS), individual higher education institutions upload their survey responses and receive real-time feedback on responses that are out of range compared to prior submissions or instances where survey responses do not align in a logical way. All NCES surveys use similar logic checks in addition to a range of other editing checks that are appropriate to the specific survey. These checks typically look for responses that are out of range for a certain type of respondent.

Although most checks are automated, some particularly complicated or large responses may require individual review. For IPEDS, the real-time feedback described above is followed by quality review checks that are done after collection of the full dataset. This can result in individualized follow up and review with institutions whose data still raise substantive questions. 

Sample Weighting

In order to lessen the burden on the public and reduce costs, NCES collects data from selected samples of the population rather than taking a full census of the entire population for every study. In all sample surveys, a range of additional analytic tasks must be completed before data can be released. One of the more complicated tasks is constructing weights based on the original sample design and survey responses so that the collected data can properly represent the nation and/or states, depending on the survey. These sample weights are designed so that analyses can be conducted across a range of demographic or geographic characteristics and properly reflect the experiences of individuals with those characteristics in the population.

If the survey response rate is too low, a “survey bias analysis” must be completed to ensure that the results will be sufficiently reliable for public use. For longitudinal surveys, such as the Early Childhood Longitudinal Study, multiple sets of weights must be constructed so that researchers using the data will be able to appropriately account for respondents who answered some but not all of the survey waves.

NCES surveys also include “constructed variables” to facilitate more convenient and systematic use of the survey data. Examples of constructed variables include socioeconomic status or family type. Other types of survey data also require special analytic considerations before they can be released. Student assessment data, such as the National Assessment of Educational Progress (NAEP), require that a number of highly complex processes be completed to ensure proper estimations for the various populations being represented in the results. For example, just the standardized scoring of multiple choice and open-ended items can take thousands of hours of design and analysis work.

Privacy Protection

Release of data by NCES carries a legal requirement to protect the privacy of our nation’s children. Each NCES public-use dataset undergoes a thorough evaluation to ensure that it cannot be used to identify responses of individuals, whether they are students, parents, teachers, or principals. The datasets must be protected through item suppression, statistical swapping, or other techniques to ensure that multiple datasets cannot be combined in such a way as to identify any individual. This is a time-consuming process, but it is incredibly important to protect the privacy of respondents.

Data and Report Release

When the final data have been received and edited, the necessary variables have been constructed, and the privacy protections have been implemented, there is still more that must be done to release the data. The data must be put in appropriate formats with the necessary documentation for data users. NCES reports with basic analyses or tabulations of the data must be prepared. These products are independently reviewed within the NCES Chief Statistician’s office.

Depending on the nature of the report, the Institute of Education Sciences Standards and Review Office may conduct an additional review. After all internal reviews have been conducted, revisions have been made, and the final survey products have been approved, the U.S. Secretary of Education’s office is notified 2 weeks in advance of the pending release. During this notification period, appropriate press release materials and social media announcements are finalized.

Although NCES can expedite some product releases, the work of preparing survey data for release often takes a year or more. NCES strives to maintain a balance between timeliness and providing the reliable high-quality information that is expected of a federal statistical agency while also protecting the privacy of our respondents.  

 

By Thomas Snyder

Data Tools for College Professors and Students

Ever wonder what parts of the country produce the most English majors? Want to know which school districts have the most guidance counselors? The National Center for Education Statistics (NCES) has all the tools you need to dig into these and lots of other data!

Whether you’re a student embarking on a research project or a college professor looking for a large data set to use for an assignment, NCES has you covered. Below, check out the tools you can use to conduct searches, download datasets, and generate your own statistical tables and analyses.

 

Conduct Publication Searches

Two search tools help researchers identify potential data sources for their study and explore prior research conducted with NCES data. The Publications & Products Search Tool can be used to search for NCES publications and data products. The Bibliography Search Tool, which is updated continually, allows users to search for individual citations from journal articles that have been published using data from most surveys conducted by NCES.

Key reference publications include the Digest of Education Statistics, which is a comprehensive library of statistical tabulations, and The Condition of Education, which highlights up-to-date trends in education through statistical indicators.

 

Learn with Instructional Modules

The Distance Learning Dataset Training System (DLDT) is an interactive online tool that allows users to learn about NCES data across the education spectrum. DLDT’s computer-based training introduces users to many NCES datasets, explains their designs, and offers technical considerations to facilitate successful analyses. Please see the NCES blog Learning to Use the Data: Online Dataset Training Modules for more details about the DLDT tool.
 




Download and Access Raw Data Files

Users have several options for conducting statistical analyses and producing data tables. Many NCES surveys release public-use raw data files that professors and students can download and analyze using statistical software packages like SAS, STATA, and SPSS. Some data files and syntax files can also be downloaded using NCES data tools:

  • Education Data Analysis Tool (EDAT) and the Online Codebook allow users to download several survey datasets in various statistical software formats. Users can subset a dataset by selecting a survey, a population, and variables relevant to their analysis.
  • Many data files can be accessed directly from the Surveys & Programs page by clicking on the specific survey and then clicking on the “Data Products” link on the survey website.

 

Generate Analyses and Tables

NCES provides several online analysis tools that do not require a statistical software package:

  • DataLab is a tool for making tables and regressions that features more than 30 federal education datasets. It includes three powerful analytic tools:
    • QuickStats—for creating simple tables and charts.
    • PowerStats—for creating complex tables and logistic and linear regressions.
    • TrendStats—for creating complex tables spanning multiple data collection years. This tool also contains the Tables Library, which houses more than 5,000 published analysis tables by topic, publication, and source.



  • National Assessment of Educational Progress (NAEP) Data Explorer can be used to generate tables, charts, and maps of detailed results from national and state assessments. Users can identify the subject area, grade level, and years of interest and then select variables from the student, teacher, and school questionnaires for analysis.
  • International Data Explorer (IDE) is an interactive tool with data from international assessments and surveys, such as the Program for International Student Assessment (PISA), the Program for the International Assessment of Adult Competencies (PIAAC), and the Trends in International Mathematics and Science Study (TIMSS). The IDE can be used to explore student and adult performance on assessments, create a variety of data visualizations, and run statistical tests and regression analyses.
  • Elementary/Secondary Information System (ElSi) allows users to quickly view public and private school data and create custom tables and charts using data from the Common Core of Data (CCD) and Private School Universe Survey (PSS).
  • Integrated Postsecondary Education Data System (IPEDS) Use the Data provides researcher-focused access to IPEDS data and tools that contain comprehensive data on postsecondary institutions. Users can view video tutorials or use data through one of the many functions within the portal, including the following:
    • Data Trends—Provides trends over time for high-interest topics, including enrollment, graduation rates, and financial aid.
    • Look Up an Institution—Allows for quick access to an institution’s comprehensive profile. Shows data similar to College Navigator but contains additional IPEDS metrics.
    • Statistical Tables—Equips power users to quickly get data and statistics for specific measures, such as average graduation rates by state.