IES Blog

Institute of Education Sciences

NCER’s Investments in Education Research Networks to Accelerate Pandemic Recovery Network Lead Spotlight: Dr. Thomas Brock, ARCC Network

We hope you enjoyed the first NCER network lead spotlight! Today, we would like to introduce Dr. Thomas Brock, director of the Community College Research Center. Dr. Brock’s network, the Accelerating Recovery in Community Colleges (ARCC) Network, aims to provide timely, actionable research from the pandemic that policymakers and practitioners can use to help community colleges recover from the challenges introduced by the COVID-19 pandemic. Happy reading!

NCER: What are the mission and goals of the Accelerating Recovery in Community Colleges Network?

Dr. Brock: The primary goal of the ARCC Network is to provide timely, actionable research that policymakers and practitioners can use to help community colleges recover from the challenges introduced by the COVID-19 pandemic. These include steep drops in enrollment—particularly for students of color and male students—and learning losses associated with illness, stress, and challenges of online learning.

 

NCER: Why is the ARCC Network important to you?

Dr. Brock: The ARCC Network is important because community colleges are important. They enroll about one-third of all undergraduate college students in the U.S., including many who are from low-income backgrounds and the first in their families to attend college. The nation needs strong community colleges to help students advance educationally and economically. The nation also needs community colleges to prepare workers and support the economy in essential fields such as health care, information technology, construction trades, and manufacturing.

NCER: I understand that you had a central role in establishing the research networks grant program at IES. What is your view of a research network, and how does it differ from a traditional education research project?

Dr. Brock: There is an old adage that the whole is greater than the sum of its parts. With the research networks, IES intends to generate a body of work on a critical education problem or issue that is more impactful than an individual research project is likely to generate. This is because members of a research network come together regularly to discuss their ideas, tackle common methodological challenges, share data collection tools, and make sense of their emerging findings. They think about how to distill, align, and communicate research results from the early stages rather than as an afterthought. This benefits policymakers and practitioners, who look to researchers for insights and guidance. It also benefits the research community by building consensus on what has been learned and what new questions need to be addressed.

NCER: How do you think the ARCC Network will impact our nation’s community colleges?

Dr. Brock: Our hope is that the ARCC Network will help policymakers to be attentive to the needs of community colleges and shed light on the populations and places that need the most help. We also hope that the network will help identify promising policies and practices to promote rapid recovery.

NCER: What are some of the biggest challenges to recovery in community colleges?

Dr. Brock: Community colleges are largely funded based on enrollment. To date, the decline in enrollment has not led to too much reduction in academic programs or services because of the Higher Education Emergency Relief Fund (HEERF), authorized by Congress. HEERF funding ends after the 2022–23 academic year. If enrollments do not rebound quickly, community colleges will have to make significant cuts. This could lead to a downward spiral in which even fewer students enroll or persist, because they do not find the courses or services that they need.

Another challenge to recovery is learning loss. We know from the National Assessment of Educational Progress that there have been declines in reading and math achievement in K–12 schools during the pandemic. As these students mature and enter postsecondary education, they may be less well prepared for college-level work. Community colleges have made significant reforms to developmental education programs in recent years but will need to do more to ensure entering students succeed in college-level courses and make progress toward their academic and career goals.

Finally, we know that the pandemic has taken a severe toll on physical and mental health. Community colleges will need to find ways to reduce stress and promote wellness for everyone in their campus community–students, faculty, and staff.

NCER: What are some effective ways to translate education research into practice so that your work will have a direct impact on states and community college systems? What are some barriers to uptake of research outcomes by these organizations?

Dr. Brock: The ARCC Network will actively disseminate the research findings produced by individual research teams and by our national scan of community college enrollments and recovery practices. We will build a website that functions as an information hub for the most recent enrollment trends and reliable evidence on recovery strategies. We will conduct interactive workshops and webinars for state and local community college leaders and staff who are interested in learning from and adapting research-based practices to support pandemic recovery. We will use our connections with national organizations, like the American Association of Community Colleges and Achieving the Dream, and social media to ensure we reach a broad audience.

NCER: Are there some generalizable tools or lessons learned that are likely to come out of this network project that you think will benefit the education research community as a whole?

Dr. Brock: Yes. One area of focus for ARCC researchers, for example, is how to design and deliver effective online learning. Prior to the pandemic, most research on online learning in community colleges indicated it was not as effective as in-person instruction, but many colleges have upped their game with improved technology and better training and support for faculty who teach online. We have also seen from the pandemic that online learning benefits some students who might not otherwise attend community college, including students who live far from campus (especially in rural areas) or who are juggling demands of work and parenting. We hope to reframe the research debate so that it is less about online versus in-person instruction and more about how to provide online instruction most effectively to students who prefer this modality. We expect the lessons and tools from the ARCC Network will be broadly relevant to community colleges and may be adapted to other education sectors.


Thank you for reading our conversation with Dr. Thomas Brock! Come back tomorrow for our final grantee spotlight!  

NCES Celebrates IES and NCES Anniversaries With Retrospective Report on Federal Education Statistics

This year marks the 20th anniversary of the Institute of Education Sciences (IES) and 155 years since the creation of a federal agency to collect and report education statistics for the United States, a role now fulfilled by the National Center for Education Statistics (NCES). To celebrate both of these anniversaries, NCES has just released a new commemorative report—A Retrospective Look at U.S. Education Statistics—that explores the history and use of federal education statistics.



The 11 statistical profiles in phase I of this report can be found within two tabs: Elementary and Secondary Education and Postsecondary Education. Users can toggle between these two tabs and then select a particular statistical profile in the drop-down menu, such as Number of Elementary and Secondary Schools, High School Coursetaking, Enrollment in Postsecondary Institutions, and Postsecondary Student Costs and Financing.


Image of report website showing tabs for Elementary and Secondary Education and Postsecondary Education and the drop-down menu to select individual statistical profiles


Each of the statistical profiles in this report is broken down into the following sections:

  • what the statistic measures (what the data may indicate about a particular topic)
  • what to know about the statistic (the history of the data collection and how it may have changed over time)
  • what the data reveal (broad historical trends/patterns in the data, accompanied by figures)
  • more information (reference tables and related resources)

Each statistical profile can be downloaded as a PDF, and each figure within a profile can be downloaded or shared via a link or on social media.

For background and context, this report also includes a Historical Event Timeline. In this section, readers can learn about major periods of prolonged economic downturn, periods of military action, and periods when U.S. troops were drafted as a part of military action—as well as major pieces of federal legislation—and how some of these events could have disrupted the nation’s social life and schooling or impacted education across the country.

The report also includes a brief overview of NCES, which can be accessed by expanding the dark blue bar labeled NCES Overview: Past, Present, and Future. This section covers the history of NCES and its mission, the evolution of NCES reports and data collections, and current and future changes to NCES’s reporting methods.


Image of report website showing introductory text and the NCES Overview blue bar


This commemorative guide to federal education statistics is not intended to be a comprehensive report on the subject but rather a resource that provides an in-depth look at a selection of statistics. Stay tuned for the release of phase II next year, which will include additional statistical profiles. Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date!

 

By Megan Barnett, AIR

Comparing College-Based to Conventional Transition Approaches for Improving Outcomes for Youth with Disabilities

In honor of National Disability Employment Awareness Month, we discussed NCSER-funded research on transition support for students with disabilities with principal investigators Meg Grigal and Clare Papay. Transition services prepare students for life after school and can include activities such as job training, post-secondary education, and support for independent living and community participation. This research team’s project, Moving Transition Forward: Exploration of College-Based and Conventional Transition Practices for Students with Intellectual Disability and Autism, examines outcomes for two transition approaches: a college-based transition and the conventional approach provided by most local education agencies. In the interview below, the researchers discuss recent results and how this information can improve the quality of transition services for students with disabilities.

What is the purpose of your project? What motivated you to conduct this research?

Headshot of Meg Grigal

Headshot of Clare Papay

The bulk of existing transition research reflects knowledge about conventional transition services, 

or those services received by students with disabilities in high schools. An alternative approach, called college-based transition services, has been around for over 20 years, providing students with intellectual disability and autism a chance to experience college while continuing to receive support through special education. We wanted to explore and compare these two types of transition experiences and assess the outcomes for students. Using two existing datasets, our project conducted a series of interrelated analyses to look more closely at the transition services students with intellectual disability and/or autism (ID/A) are accessing and the association with youth outcomes in employment. Our hope is that our findings will contribute to the knowledge base on research-based college and career preparation for youth with ID/A.

Could you explain the difference between the two transition approaches (college-based and conventional) you are examining and how each prepares students for post-school life?

“Conventional transition services” is our way of describing the transition services typically provided to youth with disabilities across the United States. These services are documented in the data from the National Longitudinal Transition Study 2012 (NLTS 2012). College-based transition services, also known as dual enrollment or concurrent enrollment, provide students with intellectual disability access to college courses, internships, and employment and other campus activities during their final 2 to 3 years of secondary education. These experiences enable students to participate in career planning with a person-centered planning approach, enroll in college classes for educational and personal enrichment, engage in social activities alongside their college peers, and participate in community-based, paid work experiences that align with their employment goals.

What do the results from your research say about the employment outcomes and other transition outcomes of students with intellectual disability and autism participating in these transition programs?

To be blunt, our findings tell us that conventional transition services are not supporting students with ID/A to become employed after high school. We found a very low prevalence of school-based predictors of post-school success for students receiving conventional transition services. As an example, in our analysis of data from NLTS 2012, we found only 32% of youth with ID/A had paid employment in the previous 12 months. Paid employment in high school is a strong predictor of post-school employment. Additionally, there was low prevalence of other critical transition activities, including self-determination/self-advocacy, self-care/independent living skills, occupational courses, and work-study. Our findings highlight points of stagnation in access to college and career preparation for students with ID/A. Past low engagement rates in college preparation activities may have been attributed to the limited access youth with ID/A have had to positive employment outcomes and poor access to postsecondary education.

On a more promising note, when we look at data on students with ID/A who are enrolled in college-based transition programs, the picture is much brighter. We’ve found moderate to high prevalence of activities reflecting important predictors of post-school success (including­ paid employment while in high school, interagency collaboration, and learning skills in community settings). Students in college-based transition programs are enrolling in courses for college credit and taking courses to help them prepare for careers. These students are leaving K-12 education in a much better position to successfully be employed after high school than many of their peers who are receiving conventional transition services.

Based on what you have learned, what are the implications for practice and policy?

With increased access and opportunities to pursue further education after high school, youth with ID/A need college preparation activities to be a part of their standard education experience. Our findings suggest college-based transition services offer an approach that addresses both employment and college preparation. However, the availability of college-based transition programs depends upon whether school districts have established partnerships with a college or university. Greater availability of college-based transition services would provide the field with a better understanding of the essential elements of practice and associated outcomes of this approach. Our findings also show the need for substantial improvement in the access to college and career preparation for youth with ID/A in conventional transition services. Finally, these studies highlight the need for additional and more robust data in federal data systems reflecting information about the transition experiences of students with intellectual disability, autism, and other developmental disabilities. We need to know what their experiences between age 18-22 look like, how inclusive these experiences are, and what outcomes they achieve after they leave K-12 education.

How can families find more information regarding college-based transition programs in their area?

We are glad you asked! The Think College website has a College Search feature that includes all the college and university programs enrolling students with ID/A in the United States, including those who are working with transitioning youth. This is a great way for families to explore local options. When options don't exist, we encourage families to speak with their school administrators to work on developing partnerships with local colleges or universities. Think College has many resources about college-based transition available on our website. Additionally, our national help desk is always available to answer questions or offer help to those seeking information about inclusive higher education and college-based transition services. Send us questions at thinkcollegeta@gmail.com

Many thanks to Drs. Grigal and Papay for sharing their work with our readers! If you want to learn more about this project, including the results of their research, please visit the following website: https://thinkcollege.net/projects/mtf.

Meg Grigal is a senior research fellow at the Institute for Community Inclusion at the University of Massachusetts, Boston. At the Institute, she is co-director of Think College, a national organization focused on research, policy, and practice in inclusive higher education. Clare Papay is a senior research associate at the Institute for Community Inclusion.

This blog was produced by Shanna Bodenhamer, virtual student federal service intern at IES and graduate student at Texas A&M University, and Akilah Nelson, program officer for NCSER’s Transition to Postsecondary Education, Career, and/or Independent Living program.

 

 

Knock, Knock! Who’s There? Understanding Who’s Counted in IPEDS

The Integrated Postsecondary Education Data System (IPEDS) is a comprehensive federal data source that collects information on key features of higher education in the United States, including characteristics of postsecondary institutions, college student enrollment and academic outcomes, and institutions’ employees and finances, among other topics.

The National Center for Education Statistics (NCES) has created a new resource page, Student Cohorts and Subgroups in IPEDS, that provides data reporters and users an overview of how IPEDS collects information related to postsecondary students and staff. This blog post highlights key takeaways from the resource page.

IPEDS survey components collect counts of key student and staff subgroups of interest to the higher education community.

Data users—including researchers, policy analysts, and prospective college students—may be interested in particular demographic groups within U.S. higher education. IPEDS captures data on a range of student and staff subgroups, including race/ethnicity, gender, age categories, Federal Pell Grant recipient status, transfer-in status, and part-time enrollment status.

The Outcome Measures (OM) survey component stands out as an example of how IPEDS collects student subgroups that are of interest to the higher education community. Within this survey component, all entering degree/certificate-seeking undergraduates are divided into one of eight subgroups by entering status (i.e., first-time or non-first-time), attendance status (i.e., full-time or part-time), and Pell Grant recipient status.

Although IPEDS is not a student-level data system, many of its survey components collect counts of students and staff by subgroup.

Many IPEDS survey components—such as Admissions, Fall Enrollment, and Human Resources—collect data as counts of individuals (i.e., students or staff) by subgroup (e.g., race/ethnicity, gender) (exhibit 1). Other IPEDS survey components—such as Graduation Rates, Graduation Rates 200%, and Outcome Measures—also include selected student subgroups but monitor cohorts of entering degree/certificate-seeking students over time to document their long-term completion and enrollment outcomes. A cohort is a specific group of students established for tracking purposes. The cohort year is based on the year that a cohort of students begins attending college.


Exhibit 1. IPEDS survey components that collect counts of individuals by subgroup

Table showing IPEDS survey components that collect counts of individuals by subgroup; column one shows the unit of information (student counts vs. staff counts); column two shows the survey component


IPEDS collects student and staff counts by combinations of interacting subgroups.

For survey components that collect student or staff counts, individuals are often reported in disaggregated demographic groups, which allows for more detailed understanding of specific subpopulations. For example, the Fall Enrollment (EF) and 12-month Enrollment (E12) survey components collect total undergraduate enrollment counts disaggregated by all possible combinations of students’ full- or part-time status, gender, degree/certificate-seeking status, and race/ethnicity. Exhibit 2 provides an excerpt of the EF survey component’s primary data collection screen (Part A), in which data reporters provide counts of students who fall within each demographic group indicated by the blank cells.


Exhibit 2. Excerpt of IPEDS Fall Enrollment (EF) survey component data collection screen for full-time undergraduate men: 2022­–23

[click image to enlarge]

Image of IPEDS Fall Enrollment survey component data collection screen for full-time undergraduate men in 2022–23

NOTE: This exhibit reflects the primary data collection screen (Part A) for the 2022–23 Fall Enrollment (EF) survey component for full-time undergraduate men. This screen is duplicated three more times for undergraduate students, once each for part-time men, full-time women, and part-time women. For survey materials for all 12 IPEDS survey components, including complete data collection forms and detailed reporting instructions, visit the IPEDS Survey Materials website.


As IPEDS does not collect data at the individual student level, these combinations of interacting subgroups are the smallest unit of information available in IPEDS. However, data users may wish to aggregate these smaller subgroups to arrive at larger groups that reflect broader populations of interest.

For example, using the information presented in exhibit 2, a data user could sum all the values highlighted in the green column to arrive at the total enrollment count of full-time, first-time men. As another example, a data user could sum all the values highlighted in the blue row to determine the total enrollment count of full-time Hispanic/Latino men. Note, however, that many IPEDS data products provide precalculated aggregated values (e.g., total undergraduate enrollment), but data are collected at these smaller units of information (i.e., disaggregated subgroup categories).

Student enrollment counts and cohorts align across IPEDS survey components.

There are several instances when student enrollment or cohort counts reported in one survey component should match or very closely mirror those same counts reported in another survey component. For example, the number of first-time degree/certificate-seeking undergraduate students in a particular fall term should be consistently reported in the Admissions (ADM) and Fall Enrollment (EF) survey components within the same data collection year (see letter A in exhibit 3).


Exhibit 3. Alignment of enrollment counts and cohorts across IPEDS survey components

Infographic showing the alignment of enrollment counts and cohorts across IPEDS survey components


For a full explanation of the alignment of student counts and cohorts across IPEDS survey components (letters A to H in exhibit 3), visit the Student Cohorts and Subgroups in IPEDS resource page.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube, follow IPEDS on Twitter, and subscribe to the NCES News Flash to stay up-to-date on IPEDS data releases and resources.

 

By Katie Hyland and Roman Ruiz, AIR

Timing is Everything: Understanding the IPEDS Data Collection and Release Cycle

For more than 3 decades, the Integrated Postsecondary Education Data System (IPEDS) has collected data from all postsecondary institutions participating in Title IV federal student aid programs, including universities, community colleges, and vocational and technical schools.

Since 2000, the 12 IPEDS survey components occurring in a given collection year have been organized into three seasonal collection periods: Fall, Winter, and Spring.

The timing of when data are collected (the “collection year”) is most important for the professionals who report their data to the National Center for Education Statistics (NCES). However, IPEDS data users are generally more interested in the year that is actually reflected in the data (the “data year”). As an example, a data user may ask, “What was happening with students, staff, and institutions in 2018–19?"


Text box that says: The collection year refers to the time period the IPEDS survey data are collected. The data year refers to the time period reflected in the IPEDS survey data.


For data users, knowing the difference between the collection year and the data year is important for working with and understanding IPEDS data. Often, the collection year comes after the data year, as institutions need time to collect the required data and check to make sure they are reporting the data accurately. This lag between the time period reflected by the data and when the data are reported is typically one academic term or year, depending on the survey component. For example, fall 2021 enrollment data are not reported to NCES until spring 2022, and the data would not be publicly released until fall 2022.

After the data are collected by NCES, there is an additional time period before they are released publicly in which the data undergo various quality and validity checks. About 9 months after each seasonal collection period ends (i.e., Fall, Winter, Spring), there is a Provisional Data Release and IPEDS data products (e.g., web tools, data files) are updated with the newly released seasonal data. During this provisional release, institutions may revise their data if they believe it was inaccurately reported. A Revised/Final Data Release then happens the following year and includes any revisions that were made to the provisional data.

Sound confusing? The data collection and release cycle can be a technical and complex process, and it varies slightly for each of the 12 IPEDS survey components. Luckily, NCES has created a comprehensive resource page that provides information about the IPEDS data collection and release cycles for each survey component as well as key details for data users and data reporters, such as how to account for summer enrollment in the different IPEDS survey components.

Table 1 provides a summary of the IPEDS 2021–22 data collection and release schedule information that can be found on the resource page. Information on the data year and other details about each survey component can also be found on the resource page.


Table 1. IPEDS 2021–22 Data Collection and Release Schedule

Table showing the IPEDS 2021–22 data collection and release schedule


Here are a few examples of how to distinguish the data year from the collection year in different IPEDS data products.

Example 1: IPEDS Trend Generator

Suppose that a data user is interested in how national graduation rates have changed over time. One tool they might use is the IPEDS Trend Generator. The Trend Generator is a ready-made web tool that allows users to view trends over time on the most frequently asked subject areas in postsecondary education. The Graduation Rate chart below displays data year (shown in green) in the headline and on the x-axis. The “Modify Years” option also allows users to filter by data year. Information about the collection year (shown in gold) can be found in the source notes below the chart.


Image of IPEDS Trend Generator webpage


Example 2: IPEDS Complete Data Files

Imagine that a data user was interested enough in 6-year Graduation Rates that they wanted to run more complex analyses in a statistical program. IPEDS Complete Data Files include all variables for all reporting institutions by survey component and can be downloaded by these users to create their own analytic datasets.

Data users should keep in mind that IPEDS Complete Data Files are organized and released by collection year (shown in gold) rather than data year. Because of this, even though files might share the same collection year, the data years reflected within the files will vary across survey components.


Image of IPEDS Complete Data Files webpage


The examples listed above are just a few of many scenarios in which this distinction between collection year and data year is important for analysis and understanding. Knowing about the IPEDS reporting cycle can be extremely useful when it comes to figuring out how to work with IPEDS data. For more examples and additional details on the IPEDS data collection and release cycles for each survey component, please visit the Timing of IPEDS Data Collection, Coverage, and Release Cycle resource page.

Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube, follow IPEDS on Twitter, and subscribe to the NCES News Flash to stay up-to-date on all IPEDS data releases.

 

By Katie Hyland and Roman Ruiz, American Institutes for Research