IES Blog

Institute of Education Sciences

IES Makes Two New Awards for the Development of Web-based Tools to Inform Decision Making by Postsecondary Students

In June, the Institute of Education Sciences (IES) announced two new awards to technology firms to develop web-based tools that inform student decision making in postsecondary education. The projects will focus on generating a measure of the return of investment (ROI) for different educational training programs and careers so that high school and college students have access to data-driven information to guide their decisions.

The awards were made through a special topic offered by the IES Small Business Innovation Research (known as ED/IES SBIR) program, which funds the research and development of commercially viable education technology. (For information on the 21 awards made through the IES 2019 standard solicitation, read here.)

Background and Awards

While websites like College Scorecard and CareerOneStop provide information to explore training programs in colleges and occupations of interest, there is no tool that helps students understand the costs and benefits of individual postsecondary programs in an integrated, customizable, and user-friendly manner.  

The special topic SBIR solicitation requested proposals from small businesses to develop new ROI tools that would combine information on fees, time to complete, and projected earnings so that students can easily compare college and career pathways. The IES-funded ROI tools aim to improve student program completion rates, with higher employment and earnings, less education-related debt, and more satisfaction with their selected paths. The special topic SBIR solicitation offered up to $200,000 for firms to develop and evaluate a prototype of their ROI tool. 

Two awards were made through this special topic:

  • Illinois-based BrightHive, Inc. is developing a prototype of the Training, Education, and Apprenticeship Program Outcomes Toolkit (TEAPOT). Designed to inform student training and educational decision making over a variety of potential pathways, TEAPOT will improve the flow and accuracy of data resulting in improved estimates of the ROI for different postsecondary education pathways.  The team will develop a data interoperability system and simplified toolkit for states and local postsecondary and workforce development organizations. The toolkit will provide more high quality, consistent, and granular information on postsecondary outcomes. The prototype will calculate ROI using student information, programmatic information (with an emphasis on net program costs to allow for variations by program type at the same institution), and access to wage and employment data sets.
  • Virginia-based Vantage Point Consultants is developing a prototype of a user-contextualized ROI tool that prospective students will use to make meaning of lifetime costs and opportunity tradeoffs associated with different degree programs offered by postsecondary institutions. The ROI tool will incorporate information on student goals and academic, professional, and personal characteristics.  The prototype will include an interface to present information to guide decision making based on an ROI calculation that discounts earning cash-flows under current and future state career and education assumptions, while subtracting college cost. In the first phase of work, the project will use information from data partners including Burning Glass Technologies and from public sources at the Department of Labor and Department of Education.

After developing prototypes, researchers will analyze whether the tools function as intended and are feasible for students to use. Research will also test if the tool shows promise for producing a meaningful and accurate measure of ROI.  Both firms are eligible to apply for additional funding to complete the full-scale development of the ROI tool, including developing an interface to improve user experience and conducting additional validation research.

Stay tuned for updates on Twitter (@IESResearch) as IES projects drive innovative forms of technology.

Written by Edward Metz, program manager, ED/IES SBIR

Investing in the Next Generation of Education Technology

Millions of students in thousands of schools around the country have used technologies developed through the Small Business Innovation Research program (ED/IES SBIR) at the IES. The program emphasizes rapid research and development (R&D), with rigorous research informing iterative development and evaluating the promise of products for improving the intended outcomes. The program also focuses on the commercialization after development is complete so that products can reach schools and be sustainable over time.

At the end of June, ED/IES SBIR announced 21 new awards for technology products for students, teachers, or administrators in education and special education. (IES also announced two additional awards through a special topic solicitation in postsecondary education. Read about these awards here.) Of the 21 awards, 13 are for prototype development and 8 for full scale development (a YouTube playlist of the full scale development projects is available here). 

Many of the new 2019 projects continue education and technology trends that have emerged in recent years. These include the three trends below.

Trend 1: Bringing Next Generation Technologies for Classrooms
For educators, it can be challenging to integrate next generation technologies into classroom practice to improve teaching and learning. In the current group of awardees, many developers are seeking to make this happen. Schell Games is developing a content creation tool for students to create artistic performances in Virtual Reality (VR) and Gigantic Mechanic is designing a class-wide role-playing game facilitated by a tablet-based app. codeSpark is building a game for children to learn to code by creating story based narratives. Killer Snails, Lighthaus, and AP Ventures are all creating educational content for VR headsets and Parametric Studios, Innovation Design Labs, and LightUp are employing Augmented Reality (AR) to support learning STEM concepts. Aufero is bringing modern design principles to develop a traditional board game for students to gain foundational computer science and coding skills.

Trend 2: Personalized Learning

Several 2019 awards are building technologies to provide immediate feedback to personalize student learning. Graspable, Inc. and Apprendis are developing adaptive engines that formatively assess performance as students do activities in algebra and physical science, and Sirius Thinking is building a multimedia platform to guide and support pairs of students as they read passages. Charmtech is developing a prototype to support English learners in reading, Cognitive Toybox is creating a game-based school readiness assessment, Hats & Ladders is developing a social skills game, and IQ Sonics is refining a music-based app for children with or at risk for disabilities to practice speaking.

Trend 3: Platforms that Host and Present Data
School administrators and teachers are always seeking useful information and data to guide decision making and inform instruction. Education Modified is developing a platform for special education teachers to implement effective Individual Education Programs (IEPs) for students with or at risk for disabilities, and VidCode is developing a dashboard to offer teachers real-time performance metrics on coding activities to teachers. LearnPlatform is developing a prototype platform that generates reports to guide teachers in implementing new education technology interventions in classrooms, and Liminal eSports is developing a platform administrators and teachers can use to organize eSports activities where students participate in group game activities to learn.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Written by Edward Metz, Program Manager, ED/IES SBIR

Fieldnotes: Reflections from an Adult Education Instructor on Research and Practice

Approximately 18 percent of US adults are at the lowest levels of literacy and nearly 30 percent are at the lowest levels of numeracy. The adult education system serves adults with low skills, but many education researchers know little about the students or the setting.  Recently, NCER convened a working group of adult education instructors, administrators, and researchers to discuss adult education’s research and dissemination needs.

Mr. Marcus Hall, an adult education instructor at the Community Learning Center and JEVS Human Services in Philadelphia, participated in this working group. He spoke with Meredith Larson, NCER program officer for adult education, about his experiences and interests in research. A copy of the working group meeting summary is available here.

Please describe your adult education classroom.

I once taught a 7-week course with students ranging from 18 to over 60 years old who had low literacy or math scores. I tried to contextualize instruction around their career interests and differentiate it to their learning needs. For example, some students were proficient readers but needed comprehension and math practice while others struggled with one or more of the basic components of reading. Somehow, I needed to help those learning phonics and those struggling with fluency while also challenging those ready for comprehension work. It’s hard to meet student needs in such a short time without teacher aides or adaptive technologies.

Why is research particularly important for adult education?

The challenges we face are monumental. Despite the large number of adults in need, adult education feels under-funded, under-staffed, and under-appreciated. Our students need complex, comprehensive, and well-rounded intervention, but we often have to make the most out of slightly targeted, inexpensive, and difficult-to-implement solutions. We need researchers to provide practical information and recommendations that we can use today to help adults learn and retain information.

Have you used research into your teaching?

Specifically, for reading instruction, I use techniques and activities built on evidence-based reading interventions. I start with tested diagnostic assessments to determine the needs of my students followed by strategies such as Collaborative Oral Reading or Repeated Reading exercises to support my students.

What topic during the meeting stood out to you?

The discussion about the workforce and professional development resonated with me. Many of our educators are part-time, come out of K-12, close to retirement, and may not have specific training for working with adults. They are asked to teach subjects they may not have any certification in, and their programs may not be able to provide the professional development they need. Just as we need supports for our learners, we need research to develop supports for us educators.

What additional research would you like to see?

Many of my students have had traumatic experiences that, when relived in the classroom, can cause them to disengage or struggle. I feel that understanding triggers and signs of discomfort has greatly enhanced my ability to help my students. Many educators want to leverage mental health approaches, like trauma-informed care, but we could use help learning how to integrate these strategies into instruction.

What do you hope researchers and educators keep in mind regarding one another?

It seems that researchers publish and promote their work to other researchers and then move to the next topic. This may be due to time constraints, publishing demands, or institutional requirements. I hope researchers take the time to come into our settings and observe us in action. I want researchers to work with us to help us understand and accept what is and isn’t working.

As for educators, we need to not try things and then stop using them when something unexpected occurs. At times, we revert back to what we know and are most comfortable with in the classroom. We educators can and must think critically about our norms and be ready and willing to enhance our practice with new information. 

Guiding Principles for Successful Data Sharing Agreements

Data sharing agreements are critical to conducting research in education. They allow researchers to access data collected by state or local education agencies to examine trends, determine the effectiveness of interventions, and support agencies in their efforts to use research-based evidence in decision-making.

Yet the process for obtaining data sharing agreements with state or local agencies can be challenging and often depends on the type of data involved, state and federal laws and regulations regarding data privacy, and specific agency policies. Some agencies have a research application process and review timeline available on their websites. Others may have a more informal process for establishing such agreements. In all instances, these agreements determine how a researcher can access, use, and analyze education agency data.

What are some guiding principles for successfully obtaining data sharing agreements? 

Over several years of managing projects that require data sharing agreements, I have learned a few key principles for success. While they may seem obvious, I have witnessed data sharing agreements fall apart because one or more of these principles were not met:

  • Conduct research on a topic that is a priority for the state or local education agency. Given the time and effort agencies invest in executing a data sharing agreement and preparing data, researchers should design studies that provide essential information to the agency on a significant topic. It can be helpful to communicate exactly how and when the findings will be shared with the agency and possible actions that may result from the study findings.
  • Identify a champion within the agency. Data sharing agreements are often reviewed by some combination of program staff, legal counsel, Institutional Review Board staff, and research or data office staff. An agency staff member who champions the study can help navigate the system for a timely review and address any internal questions about the study. That champion can also help the researcher work with the agency staff who will prepare the data.
  • Be flexible and responsive. Agencies have different requirements for reviewing data sharing agreements, preparing and transferring data, securely handling data, and destroying data upon study completion. A data sharing agreement often requires some back-and-forth to finalize the terms. Researchers need to be prepared to work with their own offices and staff to meet the needs of the agency.
  • Work closely with the data office to finalize data elements and preparation. Researchers should be able to specify the sample, timeframe, data elements, and whether they require unique identifiers to merge data from multiple files. I have found it beneficial to meet with the office(s) responsible for preparing the data files in order to confirm any assumptions about the format and definitions of data elements. If the study requires data from more than one office, I recommend having a joint call to ensure that the process for pulling the data is clear and feasible to all staff involved. For example, to link student and teacher data, it might be necessary to have a joint call with the office that manages assessment data and the office that manages employment data.
  • Strive to reduce the burden on the agency. Researchers should make the process of sharing data as simple and efficient as possible for agency staff. Strategies include providing a template for the data sharing agreement, determining methods to de-identify data prior to transferring it, and offering to have the agency send separate files that the researchers can link rather than preparing the file themselves.
  • Start early. Data sharing agreements take a lot of time. Start the process as soon as possible because it always takes longer than expected. I have seen agreements executed within a month while others can take up to a year. A clear, jointly developed timeline can help ensure that the work starts on time.

What resources are available on data sharing agreements?

If you are new to data sharing agreements or want to learn more about them, here are some helpful resources:

Written by Jacqueline Zweig, Ph.D., Research Scientist, Education Development Center. Dr. Zweig is the Principal Investigator on an IES-funded research grant, Impact of an Orientation Course on Online Students' Completion Rates, and this project relies on data sharing. 

Equity Through Innovation: New Models, Methods, and Instruments to Measure What Matters for Diverse Learners

In today’s diverse classrooms, it is both challenging and critical to gather accurate and meaningful information about student knowledge and skills. Certain populations present unique challenges in this regard – for example, English learners (ELs) often struggle on assessments delivered in English. On “typical” classroom and state assessments, it can be difficult to parse how much of an EL student’s performance stems from content knowledge, and how much from language learner status. This lack of clarity makes it harder to make informed decisions about what students need instructionally, and often results in ELs being excluded from challenging (or even typical) coursework.

Over the past several years, NCER has invested in several grants to design innovative assessments that will collect and deliver better information about what ELs know and can do across the PK-12 spectrum. This work is producing some exciting results and products.

  • Jason Anthony and his colleagues at the University of South Florida have developed the School Readiness Curriculum Based Measurement System (SR-CBMS), a collection of measures for English- and Spanish-speaking 3- to 5-year-old children. Over the course of two back-to-back Measurement projects, Dr. Anthony’s team co-developed and co-normed item banks in English and Spanish in 13 different domains covering language, math, and science. The assessments are intended for a variety of uses, including screening, benchmarking, progress monitoring, and evaluation. The team used item development and evaluation procedures designed to assure that both the English and Spanish tests are sociolinguistically appropriate for both monolingual and bilingual speakers.

 

  • Daryl Greenfield and his team at the University of Miami created Enfoque en Ciencia, a computerized-adaptive test (CAT) designed to assess Latino preschoolers’ science knowledge and skills. Enfoque en Ciencia is built on 400 Spanish-language items that cover three science content domains and eight science practices. The items were independently translated into four major Spanish dialects and reviewed by a team of bilingual experts and early childhood researchers to create a consensus translation that would be appropriate for 3 to 5 year olds. The assessment is delivered via touch screen and is equated with an English-language version of the same test, Lens on Science.

  • A University of Houston team led by David Francis is engaged in a project to study the factors that affect assessment of vocabulary knowledge among ELs in unintended ways. Using a variety of psychometric methods, this team explores data from the Word Generation Academic Vocabulary Test to identify features that affect item difficulty and explore whether these features operate similarly for current, former, as well as students who have never been classified as ELs. The team will also preview a set of test recommendations for improving the accuracy and reliability of extant vocabulary assessments.

 

  • Researchers led by Rebecca Kopriva at the University of Wisconsin recently completed work on a set of technology-based, classroom-embedded formative assessments intended to support and encourage teachers to teach more complex math and science to ELs. The assessments use multiple methods to reduce the overall language load typically associated with challenging content in middle school math and science. The tools use auto-scoring techniques and are capable of providing immediate feedback to students and teachers in the form of specific, individualized, data-driven guidance to improve instruction for ELs.

 

By leveraging technology, developing new item formats and scoring models, and expanding the linguistic repertoire students may access, these teams have found ways to allow ELs – and all students – to show what really matters: their academic content knowledge and skills.

 

Written by Molly Faulkner-Bond (former NCER program officer).