Inside IES Research

Notes from NCER & NCSER

Literacy and Deafness: Helping Students who are D/HH Improve Language and Writing Skills

Dr. Hannah Dostal (left), University of Connecticut, and Dr. Kimberly Wolbers (right), University of Tennessee
Image: Dr. Hannah Dostal (left), University of Connecticut, and Dr. Kimberly Wolbers (right), University of Tennessee

September is National Literacy Month and Deaf Awareness Month. To celebrate both occasions, we spoke with two IES-funded principal investigators about their intervention aimed at increasing the writing and language skills of students who are deaf or hard of hearing through teacher professional development targeting writing instruction and use of multilingual strategies. Together with their team, Dr. Kimberly Wolbers (University of Tennessee) and Dr. Hannah Dostal (University of Connecticut) developed Strategic and Interactive Writing Instruction (SIWI) and tested SIWI for efficacy. The team is now analyzing effects of SIWI on both student and teacher outcomes in the D/HH space.

What are some challenges facing deaf and hard of hearing (D/HH) students in the area of literacy? How does your project address these student-related challenges?

Children who are D/HH are highly diverse with respect to language modality (spoken, sign) and proficiency. Understanding this diversity is the foundation to their literacy learning and academic engagement. Working between languages and across modalities when engaged in literacy tasks is a unique challenge for D/HH writers. For example, a student may use American Sign Language (ASL), which does not have a written form, while learning and using English text as they read and write. Strategies used during writing instruction that scaffold bilingual and multilingual development have the potential to leverage student knowledge of languages to support literacy development. During SIWI, teachers engage students in explicitly comparing and contrasting ASL and English with the intention of increasing metalinguistic knowledge and translation abilities.

Another unique challenge is that a number of D/HH students lack consistent exposure to accessible language at home and school. They may not hear sufficient amounts of spoken language to acquire its complexities and may not have sufficient exposure to sign language early in their lives to acquire visual language. Such language deprivation directly impacts literacy development. Based on what we are seeing and learning from our school partners, online learning due to COVID-19 exacerbated delays in academic progress when D/HH students experienced greater language isolation during this time.

Teachers implementing SIWI tackle expressive language delays head on. They use a designated space in the classroom, called the Language Zone, to develop, translate, and revise ideas generated in ASL and English.

What are some challenges facing teachers of D/HH in the area of literacy? How does your project address these teacher-related challenges?

It is becoming increasingly more challenging to find qualified teachers of the deaf. Not only are there shortages in the field, but many current teachers also point to limited preparation and a lack of assessment materials, curriculum, and instructional resources specifically designed for D/HH students with distinct languages histories.

The SIWI professional development (PD) program is designed to address these challenges facing teachers. It is a multi-component PD program that is intensive and sustained over a 3-year period and consists of a summer institute, site visits, and individual biweekly online coaching. Teachers not only learn about effective approaches but also how to flexibly enact the approaches with students who have diverse language histories and literacy skills.

What have you found so far?

SIWI has been implemented across settings with D/HH students and studies so far suggest SIWI results in significant language and literacy growth. Results from IES-funded studies using a variety of methods demonstrate SIWI’s positive impact on student outcomes. For example, we found a relationship between SIWI and positive student gains in the effective use of genre-related writing traits and grammar and conventions, including an increase in the length of writing as well as written language clarity and complexity. Recent analyses, currently in press, demonstrate that in one academic year, students participating in SIWI gained an average of 1.2 grade levels as measured by the Woodcock Johnson IV.

The SIWI PD program has also resulted in notable outcomes for SIWI teachers. The randomized control trial demonstrated significant increases in teachers’ knowledge of writing instruction, efficacy in teaching writing, and use of evidence-based practices compared to teachers in the business-as-usual control group (a manuscript is currently in progress). 

What are the next steps for your research?

Analyses of student outcomes in the efficacy trial are currently underway. In addition to analyzing the impact of SIWI on writing and language outcomes, we are also examining the impact on reading comprehension, vocabulary knowledge, handwriting, and motivation to write.

Additionally, we are investigating whether implementation fidelity of SIWI is positively associated with student outcomes. We intend to examine whether teachers with higher implementation fidelity in their second or third year of teaching SIWI demonstrate a significantly greater impact on their students’ writing and language growth.

Dr. Kimberly Wolbers is a Deaf Education Professor and Co-Director of Undergraduate Studies for the Department of Theory & Practice in Teacher Education at the University of Tennessee and Dr. Hannah Dostal is an Associate Professor of Reading Education and an advisory board member of the Aetna Chair of Writing at the University of Connecticut. This interview was produced and edited by Julianne Kasper, Virtual Student Federal Service Intern at IES and graduate student in Education Policy & Leadership at American University.

Peer to Peer: Career Advice for Aspiring Education Researchers from Pathways to the Education Sciences Alumni

This blog is part of an ongoing series featuring IES training programs as well as our blog series on diversity, equity, inclusion, and accessibility (DEIA) within IES grant programs. For more information, see this DEIA update from Commissioners Elizabeth Albro (National Center for Education Research) and Joan McLaughlin (National Center for Special Education Research).

 

In 2015, IES launched the Pathways to the Education Sciences Research Training Program to broaden participation in education research. Pathways grants are awarded to minority serving institutions and their partners to provide up to year-long training fellowships to undergraduate, post-baccalaureate, and masters students. Each Pathways program has a specific education theme such as literacy, equity/social justice in education, student success, and education pipelines. Pathways fellows receive an introduction to scientific research methods and their program’s education theme, as well as meaningful opportunities to participate in education research, professional development, and mentoring. Currently, there are seven funded Pathways programs; IES recently launched the newest program focused on learning analytics and data science to the University of California, Irvine. Over 250 students have participated in Pathways, and many (39 at last count) have already started doctoral programs. In honor of HBCU week (September 7-10), Hispanic-Serving Institutions Week (September 13-19), and Hispanic Heritage Month (September 15- October 15), we reached out to six Pathways alumni who are in graduate school to ask them for advice for other students who wish to pursue graduate study related to education research. Here is what they shared with us.

 

Comfort Abode

RISE Training Program, University of Maryland, College Park/Bowie State University (HBCU)

Doctoral Student, Indiana University

My number one piece of advice for students who want to become education researchers would be to keep in mind the purpose of your research. If nobody understands it, it is not helpful. And in order for people to understand it, you yourself need to understand it. You cannot teach what you do not know. Especially considering that the research is in education, the goal should be to educate teachers, students, faculty, or whomever, about what is being studied and (hopefully) steps that can be taken towards improving that area. You have to keep your audience in mind and while it should not be “dumbed down,” you have to make sure that your point is getting across clearly. In order for that to happen, you have to know what you are talking about. Project RISE was especially helpful in the fact that there were a lot of mentors and people willing to help you understand the scope of the research as well as provide comments and feedback on areas to improve upon.

 

Jeremy Flood

RISE Training Program, North Carolina Central University (HBCU)/University of North Carolina Wilmington/Pennsylvania State University

Doctoral Student, North Carolina Agricultural and Technical State University

My only advice would be to remember the mission of solving challenges in education. Within the body of education research, there are several ways one can accomplish this—whether it is by policy research, grounded theory, ethnography, or experiments, there are quite a diversity of tools available at a researcher’s disposal, so much so that it may seem overwhelming at first.  Do not stress if you find this true; you are not the first or the last to feel overwhelmed! Instead, use this as an opportunity to rededicate yourself to the mission and allow your dedication to choose a research path that is best for you. Whichever one, two, or three (or more) that you choose, make sure that the end goal seeks to improve the practice of education.

 

Jessala Grijalva

AWARDSS Training Program, University of Arizona/College of Applied Science and Technology at the University of Arizona

Doctoral Student, University of Notre Dame

I advise Pathways fellows to take the time to reflect and internalize the cultural competency components of the program. The Pathways program will not only prepare you with the hard and soft skills that you need to be a successful researcher, but also help you become an all-around culturally competent researcher. Sometimes, we assume that as students of color or students from diverse backgrounds that we are inherently culturally competent; yet, there is so much more to learn and to be aware of. From my experience as a participant in the Pathways program, I’ve learned of ways to extend cultural competency beyond research and into my interactions with other researchers, colleagues, mentors/mentees, and the broader community. To be an effective researcher, it’s not only important to conduct culturally component research, but to also work with people of all walks of life, and to be able to disseminate our research and findings to the public. Training in cultural competency is very rare and very valuable–and something we may not fully appreciate—so take advantage of this opportunity and make cultural competency an important priority in your conduct as a researcher.

 

Camille Lewis

PURPOSE Program, Florida State University/Florida Agricultural and Mechanical University (HBCU)

Doctoral Student, Florida State University

There is an African proverb that states: “Knowledge is like a garden. If it is not cultivated, it cannot be harvested.” On the quest to become an education researcher, it is easy to get caught up in the hype of being “the expert.”  My #1 piece of advice to anyone who is interested in education research is to remain a student of life. Your journey to becoming an education researcher will be filled with many opportunities to learn, adapt, and understand the process of learning. Embrace these experiences; allow your researcher identity to be shaped and influenced by new discoveries and new interests. Continue to seek new information and allow your knowledge base to be cultivated. My experience as a public-school teacher, PURPOSE fellow, and doctoral student has shown me the importance and necessity of continually seeking advice, experiences, knowledge, and professional development related to learning and education. This pursuit of knowledge has informed and shaped not just my research, but my life outside academia as well. I never allow myself to become a “know it all.” This keeps me humble and allows me to continue to make improvements in every facet of my life.  

 

Christopher Terrazas, MA

Pathways Program, University of Texas at San Antonio (UTSA; HSI)

Doctoral Student, University of Texas at Austin

UTSA Pathways was instrumental in developing my identity as a researcher and graduate student. The other day, I described my experiences as being in a rocket, and Pathways provided the fuel to take off and get one step closer to my goals as a researcher. During my time, I made it a priority to be curious, always. I did this by attending all seminars offered and asking questions—even questions that I thought were not the right ones to ask at the time. You never know who may share a similar experience or perhaps a differing one to support you in your endeavors. Be bold and use your voice as an instrument to understand the world of research and graduate school during this exciting journey. It is crucial to get into this mindset because this will be your experience, perhaps your first. You will want to make sure that you are well prepared for this process as an aspiring researcher and scholar because this is your future. With that said, my number one piece of advice is to look inward to reflect on your own life experiences. Use these thoughts to feed your inner sense of self because you know more than anyone what you want for your future to be. 

 

Erica Zamora

Pathways Program, California State University, Sacramento

Doctoral Student, University of Arizona

The Pathway Fellows Program had a tremendous impact on my growth as a scholar and education researcher. My advice to students is to engage in research that not only reflect their scholarly interests but also reflect their values as community members and educators. My experience in the program gave me a deeper understanding of the importance of social justice and equity work in research. Education has the potential to transform communities and encourage growth and development while perpetuating various forms of oppression. Engaging in education research that centers the voices of and the issues that historically marginalized groups experience could lead to transformative outcomes at postsecondary institutions.

 


Written by Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council. She is also the program officer for the Pathways to the Education Sciences Research Training Program and the new Early Career Mentoring Program for Faculty at Minority Serving Institutions, the two IES training programs for minority serving institutions, including Alaska Native and Native Hawaiian-Serving Institutions, American Indian Tribally Controlled Colleges and Universities, Asian American and Native American Pacific Islander-Serving Institutions (AANAPISI), Hispanic-Serving Institutions (HSIs), Historically Black Colleges and Universities (HBCUs), Predominantly Black Institutions, Native American-Serving, Nontribal Institutions, and any other minority-serving institution as specified in request for applications.

Updates on Research Center Efforts to Increase Diversity, Equity, Inclusion, and Accessibility

As we begin a new school year, NCER and NCSER wanted to share with our community some of the work we have been doing—and are seeking to do more of—in relationship to diversity, equity, inclusion, and accessibility (DEIA). We plan to provide occasional updates via this blog to share progress and keep the conversations going.  

Actions on Diversity

At the end of 2020, IES convened a Technical Working Group (TWG) to get feedback on ways that the research centers could improve our investments focused on DEIA. Under the leadership of Drs. Katina Stapleton and Christina Chhin, we convened a stellar panel that participated in a robust conversation. That conversation and the recommendations from the panel are available in this summary document. We are already implementing some of the recommendations and wanted to share steps that we have taken and our plans for next steps to advance DEIA in IES-funded research.

  1. One of the first steps that we took in response to the TWG recommendations was to take a close look at our Requests for Applications (RFAs), identify potential barriers to applicants from underrepresented groups, and revise and/or add language that more clearly articulated our commitment to DEIA, both in terms of those that conduct the research and in the populations studied. These changes were reflected in our FY 2022 RFAs, and we will continue to revise and improve our application materials.
  2. IES has been committed to building expertise among a broad range of scholars in the education sciences for nearly two decades. The TWG noted, however, that there is a pressing need to provide funds for early career investigators who may be working at MSIs, teaching-intensive institutions, and/or at institutions with limited opportunities for research mentorship. In response, IES launched an Early Career Mentoring for Faculty at MSIs research program. This new program extends our FY 2016 training investment in MSIs that we recompeted in FY 2021: the Pathways to the Education Sciences Training program. This program is designed to encourage undergraduate, post-baccalaureate, and masters-level students in groups that are historically underrepresented in doctoral education to pursue graduate study relevant to education research. Currently, there are seven IES-funded Pathways training programs in the United States, hosted by minority serving institutions (MSIs) and their partners. We are excited to see who applied in this first round of the Early Career Mentoring program and anticipate investing in this program in FY 2023 and beyond.  
  3. The TWG also recommended that IES intentionally reach out to the MSI community to ensure that they know about the opportunities available at IES. We held our first such event since the TWG on September 7, 2021, where IES hosted a virtual listening session at HBCU week. More than 250 scholars attended that session and provided valuable feedback on barriers to HBCU-based researchers applying for research funding from IES. We are in the process of scheduling additional listening sessions with other communities of researchers to provide more opportunities for input from diverse stakeholders and underrepresented groups.
  4. The TWG also recommended that IES take a deeper look at the demographic and institutional data of applicants to our grants programs to identify which groups of researchers and institutions are underrepresented. Data indicate that the percentage of applications received from MSIs between 2013 and 2020 was very small—4% of applications to NCER and 1% to NCSER. Of those applications that were funded, 10% of NCER’s awards were made to MSIs and none of NCSER’s awards were made to MSIs. IES reviewed the demographic information FY 2021 NCER and NCSER grant applicants and awardees voluntarily submitted, and among those who reported their demographic information, we found the following:
    • Gender (response rate of approximately 82%) - The majority of the principal investigators that applied for (62%) and received funding (59%) from IES identified as female.
    • Race (response rate of approximately 75%) - The majority of principal investigators that applied for (78%) and received funding (88%) from IES identified as White, while 22% of applicants and 13% of awardees identified as non-White or multi-racial.
    • Ethnicity (response rate of approximately 72%) - The majority of principal investigators that applied for (95%) and received funding (97%) identified as non-Hispanic.
    • Disability (response rate of approximately 70%) - The majority of principal investigators that applied for (97%) and received funding (96%) identified as not having a disability.

These data underscore the need for IES to continue to broaden and diversify the education research pipeline, including institutions and researchers, and better support the needs of underrepresented researchers in the education community. However, tracking our progress has proven to be a challenge. Responding to the demographic survey was voluntary so a significant number of applicants chose not to respond to particular questions. We strongly encourage all our grant applicants to respond to the demographic survey so that we will be better able to track our progress in improving diversity in our grant programs.

Addressing Misconceptions that Limit Diversity in IES Applicants

TWG panel members and attendees at the HBCU session highlighted a series of misconceptions that the education sciences community holds about the funding process at IES and recommended that IES identify communication strategies to address these misconceptions. IES hears that message loud and clear and wants to address at least a few of those misconceptions here.

Myth: IES only funds randomized controlled trials, limiting the range of researchers and institutions that can be competitive for IES grants.

Reality: IES funds a range of research, including measurement work, exploratory research, intervention development and testing, and efficacy and replication studies. We also fund a wide range of methods, including various experimental and quasi-experimental designs and mixed methods that combine quantitative and qualitative methods.

Myth: IES doesn’t support course buyout or summer salary.

Reality: IES supports grant personnel time to carry out research related activities. This can include course buyout and summer salary. Principal investigators on grants coordinate their budget planning with sponsored projects officers to ensure that their budgets comply with institutional guidelines as well as federal guidelines.

Myth: IES program officers are too busy to help novice applicants.

Reality: Because IES program officers are not involved in the peer review of applications, they can provide in-depth technical assistance and advice throughout the application process. They can even review drafts of proposals prior to submission! IES program officers can be your best resource in helping you submit a competitive grant proposal.

 

If you’d like to learn more about DEIA at IES, please see our Diversity Statement. You can also subscribe to our Newsflash and follow us on Twitter (@IESResearch) for announcements of future listening sessions. Please send any feedback or suggestions to NCER.Commissioner@ed.gov (National Center for Education Research) or NCSER.Commissioner@ed.gov (National Center for Special Education Research). Also, watch this blog over the next few months to read about the wide range of IES grantees and fellows from diverse backgrounds and career paths. Next up is our Hispanic Heritage Month (Sept. 15-Oct. 15, 2021) blog series.


Christina Chhin (Christina.Chhin@ed.gov), Katina Stapleton (Katina.Stapleton@ed.gov), and Katie Taylor (Katherine.Taylor@ed.gov) assisted Commissioners Albro and McLaughlin in writing this blog.

Working to Understand the Policy Process in the Development of Michigan’s Read by Grade Three Law

In recent decades, there has been an emphasis on quantitative, causal research in education policy. These methods are best suited for answering questions about the effects of a policy and whether it achieved its intended outcomes. While the question of, “Did it work?” remains critical, there is a need for research that also asks, “Why did it work? For whom? In what contexts?” To answer these types of questions, researchers must incorporate rigorous qualitative methods into their quantitative studies. Education research organizations like the Association for Education Finance and Policy have explicitly invited proposals using qualitative and mixed methodologies in an effort to elevate research addressing a range of critical education policy questions. Funding organizations, including IES, encourage applicants to incorporate qualitative methods into their research process. In this guest blog, Amy Cummings, Craig De Voto, and Katharine Strunk discuss how they are using qualitative methods in their evaluation of a state education policy.

 

In our IES-funded study, we use qualitative, survey, and administrative data to understand the implementation and impact of Michigan’s early literacy law—the Read by Grade Three Law. Like policies passed in 19 other states, the Read by Grade Three Law aims to improve K-3 student literacy skills and mandates retention for those who do not meet a predetermined benchmark on the state’s third-grade English language arts assessment. Although the retention component of these policies remain controversial, similar laws are under consideration in several other states, including Alaska, Kentucky, and New Mexico. Below are some of the ways that we have integrated qualitative methods in our evaluation study to better understand the policy process in the development of the Read by Grade Three Law.  

Collecting qualitative sources helped us understand how the policy came to be, thereby assisting in the structure of our data collection for examining the law’s implementation and subsequent effects. In our first working paper stemming from this study, we interviewed 24 state-level stakeholders (policymakers, state department of education officials, early literacy leaders) involved in the development of the law and coded state policy documents related to early literacy to assess the similarity between Michigan’s policy and those of other states. Understanding the various components of the Law and how they ended up in the policy led us to ensure that we asked educators about their perceptions and implementation of these components in surveys that are also part of our evaluation. For example, because our interviews made clear the extent to which the inclusion of the retention component of the Law was controversial during its development, we included questions in the survey to assess educators’ perceptions and intended implementation of this component of the Law. In addition, it confirmed the importance of our plan to use state administrative retention and assessment data to evaluate the effect of retention on student literacy outcomes.

To trace the Read by Grade Three Law’s conception, development, and passage, we analyzed these qualitative data using two theories of the policy process: Multiple Streams Framework (MSF) and policy transfer. MSF says that policy issues emerge on government agendas through three streams: problem, policy, and political. When these streams join, a policy window is opened during which there is a greater opportunity for passing legislation. Meanwhile, policy transfer highlights how policies enacted in one place are often used in the development of policies in another.

We found that events in the problem and political streams created conditions ripe for the passage of an early literacy policy in Michigan:

  • A national sentiment around improving early literacy, including a retention-based third-grade literacy policy model that had been deemed successful in Florida
  • A pressing problem took shape, as evidenced by the state’s consistently below average fourth-grade reading scores on the National Assessment of Educational Progress
  • A court case addressing persistently low-test scores in a Detroit-area district
  • Previous attempts by the state to improve early literacy

As a result of these events, policy entrepreneurs—those willing to invest resources to get their preferred policy passed—took advantage of political conditions in the state and worked with policymakers to advance a retention-based third-grade literacy policy model. The figure below illustrates interviewee accounts of the Read by Grade Three Law’s development. Our policy document analysis further reveals that Michigan’s and Florida’s policies are very similar, only diverging on nine of the 50 elements on which we coded.

 

 

Although this study focuses on the development and passage of Michigan’s early literacy law, our findings highlight both practical and theoretical elements of the policy process that can be useful to researchers and policymakers. To this end, we show how particular conditions, coupled by policy entrepreneurs, spurred Michigan’s consideration of such a policy. It is conceivable that many state education policies beyond early literacy have taken shape under similar circumstances: a national sentiment combined with influential brokers outside government. In this way, our mixed-methods study provides a practical model of what elements might manifest to enact policy change more broadly.

From a theoretical standpoint, this research also extends our understanding of the policy process by showing that MSF and the theory of policy transfer can work together. We learned that policy entrepreneurs can play a vital role in transferring policy from one place to another by capitalizing on conditions in a target location and coming with a specific policy proposal at the ready.

There is, of course, more to be learned about the intersection between different theories of the policy process, as well as how external organizations as opposed to individuals operate as policy entrepreneurs. As the number of education advocacy organizations continues to grow and these groups become increasingly active in shaping policy, this will be an exciting avenue for researchers to continue to explore.

This study is just one example of how qualitative research can be used in education policy research and shows how engaging in such work can be both practically and theoretically valuable. The most comprehensive evaluations will use different methodologies in concert with one another to understand education policies, because ultimately, how policies are conceptualized and developed has important implications for their effectiveness.


Amy Cummings is an education policy PhD student and graduate research assistant at the Education Policy Innovation Collaborative (EPIC) at Michigan State University (MSU).

Craig De Voto is a visiting research assistant professor in the Learning Sciences Research Institute at the University of Illinois at Chicago and an EPIC affiliated researcher.

Katharine O. Strunk is the faculty director of EPIC, the Clifford E. Erickson Distinguished Chair in Education, and a professor of education policy and by courtesy economics at MSU.

Data Collection for Cost Analysis in an Efficacy Trial

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

In one of our periodic conversations about addressing cost analysis challenges for an efficacy trial, the Cost Analysis in Practice (CAP) Project and Promoting Accelerated Reading Comprehension of Text-Local (PACT-L) teams took on a number of questions related to data collection. The PACT-L cost analysts have a particularly daunting task with over 100 schools spread across multiple districts participating in a social studies and reading comprehension intervention. These schools will be served over the course of three cohorts. Here, we highlight some of the issues discussed and our advice.

Do we need to collect information about resource use in every district in our study?

For an efficacy study, you should collect data from all districts at least for the first cohort to assess the variation in resource use. If there isn’t much variation, then you can justify limiting data collection to a sample for subsequent cohorts.

Do we need to collect data from every school within each district?

Similar to the previous question, you would ideally collect data from every participating school within each district and assess variability across schools. You may be able to justify collecting data from a stratified random sample of schools, based on study relevant characteristics, within each district and presenting a range of costs to reflect differences. You might consider this option if funding for cost analysis is limited. Note that “district” and “school” refer to an example of one common setup in an educational randomized controlled trial, but other blocking and clustering units can stand in for other study designs and contexts.

How often should we collect cost data? 

The frequency of data collection depends on what the intervention is, length of implementation, and the types of resources (“ingredients”) needed. People’s time is usually the most important resource used for educational interventions, often 90% of the total costs. That’s where you should spend the most effort collecting data. Unfortunately, people are notoriously bad at reporting their time use, so ask for time use as often as you can (daily, weekly). Make it as easy as possible for people to respond and offer financial incentives, if possible. For efficacy trials in particular, be sure to collect cost data for each year of implementation so that you are accurately capturing the resources needed to produce the observed effects.

What’s the best way to collect time use data?

There are a few ways to collect time use data. The PACT-L team has had success with 2-question time logs (see Table 1) administered at the end of each history lesson during the fall quarter, plus a slightly longer 7-question final log (see Figure 2).

 

Table 1. Two-question time log. Copyright © 2021 American Institutes for Research.
1. Approximately, how many days did you spend teaching your [NAME OF THE UNIT] unit?  ____ total days
2. Approximately, how many hours of time outside class did you spend on the following activities for [NAME OF UNIT] unit? 

Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)

   a. Developing lesson plans _____ hour(s)
   b. Grading student assignments _____ hour(s)
   c. Developing curricular materials, student assignments, or student assessments _____ hour(s)
   d. Providing additional assistance to students _____ hour(s)
   e. Other activities (e.g., coordinating with other staff; communicating with parents) related to unit _____ hour(s)

 

Table 2. Additional questions for the final log. Copyright © 2021 American Institutes for Research.
3. Just thinking of summer and fall, to prepare for teaching your American History classes, how many hours of professional development or training did you receive so far this year (e.g., trainings, coursework, coaching)? _____ Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
4. So far this year, did each student receive a school-provided textbook (either printed or in a digital form) for this history class? ______Yes     ______No
5. So far this year, did each student receive published materials other than a textbook (e.g., readings, worksheets, activities) for your American history classes? ______Yes     ______No
6. So far this year, what percentage of class time did you use the following materials for your American History classes? Record average percent of time used these materials (It has to add to 100%)
   a. A hardcopy textbook provided by the school _____%
   b. Published materials that were provided to you, other than a textbook (e.g., readings, worksheets, activities) _____%
   c. Other curricular materials that you located/provided yourself _____%
   d. Technology-based curricular materials or software (e.g., books online, online activities) _____%
       Total 100%
7. So far this year, how many hours during a typical week did the following people help you with your American history course? Please answer for all that apply Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
   a. Teaching assistant _____ hours during a typical week
   b. Special education teacher _____ hours during a typical week
   c. English learner teacher _____ hours during a typical week
   d. Principal or assistant principal _____ hours during a typical week
   e. Other administrative staff _____ hours during a typical week
   f. Coach _____ hours during a typical week
   g. Volunteer _____ hours during a typical week

 

They also provided financial incentives. If you cannot use time logs, interviews of a random sample of participants will likely yield more accurate information than surveys of all participants because the interviewer can prompt the interviewee and clarify responses that don’t make sense (see CAP Project Template for Cost Analysis Interview Protocol under Collecting and Analyzing Cost Data). In our experience, participants enjoy interviews about how they spend their time more than trying to enter time estimates in restricted survey questions. There also is good precedent for collecting time use through interviews: the American Time Use Survey is administered by trained interviewers who follow a scripted protocol lasting about 20 minutes.

Does it improve accuracy to collect time use in hours or as a percentage of total time?

Both methods of collecting time use can lead to less than useful estimates like the teacher whose percentage of time on various activities added up to 233%, or the coach who miraculously spent 200 hours training teachers in one week. Either way, always be clear about the relevant time period. For example, “Over the last 7 days, how many hours did you spend…” or “Of the 40 hours you worked last week, what percentage were spent on…” Mutually exclusive multiple-choice answers can also help ensure reasonable responses. For example, the answer options could be “no time; less than an hour; 1-2 hours; 3-5 hours; more than 5 hours.

What about other ingredients besides time?

Because ingredients such as materials and facilities usually represent a smaller share of total costs for educational interventions and are often more stable over time (for example, the number of hours a teacher spends on preparing to deliver an intervention may fluctuate from week to week, but the classrooms tend to be available for use for a consistent amount of time each week), the burden of gathering data on other resources is often lower. You can add a few questions to a survey about facilities, materials and equipment, and other resources such as parental time or travel once or twice per year, or better yet to an interview, or better still, to both. One challenge is that even though these resources may have less of an impact on the bottom line costs, they can involve quantities that are more difficult for participants to estimate than their own time such as the square footage of their office.

If you have additional questions about collecting data for your own cost analysis and would like free technical assistance from the IES-funded CAP Project, submit a request here. The CAP Project team is always game for a new challenge and happy to help other researchers brainstorm data collection strategies that would be appropriate for your analysis.


Robert D. Shand is Assistant Professor in the School of Education at American University

Iliana Brodziak is a senior research analyst at the American Institutes for Research