IES Blog

Institute of Education Sciences

Data Collection for Cost Analysis in an Efficacy Trial

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

In one of our periodic conversations about addressing cost analysis challenges for an efficacy trial, the Cost Analysis in Practice (CAP) Project and Promoting Accelerated Reading Comprehension of Text-Local (PACT-L) teams took on a number of questions related to data collection. The PACT-L cost analysts have a particularly daunting task with over 100 schools spread across multiple districts participating in a social studies and reading comprehension intervention. These schools will be served over the course of three cohorts. Here, we highlight some of the issues discussed and our advice.

Do we need to collect information about resource use in every district in our study?

For an efficacy study, you should collect data from all districts at least for the first cohort to assess the variation in resource use. If there isn’t much variation, then you can justify limiting data collection to a sample for subsequent cohorts.

Do we need to collect data from every school within each district?

Similar to the previous question, you would ideally collect data from every participating school within each district and assess variability across schools. You may be able to justify collecting data from a stratified random sample of schools, based on study relevant characteristics, within each district and presenting a range of costs to reflect differences. You might consider this option if funding for cost analysis is limited. Note that “district” and “school” refer to an example of one common setup in an educational randomized controlled trial, but other blocking and clustering units can stand in for other study designs and contexts.

How often should we collect cost data? 

The frequency of data collection depends on what the intervention is, length of implementation, and the types of resources (“ingredients”) needed. People’s time is usually the most important resource used for educational interventions, often 90% of the total costs. That’s where you should spend the most effort collecting data. Unfortunately, people are notoriously bad at reporting their time use, so ask for time use as often as you can (daily, weekly). Make it as easy as possible for people to respond and offer financial incentives, if possible. For efficacy trials in particular, be sure to collect cost data for each year of implementation so that you are accurately capturing the resources needed to produce the observed effects.

What’s the best way to collect time use data?

There are a few ways to collect time use data. The PACT-L team has had success with 2-question time logs (see Table 1) administered at the end of each history lesson during the fall quarter, plus a slightly longer 7-question final log (see Figure 2).

 

Table 1. Two-question time log. Copyright © 2021 American Institutes for Research.
1. Approximately, how many days did you spend teaching your [NAME OF THE UNIT] unit?  ____ total days
2. Approximately, how many hours of time outside class did you spend on the following activities for [NAME OF UNIT] unit? 

Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)

   a. Developing lesson plans _____ hour(s)
   b. Grading student assignments _____ hour(s)
   c. Developing curricular materials, student assignments, or student assessments _____ hour(s)
   d. Providing additional assistance to students _____ hour(s)
   e. Other activities (e.g., coordinating with other staff; communicating with parents) related to unit _____ hour(s)

 

Table 2. Additional questions for the final log. Copyright © 2021 American Institutes for Research.
3. Just thinking of summer and fall, to prepare for teaching your American History classes, how many hours of professional development or training did you receive so far this year (e.g., trainings, coursework, coaching)? _____ Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
4. So far this year, did each student receive a school-provided textbook (either printed or in a digital form) for this history class? ______Yes     ______No
5. So far this year, did each student receive published materials other than a textbook (e.g., readings, worksheets, activities) for your American history classes? ______Yes     ______No
6. So far this year, what percentage of class time did you use the following materials for your American History classes? Record average percent of time used these materials (It has to add to 100%)
   a. A hardcopy textbook provided by the school _____%
   b. Published materials that were provided to you, other than a textbook (e.g., readings, worksheets, activities) _____%
   c. Other curricular materials that you located/provided yourself _____%
   d. Technology-based curricular materials or software (e.g., books online, online activities) _____%
       Total 100%
7. So far this year, how many hours during a typical week did the following people help you with your American history course? Please answer for all that apply Record time to the nearest half hour (e.g., 1, 1.5, 2, 2.5)
   a. Teaching assistant _____ hours during a typical week
   b. Special education teacher _____ hours during a typical week
   c. English learner teacher _____ hours during a typical week
   d. Principal or assistant principal _____ hours during a typical week
   e. Other administrative staff _____ hours during a typical week
   f. Coach _____ hours during a typical week
   g. Volunteer _____ hours during a typical week

 

They also provided financial incentives. If you cannot use time logs, interviews of a random sample of participants will likely yield more accurate information than surveys of all participants because the interviewer can prompt the interviewee and clarify responses that don’t make sense (see CAP Project Template for Cost Analysis Interview Protocol under Collecting and Analyzing Cost Data). In our experience, participants enjoy interviews about how they spend their time more than trying to enter time estimates in restricted survey questions. There also is good precedent for collecting time use through interviews: the American Time Use Survey is administered by trained interviewers who follow a scripted protocol lasting about 20 minutes.

Does it improve accuracy to collect time use in hours or as a percentage of total time?

Both methods of collecting time use can lead to less than useful estimates like the teacher whose percentage of time on various activities added up to 233%, or the coach who miraculously spent 200 hours training teachers in one week. Either way, always be clear about the relevant time period. For example, “Over the last 7 days, how many hours did you spend…” or “Of the 40 hours you worked last week, what percentage were spent on…” Mutually exclusive multiple-choice answers can also help ensure reasonable responses. For example, the answer options could be “no time; less than an hour; 1-2 hours; 3-5 hours; more than 5 hours.

What about other ingredients besides time?

Because ingredients such as materials and facilities usually represent a smaller share of total costs for educational interventions and are often more stable over time (for example, the number of hours a teacher spends on preparing to deliver an intervention may fluctuate from week to week, but the classrooms tend to be available for use for a consistent amount of time each week), the burden of gathering data on other resources is often lower. You can add a few questions to a survey about facilities, materials and equipment, and other resources such as parental time or travel once or twice per year, or better yet to an interview, or better still, to both. One challenge is that even though these resources may have less of an impact on the bottom line costs, they can involve quantities that are more difficult for participants to estimate than their own time such as the square footage of their office.

If you have additional questions about collecting data for your own cost analysis and would like free technical assistance from the IES-funded CAP Project, submit a request here. The CAP Project team is always game for a new challenge and happy to help other researchers brainstorm data collection strategies that would be appropriate for your analysis.


Robert D. Shand is Assistant Professor in the School of Education at American University

Iliana Brodziak is a senior research analyst at the American Institutes for Research

Exploring the Growing Impact of Career Pathways

Career pathways programs for workforce development are spreading across the country at both the secondary and postsecondary levels. Based on a synthesis of studies examining career pathways programs that integrate postsecondary career-technical education (CTE), the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide presents five recommendations for implementing evidence-based practices:

Cover of advising practice guide
  1. Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment.
  2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways.
  3. Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways.
  4. Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways.
  5. Develop and continuously leverage partnerships to prepare students and advance their labor market success.

Led by the WWC’s postsecondary contractor, Abt Associates, this practice guide was created by an expert panel of researchers and practitioners to provide examples of career pathways strategies and components and guidance to implement them; advise on strategies to overcome potential obstacles; and summarize evidence associated with rigorous research studies that met WWC standards.

As a long-time researcher of postsecondary CTE and many other important aspects of community college education, I welcome the opportunity to reflect on these five recommendations. I hope that my blog will help readers understand how this new practice guide fits into a larger landscape of research focusing on programs, policies, and practices aligned with the career pathways framework. Far from new, the notion of career pathways goes back several decades; thus, it is not surprising that we see an evolution in research to measure students’ education and employment outcomes. And still, there is a need for more rigorous studies of career pathways.

The Abt team located about 16,000 studies that were potentially relevant to the practice guide. Those studies used a wide variety of methods, data (quantitative and qualitative), and analysis procedures. Only 61 of them were eligible for review against the WWC standards, however; and only 21 of those met the WWC standards. Interestingly, most of those 21 studies focused on non-degree postsecondary credentials, rather than on college degrees, with policies and programs associated with workforce development and adult education well represented. Thus, lessons from the practice guide speak more directly to career pathways programs that culminate in credentials below the associate degree level than about those programs leading to the associate or baccalaureate degree level.

This dearth of rigorous career pathways research is problematic, as educational institutions of all types, including community colleges, seek to deliver positive, equitable outcomes to students during and beyond the COVID-19 pandemic.

Focus on Career Pathways

After examining the evidence from the studies that met the WWC standards, it was clear that the evidence converged around career pathways programs following requirements in the Strengthening Career and Technical Education for the 21st Century Act and Workforce Innovation and Opportunity Act (WIOA). In alignment with the WIOA definition of career pathways, the set of studies in the practice guide examine a “combination of rigorous and high-quality education, training, and other services” that align with the skill needs of industries in the region or state and accelerate participants’ educational and career advancement, to the extent practicable.

As defined by WIOA, career pathways support learners in pursuing their education and career goals, lead to at least one postsecondary credential, and provide entry or advancement in a particular occupation or occupational cluster. Because a growing number of community colleges employ a career pathways approach, as advocated by the federal legislation, it made sense to focus the practice guide on rigorous results and evidence-based recommendations that may help to move career pathway design and implementation forward.

The Five Recommendations

Recommendation 1: Intentionally design and structure career pathways to enable students to further their education, secure a job, and advance in employment. Our panel advocated for the intentional design and structure of career pathways for good reason. Whereas all educational institutions enroll students in courses and programs, career pathways prioritize the student’s entire educational experience, from access and entry, to completion and credentialing, and on to employment and career advancement. This purposeful approach to supporting student attainment is theorized to lead to positive student outcomes.

Applying the meta-analysis process required by the WWC, we determined from the 21 studies whether career pathways were achieving this crucial goal. We found nine of the studies showed overall statistically significant, positive results on industry-recognized credential attainment. Of the 12 studies supporting this recommendation, most  measured non-degree credentials; only two measured degree attainment—an important point to recognize, because these are the studies that have been conducted thus far.

This very small number of rigorous studies measuring degree attainment leaves open the question of whether career pathways increase postsecondary degree attainment—specifically the predominant credential in the community college context, the associate degree—and calls for greater investment in research on student completion of associate degrees (as well as baccalaureate degrees, a growing phenomenon in the United States).

Recommendation 2. Deliver contextualized or integrated basic skills instruction to accelerate students’ entry into and successful completion of career pathways. Studies that met WWC standards showed a positive impact of career pathways on college credit accumulation and industry-recognized credential attainment. Only one study measured postsecondary degree attainment relative to contextualized and basic skills instruction and it reported statistically significant and negative results. However, descriptive and correlational studies suggest that contextualized and basic skills instruction contribute to positive educational outcomes for students enrolled in Adult Basic Education in addition to postsecondary CTE and workforce training.

That results of rigorous research complement descriptive studies, some of which provide rich details on program implementation, is information useful for scaling up community college career pathways. Having said this, we still need to know more about how contextualized, basic skills instruction—and other applied instructional interventions—affect the outcomes of students, especially those from racial minoritized groups, with low incomes, and who are the first generation to attend college, all purported to be well served by the career pathways approach.

Recommendation 3: Offer flexible instructional delivery schedules and models to improve credit accumulation and completion of non-degree credentials along career pathways. Studies supporting this recommendation focused on five education outcomes: industry-recognized credential attainment, academic performance, technical skill proficiency, credit accumulation, and postsecondary degree attainment. As seen with the previous two recommendations, results on industry-recognized credential attainment were statistically significant and positive. Results on academic performance, technical skill proficiency, and credit accumulation were indeterminate, meaning findings could be positive or negative but were not statistically significant.

What is important to reiterate here is that nearly all the studies that met the WWC standards focused on non-degree credentials, providing limited information about results on the education outcome of postsecondary degree attainment. To be clear, our panel is not saying career pathways should focus exclusively on non-degree credentials; rather that results on postsecondary degree attainment are not definitive. Even so, that findings linking flexible scheduling and non-degree credential attainment are positive is important to know now, when the country is dealing with the pandemic.

Community colleges nationwide are rethinking instructional delivery to better meet students’ dire health, family, and employment needs. Rigorous research on career pathways interventions, such as flexible delivery, is needed, particularly studies involving diverse student populations. In times of economic and social struggle, it is essential that community college career pathways produce the equitable outcomes they purport to provide.

Recommendation 4: Provide coordinated comprehensive student supports to improve credit accumulation and completion of non-degree credentials along career pathways. The rigorous studies meeting WWC standards and measuring outcomes relative to comprehensive student supports focused on the education outcome domain only. Similar to the previous recommendation on flexible scheduling, findings on industry-recognized credential attainment were statistically significant and positive. However, on supports, findings on credit accumulation were statistically significant and positive, reinforcing findings generated by other studies showing holistic supports improve student outcomes. For example, a meta-analysis of studies of the Trade Adjustment Assistance Community College and Career Training grants that used rigorous evaluation designs reported favorable results for holistic supports in counseling and advising, case management, and various other support services and educational outcomes.

Consistent with the recommendations in this practice guide, a growing body of evidence favors integrating comprehensive student supports with career pathways. These supports are intended to meet the needs of the diverse population of students who attend community colleges; so, they should demonstrate equitable results on educational outcomes. More rigorous research is needed to measure whether and how career pathways provide access, opportunity, and outcomes for racially minoritized, low-income, and other underserved student groups. These studies should ascertain the impact of student supports on both education and employment outcomes, recognizing that students seek a high-quality credential and a good job that offers economic security and career mobility.

Recommendation 5: Develop and continuously leverage partnerships to prepare students and advance their labor market success. This recommendation specifically emphasizes labor market success, based on studies that examine labor market outcomes only. Supporting this recommendation were findings from studies of four labor market outcomes: short-term employment, short-term earnings, medium-term employment, and medium-term earnings. (The studies did not include long-term findings.)

Overall, statistically significant and positive outcomes were found in the meta-analysis for short-term employment, short-term earnings, and medium-term earnings. However, for medium-term employment, the meta-analysis results were indeterminate. To clarify, this does not mean employment-focused partnerships do not lead to labor market success; instead it points to a dearth of research that tracks students through training and into employment for long enough to measure long-term outcomes.

Even so, these initial findings from the meta-analysis are promising and suggest that developing and leveraging such partnerships may help move the needle on short- and medium-term employment outcomes. Longitudinal research that tracks students for periods sufficient to know whether long-term employment and earnings are affected should be a priority in the future.

Moving Forward

As I reflect on the research that I have conducted on career pathways over the years, I am gratified to see mounting evidence of positive student outcomes. As a first-generation college student myself, it has always made sense to me to demystify the college education process. Helping learners understand the entire educational journey, from start to finish, is bound to help them see how what they are learning may contribute to future education and career choices. I went to college not knowing what it would be like or whether I would be able to succeed, and I benefited from faculty and advisors who helped me see how my future could progress.

For other students like me who enter college without the benefit of family members sharing their stories of college-going, and for those who have to balance school with work and family care-taking responsibilities, it is important to know how a college education, including postsecondary CTE, can lead to positive educational and employment outcomes. Student groups underserved by postsecondary education deserve our most resolute and far-reaching efforts.

To this end, additional rigorous evidence on the impact of postsecondary CTE on college degree attainment could help to inform career pathways design, funding, and implementation. Also, as I reflected on the five recommendations, I was struck by the modest amount of research on medium-term labor market outcomes and the lack of any studies of long-term labor market outcomes. When the focus of career pathways is creating a path to living-wage employment and career advancement over the long term, it isn’t enough to know that students’ immediate employment outcomes were improved. When many students attending community colleges are already working, it isn’t even clear what immediate employment means.

If the outcome of interest for the majority of community college students who are adults and working is whether they get a better job and higher salary than they were getting pre-education, more nuanced measures and longer follow-up periods are needed than those provided by any of the research reviewed for this practice guide. It seems to me that finding more evidence of medium- and long-term outcomes could also provide more useful evidence of how career pathways work for diverse learner groups who are under-studied at the present time.

I was honored to help develop the practice guide with Hope Cotner, Grant Goold, Eric Heiser, Darlene Miller, and Michelle Van Noy. What an enormously gratifying experience it was to work with these professionals, the WWC team at Abt, and the Institute of Education Sciences staff. Working on this practice guide has left me feeling more optimistic about what we could learn with a more sizeable federal investment in research on postsecondary CTE in general, and on career pathways specifically. Rigorous evidence is needed to test models, explore interventions, and understand results for the plethora of learner groups who attend community colleges.

As the nation struggles to pull out of the pandemic that continues to rage in pockets across the country, it is the right time to invest in research that helps prepare students for good jobs that advance living-wage careers over a lifetime. A true commitment to equity in CTE programming is necessary for the nation, and now is the time to invest.

_____________________________________________________________________________________________________________

Debra D. Bragg, PhD, is president of Bragg & Associates, Inc., and the founder of research centers focusing on community college education at the University of Illinois at Urbana-Champaign and the University of Washington. She spent the first 15 years of her career in academe studying postsecondary CTE for federally funded research centers, having devoted her entire research agenda to improving education- and employment-focused policies, programs, and practices to create more equitable outcomes for community college students. She served as an expert panelist for the What Works Clearinghouse (WWC)’s Designing and Delivering Career Pathways at Community Colleges practice guide.

 

 

Why School-based Mental Health?

In May 2021, we launched a new blog series called Spotlight on School-based Mental Health to unpack the why, what, when, who, and where of providing mental health services in schools. This first post in the series focuses on the why by discussing three IES-funded projects that highlight the importance of these services.

Increasing access to needed services. A primary benefit of school-based mental health is that it can increase access to much-needed services. A 2019 report from the Substance Abuse and Mental Health Services Administration (SAMSHA) indicates that 60% of the nearly 4 million 12- to 17-year-olds who reported a major depressive episode in the past year did not receive any treatment whatsoever. What can be done to address this need? One idea being tested in this 2019 efficacy replication study is whether school counselors with clinician support can provide high school students a telehealth version of a tier-2 depression prevention program with prior evidence of efficacy, Interpersonal Psychotherapy-Adolescent Skills Training (IPT-AST). Through individual and group sessions, the IPT-AST program provides direct instruction in communication and interpersonal problem-solving strategies to decrease conflict, increase support, and improve social functioning.   

Improving access to services for Black youth. Social anxiety (SA) is a debilitating fear of negative evaluation in performance and social situations that can make school a particularly challenging environment. The connection between SA and impaired school functioning is likely exacerbated in Black youth who often contend with negative racial stereotypes. In this 2020 development and innovation project, the research team aims to expand Black youth’s access to mental health services by improving the contextual and cultural relevance of a promising school-based social anxiety intervention, the Skills for Academic and Social Success (SASS). Through community partnerships, focus groups, and interviews, the team will make cultural and structural changes to SASS and add strategies to engage Black students in urban high schools who experience social anxiety.

Reducing stigma by promoting well-being. The second leading barrier cited by adolescents for not seeking mental health treatment include social factors such as perceived stigma and embarrassment. One way to counteract these barriers is to frame intervention in more positive terms with a focus on subjective well-being, a central construct in positive psychology. In this 2020 initial efficacy study, the research team is testing the Well-Being Promotion Program in middle schools in Florida and Massachusetts. In 10 core sessions, students low in subjective well-being take part in group activities and complete homework assignments designed to increase gratitude, acts of kindness, use of signature character strengths, savoring of positive experiences, optimism, and hopeful or goal-directed thinking.

These three projects illustrate why we need to carefully consider school-based mental health as a logical and critical part of success in school, particularly as we navigate the road to helping students recover from disengagement and learning loss during the coronavirus pandemic.  

Next in the series, we will look at the what of school-based mental health and highlight several projects that are developing innovative ways to support the mental health of students and staff in school settings.


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research at IES

 

Perspective Matters: How Diversity of Background, Expertise, and Cognition Can Lead to Good Science

IES funds cutting-edge researchers who often bring multiple disciplines together. Dr. Maithilee Kunda (Vanderbilt University) is one such researcher who stands at the juncture of multiple fields, using artificial intelligence (AI) to address questions related to cognition and autism spectrum disorder. Recently, Dr. Kunda received an award from the National Center for Special Education Research to develop an educational game that leverages AI to help students with autism spectrum disorder better infer and understand the beliefs, desires, and emotions of others. As a computer scientist and woman of color performing education research, Dr. Kunda exemplifies the value that diverse backgrounds, experiences, and disciplines bring to the field.

Bennett Lunn, a Truman-Albright Fellow at IES, asked Dr. Kunda about her work and background. Her responses are below.

As a woman of color, how have your background and experiences shaped your scholarship and career?

Photo of Dr. Maithilee Kunda

In college, I was a math major on the theory track, which meant that my math classes were really hard! I had been what one might call a “quick study” in high school, so it was a new experience for me to be floating around the bottom quartile of each class. The classes were mostly men, but it happened that there was a woman of color in our cohort—an international student from Colombia—and she was flat-out brilliant. She would ask the professor a question that no one else even understood, but the professor’s eyes would light up, and the two of them would start having some animated and incomprehensible discussion about whatever “mathy” thing it was. That student’s presence bestowed upon me a valuable gift: the ability to assume, without even thinking twice, that women of color quite naturally belong in math and science, even at the top of the heap! I don’t even remember her name, but I wish I could shake her hand. She was a role model for me and for every other student in those classes just by being who she was and doing what she did.

I have been extremely lucky to have seen diverse scientists and academics frequently throughout my career. My very first computer science teacher in high school was a woman. At a high school science camp, my engineering professor was a man who walked with two forearm crutches. Several of my college professors in math, chemistry, and robotics were women. My favorite teaching assistant in a robotics class was a Black man. In graduate school, I remember professors and senior students who were women, LGBTQ people, and people of color. Unfortunately, I know that the vast majority of students do not have access to such a wealth of diverse role models. It is heartening, though, that even a single role model—just by showing up—has so much power to positively shape the perceptions of everyone who sees them in their rightful place, be it in STEM, academia, or whatever context they inhabit.

What got you interested in a career in education science?

I read a lot of science fiction and fantasy growing up, and in high school, I was wrestling with why I liked these genres so much. I came up with a pet theory about fiction writing. All works of fiction are like extended thought experiments; the author sets up some initial conditions—characters, setting, etc.—and they run the experiment via writing about it. In general fiction, the experiments mostly involve variables at the people scale. In sci-fi and fantasy, on the other hand, authors are trying to run experiments at civilization or planetary scales, and that’s why they have to create whole new worlds to write about. I realized that was why I loved those genres so much: they allowed me to think about planetary-scale experiments! 

This “what if” mindset has continued to weave itself throughout my scholarship and career.

How did it ever become possible for humans to imagine things that don’t exist? Why do some people think differently from others, and how can we redesign the workings of our societies to make sure that everyone is supported, enriched, and empowered to contribute to their fullest potential? These kinds of questions fuel my scientific passions and have led me to pursue a variety of research directions on visual thinking, autism, AI, and education.

How does your research contribute to a better understanding of the importance of neurodiversity and inclusion in education?

Early in graduate school, and long before I heard the term neurodiversity, the first big paper I wrote was a re-analysis of several research studies on cognition in autism. This research taught me there can be significant individual variation in how people think. Even if 99 other people with similar demographic characteristics happen to solve a problem one particular way, that does not mean that the hundredth person from the same group is also going to solve the problem that way.

I realized much later that this research fits very well into the idea of neurodiversity, which essentially observes that atypical patterns of thinking should be viewed more as differences than as being inherently wrong or inadequate. Like any individual characteristics you have, the way you think brings with it a particular set of strengths and weaknesses, and different kinds of thinking come with different strengths and weaknesses.

Much of my team’s current research is a continuation of this theme. For example, in one project, we are developing new methods for assessing spatial skills that dig down into the processes people use to solve problems. This view of individual differences is probably one that teachers know intuitively from working one-on-one with students. One of the challenges for today’s education research is to continue to bring this kind of intuitive expertise into our research studies to describe individual differences more systematically across diverse learner populations.

In your area of research, what do you see as the greatest research needs or recommendations to address diversity and equity and improve the relevance of education research for diverse communities of students and families?

For the past 3 years, I have been leading an IES project to create a new educational game called Film Detective to help students with autism spectrum disorder improve their theory of mind (ability to take another’s perspective) and social reasoning skills. This was my first experience doing research on an interactive application of this kind. I was a newcomer to the idea of participatory design, which basically means that instead of just designing for some particular group of users, you bring their voices in as active contributors early in the design process. Our amazing postdoc Dr. Roxanne Rashedi put together a series of early studies using participatory methods, so we had the opportunity to hear directly from middle schoolers on the spectrum, their parents, and their teachers about what they needed and wanted to see in this kind of technology.

In one of these studies, we had students try out a similar education game and then give us feedback. One young man, about 11 or 12 years old, got frustrated in the middle of the session and had a bit of a meltdown. After he calmed down, we asked him about the game and what he would like to see taught in similar games. He told us that he would really like some help in learning how to handle his frustration better so that he could avoid having those kinds of meltdowns. Impressed by his self-awareness and courage in talking to us about his personal challenges, we ended up designing a whole new area in our game called the Relaxatron arcade. This is where students can play mini-games that help them learn about strategies for self-regulation, like deep breathing or meditation. This whole experience reinforced for me the mindset of participatory design: we are all on a team—researchers, students, parents, and teachers—working collaboratively to find new solutions for education.

We are also proud to work with Vanderbilt’s Frist Center for Autism and Innovation to make our research more inclusive and participatory. One of the many excellent programs run by this center is a software internship program for college students or recent graduates on the spectrum. This summer, we are pleased to be welcoming three Frist Center interns who will be helping us on our Film Detective project.

What has been the biggest challenge you have encountered and how did you overcome the challenge?

Throughout my career, I seem to have gravitated towards questions that not many other people are asking, using methods that not many other people are using. For example, I am a computer scientist who studies autism. My research investigates visual thinking, but not vision. I work in AI, but mostly in areas out of the mainstream.

I get a lot of personal and intellectual satisfaction out of my research, but I do face some steep challenges that I believe are common for researchers working in not-so-mainstream areas. For instance, it is sometimes harder to get our papers published in the big AI conferences because our work does not always follow standard patterns for how studies are designed and implemented. And I do experience my share of impostor syndrome (feeling unqualified for your job even when you are performing well) and FOMO (fear of missing out), especially when I come across some trendy paper that already has a thousand citations in 3 months and I think to myself, “Why am I not doing that? Should I be doing that?”

I try to remember to apply the very lessons that my research has produced, and I am fortunate to have friends and colleagues who help lift me out of self-doubt. I actively remind myself about the importance to our species of having diverse forms of thinking and how my own individual view of things is a culmination of my unique lifetime of educational and intellectual experiences. That particular perspective—my perspective—is irreplaceable, and, more than any one paper or grant or citation, it is the true value I bring to the world as a scientist.

How can the broader education research community better support the careers and scholarship of researchers from underrepresented groups?

I think research communities in general need to recognize that inclusion and diversity are everybody’s business, regardless of what someone’s specific research topic is. For example, we assume that every grant proposal and paper follow principles of rigorous and ethical research design, no matter the specific methodology. While some researchers in every discipline specialize in thinking about research design from a scholarly perspective, everyone has a baseline responsibility for knowing about it and for doing it.

Similarly, while we will always want and need researchers who specialize in research on inclusion and diversity, these topics should not be considered somehow peripheral to “real science." They are just as much core parts of a discipline as anything else is. As I constantly remind my students, science is a social enterprise! The pool of individual minds that make our discoveries for us is just as important as any piece of equipment or research method.

What advice would you give to emerging scholars from underrepresented, minoritized groups that are pursuing a career in education research?

A few years ago, when I was a newly minted assistant professor, I went to a rather specialized AI symposium where I found myself to be one of only two women there—out of over 70 attendees! The other woman was a senior researcher whom I had long admired but never met, and I felt a bit star-struck at the idea of meeting her. During one of the coffee breaks, I saw her determinedly heading my way. I said to myself as she approached, “Be cool, Maithilee, be cool, don’t mention the women thing…”  I was gearing myself up to have a properly research-focused discussion, but when she arrived, the very first words out of her mouth were, “So, there’s only the two of us, huh!” We both burst out laughing, and over the next couple of days, we talked about our research as well as about the lack of diversity at the symposium and in the research area more broadly.

The lesson I learned from this wonderful role model was that taking your rightful place in the research community does not mean papering over who you are. Certain researchers are going to be rarities, at least for a while, because of aspects of who we are, but that is nothing to hide. The value we bring as scientists comes from our whole selves and we should not just accept that but embrace and celebrate it.

This blog is part of a series of interviews showcasing a diverse group of IES-funded education researchers that are making significant contributions to education research, policy, and practice. For the first blog in the series, please see Representation Matters: Exploring the Role of Gender and Race on Educational Outcomes.

Dr. Maithilee Kunda is the director of the Laboratory for Artificial Intelligence and Visual Analogical Systems and founding investigator for the Frist Center for Autism and Innovation at Vanderbilt University. This interview was produced and edited by Bennett Lunn, Truman-Albright Fellow for the National Center for Education Research and the National Center for Special Education Research.

 

Timing is Everything: Collaborating with IES Grantees to Create a Needed Cost Analysis Timeline

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

 

A few months ago, a team of researchers conducting a large, IES-funded randomized controlled trial (RCT) on the intervention Promoting Accelerated Reading Comprehension of Text-Local (PACT-L) met with the Cost Analysis in Practice (CAP) Project team in search of planning support. The PACT-L team had just received funding for a 5-year systematic replication evaluation and were consumed with planning its execution. During an initial call, Iliana Brodziak, who is leading the cost analysis for the evaluation study, shared, “This is a large RCT with 150 schools across multiple districts each year. There is a lot to consider when thinking about all of the moving pieces and when they need to happen. I think I know what needs to happen, but it would help to have the key events on a timeline.”

The comments and feeling of overload are very common even for experienced cost analysts like Iliana because conducting a large RCT requires extensive thought and planning. Ideally, planning for a cost analysis at this scale is integrated with the overall evaluation planning at the outset of the study. For example, the PACT-L research team developed a design plan that specified the overall evaluation approach along with the cost analysis. Those who save the cost analysis for the end, or even for the last year of the evaluation, may find they have incomplete data, insufficient time or budget for analysis, and other avoidable challenges. Iliana understood this and her remark set off a spark for the CAP Project team—developing a timeline that aligns the steps for planning a cost analysis with RCT planning.

As the PACT-L and CAP Project teams continued to collaborate, it became clear that the PACT-L evaluation would be a great case study for crafting a full cost analysis timeline for rigorous evaluations. The CAP Project team, with input from the PACT-L evaluation team, created a detailed timeline for each year of the evaluation. It captures the key steps of a cost analysis and integrates the challenges and considerations that Iliana and her team anticipated for the PACT-L evaluation and similar large RCTs.

In addition, the timeline provides guidance on the data collection process for each year of the evaluation.

  • Year 1:  The team designs the cost analysis data collection instruments. This process includes collaborating with the broader evaluation team to ensure the cost analysis is integrated in the IRB application, setting up regular meetings with the team, and creating and populating spreadsheets or some other data entry tool.
  • Year 2: Researchers plan to document the ingredients or resources needed to implement the intervention on an ongoing basis. The timeline recommends collecting data, reviewing the data, and revising the data collection instruments in Year 2.
  • Year 3 (and maybe Year 4): The iteration of collecting data and revising instruments continue in Year 3 and, if needed, in Year 4.
  • Year 5: Data collection should be complete, allowing for the majority of the analysis. 

This is just one example of the year-by-year guidance included in the timeline. The latest version of the Timeline of Activities for Cost Analysis is available to help provide guidance to other researchers as they plan and execute their economic evaluations. As a planning tool, the timeline gathers all the moving pieces in one place. It includes detailed descriptions and notes for consideration for each year of the study and provides tips to help researchers.

The PACT-L evaluation team is still in the first year of the evaluation, leaving time for additional meetings and collective brainstorming. The CAP Project and PACT-L teams hope to continue collaborating over the next few years, using the shared expertise among the teams and PACT-L’s experience carrying out the cost analysis to refine the timeline.

Visit the CAP Project website to find other free cost analysis resources or to submit a help request for customized technical assistance on your own project.


Jaunelle Pratt-Williams is an Education Researcher at SRI International.

Iliana Brodziak is a senior research analyst at the American Institutes for Research.

Katie Drummond, a Senior Research Scientist at WestEd. 

Lauren Artzi is a senior researcher at the American Institutes for Research.