Inside IES Research

Notes from NCER & NCSER

The Importance of Partnering with Practitioners in English Learner Research

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In FY 2014, Dr. Karen Thompson was awarded a grant for The Oregon English Learner Alliance: A Partnership to Explore Factors Associated with Variation in Outcomes for Current and Former English Learners in Oregon to determine best practices to support academic achievement among current and former English learners. Dr. Thompson and her colleagues wrote a guest blog post describing the work that the partnership undertook to better understand and improve the performance of English learners in Oregon. In this blog, we interviewed Dr. Thompson—three years after the end of the grant—to get her perspectives on the partnership, outcomes of their work, and where things currently stand.

 

What was the purpose of your research and what led you to do this work?

When I came to Oregon from California in 2012, there was growing momentum in the state to better understand and meet the needs of the state’s multilingual student population, particularly students classified as English learners (ELs). The state had developed an ambitious EL strategic plan, which included a variety of goals and action steps, such as identifying model programs and sharing best practices. I noticed that Oregon did not have publicly available information about the state’s former EL students. In prior work, other researchers and I had demonstrated that analyzing data only about students currently classified as English learners without also analyzing data about former EL students can provide incomplete and misleading information. Therefore, for Oregon to realize its goals and truly understand which programs and practices were most effectively educating its multilingual students, the state needed to make changes to its data systems. This was the seed that led to the Oregon Department of Education/Oregon State University English Language Learner Partnership. Our first goal was to simply determine how many former EL students there were in the state. Then, once the state had created a flag to identify former EL students, we were able to conduct a wide range of analyses to better understand opportunities and outcomes for both current and former EL students in ways that have informed state reporting practices and policy decisions.

 

How does this research differ from other work in the field? Why do you think partnerships with practitioners were necessary to carry out the work?

When we began our partnership, collecting and analyzing information about both current and former EL students was not common. Happily, more and more researchers and education agencies have now adopted these approaches, and we think our partnership has helped play a role in this important and illuminating shift.  

It was crucial to conduct this work via partnerships between researchers and practitioners. Practitioner partners had deep knowledge of the state’s current data systems, along with knowledge about which reporting and analysis practices could shift to incorporate new information about current and former EL students. Research partners had the bandwidth to conduct additional analyses and to lead external dissemination efforts. Our regular partnership meetings enabled our work to evolve in response to new needs. 

 

What do you think was the most important outcome of your work and why?

I think the most important outcome of our work is that educators across Oregon now have information about both their current and former English learner students and can use this data to inform policy and practice decisions. Other analyses we conducted have also informed state actions. For example, our analysis of how long it takes Oregon EL students to develop English proficiency and exit EL services informed the state’s EL progress indicator under the Every Student Succeeds Act.

 

What are the future directions for this work?

Our IES-funded partnership led to funding from the Spencer Foundation to do further research about EL students with disabilities in Oregon, which has impacted practices in the state. In addition, I am excited to be one of the collaborators in the new IES-funded National Research and Development Center to Improve Education for Secondary English Learners (PI: Aída Walquí, WestEd). As part of the Center’s research, I am working with colleagues at the University of Oregon and the University of California, Los Angeles to analyze malleable factors impacting content-course access and achievement for secondary EL students. We are collaborating with four states in this work, and as in our ODE/OSU partnership, we will be analyzing data for both current and former EL students. At a policy level, colleagues and I are involved in conversations about how data collection and reporting at the federal level could also incorporate analysis of data for both current and former EL students, including ways this might inform future reauthorizations of the Elementary and Secondary Education Act.

 

---

Dr. Karen Thompson is an Associate Professor at the College of Education at Oregon State University. Her research focuses on how curriculum and instruction, teacher education and policy interact to share the classroom experiences of K-12 multilingual students.

 

Written by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for English Learner Program, National Center for Education Research.

CALM - Child Anxiety Learning Modules: From Research to Practice at Scale in Education

Many elementary school students experience anxiety that interferes with learning and achievement, but few receive services. To expand the network of support for these young students, IES-funded researchers have turned to school nurses as a potential front-line resource. The Child Anxiety Learning Modules (CALM) intervention incorporates cognitive-behavioral therapy (CBT) and other evidence-based strategies for school nurses to use when a child has vague somatic complaints that often signal underlying anxiety.

 

 

In 2014, IES funded a Development and Innovation grant to support the development of CALM to enhance the capacity of elementary school nurses to help children with anxiety. Based on promising findings of feasibility and reduced anxiety and fewer school absences, the development team is launching an initial efficacy trial this fall to investigate the scale up potential of the CALM intervention.

 

We asked the developers of CALM—Golda Ginsburg (University of Connecticut School of Medicine) and Kelly Drake (Founder/Director of the Anxiety Treatment Center of Maryland; Johns Hopkins University School of Medicine)—to answer a few questions for our blog. Here’s what they answered.

 

 

 

 

 

 

 

 

 

 

 

 

Can you describe how the CALM intervention was developed? What led you to develop an intervention for school nurses to implement?

We have been developing and evaluating psychosocial interventions for youth with anxiety for the last two decades, and we’ve learned a lot about effective, evidence-based strategies. We know that CBT, which consists of coping strategies that target the physical, cognitive, and behavioral manifestations of anxiety, is effective in helping youth manage and reduce anxiety. Unfortunately, we’ve also learned that most youth do not receive these—or any—services to help them. To address this gap in service utilization, our efforts have focused on ways of improving access to these therapeutic strategies by broadening the pool of potential providers. Given that early interventions can reduce the long-term consequences of untreated anxiety AND that youth with anxiety often complain of troublesome physical symptoms at school, we naturally thought of school nurses as a key provider with enormous potential. However, although nurses reported spending a lot of time addressing mental health issues, they received minimal training in doing so. That’s when the idea of the CALM intervention was born. We developed the initial CALM intervention using an iterative process in which versions of the intervention and its implementation procedures were sequentially refined in response to feedback from expert consultants, school nurses, children, parents, and school personnel until it was usable in the school environment by school nurses.

 

Was it part of the original plan to develop an intervention that could one day be used at scale in schools?

Yes—absolutely! Members of the National Association of School Nurses have been on our advisory team throughout to help us plan for how to scale up the intervention if we find it helps students.

 

What was critical to consider during the research to practice process?

A central focus was to minimize burden on school staff and to integrate the intervention within the goals and mission of schools’ interdisciplinary teams. Therefore, using a multidisciplinary support team was critical in taking the intervention from a research idea to an intervention that school nurses could delivered in their real-world practice setting—schools! As clinical psychologists, we also relied on our multidisciplinary team to ensure the intervention was usable by school nurses in terms of content and flexible and feasible for their busy school day. Indeed, school nurses and school nurse organizations provided critical support for the development of CALM with a focus on feasible strategies and methods for nurses to implement. They also provided invaluable feedback regarding perceived barriers to successful implementation of the intervention and adoption by nurses and school systems, and solutions to potential barriers and options for scaling up the intervention. We also relied on experts in school-based mental health programs and those with expertise in designing, evaluating, and implementing evidence-based prevention programs in schools. We also leveraged state-level expertise by consulting with school health experts in the Connecticut State Department of Education and the Connecticut Nurses Association regarding mental health education for nurses.

 

What model are you using for dissemination and sustainability?

A wide variety of methods will be used to disseminate findings from the current study to reach different stakeholders. We will present and publish findings at 1) national scientific and practitioner-oriented conferences, 2) Maryland and Connecticut State Departments of Education and participating school districts, and 3) in relevant peer-reviewed journals. In addition, should the findings reveal a beneficial impact of the intervention, we will have the final empirically supported training and intervention materials available for broad scale implementation. The CALM intervention will be packaged to include a training seminar, training videos, nurse intervention manual, child intervention handouts, consultation/coaching plan, and assessment materials. The research team will offer training seminars with all supporting materials to school nurse organizations at the national, state, and local levels. We will also engage nurse supervisors to identify nurses—or volunteer themselves—to become trainers for newly hired nurses in the future. Finally, our current Advisory Board, which consists of members of the National Association of School Nurses (NASN), school nurses, and researchers with expertise in large scale school-based mental health program implementation and evaluation, will assist in broad dissemination and sustainability efforts.

 


Golda S. Ginsburg, Ph.D., Professor of Psychiatry, University of Connecticut School of Medicine and Adjunct Professor at The Johns Hopkins University School of Medicine, has over 25 years of experience developing and evaluating school-based interventions including school-based interventions for anxiety delivered by school clinicians, teachers, and nurses.

Kelly Drake, Ph.D., Founder/Director of the Anxiety Treatment Center of Maryland, Research Consultant with UConn, and Assistant Professor of Psychiatry in the JHU School of Medicine has extensive training and experience in clinical research with anxious youth and training clinicians in delivering CBT for children.

This interview was produced by Emily Doolittle (Emily.doolittle@ed.gov) of the Institute of Education Sciences. This is part of an ongoing interview series with education researchers, developers, and partners who have successfully advanced IES-funded education research from the university laboratory to practice at scale.

Using Mistakes as a Vehicle for Learning in Mathematics: From Research to Practice at Scale in Education

Every student makes mistakes. But not every student is given the opportunity to learn from mistakes. Left unaddressed, the mathematical misconceptions that underlie many mistakes can keep students from progressing in mathematics.

 

At the request of districts in the Minority Student Achievement Network (MSAN), a Strategic Education Research Partnership (SERP) team was convened in 2007 to address a widening achievement gap in Algebra I. The team was charged with identifying an intervention strategy, subject to several district constraints:

  1. The solution would need to be applied to all students in the regular classroom to avoid the stereotype threat associated with separating students based on performance and to protect the intervention from budget cuts that target supplemental, after-school, and summer programs first.
  2. A new curriculum was off the table because it would create upheaval for a time and would be followed by a decline in student performance during the period of adjustment.
  3. Extensive teacher training was considered undesirable because it would be costly and because algebra teachers consider themselves more expert in mathematics teaching than central office staff who would be requiring the training.

 

Julie Booth joined the partnership, and with funding from IES, led the iterative development and testing of worked example assignments that, with the input of teachers and administrators, fit within the routines of the classroom. The result—AlgebraByExample—consists of 42 uniquely designed assignments that address misconceptions, harness the power of explanation, and use mistakes as a vehicle for learning.

Typical math assignments require students to solve problems on their own. If a student’s work is incorrect, the student may never focus on what went wrong. ByExample assignments also give students problems to solve, but they first provide a solution to a similar problem that is marked right or wrong. Students are prompted with questions that target common misconceptions and errors before solving a similar problem on their own. Each assignment contains several strategically designed item pairs:

 

 

Designed in collaboration with teachers from districts in several states, the assignments can be easily incorporated into any Algebra I curriculum and teachers can choose in what way and in what order to use them. The assignments were tested in randomized trials in classrooms in eight districts with more than 6,000 students. Not only did students using AlgebraByExample improve an average of 7 percentage points on an assessment of standardized test items, students at the lower end of the distribution improved the most. The PDF downloads of the assignments are freely available for anyone to use.

The success of AlgebraByExample  led to  further IES funding of MathByExample for Grades 4 and 5 and GeometryByExample for high school geometry .

 

Resources:

AlgebraByExample website

MathByExample website

Booth et al, 2015

NSF Stem for All Video Submission 2019

 

Interview with Dr. Suzanne Donovan (SERP), Dr. Julie Booth (Temple University), and Allie Huyghe (SERP), the developers of the ByExample interventions.

 

 

Was it part of the original plan to develop an intervention that could one day be used at scale in schools?

Yes. SERP partnerships begin with problems of practice nominated by district partners, but the partnership agreement distinguishes SERP from a consultant. The intention from the start is to frame the problem and design a solution that can be used at scale. SERP has developed in-house, user-centered design expertise so that resources (such as the ByExample products) developed through partnerships meet the needs of teachers and students. Products scale when they improve the experience of teachers and students. Both the model and the internal design capacity allow SERP to move from problem framing through research, development, and dissemination of a product with IES grant funding.

 

Describe the initial research and development that occurred.

Dr. Julie Booth drafted initial assignments drawing on the mathematics misconceptions literature. SERP held regular partnership meetings with teachers and administrators at which assignments were reviewed and additional misconceptions were nominated for attention in the assignments. Administrators agreed to randomization of the assignments across classrooms and within-teacher. Assignments were first tested in individual topic blocks and revised in accordance with student performance data, observations, and teacher feedback. A year-long pilot study was then conducted using the full set of assignments.

 

Beyond IES or ED grants, what additional funding was needed to develop the intervention?

For the ByExample work, additional funding was provided by the Goldman Sachs Foundation in the initial phase to support partnership formation, problem framing, and the solution generation. IES grants funded the research and development, along with initial dissemination activities to make the materials available to the public. We were also able to develop an online platform to allow for digital use with the IES grant funds.

 

What model was used for dissemination and sustainability?

The assignments are available as free downloads on SERP’s website, and as printed workbooks through SERP’s partner print-on-demand company. They have been publicized through online communications, journal articles, presentations at conferences of various types, social media, and word of mouth. There will be a small fee for use of the digital platform to support its maintenance, but the PDFs will remain as free downloads. We have been able to sustain the collaboration of the partnership team by responding to requests from educators to expand the approach to other grade levels and submitting additional proposals to IES that have been awarded.

 

What advice would you provide to researchers who are looking to move their research from the lab to market? What steps should they take? What resources should they look for?

First, I would note that it is difficult to persuade educators to use a product that solves a problem they don’t believe they have. Listen to educators and apply research expertise to address the challenges that they experience on a day-to-day basis. Design for ease of use by teachers. No matter how good your strategy or marketing is, if it’s too much work for an already busy teacher to use, you may get uptake by a few committed teachers, but not at scale. Finally, pay attention to where teachers get their information. For AlgebraByExample, we got a big boost from the Marshall Report, produced by a teacher for other teachers to call attention to usable research.  

 

In one sentence, what would you say is most needed for gaining traction and wide scale use by educators?

Design for the routines of the classroom.

 


Suzanne Donovan, PhD, is the founding Executive Director of the SERP Institute, an education research, development, and implementation organization incubated at the National Academies. SERP leads collaborations of educators, researchers, and designers to generate research-based, scalable, and sustainable solutions to critical problems of practice. 

Julie Booth, PhD, is a Professor of STEM Education and Psychology and the Deputy Dean of Academic and Faculty Affairs at Temple University’s College of Education and Human Development. Her work focuses on translating between cognitive science and education to better understand students’ learning and improve instruction, primarily in mathematics education. She is currently an Executive Editor for the Journal of Experimental Education.

Allie Huyghe is the Assistant Director of the SERP Institute, where she manages several projects, including the IES-funded MathbyExample and GeometryByExample projects. She is also intricately involved with other SERP areas of work, participating in the design of materials from early development through release to the public.

 

This interview was produced by Christina Chhin (Christina.Chhin@ed.gov) and Edward Metz (Edward.Metz@ed.gov) of the Institute of Education Sciences. This is the fifth in an ongoing series of blog posts examining moving from university research to practice at scale in education.​

 

 

A2i: From Research to Practice at Scale in Education

This blog post is part of an interview series with education researchers who have successfully scaled their interventions.

Assessment-to-Instruction (A2i) is an online Teacher Professional Support System that guides teachers in providing Kindergarten to Grade 3 students individualized literacy instruction and assessments. Students complete the assessments independently online without the teacher taking time away from instruction. A2i generates instantaneous teacher reports with precise recommendations for each student and group recommendations. See a video demo here. Between 2003 and 2017, researchers at Florida State University (FSU) and Arizona State University (ASU), led by Carol Connor, developed and evaluated A2i with the support of a series of awards from IES and the National Institutes of Health. Findings from all publications on the A2i are posted here.

While results across seven controlled studies demonstrated the effectiveness of A2i, feedback from practitioners in the field demonstrated that implementation often required substantial amounts of researcher support and local district adaptation, and that the cost was not sustainable for most school district budgets. In 2014, the development firm Learning Ovations, led by Jay Connor, received an award from the Department of Education (ED) and IES’s Small Business Innovation Research program (ED/IES SBIR) to develop an technologically upgraded and commercially viable version of A2i to be ready to be used at scale in classrooms around the country. In 2018, with the support of a five-year Education Innovation and Research (EIR) expansion grant from ED totaling $14.65 million, A2i is now used in more than 110 schools across the country, with plans for further expansion. 

 

Interview with Carol Connor (CC) and Jay Connor (JC)

From the start of the research in the early 2000s, was it always the goal to develop a reading intervention that would one day be used on a wide scale?
CC: Yes and no. First, we had to answer the question as to whether individualization was effective in achieving student literacy outcomes. Once the research established that, we knew that this work would have wide-scale application.

When did you start thinking about a plan for distribution
CC: Before embarking on the cumulative results studies, in 2008, Jay said that we needed to know who the “customer” was… i.e., how purchasing decisions were made at scale.  His 2008 Phase I ED/IES SBIR was critical in shifting our research focus from individual classrooms to school districts as the key scaling point. 

Did you work with a technology transfer office at the university?
CC: Only to the extent of contractually clarifying intellectual property (IP) ownership and licensing. 

Who provided the support on the business side?
CC: Jay, who has an MBA/JD and has been a senior officer in two Fortune 100 companies was very instrumental in guiding our thinking of this evolution from important research to practical application. 


 Do you have any agreement about the IP with the university? What were the biggest challenges in this area?

JC: Yes, Learning Ovations has a 60-year renewable exclusive licensing agreement with FSU Foundation. FSU couldn’t have been better to work with.  Though there were expected back-and-forth elements of the original negotiations, it was clear that we shared the central vision of transforming literacy outcomes.  They continue to be a meaningful partner.

When and why was Learning Ovations first launched?
JC: In order to pursue SBIR funding we needed to be a for-profit company.  At first, I used my consulting business – Rubicon Partners LLP – as the legal entity for a 2008 Phase I award from ED/IES SBIR.  When we considered applying (and eventually won) a Fast Track Phase I & II award from SBIR in 2014, it was clear that we needed to create a full C – Corp that could expand with the scaling of the business, thus Learning Ovations was formed.

Who has provided you great guidance on the business side over the year? What did they say and do? 
JC: Having run large corporate entities and worked with small business start-ups in conjunction with Arizona State University (Skysong) and the University of California, Irvine (Applied Innovation at The Cove) and having taught entrepreneurship at The Paul Merage School of Business at UC Irvine, I had the experience or network to connect to whatever business guidance we needed.  Further, having attended a number of reading research conferences with Carol, I was quite conversant in the literacy language both from the research side and from the district decision maker’s side.

How do you describe the experience of commercializing the A2i? What were the biggest achievements and challenges in terms of preparing for commercialization?

JC: Having coached scores of entrepreneurs at various stages, I can safely say that there is no harder commercialization than one that must stay faithful to the underlying research.  A key strategy for most new businesses: being able to pivot as you find a better (easier) solution.  It is often circumscribed by the “active ingredients” of the underlying research.  Knowing this, we imbued Learning Ovations with a very strong outcomes mission – all children reading at, or above, grade level by 3rd grade.  This commitment to outcomes certainty is only assured by staying faithful to the research.  Thus, a possible constraint, became our uncontroverted strength.

Do you have advice for university researchers seeking to move their laboratory research in education into wide-spread practice? 
JC:  Start with the end in mind.  As soon as you envision wide-scale usage, learn as much as you can about the present pain and needs of your future users and frame your research questions to speak to this.  Implementation should not be an after-the-fact consideration; build it into how you frame your research questions. On one level you are asking simultaneously “will this work with my treatment group” AND “will this help me understand/deliver to my end-user group.”  I can’t imagine effective research being graphed onto a business after the fact.  One key risk that we see a number of researchers make is thinking in very small fragments whereas application (i.e., the ability to go to scale) is usually much more systemic and holistic.

In one sentence, what would say is most needed for gaining traction in the marketplace?
JC: If not you, as a researcher, someone on your team of advisors needs to know the target marketplace as well as you know the treatment protocols in your RCT.

____________

Carol Connor is a Chancellor’s Professor in the UC Irvine School of Education. Prior she was a professor of Psychology and a Senior Learning Scientist at the Learning Sciences Institute at ASU. Carol’s research focuses on teaching and learning in preschool through fifth grade classrooms – with a particular emphasis on reading comprehension, executive functioning, and behavioral regulation development, especially for low-income children.

Joseph “Jay” Connor, JD/MBA, is the Founder/CEO of Learning Ovations, Inc, the developer of the platform that has enabled the A2i intervention to scale.  Jay has 20+ years of experience in senior business management at the multi-billion dollar corporate level, and has experience in the nonprofit and public policy arenas.

This interview was produced by Edward Metz of the Institute of Education Sciences.

SELweb: From Research to Practice at Scale in Education

With a 2011 IES development grant, researchers at Rush University Medical Center, led by Clark McKown, created SELweb, a web-based system to assess the social-emotional skills in children in Kindergarten to Grade 3. The system (watch the video demo) includes illustrated and narrated modules that gauge children’s social acceptance with peers and assess their ability to understand others’ emotions and perspectives, solve social problems, and self-regulate. The system generates teacher reports with norm-referenced scores and classroom social network maps. Field trials with 8,881 children in seven states demonstrate that system produces reliable and valid measures of social-emotional skills. Findings from all publications on SELweb are posted here.

In 2016, with support from the university, McKown launched a company called xSEL Labs, to further develop and ready SELweb for use at scale and to facilitate the launch SELweb into the school marketplace. SELweb is currently used in 21 school districts in 16 states by over 90,000 students per year.

Interview with Clark McKown of Rush University Medical Center and xSEL Labs

 

From the start of the project, was it always a goal for SELweb to one day be ready to be used widely in schools?

CM: When we started our aspiration was to build a usable, feasible, scientifically sound assessment and it could be done. When the end of the grant got closer, we knew that unless we figured out another way to support the work, this would be yet another good idea that would wither on the vine after showing evidence of promise. In the last year and a half of the grant, I started thinking about how to get this into the hands of educators to support teaching and learning, and how to do it in a large-scale way.

 

By the conclusion of your IES grant to develop SELweb, how close were you to the version that is being used now in schools? How much more time and money was it going to take?

CM: Let me answer that in two ways. First is how close I thought we were to a scalable version. I thought we were pretty close. Then let me answer how close we really were. Not very close. We had built SELweb in a Flash based application that was perfectly suited to small-scale data collection and was economical to build. But for a number of reasons, there was no way that it would work at scale. So we needed capital, time, and a new platform. We found an outstanding technology partner, the 3C Institute, who have a terrific ed tech platform well-suited to our needs, robust, and scalable. And we received funding from the Wallace Foundation to migrate the assessment from the original platform to 3C’s. The other thing I have learned is that technology is not one and done. It requires continued investment, upkeep, and improvement.

What experiences led you to start a company? How were you able to do this as an academic researcher?

CM: I could tell you that I ran a children’s center, had a lot of program development experience, had raised funds, and all that would be true, and some of the skills I developed in those roles have transferred. But starting a company is really different than anything I’d done before. It’s exciting and terrifying. It requires constant effort, a willingness to change course, rapid decision-making, collaboration, and a different kind of creativity than the academy. Turns out I really like it. I probably wouldn’t have made the leap except that the research led me to something that I felt required the marketplace to develop further and to realize its potential. There was really only so far I could take SELweb in the academic context. And universities recognize the limitations of doing business through the university—that’s why they have offices of technology transfer—to spin off good ideas from the academy to the market. And it’s a feather in their cap when they help a faculty member commercialize an invention. So really, it was about finding out how to use the resources at my disposal to migrate to an ecosystem suited to continuing to improve SELweb and to get it into the hands of educators.

How did xSEL Labs pay for the full development of the version of SELweb ready for use at scale?

CM: Just as we were getting off the ground, we developed

 a partnership with a research funder (the Wallace Foundation) who was interested in using SELweb as an outcome measure in a large-scale field trial of an SEL initiative. They really liked SELweb, but it was clear that in its original form, it simply wouldn’t work at the scale they required. So we worked out a contract that included financial support for improving the system in exchange for discounted fees in the out years of the project.

What agreement did you make with the university in order to start your company and commercial SELweb?

CM: I negotiated a license for the intellectual property from Rush University with the university getting a royalty and a small equity stake in the company.

Did anyone provide you guidance on the business side?

CM: Yes. I lucked into a group of in-laws who happen to be entrepreneurs, some in the education space. And my wife has a sharp business mind. They were helpful. I also sought and found advisors with relevant expertise to help me think through the initial licensing terms, and then pricing, marketing, sales, product development, and the like. One of the nice things about business is that you aren’t expected to know everything. You do need to know how and when to reach out to others for guidance, and how to frame the issues so that guidance is relevant and helpful.

How do you describe the experience of commercializing SELWeb?

CM: Commercialization is, in my experience, an exercise in experimentation and successive approximations. How will you find time and money to test the waters? Commercialization is an exciting and challenging leap from the lab to the marketplace. In my experience, you can’t do it alone, and even with great partners, competitive forces and chance factors make success scale hard to accomplish. Knowing what you don’t know, and finding partners who can help, is critical.

I forgot who described a startup as a temporary organization designed to test whether a business idea is replicable and sustainable. That really rings true. The experience has been about leaving the safe confines of the university and entering the dynamic and endlessly interesting bazaar beyond the ivory tower to see if what I have to offer can solve a problem of practice.

In one sentence (or two!), what would say is most needed for gaining traction in the marketplace?

CM: Figure out who the customer is, what the customer needs, and how what you have to offer addresses those needs. Until you get that down, all the evidence in the world won’t lead to scale.

Do you have advice for university researchers seeking to move their laboratory research into wide-spread practice?

CM: It’s not really practical for most university researchers to shift gears and become an entrepreneur. So I don’t advise doing what I did, although I’m so glad I did. For most university researchers, they should continue doing great science, and when they recognize a scalable idea, consider commercialization as an important option for bringing the idea to scale. My impression is that academic culture often finds commerce to be alien and somewhat grubby, which can get in the way. The truth is, there are whip-smart people in business who have tremendous expertise. The biggest hurdle for many university researchers will be to recognize that they lack expertise in bringing ideas to market, they will need to find that expertise, respect it, and let go of some control as the idea, program, or product is shaped by market forces. It’s also a hard truth for researchers, but most of the world doesn’t care very much about evidence of efficacy. They have much more pressing problems of practice to attend to. Don’t get me wrong—evidence of efficacy is crucial. But for an efficacious idea to go to scale, usability and feasibility are the biggest considerations.

For academics, getting the product into the marketplace requires a new set of considerations, such as: Universities and granting mechanisms reward solo stars; the marketplace rewards partnerships. That is a big shift in mindset, and not easily accomplished. Think partnerships, not empires; listening more than talking.

Any final words of wisdom in moving your intervention from research to practice?

CM: Proving the concept of an ed tech product gets you to the starting line, not the finish. Going to scale benefits from, probably actually requires, the power of the marketplace. Figuring out how the marketplace works and how to fit your product into it is a big leap for most professors and inventors. Knowing the product is not the same as knowing how to commercialize it.

 ____________________________________________________________________________

Clark McKown is a national expert on social and emotional learning (SEL) assessments. In his role as a university faculty member, Clark has been the lead scientist on several large grants supporting the development and validation of SELweb, Networker, and other assessment systems. Clark is passionate about creating usable, feasible, and scientifically sound tools that help educators and their students.

This interview was produced by Ed Metz of the Institute of Education Sciences. This post is the third in an ongoing series of blog posts examining moving from university research to practice at scale in education.