Testing is one of the most exciting parts of implementing behavioral science interventions for us at ideas42. After months of research and interviews, and countless design iterations, we are finally able to gain insight into the extent of our impact. But what happens once we’ve run the regressions, printed the tables, and synthesized our key takeaways? Once we have evidence about what works—and what doesn’t work—how do we apply our findings in new contexts?
Among the many takeaways from the Student Persistence Initiative, ideas42’s five-year collaboration with The City University of New York (CUNY) and the New York City Mayor’s Office for Economic Opportunity (NYC Opportunity), is this: behavioral science interventions in post-secondary education, even those that are email and text based, are rarely plug-and-play. In order to scale successful interventions within a school network, or indeed to scale to other school systems, we need to put effective interventions in the hands of education experts and equip them with the tools they need to adapt the interventions for the realities of their own school contexts.
Making the leap from “test” to “scale” is the motivation behind the Student Success Toolkit, an actionable guide for education experts looking to scale and adapt the interventions that our rigorous research suggests can help students succeed. The toolkit distills insights and lessons learned from 13 randomized controlled trials (RCTs) conducted over five years across six CUNY community colleges, touching on challenges ranging from increasing the number of students who annually renew the Free Application for Federal Student Aid (FAFSA), to encouraging students to take enough course credits to graduate within two to three years.
As we set out to create this toolkit, we reflected on our own implementation experience. Between 2015 and 2020, as we iterated on behavioral interventions, we had to adapt our designs for new contexts. We learned that the individual colleges within the CUNY network, the nation’s largest urban university system, differ in many unexpected ways—some purely semantic (different monikers for their respective advising offices) and some more meaningful (different transfer office structures that support students planning to transfer to a bachelor’s program). Additionally, each academic year is different—sometimes in predictable ways (school events fall on different dates each year), and sometimes in less predictable ways (as instruction moved online this year due to the COVID-19 pandemic, we adjusted the timing of our intervention messages to make sure they’d still feel relevant).
With these experiences in mind, we created a resource that not only shares open-source versions of our intervention materials (like text messages and emails), but also describes the methods and tools we used to consistently adapt and improve our interventions over a five-year period.
We divided the toolkit into two sections in order to systematically guide readers through our behavioral diagnosis and design methodology. The first section describes the behavioral barriers—situational, psychological, or contextual features that prevent students from making optimal decisions or acting on their good intentions on their journey to academic success. These insights, generated from hundreds of conversations with students and staff across the CUNY system, as well as our own user testing (including as enrolled students), are the foundation for all of our intervention designs.
In the second section, we provide case studies on our five most successful interventions to demonstrate how we used behavioral design at CUNY to help students overcome the diagnosed barriers. Specifically, we outline strategies for effective communication, and illustrate how we applied those strategies through our designs. For example, we found that simple language and an urgent tone works well in emails and texts reminding students to renew FAFSA. Along with sample messages, we highlight why we used particular language and images. Students at CUNY who received our FAFSA messages were between 4 and 9 percentage points more likely to renew FAFSA compared to a control group; if scaled, the low-cost designs presented in the toolkit could help educators at CUNY and elsewhere bring millions of additional dollars in financial aid to students with minimal additional cost to the schools.
At ideas42, we are already using the Student Success Toolkit’s insights and designs to further promote student persistence and success beyond CUNY. For example, with support from the Michael & Susan Dell Foundation, we’re currently working to deliver the content tested in the Student Success Toolkit to hundreds of new schools by integrating our behaviorally informed campaigns into a leading student communications platform. Through this multi-year project, we hope to facilitate a seamless experience for schools across the country to select, adjust, and implement messages that are relevant to their students’ needs.
As behavioral scientists, we constantly return to the question at the heart of our work: what drives human behavior? Testing is indeed exciting, but in order for evidence-based interventions to help students succeed across the country, we must continue to share foundational insights about human behavior along with best practices for behavioral design, including practical ways to translate research evidence to practice. With this in mind, we will continue to seek ways to make it as easy as possible for educators—the experts in their own contexts—to make the leap from knowing what works to making it work for their students.