Skip to:
  1. Main navigation
  2. Main content
  3. Footer
Article

The limitations of low-touch behavioral interventions for student success

A conversation with Professor Phil Oreopoulos

Philip Oreopoulos and other economists who study economic policy were understandably excited when a study showed that students who were given 10 minutes of help filling out financial aid forms were more likely to attend college than those students who didn't receive such support.  This famous study showed that a simple and inexpensive intervention – helping students and their parents navigate the Free Application for Federal Student Aid (FAFSA) application – could have large effects on students.

Were there other similar actions—economists call them behavioral interventions—that could help more students? In particular, what support would most help students from low-income backgrounds to find resources and improve their study habits in ways that would lead to higher academic performance and subsequent success in the labor market? To find out, Oreopoulos and a colleague set up the Student Achievement Lab at the University of Toronto to create and test low-cost, scalable programs aimed at increasing student success.  Increasing student success leads in many ways to increased labor market success, which should lead to a more dynamic labor force and grow the economy’s productive potential, increasing what economists would call maximum employment.

One set of experiments focused on using text messaging and online formats to convey information targeted to help students stay in school or improve academic performance. Participating students would fill out outline surveys and get texts with motivational tips such as advice on how to study, how to manage time, how to strike a balance between working and studying, and other supportive messages. Some also received in-person coaching. Oreopoulos found that neither that experiment nor nearly 20 others involving similar low-touch methods, with 3,000 to 5,000 student participants, had any impact on student success. Surveys showed that students felt positive about the programs, but that didn’t translate into measurable achievement gains.

This hard-earned knowledge highlights the value in social sciences of both experiments and finding out what does not work. Without this evidence, one could be well intentioned and thoughtful about the intended outcome but advocate for devotion of considerable resources to implementing what turns out to be ineffective interventions. Read about how the evidence changed Oreopoulos’s, and my, mind during our conversation about his research.

The following conversation has been edited and condensed. Listen to an extended version of this conversation.

Early inspiration for behavioral interventions in education

Dionissi Aliprantis

I'd be curious to hear about some of the work that you've done on effects of educational attainment and compulsory schooling laws.

Philip Oreopoulos

Sure. Compulsory schooling is a really interesting policy change that impacts people on whether they get more or less education. I found there is an increase in earnings to those who were exposed to more restrictive dropout ages—in effect not being allowed to leave early so they had to stay on for more education. Which is interesting. Why would someone who's being basically compelled to stay in longer against their desire, who would otherwise leave earlier, actually earn much more than if the law allowed them to leave earlier? Because standard economic theory, the investment model of human capital, suggests that if there were such gains to be had from the option to stay on longer, then you would, right?

Dionissi Aliprantis

Yeah, exactly.

Philip Oreopoulos

So it opens up a lot of questions around whether the model itself is wrong. That steered me to thinking about the behavioral side of education. At that time also, behavioral economics was just getting warmed up, pointing out these examples whenever we make decisions involving long-term benefits and short-term costs, we often don't behave in the rational long-term thinking way that we think of as an economist.

So there was a lot of work done on that but surprisingly not much on education. And so I got interested in this, thinking, "Well, if there's any one group who's prone to procrastination and all these other sort of barriers, that’s a teenager.”

So after the compulsory schooling work, there was some further interesting thought along these lines, where, if individuals are making poor long-term decisions, especially around schooling because, you know, we all prefer to go out on Friday night than stay home and study, or, you know, we get distracted easily, what if we incentivize doing actions in the present to offset the immediate cost in order to realize the long-term gain?

So, around that time in the early 2000s, there were a lot of us who came to this idea around the same time: What if we paid students to do better in school? And it was more than just that kind of simple idea. There was some logic to it, the idea being if students started to work hard in school, maybe they might realize they have a lot more potential than they thought, or maybe they would realize that they actually like the challenge, and we could actually change their habit, change their attitude towards school and grades.

And so, we were offering, for first year of college, students $2,000, $3,000, $4,000, $5,000 for getting high grades. This is a project that was joint with Josh Angrist. And we saw small reactions, especially among women, but not the kind of transformational response that we were hoping for. And I think it's fair to say that a lot of other people working in that area were trying this on high school students as well. It was sort of the same—trying to incentivize grades was not going to transform education. So, many of us kind of moved on.

An early low-touch intervention that succeeded

Philip Oreopoulos

And so, in that space, still interested in behavioral responses in education, there were a bunch of researchers working together with the Brookings Institution and H&R Block to focus on behavioral interventions that might improve saving. Savings is like the prototypical example of where a nudge or behavioral intervention might work. Some research came out with Esther Duflo and colleagues showing that if you made it easier to save at the time when someone was in the position to make a decision that they had some money coming back, they would save more.

And from that experience, the administration at H&R Block at the time became interested in expanding their services and put out a call for proposals to anyone who might have an idea that works within their business model. And if the idea was interesting, then they would allow you to work with them to go ahead and do an experiment.

So, I was sitting down with some other colleagues working on education, Eric Bettinger and Bridget Long, and, at the time, Dynarski came out with this really interesting paper around FAFSA on a postcard. So, she was kind of pointing out the financial aid application for student aid is really a pain to fill out. A lot of these questions don't need to be asked and you could really simplify it, you could actually put it down to a postcard and make it a lot easier for people to get it done.

And that was a really intriguing idea. And so, when Eric and Bridget and I were sitting down, we thought, "Hey, what if we work with H&R Block to make it really easy to fill out the FAFSA?"

Dionissi Aliprantis

I would just interject that I think the counter argument would be: This is a lot of money, right? When we're talking about the FAFSA, this is something that is a lot of money, and so you would expect that people would be willing to sit down and fill out two, five pages. I don't know how many pages it was. But it's almost shocking in a sense, I think, especially when you're thinking of it from that perspective of how much money is on the line.

I just want to color the thinking at the time, why this was such a surprising finding or for some people might have changed their beliefs a lot.

Philip Oreopoulos

You know, I think for a long time in public policy, administrators sort of just took for granted that if you build a form, and if people stand to benefit from it, they'll complete it. And it's true. One attitude is that, if someone can't bring themselves to complete this form, then why should we expect them to do well in the program they want to go into? Or course, if you take a step back and you realize how much help often goes into filling out these forms, or if you look at the forms yourself, it doesn't take a lot of time to realize that it is a pain to fill out. And it is not so hard to expect someone at the margin of thinking about college, with no parents that have gone before, with no support around them, might put it off.

Dionissi Aliprantis

Exactly. But, I think for me, that's one of the key insights here. And I think maybe we're thinking about this for a lot of public programs. And thinking about uptake, where we realize these frictions and these barriers that from one perspective might seem very small or surmountable, from another, like you just said, actually it might discourage a lot of people from taking up the program. Do you want to talk more about the experiment itself and the work that you all did?

Philip Oreopoulos

We proposed to them to take advantage of the fact that at the time when someone comes to get their taxes done at H&R Block, which often serves a disadvantaged population, a lot of the information needed to complete the financial aid application for college is the same information that was just collected. And so, we had two groups: we had sort of potentially the adult independent that might want to go back to school and then the parent coming into the H&R Block who had some child exiting high school. And we would help the child complete the form. And you know, working with them quite meticulously over each line item, trying to figure out what items follow which, we figured out all the skip logic so that we got it down to about 10 minutes of staying a little bit longer after the tax interview, they could stay and complete the form.

So then it was just a matter of randomizing who got this extra invitation. The control group had to get something, so we gave them a little pamphlet about college. We also had a second treatment group which got a pamphlet and a rough guess of how much they would be eligible for if they went back home and completed the form on their own.

Dionissi Aliprantis

That's interesting.

Philip Oreopoulos

So that extra information did nothing, maybe not surprisingly. Just telling them how much aid that they were eligible for didn't seem to move the dial.

Dionissi Aliprantis

Which is kind of surprising to me, actually. Because I am sure that those were nontrivial sums for a lot of people.

Philip Oreopoulos

Yeah. I mean, we thought we might have a little bit of an effect. But we really got zero across the board. For the independents, we had a lot of precision in saying there was just no impact.

So, if a 20-year-old who had just finished high school did their taxes at H&R Block and then got a notice that, if they applied for financial aid, they might be eligible for $6,000, $7,000 in grants—that did nothing.

But, if you worked with them for a little bit longer, and filled out the form, and they got an official letter from the Department of Education a few weeks later saying, "This is the money that's now available for you for going to college," that did move the dial, especially for the sample where we helped the parents. In that sample of youth just transitioning out of high school, the fraction going on to college the next year went up from about 40 percent to 48 percent. So, an eight percentage point increase just when we fill out that form. So, that was exciting.

Dionissi Aliprantis

10 minutes. Yeah.

Philip Oreopoulos

I think that got a lot of people got interested in that. It certainly lined up with Sue Dynarski's suggestion that we can make this process a lot easier. And I think it opened the door to a lot of people realizing that there was a lot of room to making the transition from high school to college a lot easier. And there's a lot of different ways to do it—not just the financial aid form, but help in choosing a field, help in completing the application, and lots of other things that we might want to consider.

Dionissi Aliprantis

I have to just say, I think it's kind of a wild finding. I mean, it's an incredible result. And I think it's one of those findings where it makes you think about everything a little bit differently.

Phil set up a lab to run experiments testing behavioral interventions…

Dionissi Aliprantis

So, I really wanted to talk about this piece in Education Next called, “Nudging and Shoving Students Toward Success.” I have read this multiple times. I think it's such a nice summary of findings and of just clear thinking. I think it's a really valuable piece in just synthesizing what we know. So, you talk about some of the other behavioral interventions that also gave some momentum to this kind of thinking about, okay, "What's on the table?" Where it's just we have too much paperwork, or there's information frictions and people just don't know, light touch, we don't have to spend that much money or totally rearrange the system, but we could get really good results from some types of interventions.

And I was wondering if you might talk about some of those. I'm thinking about the Hastings and Weinstein paper, where they mail information about schools to families in North Carolina. There's some work with text message reminders. There might be others. I'm curious if you might speak about those?

Philip Oreopoulos

Sure. That was a fun paper to write because it allowed me to clarify my own thoughts. You know, it's one of those things where you don't really know what you're thinking until you write your ideas down.

I think that there were a lot of us initially excited about the idea of using technology to try to simplify the process of making education decisions. The growing usage of text messages, or the low cost of sending an e-mail or mailing a package seemed to really open up the idea of providing information at the time you're thinking of making these decisions, in a way that could really make the difference between someone making a good decision or bad.

I think that there were some initial promising results suggesting that you could have almost the same kind of impact helping with college applications if you just sent text messages.

And there's a really neat attraction to that, of how low a cost it is per person, and sending these messages out. And even allowing the possibility of them texting back to someone who could then respond or speak with you on the phone. The ability to communicate with people that you're trying to reach just became a lot easier. And so, communicating a supportive message or a key reminder to someone who may want to be doing this seemed to be really attractive.

So a lot of people started trying to use low-cost messaging in the realm of education for things like college application, but also coaching, helping students, remind them of when they have homework, or reminding parents if their child missed a deadline. Also giving them motivational tips. You know, inspiring them when they might be down, or having some struggle, encouraging them to have a positive mindset.

A lot of research was going on to try to incorporate some of this technology with these behavioral interventions. And, of course, the attraction with even a small effect would be worthwhile because the cost was so small.

That seemed like a great research agenda; that really seemed to be taking off. For me, the question shifted from, “Okay, should we not only help individuals get into college, but could we help them complete it as well?” And any one of us teaching at the college level realizes that just getting them in is not the end of the story.

Dionissi Aliprantis

Step one. (laughs)

Philip Oreopoulos

Step one. A lot of students struggle. The dropout rate is concerningly very high, especially in the US. There's a lot of good advice on how to make the most out of college and how to have a good time and do well there, but a lot of students don't seem to be following it. Issues around study tips—how to work and do your homework, how to study effectively, how to manage your time, how to approach different things.

And so, I started looking into using text messaging and online formats as a way to scale effectively good advice and good information, almost as if we were doing like a University 101 type of class.
In that setting I had the idea, which I thought was quite clever, of teaming up with large, first-year instructors to make it mandatory to do a warm-up exercise where we would get all students to go online and spend an hour doing a type of survey that they would have to do for a small participation grade.

And in that context, it was really easy to set up an experiment, and we could try a lot of social-psychological interventions. And what I really liked about that is that we could iterate, in this A/B testing way, and have a very large research agenda on figuring out what works, and what works better, and then keep going.

Dionissi Aliprantis

And it's just hugely beneficial, (laughs) right? Everything we learn about what helps students persist and graduate is just tremendously beneficial for those students. They're not going to end up without a degree and with debt; they're going to have skills, they're going to have attainment.

Philip Oreopoulos

So we could learn a lot along the way from doing this, and it seemed logical to start with what research was already out there. I borrowed from the social-psych literature of these very small experiments that seem to suggest just getting students to think about their future goals and write that down would have these massive effects on GPA. Or just getting them to adopt a growth mindset, that if they saw setbacks as opportunities to learn and realize that learning is necessary for them to grow and excel, then they could keep at it, that everything would work out.

Dionissi Aliprantis

And I think there's lots of reasons to think that those kinds of interventions would work too.

Philip Oreopoulos

There's a lot of published material that seems to suggest this would be obvious. And in my setup, I had 3,000 to 5,000 students every year that I could set up these experiments with. And that was really exciting.

All my first-year econ instructor colleagues seemed willing to let me do this. I thought it was going to be this fantastic lab. So, this was worked together with Uros Petronijevic at York University. He and I just sort of started this Student Achievement Lab, putting together these online sites that people would have to go through. And then, also, some of the students would get follow-up text messages or even meetings set up with real coaches.

Dionissi Aliprantis

I mean, even now, hearing about it, I'm excited about it.

It's exciting because you're going to be able to tinker with this very critical moment, and you're going to have all of these students; you can learn a lot, and you have a bunch of ideas about what you think will work. So, this is a little bit of a cliffhanger, but I'm excited. (laughs)

…and found that behavioral intervention after behavioral intervention had zero effect on student success

Philip Oreopoulos

And so, Uros and I tried, in the first year, to use a goal-setting experiment that a previous study said had these massive effects. We found zero. Just no effect—precise zero. Not even a third of what they found.

Dionissi Aliprantis

This must have been crushing, right? I mean, just on a personal level. As a social scientist, it's like, what the data says is what the data says. I've got to follow that, but–

Philip Oreopoulos

Even working with the previous co-author, he didn't really want to work with me on that project once he realized that we weren't finding anything. We could have a long conversation about that.

Dionissi Aliprantis

It's interesting. Finding nothing is still learning, though, right? And I think it's huge. Maybe we should communicate that to the public.

Philip Oreopoulos

I think it's hard to defend the paper when you find nothing. It's not as exciting. And a lot of people will quickly say, "Well, did you try this, or did you do that? Maybe this didn't go well. Maybe it's you." And so we spend the whole seminar and paper trying to say, "It's not me. I really did everything I could, and we still found nothing."

Dionissi Aliprantis

And we've done it with thousands of students (laughs).

Philip Oreopoulos

So, it was frustrating. But then the next step was, "Well, we have this system now in place of doing more experiments, maybe we just adopted the wrong one. So, then Uros and I started to partner with David Jaeger and Greg Walton and other social psychologists to work on growth mindset. We started putting together our own coaching studies, having text message coaches. We would hire these really keen undergrads to provide positive feedback to students.

The way sometimes I describe it is that when a college student gets back their first midterm and they completely bombed, there's different ways to react. The one way is just to say, "Well, I guess college isn't for me. I'm not as smart as I thought I was. This is not the environment for me. I'm going to, basically, accept that I'm not a good student like everyone else around me, and maybe I'm going to quit."

Whereas, another way to react is accept that you're going to have ups and downs. You're still learning, and when you fail, the first thing is, "Let me try to understand what went wrong. Let me understand that. Now I sort of see what happened, and I will try not to make the mistake again. I know that this is part of my learning process, and from this learning process, I will get better. I need not take this failure as a sign that I will continue to fail.”

Dionissi Aliprantis

I think people can hear that and hear why it seems like that is an approach to life and to learning that will bring success.

Philip Oreopoulos

Right. And there's also lots of other kinds of mindsets. When the growth mindset didn't work, we said, "Okay, why don't we look at the belonging mindset," where we have a sense that some people, who are especially first-generation college students, or international students, may arrive onto campus not feeling that they belong.Some people may react in a negative way when they arrive and don't feel that they fit in. And others may realize, "Well, I'm just getting started. Eventually, I will fit in, get to meet more friends, and lots of other people are in the same boat."

So, we're good to try the international student mindset. I think we even tried an economics mindset. We just kept trying different types of studies. Every time we tried a new study where we thought, "Okay, this time, we got it. We understand," we kept getting zero results.

Dionissi Aliprantis

Oh, man. (laughs)

Philip Oreopoulos

And Uros and I were iterating, I think, eventually, over five years, we did about 15 to 20 separate experiments. All of them we had power to detect even just small effects and never even found those. And it was just consistently, over five years, zero, after zero, after zero.

Dionissi Aliprantis

I'm thinking about this from your perspective as a social scientist. You've got to call it how you see it. But, I'm sure, on a personal level, you were rooting for these, right? You don't want to go in and find zeros. You want to find, "Hey, we do this intervention, and, look, students are more successful," right?

Philip Oreopoulos

No one gets into the business of doing large field experiments expecting they'll find no effects. There's a lot of effort that goes into trying to produce high fidelity so that the experience on the participant side is in a way that you hope is going to benefit or help them.

And we were really happy with how the interventions looked. And looking at how the students responded, what they wrote down when we asked them to reflect on mindset, or to reflect on what they were learning, or what they were going to do; it all seemed to suggest they got the message. And these were keen first-year students—they were very inspired.

Dionissi Aliprantis

It wasn't you. It wasn't the design.

Philip Oreopoulos

And even some of the interactions that they were having with our coaches over text messages were really encouraging. Just when actually doing the data analysis, it did not translate to the aggregate effect.

If you ask the students did they like the program, or did they feel they gained a lot from it, they all loved it. They all thought this was really transformational, or that they would think that this type of program should continue.

Dionissi Aliprantis

It felt good.

Philip Oreopoulos

And so I've had to take a little bit of a philosophical perspective on this five-year effort that I thought was going to make my career, or offer potential for scalability on a large scale. I do think that, along this way, other researchers in this area of behavioral interventions have also come to realize the limitations of low-touch—meaning cheap e-mail, text, mailing—interventions at having meaningful large-scale change.

And I think if you look at the literature as a whole in this area and education, at best, the effects are small; at worst, they're zero.

For those of us in education policy who really want to make a difference, it feels like maybe we could be spending better use of our time looking at other types of policies to have large scale.

Dionissi Aliprantis

I think it's so easy to see the world how we want it to be and not how it actually is. And so, I just want to take the moment to acknowledge how I think it's very cool and I feel very privileged to sit here, listening to (laughs) you talking about after five years of these kinds of experiments—that hard-earned wisdom from that, knowledge from that. Of actually, “We did it and it didn't turn out how we were hoping.” And I think that's just so valuable. I just want to appreciate that. That you let the evidence change your mind.

Philip Oreopoulos

I appreciate that, for sure.

I do think it has helped me clarify my own thinking about how the scientific process can help with policy.

Hal Martin and Andrew Zajac contributed to this article.

The opinions expressed in this article are those of the participants and do not necessarily represent the views of the Federal Reserve Bank of Cleveland or the Board of Governors of the Federal Reserve System.

Philip Oreopoulos

Professor of Economics and Public Policy
University of Toronto

Philip Oreopoulos is a professor of economics and public policy at the University of Toronto. He received his PhD from the University of California at Berkeley and his MA from the University of British Columbia. He is a research associate of the National Bureau of Economic Research and research fellow at the Canadian Institute for Advanced Research. He has held previous visiting appointments at Harvard University and the Massachusetts Institute of Technology and has served as an editor for several top journals in economics. Professor Oreopoulos’s current work focuses on education policy, especially the application of behavioral economics to education and child development. He often examines this field by initiating and implementing large-scale field experiments, with the goal of producing convincing evidence for public policy decisions.

Questions or comments about the Program on Economic Inclusion?