High-stakes Nudges: Providing social information can affect what job you choose and keep

A single sentence at the end of an email can affect whether or not someone accepts a job—and stays at it for years to come. Lucas Coffman (Harvard), Clayton Featherstone (University of Pennsylvania), and J-PAL affiliate Judd Kessler (University of Pennsylvania) found that providing social information—in this case, the job acceptance rate of the previous year’s admitted applicants—increased the likelihood that an individual would accept a teaching job with Teach for America. Those told about the previous year’s matriculation rate were not only more likely to accept the job, but also to complete training, start the job, and return for a second year. Based on this study’s results, Teach for America has started including a line about their historical matriculation rate in all admissions letters.

Teach for America, a non-profit organization that recruits recent college graduates and professionals to teach in under-resourced public schools, sends thousands of offer letters via email to job applicants each year. To test how the provision of social information would influence an applicant’s likelihood to accept a job, the researchers added one line to the end of randomly selected job offer letters: “Last year, more than 84 percent of admitted applicants made the decision to join the corps, and I sincerely hope you join them.”

Admitted applicants who received offer letters with this sentence were 1.8 percentage points more likely to accept the position. In other words, including this information persuaded 8.4 percent of admitted applicants who would not have joined Teach for America to do so. Furthermore, the researchers found that providing social information increased matriculation by 3 to 5 percentage points among three subgroups of individuals who were likely to be “on the fence” about joining Teach for America. This means that providing social information persuaded 12 to 14 percent of “on-the-fence” admits who would not have joined Teach for America to do so. In addition, teachers who received this information also returned for their second year of teaching at higher rates, which suggests that the social information did not convince those who were likely to drop out at a later point to join the teaching program. 

Why might one simple sentence convince individuals to change their decisions? According to the researchers, information on their peers’ choices may influence admitted applicants’ opinion of the value of the Teach for America experience. For example, the high acceptance rate may signal that the program is particularly effective at improving student outcomes, or that it looks especially good on resumes.

While social information has been found to influence low-stakes decisions (e.g. donating to charity or taking an environmentally friendly action), Coffman, Featherstone, and Kessler show for the first time that social information—such as information about the previous decisions of others—can affect high-stakes behavior and do so persistently. As policymakers seek out cheap, subtle interventions to shape behavior, this study shows that providing social information can be a potentially powerful option.

Read the full paper published in the American Economic Journal here.

Featured Affiliate Interview: Philip Oreopoulos

This interview was originally completed February 2015.

Philip Oreopoulos is a Professor of Economics and Public Policy at the University of Toronto.

phil_oreopoulos

What got you interested in economics and the economics of education in particular?
In high school I thought I wanted a career in business, but then after taking an economics class I realized what I really wanted was to understand how the world of business worked, why some ended up happy and successful and others ended up homeless. Economics provides an appealing approach to understanding how best to deal with constraints (be they financial, time, or otherwise) and a wide set of tools for trying to make things better.

My training at Berkeley was in labor economics, with a focus on empirical methods for generating convincing causal inference. At that time, the field of behavioral economics was just starting to get off the ground, with research on procrastination and hyperbolic discounting, applied mostly to savings and finance. I started a project looking at the causal effects of compulsory schooling on wealth and happiness, and it dawned on me that the typical investment model of schooling could not easily explain high returns from compelling students to stay in school who otherwise would have left earlier. Behavioral models that incorporate adolescents’ lack of maturity or ability to ‘think things through’ I think are much better suited at explaining dropout behavior.  I’m very interested now in students’ own contribution to their schooling production function and what that production function actually looks like. We know surprisingly little around these topics, but that is starting to change.

What is one current research project that you’re particularly excited about?
I recently finished an experiment in which high school seniors at low-transition schools were guided through the college application process in class, for free. The goal was to help every Grade 12 student exit with an offer of acceptance from a college program they helped pick and a financial aid package, regardless of how certain they were of whether they wanted to go. Many students at these schools don’t receive encouragement to go to college and must initiate much time and focus if they want to complete all the application steps on their own. By incorporating the application process into the curriculum for everyone, students less sure about college can discover the variety of programs available to them. The option to go becomes much more real. The program, randomized at the school level, produced an increase in college going of 5 percentage points among the entire Grade 12 class – an increase of about 12 students per school.

What is your “dream evaluation”? (It doesn’t have to be feasible!)
I’d like to be able to evaluate long term effects for students at the margin of going to college. My interest and others in encouraging more youth to go to college (including possibly two-year vocational programs) is based on past research that is either outdated or not all that good. There’s room for improvement and I’m hoping that an experiment will come along that has a large enough treatment effect for encouraging youth to well-matched programs who otherwise would not have gone in order to evaluate intermediate and long-term impacts on skills, finances, health, and wellbeing.

I can’t resist also suggesting to systematically evaluate how different parenting methods affect children in the long term. There is not a lot of good evidence-based advice for parents on how they could be spending their time and money to foster children’s patience, grit, compassion, etc…  What activities are best for my kids? How should I manage screen time? How should I react to misbehavior? What’s the best way to discipline misbehavior? Do the answers to these questions depend on gender or social-economic circumstance? It’s would be very difficult to implement field experiments that change parenting styles, but since you’re asking me to dream…

What has been your craziest experience implementing a research project?
I had a bad experience once with a funder that did not like the one-year results of a two-year study. They requested changes to the program design and site location or else would withdrawal support. Funders usually partner to implement program evaluations because they think the program will work. It’s stressful when things don’t go exactly as planned, and even when they do, it’s no guarantee the program will work as expected.

See Phil Oreopoulos’ bio and evaluations.