High-stakes Nudges: Providing social information can affect what job you choose and keep

A single sentence at the end of an email can affect whether or not someone accepts a job—and stays at it for years to come. Lucas Coffman (Harvard), Clayton Featherstone (University of Pennsylvania), and J-PAL affiliate Judd Kessler (University of Pennsylvania) found that providing social information—in this case, the job acceptance rate of the previous year’s admitted applicants—increased the likelihood that an individual would accept a teaching job with Teach for America. Those told about the previous year’s matriculation rate were not only more likely to accept the job, but also to complete training, start the job, and return for a second year. Based on this study’s results, Teach for America has started including a line about their historical matriculation rate in all admissions letters.

Teach for America, a non-profit organization that recruits recent college graduates and professionals to teach in under-resourced public schools, sends thousands of offer letters via email to job applicants each year. To test how the provision of social information would influence an applicant’s likelihood to accept a job, the researchers added one line to the end of randomly selected job offer letters: “Last year, more than 84 percent of admitted applicants made the decision to join the corps, and I sincerely hope you join them.”

Admitted applicants who received offer letters with this sentence were 1.8 percentage points more likely to accept the position. In other words, including this information persuaded 8.4 percent of admitted applicants who would not have joined Teach for America to do so. Furthermore, the researchers found that providing social information increased matriculation by 3 to 5 percentage points among three subgroups of individuals who were likely to be “on the fence” about joining Teach for America. This means that providing social information persuaded 12 to 14 percent of “on-the-fence” admits who would not have joined Teach for America to do so. In addition, teachers who received this information also returned for their second year of teaching at higher rates, which suggests that the social information did not convince those who were likely to drop out at a later point to join the teaching program. 

Why might one simple sentence convince individuals to change their decisions? According to the researchers, information on their peers’ choices may influence admitted applicants’ opinion of the value of the Teach for America experience. For example, the high acceptance rate may signal that the program is particularly effective at improving student outcomes, or that it looks especially good on resumes.

While social information has been found to influence low-stakes decisions (e.g. donating to charity or taking an environmentally friendly action), Coffman, Featherstone, and Kessler show for the first time that social information—such as information about the previous decisions of others—can affect high-stakes behavior and do so persistently. As policymakers seek out cheap, subtle interventions to shape behavior, this study shows that providing social information can be a potentially powerful option.

Read the full paper published in the American Economic Journal here.

Evaluation Summary: Dependence Duration and Labor Market Conditions

shutterstock_468856877.jpgDoes long-term unemployment hurt a person’s chance of returning to the work force? According to a 2012 analysis by the U.S. Congressional Budget Office, long-term unemployment may “produce a self-perpetuating cycle wherein protracted spells of unemployment heighten employer’s reluctance to hire those individuals, which in turn leads to even longer spells of joblessness.” But despite widespread concern about this cycle of long-term unemployment, it has been difficult to credibly establish whether this challenge actually exists.

In response, Kory Kroft (University of Toronto), Fabian Lange (McGill University) and J-PAL affiliate Matthew Notowidigdo (now at Northwestern University) conducted a randomized evaluation testing the impact of the length of job applicants’ unemployment spells on firms’ callback decisions. Using a major online job board in the United States, the researchers sent roughly 12,000 fictitious resumes with randomly varied employment statuses and unemployment spells to 3,000 job openings—four resumes per job. For each job, researchers constructed two high-quality resumes and two low-quality resumes. Researchers also randomly assigned each resume’s employment status and, if unemployed, the length of the current unemployment spell. By randomly varying employment characteristics across high and low-quality resumes, researchers were able to isolate the effect of unemployment spells on firms’ callback decisions.

Overall, the evidence from this study demonstrates that longer unemployment spells reduced interview callbacks, potentially because employers consider long unemployment to be an indicator of low worker quality. Over the first eight months of an unemployment spell, the likelihood of receiving a callback from employers sharply declined the longer an applicant had been unemployed. Past eight months, more time spent out of the workforce did not continue to reduce callbacks. In addition, researchers found that this effect was stronger in a tight local labor market where fewer workers were likely to be unemployed and among young job seekers with limited experience. Future research is needed to examine how duration dependence affects older job seekers.

For more information, read the full evaluation summary on the J-PAL North America website.

Study Cited: Kroft, Kory, Fabian Lange, and Matthew J. Notowidigdo. 2013. “Duration Dependence and Labor Market Conditions: Evidence from a Field Experiment.” The Quarterly Journal of Economics 128: 1123-67.

Featured Affiliate Interview: Philip Oreopoulos

This interview was originally completed February 2015.

Philip Oreopoulos is a Professor of Economics and Public Policy at the University of Toronto.

phil_oreopoulos

What got you interested in economics and the economics of education in particular?
In high school I thought I wanted a career in business, but then after taking an economics class I realized what I really wanted was to understand how the world of business worked, why some ended up happy and successful and others ended up homeless. Economics provides an appealing approach to understanding how best to deal with constraints (be they financial, time, or otherwise) and a wide set of tools for trying to make things better.

My training at Berkeley was in labor economics, with a focus on empirical methods for generating convincing causal inference. At that time, the field of behavioral economics was just starting to get off the ground, with research on procrastination and hyperbolic discounting, applied mostly to savings and finance. I started a project looking at the causal effects of compulsory schooling on wealth and happiness, and it dawned on me that the typical investment model of schooling could not easily explain high returns from compelling students to stay in school who otherwise would have left earlier. Behavioral models that incorporate adolescents’ lack of maturity or ability to ‘think things through’ I think are much better suited at explaining dropout behavior.  I’m very interested now in students’ own contribution to their schooling production function and what that production function actually looks like. We know surprisingly little around these topics, but that is starting to change.

What is one current research project that you’re particularly excited about?
I recently finished an experiment in which high school seniors at low-transition schools were guided through the college application process in class, for free. The goal was to help every Grade 12 student exit with an offer of acceptance from a college program they helped pick and a financial aid package, regardless of how certain they were of whether they wanted to go. Many students at these schools don’t receive encouragement to go to college and must initiate much time and focus if they want to complete all the application steps on their own. By incorporating the application process into the curriculum for everyone, students less sure about college can discover the variety of programs available to them. The option to go becomes much more real. The program, randomized at the school level, produced an increase in college going of 5 percentage points among the entire Grade 12 class – an increase of about 12 students per school.

What is your “dream evaluation”? (It doesn’t have to be feasible!)
I’d like to be able to evaluate long term effects for students at the margin of going to college. My interest and others in encouraging more youth to go to college (including possibly two-year vocational programs) is based on past research that is either outdated or not all that good. There’s room for improvement and I’m hoping that an experiment will come along that has a large enough treatment effect for encouraging youth to well-matched programs who otherwise would not have gone in order to evaluate intermediate and long-term impacts on skills, finances, health, and wellbeing.

I can’t resist also suggesting to systematically evaluate how different parenting methods affect children in the long term. There is not a lot of good evidence-based advice for parents on how they could be spending their time and money to foster children’s patience, grit, compassion, etc…  What activities are best for my kids? How should I manage screen time? How should I react to misbehavior? What’s the best way to discipline misbehavior? Do the answers to these questions depend on gender or social-economic circumstance? It’s would be very difficult to implement field experiments that change parenting styles, but since you’re asking me to dream…

What has been your craziest experience implementing a research project?
I had a bad experience once with a funder that did not like the one-year results of a two-year study. They requested changes to the program design and site location or else would withdrawal support. Funders usually partner to implement program evaluations because they think the program will work. It’s stressful when things don’t go exactly as planned, and even when they do, it’s no guarantee the program will work as expected.

See Phil Oreopoulos’ bio and evaluations.

Featured Affiliate Interview: Matthew Notowidigdo

Matthew Notowidigdo is an Associate Professor of Economics at Northwestern University.

What got you interested in economics, and particularly in labor and health economics?

My interest in economics started in high school.  My AP economics teacher (Betsy Sidor) taught an interesting and challenging class that covered intro micro and macro.  That got me interested in studying economics in college; I majored in both econ and computer science. I didn’t end up focusing on labor and health until part of the way through graduate school.  I initially wanted to study either Industrial Organization or Corporate Finance.  This was primarily because I thought I’d have a comparative advantage in these areas. Since I had a strong computer science background (Bachelor’s and Master’s in Computer Science), I wanted to consider IO, which uses very sophisticated computational techniques.  Additionally, I had worked on Wall Street for a couple of years, which I thought would be useful for coming up with research ideas in Corporate Finance (I also took a great class from Antoinette Schoar, which I found very inspiring). However, after working as a Research Assistant for David Autor (Labor) and Amy Finkelstein (Health), I suppose my interest in labor and health came fairly directly from them.

What is one current research project that you’re particularly excited about?

I have just started a new RCT [randomized controlled trial] with Amy Finkelstein and J-PAL North America studying a “high-touch” application assistance program for elderly individuals who are eligible for food stamps but are not currently enrolled.  This is only my second large-scale RCT, and I am very excited about it.  As with my first RCT (which studied long-term unemployment), I expect that there will be some interest from the policy community in the results.  But I’m also excited about the fact that in designing the experiment, I think we have tried our best to go beyond “what works” and learn something of broad interest to other researchers.  In particular, I hope we’ll be able to learn something general about why eligible individuals do not currently take advantage of transfer programs.

What is your “dream evaluation”? (It doesn’t have to be feasible!)
That’s an interesting question!  I came up with few ideas.  First, I’d be interested in running a large-scale job training experiment that’s targeted towards the long-term unemployed.  My sense is that there is still much more to learn about “what works” when it comes to job training programs.

Second, I’ve recently been working the last couple of years studying the economic effects of winning the lottery (one neat thing about this is that we can analyze effects of winning the lottery using an RCT framework since we have very detailed information on the number of lottery tickets owned at the time of each lottery drawing).  One thing this paper has gotten me interested in is the differences between a universal basic income program (as is currently being debated in Switzerland and as is currently done in some developing countries) to the current US system of taxes and transfers, which is a patchwork system that creates a very complicated set of incentives.  It would be a “dream” evaluation to me to be able to compare the two systems in an RCT.

Lastly, my least feasible “dream” RCT would be a large-scale randomization of unemployment insurance (UI), similar to the cluster randomization of job search assistance by Bruno Crépon and co-authors, in order to learn more about the aggregate effects of UI.  There is currently an active debate in labor and macro about the “spillover” effects of UI and the role of UI as an automatic stabilizer.  There have been some clever natural experiments to try to learn about the aggregate effects of UI, but these well-executed papers have come to very different conclusions. So my hope would be that my “dream” RCT that would give more generous UI to everyone in a local labor market (choosing local markets at random) would help advance that debate.

What is your most memorable story from the field?

The most memorable story for me from the food stamps RCT mentioned above is when the lawyers on all the sides of the table were able to agree to the terms of our data-use agreement.  I was starting to worry that the project was not going to get off the ground.  It helped remind me that the world didn’t revolve around our RCT!  I am very grateful for all the patience and support we got during that part of the process, especially from the MIT legal team.

matt-noto-twitter