Staff Profile: Elisabeth O’Toole

eot-photo_bos

What do you do at J-PAL? I provide research support for the Nurse-Family Partnership study, coordinating between the research team and providers on study implementation and analysis. I have also contributed to the facilitation of several J-PAL conferences, including for the State & Local Innovation Initiative and the Health Care Delivery Initiative. Lastly, I have been assisting editing a research resources document regarding RCT challenges.

What drew you to want to work at J-PAL? I rightfully anticipated that J-PAL would be a perfect fit for me to put my economics background to good use. It is incredible that the research resources created and impact evaluations conducted in our department have the potential to impact domestic and even global policy.

What is your favorite place in the world that you have been? Lake Atitlan, Guatemala – I went on a zip-line tour, and despite being incredibly nervous, the views were incredible as we were flying over coffee fields and rain forests as we looked out on the lake.

If you could have dinner with one person, dead or alive, who would it be? I would love the chance to speak again with the departed Fr. Ted Hesburgh, CSC – a former president of my alma mater whose actions had meaningful global impact.

If you could buy one material thing, and money was not an issue, what would you buy? A Viking stove top/oven to bake delicious cookies.

Read Elisabeth’s bio on the J-PAL North America website.

Evaluation Summary: Dependence Duration and Labor Market Conditions

shutterstock_468856877.jpgDoes long-term unemployment hurt a person’s chance of returning to the work force? According to a 2012 analysis by the U.S. Congressional Budget Office, long-term unemployment may “produce a self-perpetuating cycle wherein protracted spells of unemployment heighten employer’s reluctance to hire those individuals, which in turn leads to even longer spells of joblessness.” But despite widespread concern about this cycle of long-term unemployment, it has been difficult to credibly establish whether this challenge actually exists.

In response, Kory Kroft (University of Toronto), Fabian Lange (McGill University) and J-PAL affiliate Matthew Notowidigdo (now at Northwestern University) conducted a randomized evaluation testing the impact of the length of job applicants’ unemployment spells on firms’ callback decisions. Using a major online job board in the United States, the researchers sent roughly 12,000 fictitious resumes with randomly varied employment statuses and unemployment spells to 3,000 job openings—four resumes per job. For each job, researchers constructed two high-quality resumes and two low-quality resumes. Researchers also randomly assigned each resume’s employment status and, if unemployed, the length of the current unemployment spell. By randomly varying employment characteristics across high and low-quality resumes, researchers were able to isolate the effect of unemployment spells on firms’ callback decisions.

Overall, the evidence from this study demonstrates that longer unemployment spells reduced interview callbacks, potentially because employers consider long unemployment to be an indicator of low worker quality. Over the first eight months of an unemployment spell, the likelihood of receiving a callback from employers sharply declined the longer an applicant had been unemployed. Past eight months, more time spent out of the workforce did not continue to reduce callbacks. In addition, researchers found that this effect was stronger in a tight local labor market where fewer workers were likely to be unemployed and among young job seekers with limited experience. Future research is needed to examine how duration dependence affects older job seekers.

For more information, read the full evaluation summary on the J-PAL North America website.

Study Cited: Kroft, Kory, Fabian Lange, and Matthew J. Notowidigdo. 2013. “Duration Dependence and Labor Market Conditions: Evidence from a Field Experiment.” The Quarterly Journal of Economics 128: 1123-67.

Partnering with J-PAL North America: A Practitioner Perspective

Partner Testimonials_Flickr Creative Commons_© visitBerlin_Foto_Dirk Mathesius.jpg

Benefits Data Trust (BDT), a nonprofit partner, is collaborating with J-PAL North America to identify effective outreach strategies to enroll low-income households into benefits. We asked Rachel Cahill, Director of Policy at BDT, a few questions about her experience partnering with J-PAL North America to design an evaluation that will answer important questions about BDT’s work.

What made you decide to partner with J-PAL North America?

I think there is a benefit to the partnership. We were doing this work and we already had another evaluation although not as an RCT, and we were confident that our program worked to help low-income households apply for benefits. Really we were seeking to answer a related question: is the intervention that we already believe is working on SNAP takeup—is it having an impact on health outcomes?

We got connected through the Camden Coalition, and began working with Amy [Finkelstein] and Matt [Notowidigo], discussing what we already knew, to eventually arrive at the first order research question in this evaluation on the effect of our program on SNAP take-up.

We didn’t begin with this particular research question. It was sort of a negotiation—Matt and Amy found that there really isn’t a lot on the effect of outreach and application assistance on benefits enrollment. We bought into that approach.

It was much more of a partnership to decide on the research question.

Can you discuss one or two examples of challenges in delivering your intervention in context of a randomized evaluation and how you worked with J-PAL to overcome them?

There were various challenges. One of the nuances of the program is that we are using data that belongs to Pennsylvania Department of Health and Human Services. We had to broker a three-way agreement between BDT, MIT, and DHS, which was a very large barrier to overcome.

The current Data Use Agreement allowed us to use DHS data for outreach but not for research. Ultimately we did overcome that barrier, but it did delay the launch of the evaluation.

Doing an RCT with an entitlement program, we had to develop a design that was really just a wait list control. It would be much more straightforward to get to BDT’s core question of identifying the effect of the SNAP program by randomly assigning some people to receive the SNAP program and randomly assigning others not to receive the program. However, people who are eligible cannot be denied the program because it is an entitlement.

Instead, we had to think creatively to develop a high-intensity, low-intensity, and control group in an encouragement design. We had to negotiate with MIT because we were seeking simplicity in the design. Initially J-PAL proposed a dozen different types of outreach letters, to explore the effect of many different variations on the outreach strategy. BDT explained we can manage a lot of nuance, but there was a limit at which the ability for us to manage different treatment arms would decline.

There was a tradeoff: Do you keep the design simpler or do you have a dozen treatment arms and increase the probability that you make a mistake?

What lessons or insights about your program have emerged from this partnership and this evaluation?

We have learned a lot about doing research. I joked with Amy that I estimated I would spend 20 percent of my time on this project. She asked how I was going to spend 20 percent of my time on this project, and I said to Amy that it takes 20 percent of my time to answer her questions!

To do it right, which J-PAL does, it does take a lot of thought and planning. It really takes multiple people with different types of expertise. This requires a big resource commitment from the organization.

We also have to use some political capital with our state partners. There’s an opportunity cost there: you have a limited number of requests that you can make in a given time. Looking back on things now, I still would have done the evaluation, but I would have allocated more time and more resources.

For example, we thought BDT would be involved for the first 12 months, but we didn’t budget time for the data analysis because we figured that would be MIT’s role. We realized that the engagement doesn’t end when the 12 months ends, and that we still have to follow up concerning things like data collection.

This experience will just make us smarter in doing future research.

There’s been tremendous value in terms of learning on our program.

We very quickly saw that the marketing letter formatted in a particular way for the evaluation generated an increased response rate of a half percentage point, which may seem small but is a lot for our field of work. This was significant enough to us, not even to wait until the end of the RCT; we thought about how to incorporate this letter design in other states beside Pennsylvania immediately.

The discipline of setting up an RCT was also helpful.

How will your organization use the knowledge generated by the evaluation?

Our hypothesis is that the high-touch intervention will increase take-up more than the light touch intervention and certainly more than the control group.

A common misperception in this field of social services is that just sending out a letter will be enough to increase take-up. This is founded on the premise that low take-up is just an issue of awareness, but we know that the enrollment process takes a lot more than awareness raising. Many of the people we talk to know that SNAP exists—they just can’t imagine going through all the paperwork and enrollment procedures to access the benefits they are eligible for. This is especially the case for the SNAP program, which has one of the more archaic enrollment processes.

The real game changer, is whether we can demonstrate long-term outcomes on health. Amy and Matt believe that we are not powered enough to detect these effects in this particular evaluation.

People are really thinking about investments in social services to decrease healthcare costs. I don’t want to put it bluntly, but that’s where the money is—healthcare is an area where the government is spending so much money. We think this is a really significant opportunity to generate definitive evidence on whether social services can prevent future health costs rather than just having a hypothesis that social services might be helpful.

That’s really what we’re striving for, and we’re willing to go down the rabbit hole to figure that out.

I really like working with the MIT team. I would explain to any other nonprofit considering doing a randomized evaluation what a big deal an RCT is. It’s still a big challenge to get a nonprofit to go down that rabbit hole to answer really tough questions. With a big emphasis on rapid testing, people often don’t want to wait several years to see longer term outcomes. The staff alone in nonprofits often transition with 3 years, so it can be hard to even have the same people working for the duration of the evaluation.

I don’t say this as a critique but just as advice to J-PAL folks–thinking about partners other than government—building a level of transparency on the level of commitment that is required going into an evaluation. I was not aware of how much this would entail when BDT signed on.

Read partner testimonials from BDT and from the South Carolina Department of Health and Human Services here.

Staff Profile: Ting Wang

 

Ting_Wang.JPGWhat do you do at J-PAL?   I am a research assistant for MIT professor Amy Finkelstein, working on two of her RCT projects. One project in Pennsylvania studies if the intervention helps increase the take-up rate of SNAP, also known as “food stamps”, among poor and elderly population. The other project investigates the impact of introducing decision-making technology on a provider’s decision to order potentially unnecessary radiology scans for patients.

What drew you to want to work at J-PAL? I audited the “Consumers, Firms and Markets in Developing Countries” class taught by professor Robert Jensen at UPenn when I was on exchange and felt inspired by how economic research can help the poor. I wanted first-hand experience doing this kind of research.

What is your favorite place in the world that you have been?  I love my hometown, Beijing, very much. Besides Beijing, Kyoto in Japan.

If you had to eat only one food for the rest of your life, what would it be? Dumplings (Jiaozi).

Read Ting’s bio on the J-PAL North America website.

 

Staff Profile: Jamie Daw

jamie_dawWhat do you do at J-PAL? I work with policymakers and health care delivery organizations to develop randomized evaluations of innovative programs and policies. My focus is on facilitating collaborations between J-PAL affiliates and the winners of the Health Care Delivery Innovation Competition, all of which are deploying novel health and social services to improve health outcomes and enhance the accessibility and affordability of quality health care.

What drew you to want to work at J-PAL? J-PAL’s mission to reduce poverty across the globe and the opportunity to act as a liaison between the worlds of research and policy.

What is your favorite place in the world that you have been? My former home, Vancouver B.C., will always have my heart! The juxtaposition of wild nature and a lively, cosmopolitan city is hard to beat.

If you had to eat only one food for the rest of your life, what would it be? The Adventurer Bowl at Life Alive in Cambridge. It tastes great and covers all the food groups!

If you had a million dollars to donate, what would you give it to? Advocacy for universal, affordable child care. I’m passionate about supporting policies that promote gender equality. Access to affordable child care is a huge piece of this puzzle!

Read Jamie’s bio on the J-PAL North America website.

J-PAL North America partnering with state and local governments to tackle key policy challenges

philadelphia-shutterstock_0
Philadelphia skyline | Photo credit: Shutterstock

We are excited to announce that we are now inviting Letters of Interest from U.S. state and local governments that are interested in partnering with J-PAL North America through our J-PAL State and Local Innovation Initiative. This initiative, which we launched last year, supports state and local leaders who want to use randomized evaluations to generate rigorous evidence about the effectiveness of their policies and programs. You can read our official announcement here.

State and local governments across the U.S. are tackling challenging social problems, almost always with very limited resources. The state and local leaders we speak with are eager to learn what’s worked in other jurisdictions and to test which of their own approaches are most effective. The goal of the J-PAL State and Local Innovation Initiative is to provide them with the support they need to be able to answer these questions.

The state and local governments selected through the J-PAL State and Local Innovation Initiative will receive pro bono technical support from J-PAL staff, flexible pilot funding of up to $100,000, and connections with leading academic researchers from J-PAL’s network to help them design high-quality, feasible randomized evaluations. State and local governments can later apply in partnership with a researcher for larger amounts of funding, typically $250,000-$500,000, to carry out the evaluation.

If you’re interested in learning more, please visit www.povertyactionlab.org/stateandlocal. You can also register for a webinar that we’re hosting on December 8th. If you have further questions or would like to discuss a specific proposal, feel free to contact me at jchabrier@povertyactionlab.org.

We look forward to receiving your letters, learning more about the innovative work being done by state and local governments across the U.S., and finding new ways to support state and local leaders in using rigorous evidence to improve their effectiveness and ultimately the lives of their residents.

Why pilot? Lessons from a Nurse-Family Partnership case study

img_1040-1
The research team prepares iPads for surveying.

This week we released a case study of our Nurse-Family Partnership (NFP) Pay for Success Pilot in South Carolina. This document, produced in collaboration with our partners, Nurse-Family Partnership, Social Finance, the South Carolina Department of Health and Human Services, and the Harvard T.H. Chan School of Public Health, highlights the value of incorporating a pilot period into a Pay for Success (PFS) project timeline.

In addition to providing sufficient time for the NFP team to strengthen referral pipelines, ramp up operations, and develop strategies for expansion, the pilot period was critical for the research team to test, refine, and finalize the integration of the randomized study design into NFP’s enrollment process.

Randomization of study participants into treatment and control arms is the linchpin of the evaluation. By comparing the outcomes of the two randomly assigned groups, researchers can obtain an accurate estimate of NFP’s effectiveness. Monitoring the random assignment function throughout the pilot period confirmed the process was working as intended and ensured that staff were successfully implementing procedures in the field. Since the completion of the pilot period, our project team has enrolled over 1,000 first-time moms into the study!

Careful study design and thoughtful execution are critical for generating high-quality, robust estimates of a program’s impact. Read more about our pilot period in Charting the Course: Reflections on the South Carolina Nurse-Family Partnership Pay for Success Pilot.