By Rohini Pande, Charity Troyer Moore, and Eric Dodge
This article first appeared in Alliance for Useful Evidence on October 14, 2015.

If the development community agrees on one thing, it is the need to “bridge the gap” between researchers and practitioners. If it agrees on a second thing, it’s that misaligned incentives underlie this gap.

The usual narrative goes like this:

Policymakers are under pressure from higher-ups in their ministries who, in turn, face pressure from officials with an eye on the upcoming election. This creates demand for short project cycles and evidence that is specific to their little corner of the world.

By contrast, researchers seek to make a major contribution to their field. This reduces their incentives to investigate quirky local phenomena and puts them on a much longer timeline than policymakers.

But, the usual narrative presumes that the only task at hand is to identify and test the policy solution. Before you arrive at the solution, you need to identify the problem. If you find a way to bring researchers and practitioners together at the stage of problem identification, you may align their incentives and create valuable collaborations.

While no easy task, our recent work provides some perspective on how it can function.  With support from DFID’s Building Capacity to Use Research Evidence, our team based out of Harvard Kennedy School’s Evidence for Policy Design is working with the Government of India to make administrative data from Mahatma Gandhi National Rural Employment Guarantee Scheme (MGNREGS) –the world’s largest public works program – usable by program officials, researchers, and the public.

MGNREGS benefits up to 50 million households at an annual cost of nearly $5.47 billion. When we began our collaboration in 2013, MGNREGS had one of the largest databases for a social program in the developing world. But the website providing access sprawled over thousands of pages and required extensive knowledge to operate.

As we worked alongside technicians at India’s National Informatics Center on what would become the MGNREGA Public Data Portal, we identified factors that had effectively buried this gold mine of data. Predictably, some related to capacity (overstrained servers, few computers), and others to bureaucratic quirks (measures against cronyism had prevented hiring of technical staff).

But, a key set of factors related to organizational structure. After witnessing frequent mistimed communications between bureaucrats and technicians, we introduced principles of agile software development, where developers quickly produce prototypes, gain early and frequent feedback from the client, then proceed through cycles of cooperative iteration.

This experience has influenced how we view project cycles. To those who fund impact evaluations and other late-stage aspects of research, we argue the case for investing in the stage of joint problem discovery with the policy partner. Tweaking the problem definition upfront can lead to a very different – and potentially more positive – evaluation outcome. The delivered policy solution will be more likely to work at scale and match policymakers’ needs.

Our embedded collaboration has, for now, produced a durable product. Since the launch of the Data Portal, the Ministry itself has maintained and updated the platforms, making them more robust and versatile. As we write, one component of the Portal alone, the Reports Dashboard, is approaching 150,000 views by over 100,000 users in one year.

The impact evaluation of the Data Portal must wait for future stages where we will measure how policymakers are using the data. But the nature of the collaboration makes it more likely our results will be policy relevant. And both researchers and policymakers can reap the benefits of a functioning product in the meantime.