An ex-post study of project impact can inform future programming and validate development approaches. But, what factors should implementors consider when planning such studies? Ilisa Gertner and Caryl Merten share their recent ex-post experience.
As international development professionals, we strive for sustainable project results and an evidence base for our approaches. But the truth is we don’t always know what happens to our efforts after a project closes. Chemonics recently conducted a self-financed ex-post study of a former agriculture project to fill that information gap. While the team did come away with evidence supporting the sustainability of the project’s outcomes, we also gleaned some valuable lessons from the process itself.
Chemonics’ commitment to learning and project assessment prompted an ex-post study of USAID’s Maximizing Agricultural Revenue and Key Enterprises in Targeted Sites II (MARKETS II) project, which operated in Nigeria from 2012 to 2017. The study examined MARKETS II’s contribution to smallholder farmer resilience, including women and youth, and whether the agronomic practices and market linkages established during project implementation continued more than one year after closeout. The team assessed the project’s quantitative data. It also conducted in-country key informant interviews and focus group discussions with former beneficiaries and public and private sector stakeholders.
Here’s what we learned in the process:
1. Ensure Objectivity
Throughout the process, managing potential bias was key. With Chemonics financing the study, and three of the five researchers being former MARKETS II staff with critical project and operational context knowledge, we wanted to ensure the study’s objectivity from the outset. Chemonics did this by engaging an objective, third-party consultant Paul McNamara, an independent researcher and extension program specialist in agricultural and consumer economics, as the team’s lead advisor. The research team also focused on collecting objective, qualitative research: the DC-based research team conducted two orientation sessions to prepare the in-country researchers on tactics for how to best ask questions objectively. The team compiled open-ended question guides and digital recordings of all interviews and focus groups to make sure that interviewees were not delivering answers that were expected of them. This qualitative research resulted in a rich set of stories and quotes from smallholder farmers and project stakeholders, including women and youth.
The development world’s move toward evidence-based planning and decisions requires a solid data foundation. We were reminded of this while experiencing data collection and management challenges during our research. During the study, we found that M&E rigor from the outset is crucial for improving current and future projects. It cannot be emphasized enough that staff engaged throughout the project cycle – from proposal writers to technical experts — need a solid understanding of data quality, analysis, and management to ensure data sets are reliable, complete, and properly stored. M&E is not the job of the M&E specialist alone. Laying this foundation facilitates statistical analyses beyond performance monitoring during the project and for future ex-post studies. These analyses inform adaptive management, support the evidence base for technical approaches, and demonstrate the impact of development interventions. They have value even if they do not respond directly to contractual targets.
2. Establish Data Rigor from the Start
MARKETS II collected and verified a wealth of data for project reporting. It also conducted baseline, mid-term, and end-line surveys, which collected data beyond contractual requirements and provided a sense of the project’s direction. Unfortunately, these additional data sets were incomplete for more complex statistical analysis. Because future use of this data was not planned at the time, the project’s M&E staff understandably refocused on collecting data pertaining directly to contractual targets. Reviewing project M&E data a year after project closure — when M&E staff are no longer available for assistance — challenged the team to decide which data sources to clean for statistical analyses. Sifting through many Excel files and determining which data could be used diverted valuable time and resources, delaying higher-level analysis. Though challenging, these roadblocks reminded us how crucial it is for implementers to lay strong data foundations from the get-go.
3. Expect the Unexpected During Data Collection
The study had clearly defined research objectives and timelines, but we also planned for agile decision-making. Notably, we held real-time, nearly daily check-ins with field researchers, which allowed for flexible, creative decisions and ensured quality data collection. This was critical — despite the fluid security situation, our flexibility and constant communication helped us achieve our target to conduct 18 focus groups and 12 key informant interviews. In our original research concept, we planned to collect qualitative data in Nigeria and a statistically representative quantitative survey after closeout. However, the budget was not secured prior to the research starting, so the team needed to spend additional time away from research re-strategizing to ensure substantive qualitative findings in lieu of quantitative sampling.
The initial plan identified Kebbi, Kaduna, and Kano as key states for the study. Kebbi was the project’s flagship rice state, Kaduna had the highest concentration of MARKETS II farmers (more than 80,000) and four targeted value chains, and Kano had three targeted value chains. A shifting security situation led us to eliminate Kaduna visits and instead add more meetings in Kebbi and Kano to ensure a broad information base. This approach relied on stakeholders’ last-minute availability. We tried to replace visits with calls to Kaduna stakeholders, but limited conversations caused by network connection issues impeded this option. Finally, with no in-country presence to support security logistics, the home office research team focused time on security issues, diverting resources from research. When ideating the research, budget limitations must be defined and earmarked — while considering backup logistics and administrative support options. When conducting research, flexibility is important, but you also need to revert to the research objectives and timeline to determine how to best accomplish your goals with the resources you have.
Conducting an ex-post study can be challenging, but it’s often rewarding. In the end, Chemonics was able to show that MARKETS II’s approach was effective. Agricultural practices and market linkages are sustained and evolving a year after project closure. With careful planning and flexibility to maintain research credibility, development practitioners can overcome challenges and learn definitively whether or not their approaches have a lasting impact for the beneficiaries they serve.
Posts on the Chemonics blog represent the views of the authors and do not necessarily represent the views of Chemonics.