VIOLENCE AGAINST WOMEN PREVENTION PROGRAMMING GUIDE
Measure

What are other practical considerations when selecting an evaluation design?

Decision-making guides

Your evaluation design should be proportionate to the size of your program and the time, budget and expertise you have available. The feasibility of your evaluation also depends on factors such as geographical accessibility, weather/season, security. These should all be factored into decisions about your design. The questions below can help you think through these issues:

Available resources 

  Considerations

What budget do you have?

– An evaluation should be proportional to the program size and value and what you wish to learn. A rule of thumb for evaluation is generally 10% of the total program budget but sometimes this is far higher. NB: An RCT generally costs $500,000+ so is only justifiable for bigger programs that have shown promise.
– A simple baseline/endline evaluation with direct beneficiaries is often the cheapest option, especially it can be self-administered or completed with support of facilitators before and after a program activity (for example  during the first and last session of a multi-week curriculum).
– Qualitative evaluation can be quite resource intensive, but good value for money to understand participant experiences of the program and processes of change (see slide on common mistakes with qualitative methods). 

What time do you have?

– To design and conduct a quantitative survey for an evaluation, you ideally need at least 6 months – to design relevant questions, ensure accurate translation, get ethical approval, train the research team and collect data. Analysis can take another 3 months.
– Qualitative evaluation also takes time. As a general guide, you should estimate a maximum of 3-4 in-depth interviews or 2 FGDs per researcher per day. It can take the same time to transcribe the data and again to synthesize and analyze the data.

 What technical expertise is available to you?

– Regardless of the design you use, all evaluations of VAW prevention programs require specific research skills to ensure robust design of tools, sampling, training of research teams, quality control of data collection and rigorous analysis. However, experimental and quasi-experimental designs require niche expertise in specific quantitative methods. Qualitative evaluation also requires specific skills. 

Plan
Understand
Design
Implement
Measure