We are in the process of renovating our website, and one of the changes we are planning is to raise the profile of a set of resources which we hope will be increasingly useful as more local areas try to develop their understanding of policy effectiveness: evaluation case studies.
These were originally aimed primarily at an academic audience to illustrate how different evaluation methods measured up against the Scientific Maryland Scale that we use to gauge robustness. They also unpack particularly tricky aspects of the evaluation process, such as randomisation, in a way that we hoped might be useful for areas that were considering adopting innovative evaluation approaches.
As our library of these case studies has grown (to over 30 now) it has developed into a useful resource for our wider audience, providing great examples of evaluations to copy, as well as guidance on policy approaches that are having positive impacts.
We published three new ones on the website last week. Here is a thumbnail sketch of each :
- Export promotion: A Randomised Control Trial of a UK Trade and Investment pilot providing web-based information on the benefits and challenges of exporting found that this relatively inexpensive approach increased exports for firms already in the export market. The information had no impact on firms that were not exporting. This suggests that focussing this sort of policy on existing exporters will maximise its impact.
- Job seekers: A Randomised Control Trial of a programme offering online advice to job seekers in Edinburgh found significant increases in interview secured (albeit from a low base). The picture here is complex, however: there was a very small negative impact on jobs found for those in the programme. We need more evaluation of these types of programme to understand why the benefits do not seem to carry through from interviews to hires.
- ‘Promise’ programmes : Popular in the US, these programmes offer to pay university tuition fees for students in a given area –sometimes awarded on merit, sometimes offered to all. The evaluation is not as rigorous as the RCTs above, which is not unusual given the particular challenges of evaluating any area-based programme. But the study found that these programmes do seem to have a positive impact on house prices and school enrolments. There are important nuances around how to make sure that these programmes benefit the most disadvantaged students.
If you follow the links to these case studies you will find short, hopefully helpful summaries of important lessons for anyone working in these policy areas. Each includes a further link to more detailed analyses should you require them.
When our improved website goes online, these case studies — and others like them — will be easier to find. If you come across robust evaluations that you’d like us to add to this resource please get in touch so we can write them up and make them available to all.