As I travel around the country talking to practitioners I’m asked a lot of questions. The one that comes up most frequently is: ‘So, what works?’
Perhaps unsurprising given the name of the Centre but also, I think, indicative of a certain amount of confusion about what the Centre hopes to achieve.
At one level, it would be great if the impact evaluation clearly said that policy A works, policy B doesn’t. Unfortunately, as our reviews to date make clear (e.g. on employment training or business advice) this is rarely going to be the case. Even if it was, I’d be amazed if policymakers suddenly stopped providing, say, business advice, if the evaluation evidence was uniformly negative.
However, for the kind of broad policy areas that we consider in our reviews, we tend to find that specific interventions work some of the time, but not always. If we are lucky, and the evidence base is relatively large, we might be able to tease out some policy features that appear to be associated with success (e.g. on the job training appears to work better than class-room based training) or to suggest that policy is more effective on some outcomes than others (e.g. business advice programmes appear to improve productivity more than employment). Very occasionally, we may even be able to talk about relative cost-effectiveness (although not on any of the areas we have looked at so far).
But this is about as far as the existing evidence base can take us and it is clearly a long way from answering the ‘so, what works’ question.
For some, this simply confirms their prejudices that impact evaluation isn’t very useful in tackling real world problems. If you can’t say what works then what’s the point?
It goes without saying that we strongly disagree with that conclusion. We can’t say much now, but future impact evaluation has a vital part to play in improving the policy design process.
In order for it to fulfill that potential, however, we have to recognise that the fact that ‘so, what works’ is often the wrong question. The policy design question that most practitioners really want answered is ‘what works better’? Given that we are going to provide employment training, what kind of training should we be providing? Short course or long courses? On the job or off the job? What kind of support should business advice offer? Light touch signposting to existing providers or more intensive ‘brokerage’? What kind of training helps entrepreneurs to succeed?
It is answering these questions that is the fundamental objective of the What Works project. When NICE provides guidance it doesn’t try to answer the broad question about what makes us healthy. Instead, it tries to decide what treatments work best in addressing particular conditions. Similarly, the Educational Endowment Foundation focuses on assessing the effectiveness of very specific interventions (34 in total including after school programmes, arts participation, extended school time, feedback) on improving one specific outcome: the attainment of disadvantaged pupils of primary and secondary school age.
And as NICE and EEF show, good evaluation has a crucial role to play in helping answer these questions. In the area of local economic growth, there’s still a long way to go in shifting people’s understanding of the role that evaluation can play in helping address these policy design questions. A small, but growing, group of practitioners at both central and local government level are trying to improve evaluation and embed it in the policy design process. We are working hard to support them but as with all such efforts, there is a long way to go. But an important part of that journey is the first step – and asking the right question is crucial to the first step.
We won’t be rebranding, but helping organisations to embed evaluation in the policy making process to understand ‘what works better’ will be central to our work over the next few years.