A familiar observation of the What Works Centre’s evidence reviews is just how few out of the thousands of papers found in our exhaustive trawl of academic journals/databases/Google manage to clear the methodological bar that allows them to be used in our analysis and for recommendations. A further observation is how few of these come from the UK. Across the OECD, where tens of billions of pounds of tightly stretched budgets are spent on local economic development every year, that is a scandal in a period of austerity and slowing growth. The What Works Centre is as unhappy about this outcome as anyone else.
Some people argue that we should simply lower our evidence standards. We are already doing this – for example in our toolkit work and in our ongoing work with a number of Local Authorities on improving transport evaluation and appraisal. But we need to do this in a systematic and clear way so that local decision makers understand the basis on which we are drawing our conclusions and providing advice to them. The What Works Centre is an evidence centre, not an ‘assertion centre’ and we use evidence standards to help decision makers understand the difference between competing, and often contradictory, evidence.
We’re also working to support more evaluations to clear our evidence bar. That’s why the focus of our work has shifted over the past 18 months, from scanning reviews of sufficient quality that already exist across the OECD, to focusing a large part of our work on travelling around the country with our partners New Economy Manchester, to help local authorities, LEPs and combined authorities start improving the quality of their policy evaluations.
Good evaluation can be expensive but zero or bad evaluation can be ruinous. That’s not just for authorities that don’t know whether their programmes have worked, but for other authorities in the UK and around the world that then step into the same policy bear traps after them. Evaluation should be an integral part of many more projects, rather than being viewed as an add-on or an afterthought. Paying for it should be built into projects’ budgets from the outset. If a £100,000 evaluation of a £3 million programme feels overly expensive in the short run, but reveals that the programme has not had the forecast impact, it will be repaid many times over when the council looks set to extend or renew similar projects; or when they or other authorities decide to pursue alternative projects with greater proven impact. On a wider level, is it defensible for anyone to advocate spending hundreds of millions of pounds of taxpayers’ money in any other area without also setting out the purpose of that spending and robust measures to understand the policy’s impact?
Too often, people think that the answer to questions about policy effectiveness can come from assertions about the impact of particular projects generalised to other projects and contexts. Our evidence reviews, and the systematic approach that we take to evidence, try to make it clear that this is hardly ever the case. Our approach to building the evidence base tries to make sure that there is less and less need for this to be the case in future.