Skip to content
Blog

Working with policymakers to build the evidence base

arrow down
urfan-hasanov-LPg1Wm5WGQU-unsplash

As the What Works Centre for Local Economic Growth moves into its second phase, we shift our time and attention away from sifting, reviewing, and synthesising the evidence towards being collaborative creators of evidence. We want to work directly with local authorities, LEPs and others interested in driving local growth. 

Over the last three and a half years our team has sifted and reviewed more than 10,000 pieces of evidence to publish 13 evidence reviews and 18 toolkits across 11 key policy areas in the local economic growth arena. Our aim has been to create accessible summaries of what we think is the most reliable evidence on impact of the most common policy levers we use to promote local growth. And, beyond that, what we know about the relative success of different programme design features in delivering positive outcomes. All with the aim of helping practitioners to easily access the evidence base when they’re thinking about what are the ‘best bets’ for their area, and how to maximise the impact of things they’re doing already. 

One of the things I think has frustrated users (and us!) is that, although we can learn a lot from reviewing the existing evidence, we still have many gaps. As just a couple of examples: 

  • We can say that apprenticeships increase labour market prospects (getting jobs, staying in work, moving up); and we know something about the relative effectiveness of mentoringfinancial incentives and pre-apprenticeships; but we don’t yet have the breadth or richness of evidence to tell you whether they work better in some sectors than others.
  • ‘Managed brokerage’-style intensive business support interventions perform better than lighter touch interventions, but we don’t yet know whether the much higher cost of the intensive support justifies pursuing it. In fact, for some kinds of tailored support (which are likely more intensive), the evidence suggests that other approaches – such as mentoring or subsidised consultancy – may be more cost effective. 
  • We haven’t yet been able to tell you much about whether programmes do better when designed and delivered locally versus nationally. Devolved programmes should exploit local knowledge, so that should improve outcomes. But local resources and capacity may be thinner, especially right now. 
  • We have far less evidence than we would like on the impact of specific local growth policies in the UK. A lot of the evidence we are citing comes from other OECD countries – so we all want to know more about whether and how some of the findings translate to the UK context. 

That is why we think the best thing we can do over the next two years is to work with UK policy makers to develop the evidence base, and work with early career researchers to broaden the future capacity of the academic community to do applied public policy research. 

So, if you are delivering a programme aimed at improving local economic growth outcomes, our offer to you is: 

  • If you are working on devolution or funding bids and want help to develop a credible monitoring and evaluation plan to strengthen your bid, we will have a conversation with you about what’s robust and achievable for your programme. 
  • If you are wondering which way of delivering a programme will work better, and there isn’t enough in our evidence reviews and toolkits to help you, we can: talk to you about whether it’s possible to offer pilot variants of the programme; and help think about how you can get rapid feedback as to which is working better to allow you to roll out with greater confidence. 
  • If you’re commissioning evaluation but are not sure how to write your spec to ensure you get robust results, we can: offer advice on developing the brief, in reviewing tender submissions, in peer reviewing deliverables or by sitting on project steering groups.
  • If you are open to developing RCTs (Randomised Control Trials) of your local growth programmes we can: advise on how to set up the randomisation (including helping you make the case to senior decision makers and elected officials), help you specify surveys or identify existing data sources you can use and, in some cases, help you undertake aspects of the impact evaluation 
  • If you are looking to carry out a robust statistical evaluation of area-based policies that are not suitable for randomisation, we can advise you on control areas, benchmarking, data sources and statistical methods and in some instances we may be able to help with parts of the impact evaluation. 

We will consider any policies or programmes that are targeted at driving local growth. At the moment, we’re particularly interested in: 

  • The impact of housing interventions on local growth outcomes
  • Impacts of non-rail public transport and active transport interventions impacts of completed local rail or metro transport interventions beyond the major national scale projects 
  • The relative merits (including cost-effectiveness) of different methods of delivering employment training for the unemployed, apprenticeships 
  • The relative merits (including cost-effectiveness) of different methods of delivering business support 
  • The impact of incubator, accelerator and move-on space aimed at increasing business start ups, productivity or stimulating innovation 

But we’re open to other suggestions across the broad range of policies that support local growth 

We want to support practitioners in getting fast feedback on what they’re doing, so that they can use to quickly adjust programme design in response to what looks like it’s working better. We’re really keen to help policy makers speed up the traditional policy cycle and to be more agile in programme delivery. 

We are not solely interested in supporting RCTs, but in contexts where they are feasible they are our highest priority. 

Finally, if there are areas you think are not well covered by our existing reviews and toolkits we will consider developing more, if we can find sufficient evidence to do so. We may also review existing evidence reviews if compelling new evidence becomes available. However we don’t want this to distract us from helping to grow the evidence base so we will prioritise hands-on evidence creation where can.