In putting together our evidence reviews, we’ve developed a minimum standard for the evidence that we consider. Impact evaluations need to be able to look at outcomes before and after a policy is implemented, both for the target group and for a comparison group. That feels simple enough, but we’ve found the vast majority of local economic growth evaluations don’t meet this standard.
However, we do have enough studies in play to draw conclusions about more or less effective policies.
The table below summarises the evidence for employment effects: one of the key economic success measures for LEPs and for local economies (and the outcome that has the most coverage across the different reviews).
|No. of studies
|of which SMS3 and above
|of which look at employment effects
|of which show positive impact on employment
|Access to Finance
|Sports & culture
First, we can see straight away that success rates vary. Active labour market programmes and apprenticeships tend to be pretty effective at raising employment (and at cutting time spent unemployed). By contrast, firm-focused interventions (business advice or access to finance measures) don’t tend to work so well at raising workforce jobs.
Second, the evidence does show that some programmes are better at meeting some objectives than others. This matters, since local economic development interventions often have multiple objectives.
For example, firm-focused policies turn out to be much better at raising firm sales and profits than at raising workforce head count. That might feed through to more money flowing in the local economy, but if employment is the priority, resources might be better spent elsewhere.
We can also see that complex interventions like estate renewal don’t tend to deliver job gains. However, they work better at delivering other important objectives – not least, improved housing and local environments.
Third, some policies will work best when carefully targeted. Improving broadband access is a good example: SMEs benefit more than larger firms; so do firms with a lot of skilled workers; so do people and firms in urban areas. That gives us some clear steers about where economic development budgets need to be focused.
Fourth, it turns out that some programmes don’t have a strong economic rationale, but then wider welfare considerations can come into play. For example, if you think of the internet as a basic social right, then we need universal access, not just targeting around economic gains.
This point also applies particularly to area-based interventions such as sports and cultural events and facilities, and to estate renewal. The evidence shows that the net employment, wage and productivity effects of these programmes tends to be very small (although house price effects may be bigger). There are many other good reasons to spend public money on these programmes, just not from the economic development budget.
Finally, this table shows that we need more robust evidence. We hope that our workshops with New Economy, our How to Evaluate guidance, and lessons learned from our demonstration projects will help improve the evidence base. The demonstration projects will be the subject of blogs running through the end of the year – watch this space!