David Card and co-authors have just published an update of their 2010 meta-analysis on 207 active labour market programmes (ALMPs). The earlier version of this work fed into our very first evidence review on the impact of employment training, so we’ve been interested to see the messages emerging from the update. It’s also provided a chance to reflect on the way we produce our reviews and the way in which meta-evaluations should inform our work. This blog considers the substantive points. In a follow-up, we’ll also consider the methodological issues.
In terms of substantive messages, it’s important to note that the scope of the programmes considered here is broader than those we considered in our employment training review. Specifically, these programmes involve at least one of:
- Training (classroom or on the job)
- Job search assistance
- Sanctions for failing to search
- Subsidised employment, either in the private or public sector.
In practice these components often get packaged up, for example the Work Programme offers training, job search assistance and sanctions, plus some work placement elements. Card et al find that such ‘search and sanctions’ programmes are especially popular in the UK, other ‘Anglo-Saxon’ countries and some ‘Nordic’ countries.
What they find
First, ALMP programmes’ average effects are close to zero in the short run, but become more positive 2-3 years later. This is consistent with what we find for employment training: overall effects are ‘modest’; short term effects are often very low (because of the ‘lock-in’ problem, for example), but impacts may rise over time.
Second, the time profile of impacts varies. Programmes that focus on skills buildup (such as training and private sector work placements) have bigger effects in the medium and long run, and overall. Work first programmes of the search and sanctions type have bigger short term effects – as we would expect. Finally, public sector employment subsidies tend to have negligible or even negative impacts at all horizons. Again, this is broadly consistent with what we find for employment training, although we didn’t look explicitly at the effect of different programme types over time (partly because we were drawing on a smaller body of evidence).
Third, effects vary across different types of people: women and the long term unemployed experience the biggest gains from ALMPs. Card and co-authors find ‘suggestive’ evidence that ‘search and sanctions’ programmes work better for ‘disadvantaged’ participants (they don’t explain who they mean by this, or why that might be), while private sector employment subsidies work best for the long term unemployed. Again, while we considered differences across different types of people as part of our employment training review, the smaller evidence base made it hard to reach any conclusions.
Fourth, ALMPs are more likely to show positive impacts in a recession. This chimes with other studies on some European countries (Kluve 2010), on Germany (Lechner and Wunsch 2009) and Sweden (Forslund et al. 2011). This is the one noticeable difference with what we found – where the effects of employment training didn’t seem to differ much according to the state of the local economy. It’s not clear what explains this difference. ALMP programmes have a more critical role to play in weak labour markets; but in recessions, where all kinds of people find themselves out of work, ALMPs might have a stronger set of participants to work with, not just the ‘hardest to help’.
Overall, it’s reassuring to see that many of these findings are in line with those we reported in our employment training review. But once again, this study highlights the importance of improving the amount of evidence available to help understand what works better.