Skip to content

How to evaluate employment training: Adult education vouchers (RCT)

arrow down
eneko-urunuela-VdhANniicic-unsplash

What was the programme and what did it aim to do?

This study tests the effectiveness of adult education vouchers on labour market outcomes, using Swiss data. This was a national programme that aimed to encourage lifelong learning by covering some or all of course costs, and a large scale field experiment was organised by the Swiss government to test the effects. Vouchers were issued for 200, 500 and 1500 Swiss Francs (approx. £134 / £333 / £1000), which amounted to roughly a 13 per cent, 33 per cent or 100 per cent subsidy of an average course. Vouchers were targeted on programmes covering soft skills and vocational training, rather than formal qualifications.

What’s the evaluation challenge?

Evaluating the effect of employment training programmes is difficult because such programmes are used by specific types of individual. Typically, unemployed individuals will either volunteer for a programme and/or be selected by a caseworker. As a result of this selection, if we compare differences in outcomes for individuals who received training to those who do not, these differences may not reflect the impact of the programme. Instead, they may simply reflect differences in the types of individuals who went on the programme.

What did the evaluation do?

The study addressed the selection problem by implementing a randomised controlled trial. Participants were recruited through the Swiss Labour Force Survey: 1,422 people were randomly selected to receive vouchers, with 9,099 people in the control group. After randomisation, the treatment and control groups were tracked through the Labour Force Survey and asked questions about their subsequent earnings, employment and status and takeup of further training.

How good was the evaluation?

According to our scoring guide, an RCT receives a maximum of 5 (out of 5) on the Maryland Scientific Methods Scale (Maryland SMS). This is because randomisation controls for both observable (e.g. age) and unobservable (e.g. motivation) characteristics of people that might affect outcomes. By randomising the value of the voucher, the experiment also provided help to identify the most effective level of support. Of course not everyone in the treatment group actually used their vouchers – the redemption rate was just under 19 per cent – but the researchers were able to observe what type of people were most likely to go through with the programme. And by selecting both groups through an existing national study the research was able to minimise people dropping out (the ‘attrition’ issue). For these reasons, we score this study as a 5 on the Scientific Maryland Scale.

What did the evaluation find?

The research found that vouchers increased take-up of adult education, but by very small amounts. For that reason, the average effects of vouchers on earnings and employment one year later were small and non-significant. The least skilled participants saw the biggest gains – but they were the least likely to redeem their vouchers (perhaps because they were least able to take time off from work). Vouchers significantly increased the number of people taking further training courses; but almost 20 per cent of people who used vouchers did so instead of taking a course at work (paid for by an employer).

What can we learn from this?

The study provides very high quality evidence that training vouchers can boost labour market outcomes – but, in this case, only for the low-skilled. Could we replicate this work in the UK? Yes. It could be done by Whitehall – but also out at city level, if local authorities have access to raw LFS data, some flexibility over adult skills policy, and can target the programme at a largely self-contained labour market.

In the absence of a UK trial for a similar programme, this evaluation still holds lessons for the UK. It suggests that programmes have to be designed and targeted carefully. Incentives to participate may need to be high, given the real participation barriers many low skilled people face. Policymakers also need to think about unintended effects on other training activity – especially crowding out courses that firms might offer for free. Targeting vouchers at the unemployed rather than low skilled in-work groups might be most effective from a value for money point of view (although, again, a further trial would help test this idea).

Reference

Schwerdt, G., Messer, D., Woessmann, L., and Wolter, S.C. (2012). The impact of an adult education voucher program: Evidence from a randomized field experiment. Journal of Public Economics 96, 569–583. [Study 223 from our Employment Training review]

How to evaluate employment training: Adult education vouchers (RCT)