Thanks to all of those who joined us in Coventry last week for an extremely useful discussion of our preliminary findings on employment training. Over 30 representatives of local councils, LEPs and other organisations came to the Guildhall and gave us feedback on our review of the evidence.
At the meeting, we explained how we sifted down an initial pool of over 1,000 pieces of evidence to get 71 impact evaluations (the kind of robust evaluations that are the focus of our review). Overall, of the 71 evaluations reviewed, 35 found positive programme impacts, and a further 22 found mixed results. Nine studies found that training interventions didn’t work (i.e. they found no evidence of positive outcomes on employment or wages) and a further five found negative impacts.
There was lots of discussion on how individual programme design features, as well as the context in which the programme operated, influenced the impact of training. We can’t go in to details until our report is published, but suffice it to say that there were many more questions than answers.
The lack of robust evaluation evidence often comes as a big surprise to practitioners. It’s also a source of considerable frustration. This simply serves to highlight the vital role the Centre will play in helping practitioners improve evaluation, and thus our understanding of what works.. Encouraging those running training programmes to build evaluation into the commissioning process will enable us to have better answers to some of the questions that arose, such as the effectiveness of peer-to-peer learning, and the benefits of combining brokerage services with training.
We will be releasing the final report of our findings at the end of March.