Skip to content
Blog

What works for employment training? The key questions we need to answer to aid a skills-led Covid-19

arrow down
lance-anderson-oq2WOgySksM-unsplash

The employment impacts of Covid19 are not yet fully visible. Unemployment has not risen as much as expected, but other figures suggests the unemployment data may not be fully capturing the extent of the change yet, and it is also possible that the furlough scheme is still masking a reduction in viable employment that will only be seen after it ends. As the economy opens up, and the furlough scheme draws to a close – assuming a successful vaccine rollout – we will know more. One thing we can be sure of, though, is that skills and employment training will have an important role to play in recovery, for individuals and for the economy as a whole. This would be the case after any major economic shock, but there are reasons to believe that it will be particularly true for the current crisis.

First, because it is not just unemployment that has prevented people from working. Furlough, illness and caring responsibilities have also kept people away from their jobs, leading them to miss out on the on-the-job experience that is central to skills development and maintenance. Second, because the distribution of jobs in the labour market has changed in response to Covid-induced trends such as more online shopping and services. The nation’s skills will need to adapt to fit the new distribution, but we don’t know yet whether the trends will persist or reverse as pandemic restrictions lift. Finally, even among those in stable employment, training needs are likely to have increased, with many people needing new skills to effectively and productively work from home.

In this context, questions about ‘what works’ to train and retrain working age people are crucial. These questions will be at the heart of the discussion at the upcoming Labour Market Roundtable hosted by the Economics Observatory and the International Public Policy Observatory.

At the What Works Centre for Local Economic Growth, we aim to provide evidence-based and actionable answers to questions like these, to support people designing and delivering public policy. To help us do this, we produced a series of systematic evidence reviews (published before the pandemic), including on employment training and apprenticeships. Importantly, these reviews draw on studies which measure the impact of a programme or policy. If a study shows that outcomes improved, but cannot rule out that this was the result of, for example, wider economic trends, rather than the impact of the intervention, then it isn’t included. This approach means the findings have direct implications for the design and delivery of effective programmes and policies. For example, the employment training review concludes that:

  • On-the-job training programmes tend to outperform classroom-based ones. Involving employers in design, and using activities that closely mirror actual jobs appear to be particularly important.
  • Shorter programmes (less than six months) are more effective for less formal training activity. When the content is skill-intensive, longer programmes deliver employment gains.
  • The state of the economy is not a major factor in the performance of training interventions; programme design features appear to be more important than macroeconomic factors.

The reviews are accompanied by a series of ‘toolkits’ which are based on the review findings, but also draw on a wider range of practical examples. The training-related toolkits examine the evidence for particular training design features, including financial incentives, careers counselling, pre-qualifications, reminders, mentoring, in-work progression support and high involvement management practices.

The reviews and toolkits provide useful insights for decision makers. But they also highlight how many questions have not yet been addressed by high-quality studies. For the employment training review, of more than 1000 relevant studies, only 71 met our evidence standards. While this is a higher number than for most areas of local growth policy, it is relatively small compared to, for example, the thousands of studies which assess the effectiveness of school-age education interventions, and points to the need for better evaluation of employment training interventions. Questions which we can’t yet confidently answer include:

  • What types of programme are most cost-effective, given that more impactful programmes are often more expensive?
  • What are the best ways to involve employers in training design and delivery?
  • How does the effectiveness of interventions differ for participants from different groups (for men and women, younger workers and older workers, people from different ethnicities)?

We also need more evidence on the detail of programme delivery: what is the optimal length for different types of courses? at what level should financial incentives and employer subsidies be set? what type of training do the trainers themselves need? Some of these evidence gaps are particularly frustrating given that, in general, training and skills interventions are particularly amenable to impact evaluations which compare different versions of a programme.

Despite these gaps, however, the evidence on skills and training in general is relatively strong, and relatively positive, when compared to that for other interventions aimed at improving local economic outcomes. This is perhaps to be expected given that differences in the ‘skills composition’ of different areas explain between 50 percent and 90 percent of wage disparities.

The evidence base gives us a good place to start, then, but there is much more we need to know. With this in mind, there are three questions I will be bringing to the roundtable:

  • To what extent can evidence from previous studies be applied in the current context? What specific features of the post-pandemic labour market do we need to consider?
  • How do we get the evidence to the right people: how should we present the evidence to greatest effect, and how do we reach a wider audience?
  • How can we ensure that new interventions are designed in a way that allows us to measure and compare their effectiveness, improving the evidence base ready for the next major economic shock?

This blog was originally posted on The International Public Policy Observatory.