Skip to content

Following the (social) science: the strength of evidence in a crisis

arrow down

Two billion pounds is a lot of money by anyone’s measure. Enough to fund 80 new schools, 16,000 new social homes or 26,000 nurses for a year. Or, if you’re a football fan, six times the cost of Liverpool’s Premier-League-winning starting eleven.

This is the amount Rishi Sunak pledged on Wednesday as the initial cash available for his Kickstart Scheme, designed to create new jobs for young people at risk of unemployment. And that’s just for starters: he has guaranteed that there will be no cap on the number of Kickstart places available.

The investment has been broadly welcomed. The scale of the hit from COVID is such that the Chancellor has had to commit previously unthinkable sums of money across a range of new and often unproven policies, in an attempt to shore up the economy.

What is significant about the Kickstart commitment is that, even with no cap, it is not a massive gamble in the face of an unprecedented problem. In fact, it’s a really good bet. How do we know? Because something just like it has been tried before, and crucially, it was evaluated.

I’m not the first to point out the resemblance of the new scheme to the Future Jobs Fund, launched in 2010 in response to the financial crisis of 2008.

Perhaps Kickstart looks so like the Future Jobs Fund simply because the two schemes have similar goals. But it seems likely that it’s also because the officials devising Wednesday’s Plan for Jobs could point to reliable analysis which estimated that the Fund delivered combined net benefits to participants, employers and society well above the estimated net cost to the exchequer of £3,100.

Certainly, the Chancellor was keen to demonstrate in his speech that that he’s following the (social) science. In the section dedicated to supporting people to find work, he explicitly cited evidence on what works no less than once a minute.

He was able to do this because support for people seeking work is a rare case of a policy area where there is a long-standing commitment to building and using good evidence.

The only comparable topic is probably education. In that case, previous investment by government in synthesising evidence on what works to help struggling pupils has enabled the Department of Education to react to COVID with £350 million tutoring fund. They could be timely, but also confident that the investment would be likely to deliver the right educational outcomes.

In retrospect, these investments in the evidence seem like no-brainers. The cost of evaluating the Future Jobs Fund is dwarfed by what has been allocated to the Kickstart Scheme as a result of that evidence. And yet had the Chancellor wanted to take a similar approach to last week’s announcements on infrastructure, say, it would have been much harder. In many policy areas, investment in evidence of ‘what works’ is alarmingly low. They need to follow the lead of education and employment.

Of course, we must wait and see whether Kickstart can match the impact of the Future Jobs Fund. Every good evaluator knows that the evidence can provide us with best bets, but never guarantees. The context now is different from 2010, and the devil is in the delivery detail.

We all hope that the long-term legacy of the Kickstart is a generation of young people protected from unemployment and its often devastating effects.

But the scheme is also a reminder that the perennially unglamorous and non-headline-grabbing work of evaluation and evidence synthesis is not an empty academic exercise. It is work that makes it possible for us to take bold steps in times of crisis, without playing fast and loose with billions of pounds of public money.

So I hope for an additional legacy. I hope that just a tiny fraction of £2 billion will be allocated to evaluation. To give us even better knowledge of how to protect the next generation of young people who come face to face with a crisis like this.