Skip to content
Blog

What Works? The UK’s new What Works Centres show their results

arrow down
jj-ying-7JX0-bfiuxQ-unsplash

Yesterday saw the publication of a collection of early conclusions from the What Works Network – the family of institutions created to put evidence in the hands of commissioners and professionals.

Even if you hadn’t heard of the What Works Centres before, you’ll soon start to see their influence in many areas. The basic idea is simple: figure out ‘what works’ and put it in the hands of those who need to know it. The centres are dedicated to generating and collating evidence on ‘what works’ in healthcare, education, boosting growth etc; putting it in a form that busy practitioners can easily digest; and encouraging its active adoption.

The discussion was kicked off by Kevan Collins, the quietly spoken and impressive head of the Educational Endowment Fund (EEF). Since 2011, the EEF has funded 93 large scale trials in around 4,500 schools. Of these trials, all but 5 have been randomised controlled trials, laying to rest the idea that ‘you can’t do RCTs in education’, as Kevan put it. Like most of the What Works centres, a key output is their ‘toolkit’, (in this case, a summary of more than 11,000 studies in an easy to read format). Some of their work has already caused quite a stir, such as warning on the relative cost-ineffectiveness of Teacher Assistants or the negative impacts of repeating a year, but also through highlighting cheap but effective approaches, such as peer-to-peer learning.

Alongside Kevan, there was a presentation by Kate Atkins, the Headteacher of a primary school in Lambeth. She explained how their school had developed an approach to ‘metacognition’ – helping kids learn how to learn, and to think about the learning process. Metacognition is one of the approaches that scores high on the EEF toolkit. But Kate was working with 30 schools across the UK to test a particular approach to training teachers in developing metacognition – the results aren’t in yet, but if it works it will provide a standardised approach that can then be picked up much more widely.

Henry Overman, Director of the What Works centre for Local Economic Growth, and scourge of expensive economic consultants, presented a great summary of the results from their first year of reviews. Skills development was effective, especially short training courses and those with employer involvement. In contrast, sporting and cultural events – though they might have other benefits – don’t boost local growth. He also highlighted in a range of areas, such as business advice and access to finance, just how poor the evidence base is, and how much work is yet to be done to establish even basic cost-benefit advice.

Shirley Pearce gave a sneak preview of the toolkit from the What Works centre in the College of Policing, and a glimpse into the fine judgements involved in trying to get the balance right between ease of comprehension and respecting the underlying nuance in the results. This was also a theme picked up by Carey Oppenheim (CEO of the Early Intervention Foundation) and by Abdool Kara (Policy Spokesman on Evidence Based Policy in SOLACE, and CEO of Swale Borough Council) in the later panel. Abdool gave a sense of how he made sense of the enormous volume of material he and his colleagues received purporting to be evidence: relevance and credibility being key. ‘Who can I trust?’ he asked, ‘If the Core Cities group say that growth occurs in cities, I ignore it – they would say that, wouldn’t they?’

Sir Andrew Dillon, CEO of the National Institute for Clinical Excellence (NICE) gave a masterclass on the challenges of getting evidence adopted, even in the well-established field of medicine. He should know. NICE is the longest established of the What Works centres by some margin, founded in 1999, with a body of 100,000s of studies to call on, and increasingly relied on for guidance by medical professionals across the world. He highlighted four challenges:

Getting information to those who actually make the decisions, at the right time and in the right format [note: EEF is currently in the field with a massive 6-arm trial testing exactly this with schools]; Organisational – ‘getting the right signals into the system…and to those who have stewardship of it’ [others also highlighted the importance that the regulators play in each sector in signalling that use of evidence by providers is expected, and even measured]; Fiscal – even much more cost effective alternatives often bring some up-front costs, and this is particularly difficult in the context of high pressure, annualised budgets; and Behavioural – ‘noone likes being told by someone else that they can do their job better’.

We also had a presentation from Jon Baron, President of the US Coalition for Evidence-based policy, on developments in Washington, and had a final discussion session with Oliver Letwin, Minister for Government Policy, chaired by Dawn Austwick, CEO of the Big Lottery Fund.

It’s a rare treat to hear from Oliver Letwin at such a public event. He is exceptionally thoughtful and blunt. He contrasted the What Works approach with the ‘policy-based evidence making’ that has, historically, often characterised government. He felt that we were at the beginning of an ‘immensely fruitful journey’. He noted that we ‘shouldn’t stop doing anything without evidence’ – it was important to keep innovating – but we also ‘had a duty to find out if it’s working’. His optimistic conclusion was that in 10 or 15 years we’d look back on this conversation and think it quaint: though it should never be a substitute for national debate and democracy, the application of robust methods of testing variations of approach in policy and practice – as pushed by the Behavioural Insights Team inside the centre of government itself – will be the new normal. Bring it on.

And if you haven’t looked at one of the What Works centres toolkits yet – where have you been?