Ten years ago this week, What Works Growth was launched. One central question lies at the heart of the work we’ve done over the last decade – what can policy do to increase local economic growth? We believe that careful research and evaluation has a crucial role to play in answering this question and increasing the effectiveness of policymaking.
Over the ten years, we’ve produced many evidence reviews and toolkits that summarise this evidence. We’ve worked with people across local and central government to understand the challenges they face in improving policy and how we might help them. We’ve advised local and central governments thinking of, or undertaking, evaluation of many different policies. We’ve run events on specific policies and training on using evidence, running evaluations, and more.
We’ve learnt a lot, but it’s far from job done. If anything, the impact of Brexit and Covid-19, the perilous state of local and central government finance, and the need to respond to climate change make the challenges to formulating good and effective policy larger than ever. But it’s not all doom and gloom as the rest of this blog post – which summarises highlights and lessons from our current and previous team members – makes clear.
Lynne Miles, Co-founder and Deputy Director to February 2020
Highlights: I’m really proud of the evidence reviews we developed that are the foundation of our work. I still find myself pointing people to them, regularly.
And using this picture of David Gandy to explain to the layperson why it’s important to understand that things other than your intervention might have an impact on outcomes. Actually most people have an intuitive understanding of what good evaluation looks like, once you get past the maths and jargon. At the end of the day – most people understand the need not to compare apples with oranges. I really enjoyed going out and working with people in local government, local enterprise partnerships, and combined authorities – and watching them understand that they knew more than they thought they did about good evaluation.
Source: Vitabiotics, 2016
Most important lessons: With the best will in the world, sometimes the cards are stacked against evaluating a project well. Maybe it is too small, maybe data is not available, or too much other stuff is happening in the same place at the same time. What matters most is really understanding the evidence that’s already out when you’re making recommendations or decisions about what to do.
Meg Kaufmann, Co-founder, Project Manager and Head of Outreach until September 2021
Highlights: The sessions we had at the very beginning arguing over how to interpret and then present the results of the original evidence reviews. A professional highlight, that tension between academic approach, simplicity of messaging, and meeting our audience’s expectations.
Most important lessons: Never get tired of telling the scared straight story to people who have never heard it before. It works every time. Sometimes a simple illustration is exactly what is needed (which brings us back to David Gandy).
Max Nathan, Co-founder and Deputy Director to February 2021
Highlights: I’m very proud that we all built something, from scratch, that’s shaped policy and practice for the better.
Most important lessons: First lesson – it’s a very slow burn! Building an institution and changing culture needed all ten of those years. Second lesson – most places now have *less* capacity to impact-test policy than in 2013. It’s more important than ever that What Works Centres help this to happen. I wish we had realised this sooner!
Megan Streb, Head of Outreach, joined in January 2022
Highlights: A recent highlight was hearing from someone who attended our logic model training last year. As a result of our workshop, they have embedded logic models within the business planning for that government department. This ensures the use of evidence to understand the need, and the role of monitoring and evaluation to track outcomes is put front and centre.
Most important lessons: My ethos at What Works Growth has been to ‘meet people where they are’ when it comes to the use of evidence and evaluation. And a big part of that is acknowledging the wide range of expertise in local government economic development teams. There isn’t a one-size-fits-all approach to using evidence. As we’re a small organisation, it can be a challenge to find the right mix of activities as people have different existing knowledge and experience in policy areas.
Gonzalo Nunez Chaim, Research Economist, joined in July 2018
Highlights: Seeing the increasing interest of places in engaging and interacting with robust evidence.
Most important lessons: That even a small piece of advice could make a substantial difference. One data point can speak more than a thousand words.
Victoria Sutherland, Deputy Director, Evidence, joined in July 2020
Highlights: Anytime we can help a policymaker make a better decision or help them to evaluate a policy. Local economic growth is a fundamentally positive field – policymakers want to improve the lives of their communities – so any opportunity to help them do that more effectively is a win, especially as it can often be risky for them to take an evidence-based approach so they deserve all the support they can get.
Most important lessons: That we don’t actually know that much about what works. It continues to astonish me how little robust evaluation evidence there actually is on local economic growth policies and that this isn’t seen as more of a problem. But the second of these is beginning to change which, alongside improvements in methods, will help address the first. And it’s exciting to be part of that shift. I take comfort from fact other policy areas have gone through this process before us and in many – such as medicine – it is now unthinkable that policies wouldn’t be evidence-based.
Those are some of our key insights and highlights from the last ten years. If you have your own, please share them with us. And, as ever, if you’re interested in any area of our work please get in touch.